WorldWideScience

Sample records for model reproduces broad

  1. The diverse broad-band light-curves of Swift GRBs reproduced with the cannonball model

    CERN Document Server

    Dado, Shlomo; De Rújula, A

    2009-01-01

    Two radiation mechanisms, inverse Compton scattering (ICS) and synchrotron radiation (SR), suffice within the cannonball (CB) model of long gamma ray bursts (LGRBs) and X-ray flashes (XRFs) to provide a very simple and accurate description of their observed prompt emission and afterglows. Simple as they are, the two mechanisms and the burst environment generate the rich structure of the light curves at all frequencies and times. This is demonstrated for 33 selected Swift LGRBs and XRFs, which are well sampled from early time until late time and well represent the entire diversity of the broad band light curves of Swift LGRBs and XRFs. Their prompt gamma-ray and X-ray emission is dominated by ICS of glory light. During their fast decline phase, ICS is taken over by SR which dominates their broad band afterglow. The pulse shape and spectral evolution of the gamma-ray peaks and the early-time X-ray flares, and even the delayed optical `humps' in XRFs, are correctly predicted. The canonical and non-canonical X-ra...

  2. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  3. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  4. Reproducing the hierarchy of disorder for Morpho-inspired, broad-angle color reflection

    DEFF Research Database (Denmark)

    Song, Bokwang; Johansen, Villads Egede; Sigmund, Ole

    2017-01-01

    on the positional disorder among the identical, multilayered ridges as the critical factor for producing angular independent color. Realizing such positional disorder of identical nanostructures is difficult, which in turn has limited experimental verification of different physical mechanisms that have been...... proposed. In this paper, we suggest an alternative model of inter-structural disorder that can achieve the same broad-angle color reflection, and is applicable to wafer-scale fabrication using conventional thin film technologies. Fabrication of a thin film that produces pure, stable blue across a viewing...... angle of more than 120 ° is demonstrated, together with a robust, conformal color coating....

  5. Hydrological Modeling Reproducibility Through Data Management and Adaptors for Model Interoperability

    Science.gov (United States)

    Turner, M. A.

    2015-12-01

    Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of

  6. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  7. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  8. Properties of galaxies reproduced by a hydrodynamic simulation

    Science.gov (United States)

    Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.

  9. Broad line regions in Seyfert-1 galaxies

    International Nuclear Information System (INIS)

    Groningen, E. van.

    1984-01-01

    To reproduce observed emission profiles of Seyfert galaxies, rotation in an accretion disk has been proposed. In this thesis, the profiles emitted by such an accretion disk are investigated. Detailed comparison with the observed profiles yields that a considerable fraction can be fitted with a power-law function, as predicted by the model. The author analyzes a series of high quality spectra of Seyfert galaxies, obtained with the 2.5m telescope at Las Campanas. He presents detailed analyses of two objects: Mkn335 and Akn120. In both cases, strong evidence is presented for the presence of two separate broad line zones. These zones are identified with an accretion disk and an outflowing wind. The disk contains gas with very high densities and emits predominantly the lower ionization lines. He reports on the discovery of very broad wings beneath the strong forbidden line 5007. (Auth.)

  10. Anatomical Reproducibility of a Head Model Molded by a Three-dimensional Printer.

    Science.gov (United States)

    Kondo, Kosuke; Nemoto, Masaaki; Masuda, Hiroyuki; Okonogi, Shinichi; Nomoto, Jun; Harada, Naoyuki; Sugo, Nobuo; Miyazaki, Chikao

    2015-01-01

    We prepared rapid prototyping models of heads with unruptured cerebral aneurysm based on image data of computed tomography angiography (CTA) using a three-dimensional (3D) printer. The objective of this study was to evaluate the anatomical reproducibility and accuracy of these models by comparison with the CTA images on a monitor. The subjects were 22 patients with unruptured cerebral aneurysm who underwent preoperative CTA. Reproducibility of the microsurgical anatomy of skull bone and arteries, the length and thickness of the main arteries, and the size of cerebral aneurysm were compared between the CTA image and rapid prototyping model. The microsurgical anatomy and arteries were favorably reproduced, apart from a few minute regions, in the rapid prototyping models. No significant difference was noted in the measured lengths of the main arteries between the CTA image and rapid prototyping model, but errors were noted in their thickness (p printer. It was concluded that these models are useful tools for neurosurgical simulation. The thickness of the main arteries and size of cerebral aneurysm should be comprehensively judged including other neuroimaging in consideration of errors.

  11. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  12. Particle acceleration model for the broad-band baseline spectrum of the Crab nebula

    Science.gov (United States)

    Fraschetti, F.; Pohl, M.

    2017-11-01

    We develop a simple one-zone model of the steady-state Crab nebula spectrum encompassing both the radio/soft X-ray and the GeV/multi-TeV observations. By solving the transport equation for GeV-TeV electrons injected at the wind termination shock as a log-parabola momentum distribution and evolved via energy losses, we determine analytically the resulting differential energy spectrum of photons. We find an impressive agreement with the observed spectrum of synchrotron emission, and the synchrotron self-Compton component reproduces the previously unexplained broad 200-GeV peak that matches the Fermi/Large Area Telescope (LAT) data beyond 1 GeV with the Major Atmospheric Gamma Imaging Cherenkov (MAGIC) data. We determine the parameters of the single log-parabola electron injection distribution, in contrast with multiple broken power-law electron spectra proposed in the literature. The resulting photon differential spectrum provides a natural interpretation of the deviation from power law customarily fitted with empirical multiple broken power laws. Our model can be applied to the radio-to-multi-TeV spectrum of a variety of astrophysical outflows, including pulsar wind nebulae and supernova remnants, as well as to interplanetary shocks.

  13. Modeling and evaluating repeatability and reproducibility of ordinal classifications

    NARCIS (Netherlands)

    de Mast, J.; van Wieringen, W.N.

    2010-01-01

    This paper argues that currently available methods for the assessment of the repeatability and reproducibility of ordinal classifications are not satisfactory. The paper aims to study whether we can modify a class of models from Item Response Theory, well established for the study of the reliability

  14. The construction of a two-dimensional reproducing kernel function and its application in a biomedical model.

    Science.gov (United States)

    Guo, Qi; Shen, Shu-Ting

    2016-04-29

    There are two major classes of cardiac tissue models: the ionic model and the FitzHugh-Nagumo model. During computer simulation, each model entails solving a system of complex ordinary differential equations and a partial differential equation with non-flux boundary conditions. The reproducing kernel method possesses significant applications in solving partial differential equations. The derivative of the reproducing kernel function is a wavelet function, which has local properties and sensitivities to singularity. Therefore, study on the application of reproducing kernel would be advantageous. Applying new mathematical theory to the numerical solution of the ventricular muscle model so as to improve its precision in comparison with other methods at present. A two-dimensional reproducing kernel function inspace is constructed and applied in computing the solution of two-dimensional cardiac tissue model by means of the difference method through time and the reproducing kernel method through space. Compared with other methods, this method holds several advantages such as high accuracy in computing solutions, insensitivity to different time steps and a slow propagation speed of error. It is suitable for disorderly scattered node systems without meshing, and can arbitrarily change the location and density of the solution on different time layers. The reproducing kernel method has higher solution accuracy and stability in the solutions of the two-dimensional cardiac tissue model.

  15. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  16. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  17. The broad component of hydrogen emission lines in nuclei of Seyfert galaxies: Comments on a charge exchange model

    International Nuclear Information System (INIS)

    Katz, A.

    1975-01-01

    A model to account for the broad hydrogen line emission from the nuclei of Seyfert galaxies based on charge exchange and collisional processes, as proposed by Ptak and Stoner, is investigated. The model consists of a source of fast (E approx. 10 5 eV) protons streaming through a medium of quiescent gas. One of the major problems that results from such a model concerns the strong narrow hydrogen line core that would be produced, in direct conflict with the observations. The lines cannot arise from gas arranged throughout a spherical volume surrounding the source of the fast particles because the fast protons would produce far more ionizations than the possible number of recombinations. A very dense shell source of less than 1 AU in thickness and at least several tens of parsecs in radius must be invoked to reproduce the asymmetric broad profiles observed. There must be absorption throughout the center of the shell to account for the line profiles. The gas cannot be arranged in dense clumps throughout a large volume because momentum exchange of the gas with the primary particles would quickly accelerate any clumps. The energy balance and energy requirements of such a model are investigated, and it is found that an energy equal to or greater than the total luminosity of most Seyfert galaxies is required to produce the hydrogen line alone. The gas must be mostly neutral and den []e (N approx. 10 7 ) if a reasonable temperature is to be maintained

  18. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    Science.gov (United States)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  19. The Accuracy and Reproducibility of Linear Measurements Made on CBCT-derived Digital Models.

    Science.gov (United States)

    Maroua, Ahmad L; Ajaj, Mowaffak; Hajeer, Mohammad Y

    2016-04-01

    To evaluate the accuracy and reproducibility of linear measurements made on cone-beam computed tomography (CBCT)-derived digital models. A total of 25 patients (44% female, 18.7 ± 4 years) who had CBCT images for diagnostic purposes were included. Plaster models were obtained and digital models were extracted from CBCT scans. Seven linear measurements from predetermined landmarks were measured and analyzed on plaster models and the corresponding digital models. The measurements included arch length and width at different sites. Paired t test and Bland-Altman analysis were used to evaluate the accuracy of measurements on digital models compared to the plaster models. Also, intraclass correlation coefficients (ICCs) were used to evaluate the reproducibility of the measurements in order to assess the intraobserver reliability. The statistical analysis showed significant differences on 5 out of 14 variables, and the mean differences ranged from -0.48 to 0.51 mm. The Bland-Altman analysis revealed that the mean difference between variables was (0.14 ± 0.56) and (0.05 ± 0.96) mm and limits of agreement between the two methods ranged from -1.2 to 0.96 and from -1.8 to 1.9 mm in the maxilla and the mandible, respectively. The intraobserver reliability values were determined for all 14 variables of two types of models separately. The mean ICC value for the plaster models was 0.984 (0.924-0.999), while it was 0.946 for the CBCT models (range from 0.850 to 0.985). Linear measurements obtained from the CBCT-derived models appeared to have a high level of accuracy and reproducibility.

  20. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...... source. Toluene was degraded under aerobic conditions at a constant temperature of 28 degreesC. The experiments were modelled by a Monod model - extended to meet the air/liquid system, and the parameter values were estimated using a statistical nonlinear estimation procedure. Model reduction analysis...... resulted in a simpler model without the biomass decay term. In order to test for model reduction and reproducibility of parameter estimates, a likelihood ratio test was employed. The limited reproducibility for these experiments implied that all 9 batch experiments could not be described by the same set...

  1. NRFixer: Sentiment Based Model for Predicting the Fixability of Non-Reproducible Bugs

    Directory of Open Access Journals (Sweden)

    Anjali Goyal

    2017-08-01

    Full Text Available Software maintenance is an essential step in software development life cycle. Nowadays, software companies spend approximately 45\\% of total cost in maintenance activities. Large software projects maintain bug repositories to collect, organize and resolve bug reports. Sometimes it is difficult to reproduce the reported bug with the information present in a bug report and thus this bug is marked with resolution non-reproducible (NR. When NR bugs are reconsidered, a few of them might get fixed (NR-to-fix leaving the others with the same resolution (NR. To analyse the behaviour of developers towards NR-to-fix and NR bugs, the sentiment analysis of NR bug report textual contents has been conducted. The sentiment analysis of bug reports shows that NR bugs' sentiments incline towards more negativity than reproducible bugs. Also, there is a noticeable opinion drift found in the sentiments of NR-to-fix bug reports. Observations driven from this analysis were an inspiration to develop a model that can judge the fixability of NR bugs. Thus a framework, {NRFixer,} which predicts the probability of NR bug fixation, is proposed. {NRFixer} was evaluated with two dimensions. The first dimension considers meta-fields of bug reports (model-1 and the other dimension additionally incorporates the sentiments (model-2 of developers for prediction. Both models were compared using various machine learning classifiers (Zero-R, naive Bayes, J48, random tree and random forest. The bug reports of Firefox and Eclipse projects were used to test {NRFixer}. In Firefox and Eclipse projects, J48 and Naive Bayes classifiers achieve the best prediction accuracy, respectively. It was observed that the inclusion of sentiments in the prediction model shows a rise in the prediction accuracy ranging from 2 to 5\\% for various classifiers.

  2. The case for inflow of the broad-line region of active galactic nuclei

    Science.gov (United States)

    Gaskell, C. Martin; Goosmann, René W.

    2016-02-01

    The high-ionization lines of the broad-line region (BLR) of thermal active galactic nuclei (AGNs) show blueshifts of a few hundred km/s to several thousand km/sec with respect to the low-ionization lines. This has long been thought to be due to the high-ionization lines of the BLR arising in a wind of which the far side of the outflow is blocked from our view by the accretion disc. Evidence for and against the disc-wind model is discussed. The biggest problem for the model is that velocity-resolved reverberation mapping repeatedly fails to show the expected kinematic signature of outflow of the BLR. The disc-wind model also cannot readily reproduce the red side of the line profiles of high-ionization lines. The rapidly falling density in an outflow makes it difficult to obtain high equivalent widths. We point out a number of major problems with associating the BLR with the outflows producing broad absorption lines. An explanation which avoids all these problems and satisfies the constraints of both the line profiles and velocity-resolved reverberation-mapping is a model in which the blueshifting is due to scattering off material spiraling inwards with an inflow velocity of half the velocity of the blueshifting. We discuss how recent reverberation mapping results are consistent with the scattering-plus-inflow model but do not support a disc-wind model. We propose that the anti-correlation of the apparent redshifting of Hβ with the blueshifting of C iv is a consequence of contamination of the red wings of Hβ by the broad wings of [O iii].

  3. Assessment of the potential forecasting skill of a global hydrological model in reproducing the occurrence of monthly flow extremes

    Directory of Open Access Journals (Sweden)

    N. Candogan Yossef

    2012-11-01

    Full Text Available As an initial step in assessing the prospect of using global hydrological models (GHMs for hydrological forecasting, this study investigates the skill of the GHM PCR-GLOBWB in reproducing the occurrence of past extremes in monthly discharge on a global scale. Global terrestrial hydrology from 1958 until 2001 is simulated by forcing PCR-GLOBWB with daily meteorological data obtained by downscaling the CRU dataset to daily fields using the ERA-40 reanalysis. Simulated discharge values are compared with observed monthly streamflow records for a selection of 20 large river basins that represent all continents and a wide range of climatic zones.

    We assess model skill in three ways all of which contribute different information on the potential forecasting skill of a GHM. First, the general skill of the model in reproducing hydrographs is evaluated. Second, model skill in reproducing significantly higher and lower flows than the monthly normals is assessed in terms of skill scores used for forecasts of categorical events. Third, model skill in reproducing flood and drought events is assessed by constructing binary contingency tables for floods and droughts for each basin. The skill is then compared to that of a simple estimation of discharge from the water balance (PE.

    The results show that the model has skill in all three types of assessments. After bias correction the model skill in simulating hydrographs is improved considerably. For most basins it is higher than that of the climatology. The skill is highest in reproducing monthly anomalies. The model also has skill in reproducing floods and droughts, with a markedly higher skill in floods. The model skill far exceeds that of the water balance estimate. We conclude that the prospect for using PCR-GLOBWB for monthly and seasonal forecasting of the occurrence of hydrological extremes is positive. We argue that this conclusion applies equally to other similar GHMs and

  4. Cellular automaton model in the fundamental diagram approach reproducing the synchronized outflow of wide moving jams

    International Nuclear Information System (INIS)

    Tian, Jun-fang; Yuan, Zhen-zhou; Jia, Bin; Fan, Hong-qiang; Wang, Tao

    2012-01-01

    Velocity effect and critical velocity are incorporated into the average space gap cellular automaton model [J.F. Tian, et al., Phys. A 391 (2012) 3129], which was able to reproduce many spatiotemporal dynamics reported by the three-phase theory except the synchronized outflow of wide moving jams. The physics of traffic breakdown has been explained. Various congested patterns induced by the on-ramp are reproduced. It is shown that the occurrence of synchronized outflow, free outflow of wide moving jams is closely related with drivers time delay in acceleration at the downstream jam front and the critical velocity, respectively. -- Highlights: ► Velocity effect is added into average space gap cellular automaton model. ► The physics of traffic breakdown has been explained. ► The probabilistic nature of traffic breakdown is simulated. ► Various congested patterns induced by the on-ramp are reproduced. ► The occurrence of synchronized outflow of jams depends on drivers time delay.

  5. COMBINE archive and OMEX format : One file to share all information to reproduce a modeling project

    NARCIS (Netherlands)

    Bergmann, Frank T.; Olivier, Brett G.; Soiland-Reyes, Stian

    2014-01-01

    Background: With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models,

  6. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  7. Investigation of dimensional variation in parts manufactured by fused deposition modeling using Gauge Repeatability and Reproducibility

    Science.gov (United States)

    Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar

    2018-02-01

    In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.

  8. Can a coupled meteorology–chemistry model reproduce the ...

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  9. Pharmacokinetic Modelling to Predict FVIII:C Response to Desmopressin and Its Reproducibility in Nonsevere Haemophilia A Patients.

    Science.gov (United States)

    Schütte, Lisette M; van Hest, Reinier M; Stoof, Sara C M; Leebeek, Frank W G; Cnossen, Marjon H; Kruip, Marieke J H A; Mathôt, Ron A A

    2018-04-01

     Nonsevere haemophilia A (HA) patients can be treated with desmopressin. Response of factor VIII activity (FVIII:C) differs between patients and is difficult to predict.  Our aims were to describe FVIII:C response after desmopressin and its reproducibility by population pharmacokinetic (PK) modelling.  Retrospective data of 128 nonsevere HA patients (age 7-75 years) receiving an intravenous or intranasal dose of desmopressin were used. PK modelling of FVIII:C was performed by nonlinear mixed effect modelling. Reproducibility of FVIII:C response was defined as less than 25% difference in peak FVIII:C between administrations.  A total of 623 FVIII:C measurements from 142 desmopressin administrations were available; 14 patients had received two administrations at different occasions. The FVIII:C time profile was best described by a two-compartment model with first-order absorption and elimination. Interindividual variability of the estimated baseline FVIII:C, central volume of distribution and clearance were 37, 43 and 50%, respectively. The most recently measured FVIII:C (FVIII-recent) was significantly associated with FVIII:C response to desmopressin ( p  C increase of 0.47 IU/mL (median, interquartile range: 0.32-0.65 IU/mL, n  = 142). C response was reproducible in 6 out of 14 patients receiving two desmopressin administrations.  FVIII:C response to desmopressin in nonsevere HA patients was adequately described by a population PK model. Large variability in FVIII:C response was observed, which could only partially be explained by FVIII-recent. C response was not reproducible in a small subset of patients. Therefore, monitoring FVIII:C around surgeries or bleeding might be considered. Research is needed to study this further. Schattauer Stuttgart.

  10. From alginate impressions to digital virtual models: accuracy and reproducibility.

    Science.gov (United States)

    Dalstra, Michel; Melsen, Birte

    2009-03-01

    To compare the accuracy and reproducibility of measurements performed on digital virtual models with those taken on plaster casts from models poured immediately after the impression was taken, the 'gold standard', and from plaster models poured following a 3-5 day shipping procedure of the alginate impression. Direct comparison of two measuring techniques. The study was conducted at the Department of Orthodontics, School of Dentistry, University of Aarhus, Denmark in 2006/2007. Twelve randomly selected orthodontic graduate students with informed consent. Three sets of alginate impressions were taken from the participants within 1 hour. Plaster models were poured immediately from two of the sets, while the third set was kept in transit in the mail for 3-5 days. Upon return a plaster model was poured as well. Finally digital models were made from the plaster models. A number of measurements were performed on the plaster casts with a digital calliper and on the corresponding digital models using the virtual measuring tool of the accompanying software. Afterwards these measurements were compared statistically. No statistical differences were found between the three sets of plaster models. The intra- and inter-observer variability are smaller for the measurements performed on the digital models. Sending alginate impressions by mail does not affect the quality and accuracy of plaster casts poured from them afterwards. Virtual measurements performed on digital models display less variability than the corresponding measurements performed with a calliper on the actual models.

  11. Effect of Initial Conditions on Reproducibility of Scientific Research

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-01-01

    Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705

  12. Accuracy, reproducibility, and time efficiency of dental measurements using different technologies.

    Science.gov (United States)

    Grünheid, Thorsten; Patel, Nishant; De Felippe, Nanci L; Wey, Andrew; Gaillard, Philippe R; Larson, Brent E

    2014-02-01

    Historically, orthodontists have taken dental measurements on plaster models. Technological advances now allow orthodontists to take these measurements on digital models. In this study, we aimed to assess the accuracy, reproducibility, and time efficiency of dental measurements taken on 3 types of digital models. emodels (GeoDigm, Falcon Heights, Minn), SureSmile models (OraMetrix, Richardson, Tex), and AnatoModels (Anatomage, San Jose, Calif) were made for 30 patients. Mesiodistal tooth-width measurements taken on these digital models were timed and compared with those on the corresponding plaster models, which were used as the gold standard. Accuracy and reproducibility were assessed using the Bland-Altman method. Differences in time efficiency were tested for statistical significance with 1-way analysis of variance. Measurements on SureSmile models were the most accurate, followed by those on emodels and AnatoModels. Measurements taken on SureSmile models were also the most reproducible. Measurements taken on SureSmile models and emodels were significantly faster than those taken on AnatoModels and plaster models. Tooth-width measurements on digital models can be as accurate as, and might be more reproducible and significantly faster than, those taken on plaster models. Of the models studied, the SureSmile models provided the best combination of accuracy, reproducibility, and time efficiency of measurement. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  13. NetBenchmark: a bioconductor package for reproducible benchmarks of gene regulatory network inference.

    Science.gov (United States)

    Bellot, Pau; Olsen, Catharina; Salembier, Philippe; Oliveras-Vergés, Albert; Meyer, Patrick E

    2015-09-29

    In the last decade, a great number of methods for reconstructing gene regulatory networks from expression data have been proposed. However, very few tools and datasets allow to evaluate accurately and reproducibly those methods. Hence, we propose here a new tool, able to perform a systematic, yet fully reproducible, evaluation of transcriptional network inference methods. Our open-source and freely available Bioconductor package aggregates a large set of tools to assess the robustness of network inference algorithms against different simulators, topologies, sample sizes and noise intensities. The benchmarking framework that uses various datasets highlights the specialization of some methods toward network types and data. As a result, it is possible to identify the techniques that have broad overall performances.

  14. MICROLENSING OF QUASAR BROAD EMISSION LINES: CONSTRAINTS ON BROAD LINE REGION SIZE

    Energy Technology Data Exchange (ETDEWEB)

    Guerras, E.; Mediavilla, E. [Instituto de Astrofisica de Canarias, Via Lactea S/N, La Laguna E-38200, Tenerife (Spain); Jimenez-Vicente, J. [Departamento de Fisica Teorica y del Cosmos, Universidad de Granada, Campus de Fuentenueva, E-18071 Granada (Spain); Kochanek, C. S. [Department of Astronomy and the Center for Cosmology and Astroparticle Physics, The Ohio State University, 4055 McPherson Lab, 140 West 18th Avenue, Columbus, OH 43221 (United States); Munoz, J. A. [Departamento de Astronomia y Astrofisica, Universidad de Valencia, E-46100 Burjassot, Valencia (Spain); Falco, E. [Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Motta, V. [Departamento de Fisica y Astronomia, Universidad de Valparaiso, Avda. Gran Bretana 1111, Valparaiso (Chile)

    2013-02-20

    We measure the differential microlensing of the broad emission lines between 18 quasar image pairs in 16 gravitational lenses. We find that the broad emission lines are in general weakly microlensed. The results show, at a modest level of confidence (1.8{sigma}), that high ionization lines such as C IV are more strongly microlensed than low ionization lines such as H{beta}, indicating that the high ionization line emission regions are more compact. If we statistically model the distribution of microlensing magnifications, we obtain estimates for the broad line region size of r{sub s} = 24{sup +22} {sub -15} and r{sub s} = 55{sup +150} {sub -35} lt-day (90% confidence) for the high and low ionization lines, respectively. When the samples are divided into higher and lower luminosity quasars, we find that the line emission regions of more luminous quasars are larger, with a slope consistent with the expected scaling from photoionization models. Our estimates also agree well with the results from local reveberation mapping studies.

  15. Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000

    Science.gov (United States)

    Sanchez-Gomez, Emilia; Somot, S.; Déqué, M.

    2009-10-01

    One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation.

  16. Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000

    Energy Technology Data Exchange (ETDEWEB)

    Somot, S.; Deque, M. [Meteo-France CNRM/GMGEC CNRS/GAME, Toulouse (France); Sanchez-Gomez, Emilia

    2009-10-15

    One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation. (orig.)

  17. Spatiotemporal exploratory models for broad-scale survey data.

    Science.gov (United States)

    Fink, Daniel; Hochachka, Wesley M; Zuckerberg, Benjamin; Winkler, David W; Shaby, Ben; Munson, M Arthur; Hooker, Giles; Riedewald, Mirek; Sheldon, Daniel; Kelling, Steve

    2010-12-01

    The distributions of animal populations change and evolve through time. Migratory species exploit different habitats at different times of the year. Biotic and abiotic features that determine where a species lives vary due to natural and anthropogenic factors. This spatiotemporal variation needs to be accounted for in any modeling of species' distributions. In this paper we introduce a semiparametric model that provides a flexible framework for analyzing dynamic patterns of species occurrence and abundance from broad-scale survey data. The spatiotemporal exploratory model (STEM) adds essential spatiotemporal structure to existing techniques for developing species distribution models through a simple parametric structure without requiring a detailed understanding of the underlying dynamic processes. STEMs use a multi-scale strategy to differentiate between local and global-scale spatiotemporal structure. A user-specified species distribution model accounts for spatial and temporal patterning at the local level. These local patterns are then allowed to "scale up" via ensemble averaging to larger scales. This makes STEMs especially well suited for exploring distributional dynamics arising from a variety of processes. Using data from eBird, an online citizen science bird-monitoring project, we demonstrate that monthly changes in distribution of a migratory species, the Tree Swallow (Tachycineta bicolor), can be more accurately described with a STEM than a conventional bagged decision tree model in which spatiotemporal structure has not been imposed. We also demonstrate that there is no loss of model predictive power when a STEM is used to describe a spatiotemporal distribution with very little spatiotemporal variation; the distribution of a nonmigratory species, the Northern Cardinal (Cardinalis cardinalis).

  18. Thermal wind model for the broad emission line region of quasars

    International Nuclear Information System (INIS)

    Weymann, R.J.; Scott, J.S.; Schiano, A.V.R.; Christiansen, W.A.

    1982-01-01

    Arguments are summarized for supposing that the clouds giving rise to the broad emission lines of QSOs are confined by the pressure of an expanding thermal gas and that a flux of relativistic particles with luminosity comparable to the photon luminosity streams through this gas. The resulting heating and momentum deposition produces a transonic thermal wind whose dynamical properties are calculated in detail. This wind accelerates and confines the emission line clouds, thereby producing the broad emission line (BEL) profiles. In a companion paper, the properties of the wind at much larger distances (approx.kpc) than the BEL region are used to explain the production of the broad absorption lines (BAL) observed in some QSOs. The same set of wind parameters can account for the properties of both the BEL and BAL regions, and this unification in the physical description of the BEL and BAL regions is one of the most important advantages of this model. A characteristic size of approx.1 pc for the QSO emission line region is one consequence of the model. This characteristic size is shown to depend upon luminosity in such a way that the ionization parameter is roughly constant over a wide range of luminosities. An X-ray luminosity due to thermal bremsstrahlung of approx.1%--10% of the optical luminosity is another consequence of the model. The trajectories of clouds under the combined influence of ram pressure acceleration and radiative acceleration are calculated. From these trajectories emission line profiles are also calculated, as well as the wind and cloud parameters yielding profiles in fair agreement with observed profiles explored. Opacity in the wind due to electron scattering displaces the line cores of optically thin lines to the blue. This is roughly compensated for by the redward skewing of optically thick lines due to preferential emission of photons from the back side of the clouds.void rapid depletion due to Compton losses are discussed

  19. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  20. Multichannel calculation of the very narrow Ds0 *(2317) and the very broad D0 *(2300-2400)

    Science.gov (United States)

    Rupp, G.; van Beveren, E.

    2007-03-01

    The narrow D s0 * (2317) and broad D 0 * (2300-2400) charmed scalar mesons and their radial excitations are described in a coupled-channel quark model that also reproduces the properties of the light scalar nonet. All two-meson channels containing ground-state pseudoscalars and vectors are included. The parameters are chosen fixed at published values, except for the overall coupling constant λ, which is fine-tuned to reproduce the D s0 * (2317) mass, and a damping constant α for subthreshold contributions. Variations of λ and D 0 * (2300-2400) pole postions are studied for different α values. Calculated cross-sections for S-wave DK and Dπ scattering, as well as resonance pole positions, are given for the value of α that fits the light scalars. The thus predicted radially excited state D s0 *‧(2850), with a width of about 50MeV, seems to have been observed already.

  1. [NDVI difference rate recognition model of deciduous broad-leaved forest based on HJ-CCD remote sensing data].

    Science.gov (United States)

    Wang, Yan; Tian, Qing-Jiu; Huang, Yan; Wei, Hong-Wei

    2013-04-01

    The present paper takes Chuzhou in Anhui Province as the research area, and deciduous broad-leaved forest as the research object. Then it constructs the recognition model about deciduous broad-leaved forest was constructed using NDVI difference rate between leaf expansion and flowering and fruit-bearing, and the model was applied to HJ-CCD remote sensing image on April 1, 2012 and May 4, 2012. At last, the spatial distribution map of deciduous broad-leaved forest was extracted effectively, and the results of extraction were verified and evaluated. The result shows the validity of NDVI difference rate extraction method proposed in this paper and also verifies the applicability of using HJ-CCD data for vegetation classification and recognition.

  2. Predicting Graduation Rates at 4-Year Broad Access Institutions Using a Bayesian Modeling Approach

    Science.gov (United States)

    Crisp, Gloria; Doran, Erin; Salis Reyes, Nicole A.

    2018-01-01

    This study models graduation rates at 4-year broad access institutions (BAIs). We examine the student body, structural-demographic, and financial characteristics that best predict 6-year graduation rates across two time periods (2008-2009 and 2014-2015). A Bayesian model averaging approach is utilized to account for uncertainty in variable…

  3. Reproducibility in cyclostratigraphy: initiating an intercomparison project

    Science.gov (United States)

    Sinnesael, Matthias; De Vleeschouwer, David; Zeeden, Christian; Claeys, Philippe

    2017-04-01

    The study of astronomical climate forcing and the application of cyclostratigraphy have experienced a spectacular growth over the last decades. In the field of cyclostratigraphy a broad range in methodological approaches exist. However, comparative study between the different approaches is lacking. Different cases demand different approaches, but with the growing importance of the field, questions arise about reproducibility, uncertainties and standardization of results. The radioisotopic dating community, in particular, has done far-reaching efforts to improve reproducibility and intercomparison of radioisotopic dates and their errors. To satisfy this need in cyclostratigraphy, we initiate a comparable framework for the community. The aims are to investigate and quantify reproducibility of, and uncertainties related to cyclostratigraphic studies and to provide a platform to discuss the merits and pitfalls of different methodologies, and their applicabilities. With this poster, we ask the feedback from the community on how to design this comparative framework in a useful, meaningful and productive manner. In parallel, we would like to discuss how reproducibility should be tested and what uncertainties should stand for in cyclostratigraphy. On the other hand, we intend to trigger interest for a cyclostratigraphic intercomparison project. This intercomparison project would imply the analysis of artificial and genuine geological records by individual researchers. All participants would be free to determine their method of choice. However, a handful of criterions will be required for an outcome to be comparable. The different results would be compared (e.g. during a workshop or a special session), and the lessons learned from the comparison could potentially be reported in a review paper. The aim of an intercomparison project is not to rank the different methods according to their merits, but to get insight into which specific methods are most suitable for which

  4. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  5. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  6. GeoTrust Hub: A Platform For Sharing And Reproducing Geoscience Applications

    Science.gov (United States)

    Malik, T.; Tarboton, D. G.; Goodall, J. L.; Choi, E.; Bhatt, A.; Peckham, S. D.; Foster, I.; Ton That, D. H.; Essawy, B.; Yuan, Z.; Dash, P. K.; Fils, G.; Gan, T.; Fadugba, O. I.; Saxena, A.; Valentic, T. A.

    2017-12-01

    Recent requirements of scholarly communication emphasize the reproducibility of scientific claims. Text-based research papers are considered poor mediums to establish reproducibility. Papers must be accompanied by "research objects", aggregation of digital artifacts that together with the paper provide an authoritative record of a piece of research. We will present GeoTrust Hub (http://geotrusthub.org), a platform for creating, sharing, and reproducing reusable research objects. GeoTrust Hub provides tools for scientists to create `geounits'--reusable research objects. Geounits are self-contained, annotated, and versioned containers that describe and package computational experiments in an efficient and light-weight manner. Geounits can be shared on public repositories such as HydroShare and FigShare, and also using their respective APIs reproduced on provisioned clouds. The latter feature enables science applications to have a lifetime beyond sharing, wherein they can be independently verified and trust be established as they are repeatedly reused. Through research use cases from several geoscience laboratories across the United States, we will demonstrate how tools provided from GeoTrust Hub along with Hydroshare as its public repository for geounits is advancing the state of reproducible research in the geosciences. For each use case, we will address different computational reproducibility requirements. Our first use case will be an example of setup reproducibility which enables a scientist to set up and reproduce an output from a model with complex configuration and development environments. Our second use case will be an example of algorithm/data reproducibility, where in a shared data science model/dataset can be substituted with an alternate one to verify model output results, and finally an example of interactive reproducibility, in which an experiment is dependent on specific versions of data to produce the result. Toward this we will use software and data

  7. A novel highly reproducible and lethal nonhuman primate model for orthopox virus infection.

    Directory of Open Access Journals (Sweden)

    Marit Kramski

    Full Text Available The intentional re-introduction of Variola virus (VARV, the agent of smallpox, into the human population is of great concern due its bio-terroristic potential. Moreover, zoonotic infections with Cowpox (CPXV and Monkeypox virus (MPXV cause severe diseases in humans. Smallpox vaccines presently available can have severe adverse effects that are no longer acceptable. The efficacy and safety of new vaccines and antiviral drugs for use in humans can only be demonstrated in animal models. The existing nonhuman primate models, using VARV and MPXV, need very high viral doses that have to be applied intravenously or intratracheally to induce a lethal infection in macaques. To overcome these drawbacks, the infectivity and pathogenicity of a particular CPXV was evaluated in the common marmoset (Callithrix jacchus.A CPXV named calpox virus was isolated from a lethal orthopox virus (OPV outbreak in New World monkeys. We demonstrated that marmosets infected with calpox virus, not only via the intravenous but also the intranasal route, reproducibly develop symptoms resembling smallpox in humans. Infected animals died within 1-3 days after onset of symptoms, even when very low infectious viral doses of 5x10(2 pfu were applied intranasally. Infectious virus was demonstrated in blood, saliva and all organs analyzed.We present the first characterization of a new OPV infection model inducing a disease in common marmosets comparable to smallpox in humans. Intranasal virus inoculation mimicking the natural route of smallpox infection led to reproducible infection. In vivo titration resulted in an MID(50 (minimal monkey infectious dose 50% of 8.3x10(2 pfu of calpox virus which is approximately 10,000-fold lower than MPXV and VARV doses applied in the macaque models. Therefore, the calpox virus/marmoset model is a suitable nonhuman primate model for the validation of vaccines and antiviral drugs. Furthermore, this model can help study mechanisms of OPV pathogenesis.

  8. Reproducibility of somatosensory spatial perceptual maps.

    Science.gov (United States)

    Steenbergen, Peter; Buitenweg, Jan R; Trojan, Jörg; Veltink, Peter H

    2013-02-01

    Various studies have shown subjects to mislocalize cutaneous stimuli in an idiosyncratic manner. Spatial properties of individual localization behavior can be represented in the form of perceptual maps. Individual differences in these maps may reflect properties of internal body representations, and perceptual maps may therefore be a useful method for studying these representations. For this to be the case, individual perceptual maps need to be reproducible, which has not yet been demonstrated. We assessed the reproducibility of localizations measured twice on subsequent days. Ten subjects participated in the experiments. Non-painful electrocutaneous stimuli were applied at seven sites on the lower arm. Subjects localized the stimuli on a photograph of their own arm, which was presented on a tablet screen overlaying the real arm. Reproducibility was assessed by calculating intraclass correlation coefficients (ICC) for the mean localizations of each electrode site and the slope and offset of regression models of the localizations, which represent scaling and displacement of perceptual maps relative to the stimulated sites. The ICCs of the mean localizations ranged from 0.68 to 0.93; the ICCs of the regression parameters were 0.88 for the intercept and 0.92 for the slope. These results indicate a high degree of reproducibility. We conclude that localization patterns of non-painful electrocutaneous stimuli on the arm are reproducible on subsequent days. Reproducibility is a necessary property of perceptual maps for these to reflect properties of a subject's internal body representations. Perceptual maps are therefore a promising method for studying body representations.

  9. DETECTION OF EXTREMELY BROAD WATER EMISSION FROM THE MOLECULAR CLOUD INTERACTING SUPERNOVA REMNANT G349.7+0.2

    Energy Technology Data Exchange (ETDEWEB)

    Rho, J. [SETI Institute, 189 N. Bernardo Avenue, Mountain View, CA 94043 (United States); Hewitt, J. W. [CRESST/University of Maryland, Baltimore County, Baltimore, MD 21250 (United States); Boogert, A. [SOFIA Science Center, NASA Ames Research Center, MS 232-11, Moffett Field, CA 94035 (United States); Kaufman, M. [Department of Physics and Astronomy, San Jose State University, San Jose, CA 95192-0106 (United States); Gusdorf, A., E-mail: jrho@seti.org, E-mail: john.w.hewitt@nasa.gov, E-mail: aboogert@sofia.usra.edu, E-mail: michael.kaufman@sjsu.edu, E-mail: antoine.gusdorf@lra.ens.fr [LERMA, UMR 8112 du CNRS, Observatoire de Paris, École Normale Suprieure, 24 rue Lhomond, F-75231 Paris Cedex 05 (France)

    2015-10-10

    We performed Herschel HIFI, PACS, and SPIRE observations toward the molecular cloud interacting supernova remnant G349.7+0.2. An extremely broad emission line was detected at 557 GHz from the ground state transition 1{sub 10}-1{sub 01} of ortho-water. This water line can be separated into three velocity components with widths of 144, 27, and 4 km s{sup −1}. The 144 km s{sup −1} component is the broadest water line detected to date in the literature. This extremely broad line width shows the importance of probing shock dynamics. PACS observations revealed three additional ortho-water lines, as well as numerous high-J carbon monoxide (CO) lines. No para-water lines were detected. The extremely broad water line is indicative of a high velocity shock, which is supported by the observed CO rotational diagram that was reproduced with a J-shock model with a density of 10{sup 4} cm{sup −3} and a shock velocity of 80 km s{sup −1}. Two far-infrared fine-structure lines, [O i] at 145 μm and [C ii] line at 157 μm, are also consistent with the high velocity J-shock model. The extremely broad water line could be simply from short-lived molecules that have not been destroyed in high velocity J-shocks; however, it may be from more complicated geometry such as high-velocity water bullets or a shell expanding in high velocity. We estimate the CO and H{sub 2}O densities, column densities, and temperatures by comparison with RADEX and detailed shock models.

  10. Modelling soil erosion at European scale: towards harmonization and reproducibility

    Science.gov (United States)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  11. Reproducing tailing in breakthrough curves: Are statistical models equally representative and predictive?

    Science.gov (United States)

    Pedretti, Daniele; Bianchi, Marco

    2018-03-01

    Breakthrough curves (BTCs) observed during tracer tests in highly heterogeneous aquifers display strong tailing. Power laws are popular models for both the empirical fitting of these curves, and the prediction of transport using upscaling models based on best-fitted estimated parameters (e.g. the power law slope or exponent). The predictive capacity of power law based upscaling models can be however questioned due to the difficulties to link model parameters with the aquifers' physical properties. This work analyzes two aspects that can limit the use of power laws as effective predictive tools: (a) the implication of statistical subsampling, which often renders power laws undistinguishable from other heavily tailed distributions, such as the logarithmic (LOG); (b) the difficulties to reconcile fitting parameters obtained from models with different formulations, such as the presence of a late-time cutoff in the power law model. Two rigorous and systematic stochastic analyses, one based on benchmark distributions and the other on BTCs obtained from transport simulations, are considered. It is found that a power law model without cutoff (PL) results in best-fitted exponents (αPL) falling in the range of typical experimental values reported in the literature (1.5 tailing becomes heavier. Strong fluctuations occur when the number of samples is limited, due to the effects of subsampling. On the other hand, when the power law model embeds a cutoff (PLCO), the best-fitted exponent (αCO) is insensitive to the degree of tailing and to the effects of subsampling and tends to a constant αCO ≈ 1. In the PLCO model, the cutoff rate (λ) is the parameter that fully reproduces the persistence of the tailing and is shown to be inversely correlated to the LOG scale parameter (i.e. with the skewness of the distribution). The theoretical results are consistent with the fitting analysis of a tracer test performed during the MADE-5 experiment. It is shown that a simple

  12. A reproducible brain tumour model established from human glioblastoma biopsies

    International Nuclear Information System (INIS)

    Wang, Jian; Chekenya, Martha; Bjerkvig, Rolf; Enger, Per Ø; Miletic, Hrvoje; Sakariassen, Per Ø; Huszthy, Peter C; Jacobsen, Hege; Brekkå, Narve; Li, Xingang; Zhao, Peng; Mørk, Sverre

    2009-01-01

    Establishing clinically relevant animal models of glioblastoma multiforme (GBM) remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression

  13. A reproducible brain tumour model established from human glioblastoma biopsies

    Directory of Open Access Journals (Sweden)

    Li Xingang

    2009-12-01

    Full Text Available Abstract Background Establishing clinically relevant animal models of glioblastoma multiforme (GBM remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. Methods In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. Results The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. Conclusions In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression.

  14. Broad-band near-field ground motion simulations in 3-dimensional scattering media

    KAUST Repository

    Imperatori, W.

    2012-12-06

    The heterogeneous nature of Earth\\'s crust is manifested in the scattering of propagating seismic waves. In recent years, different techniques have been developed to include such phenomenon in broad-band ground-motion calculations, either considering scattering as a semi-stochastic or purely stochastic process. In this study, we simulate broad-band (0–10 Hz) ground motions with a 3-D finite-difference wave propagation solver using several 3-D media characterized by von Karman correlation functions with different correlation lengths and standard deviation values. Our goal is to investigate scattering characteristics and its influence on the seismic wavefield at short and intermediate distances from the source in terms of ground motion parameters. We also examine scattering phenomena, related to the loss of radiation pattern and the directivity breakdown. We first simulate broad-band ground motions for a point-source characterized by a classic ω2 spectrum model. Fault finiteness is then introduced by means of a Haskell-type source model presenting both subshear and super-shear rupture speed. Results indicate that scattering plays an important role in ground motion even at short distances from the source, where source effects are thought to be dominating. In particular, peak ground motion parameters can be affected even at relatively low frequencies, implying that earthquake ground-motion simulations should include scattering also for peak ground velocity (PGV) calculations. At the same time, we find a gradual loss of the source signature in the 2–5 Hz frequency range, together with a distortion of the Mach cones in case of super-shear rupture. For more complex source models and truly heterogeneous Earth, these effects may occur even at lower frequencies. Our simulations suggests that von Karman correlation functions with correlation length between several hundred metres and few kilometres, Hurst exponent around 0.3 and standard deviation in the 5–10 per cent

  15. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  16. Broad-Band Variability in Accreting Compact Objects

    Directory of Open Access Journals (Sweden)

    S. Scaringi

    2015-02-01

    Full Text Available Cataclysmic variable stars are in many ways similar to X-ray binaries. Both types of systems possess an accretion disk, which in most cases can reach the surface (or event horizon of the central compact object. The main difference is that the embedded gravitational potential well in X-ray binaries is much deeper than those found in cataclysmic variables. As a result, X-ray binaries emit most of their radiation at X-ray wavelengths, as opposed to cataclysmic variables which emit mostly at optical/ultraviolet wavelengths. Both types of systems display aperiodic broad-band variability which can be associated to the accretion disk. Here, the properties of the observed X-ray variability in XRBs are compared to those observed at optical wavelengths in CVs. In most cases the variability properties of both types of systems are qualitatively similar once the relevant timescales associated with the inner accretion disk regions have been taken into account. The similarities include the observed power spectral density shapes, the rms-flux relation as well as Fourier-dependant time lags. Here a brief overview on these similarities is given, placing them in the context of the fluctuating accretion disk model which seeks to reproduce the observed variability.

  17. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    Science.gov (United States)

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  18. Improving the Pattern Reproducibility of Multiple-Point-Based Prior Models Using Frequency Matching

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2014-01-01

    Some multiple-point-based sampling algorithms, such as the snesim algorithm, rely on sequential simulation. The conditional probability distributions that are used for the simulation are based on statistics of multiple-point data events obtained from a training image. During the simulation, data...... events with zero probability in the training image statistics may occur. This is handled by pruning the set of conditioning data until an event with non-zero probability is found. The resulting probability distribution sampled by such algorithms is a pruned mixture model. The pruning strategy leads...... to a probability distribution that lacks some of the information provided by the multiple-point statistics from the training image, which reduces the reproducibility of the training image patterns in the outcome realizations. When pruned mixture models are used as prior models for inverse problems, local re...

  19. Reproducing the nonlinear dynamic behavior of a structured beam with a generalized continuum model

    Science.gov (United States)

    Vila, J.; Fernández-Sáez, J.; Zaera, R.

    2018-04-01

    In this paper we study the coupled axial-transverse nonlinear vibrations of a kind of one dimensional structured solids by application of the so called Inertia Gradient Nonlinear continuum model. To show the accuracy of this axiomatic model, previously proposed by the authors, its predictions are compared with numeric results from a previously defined finite discrete chain of lumped masses and springs, for several number of particles. A continualization of the discrete model equations based on Taylor series allowed us to set equivalent values of the mechanical properties in both discrete and axiomatic continuum models. Contrary to the classical continuum model, the inertia gradient nonlinear continuum model used herein is able to capture scale effects, which arise for modes in which the wavelength is comparable to the characteristic distance of the structured solid. The main conclusion of the work is that the proposed generalized continuum model captures the scale effects in both linear and nonlinear regimes, reproducing the behavior of the 1D nonlinear discrete model adequately.

  20. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  1. Development of a Consistent and Reproducible Porcine Scald Burn Model

    Science.gov (United States)

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  2. Reproducibility of Carbon and Water Cycle by an Ecosystem Process Based Model Using a Weather Generator and Effect of Temporal Concentration of Precipitation on Model Outputs

    Science.gov (United States)

    Miyauchi, T.; Machimura, T.

    2014-12-01

    GCM is generally used to produce input weather data for the simulation of carbon and water cycle by ecosystem process based models under climate change however its temporal resolution is sometimes incompatible to requirement. A weather generator (WG) is used for temporal downscaling of input weather data for models, where the effect of WG algorithms on reproducibility of ecosystem model outputs must be assessed. In this study simulated carbon and water cycle by Biome-BGC model using weather data measured and generated by CLIMGEN weather generator were compared. The measured weather data (daily precipitation, maximum, minimum air temperature) at a few sites for 30 years was collected from NNDC Online weather data. The generated weather data was produced by CLIMGEN parameterized using the measured weather data. NPP, heterotrophic respiration (HR), NEE and water outflow were simulated by Biome-BGC using measured and generated weather data. In the case of deciduous broad leaf forest in Lushi, Henan Province, China, 30 years average monthly NPP by WG was 10% larger than that by measured weather in the growing season. HR by WG was larger than that by measured weather in all months by 15% in average. NEE by WG was more negative in winter and was close to that by measured weather in summer. These differences in carbon cycle were because the soil water content by WG was larger than that by measured weather. The difference between monthly water outflow by WG and by measured weather was large and variable, and annual outflow by WG was 50% of that by measured weather. The inconsistency in carbon and water cycle by WG and measured weather was suggested be affected by the difference in temporal concentration of precipitation, which was assessed.

  3. Magni Reproducibility Example

    DEFF Research Database (Denmark)

    2016-01-01

    An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set.......An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set....

  4. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes......, as well as overall preference, was based on consistency tests of binary paired-comparison judgments and on modeling the choice frequencies using probabilistic choice models. As a result, the preferences of non-expert listeners could be measured reliably at a ratio scale level. Principal components derived...

  5. Demography-based adaptive network model reproduces the spatial organization of human linguistic groups

    Science.gov (United States)

    Capitán, José A.; Manrubia, Susanna

    2015-12-01

    The distribution of human linguistic groups presents a number of interesting and nontrivial patterns. The distributions of the number of speakers per language and the area each group covers follow log-normal distributions, while population and area fulfill an allometric relationship. The topology of networks of spatial contacts between different linguistic groups has been recently characterized, showing atypical properties of the degree distribution and clustering, among others. Human demography, spatial conflicts, and the construction of networks of contacts between linguistic groups are mutually dependent processes. Here we introduce an adaptive network model that takes all of them into account and successfully reproduces, using only four model parameters, not only those features of linguistic groups already described in the literature, but also correlations between demographic and topological properties uncovered in this work. Besides their relevance when modeling and understanding processes related to human biogeography, our adaptive network model admits a number of generalizations that broaden its scope and make it suitable to represent interactions between agents based on population dynamics and competition for space.

  6. Reproducibility and accuracy of linear measurements on dental models derived from cone-beam computed tomography compared with digital dental casts

    NARCIS (Netherlands)

    Waard, O. de; Rangel, F.A.; Fudalej, P.S.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Breuning, K.H.

    2014-01-01

    INTRODUCTION: The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models

  7. Amplifying modeling for broad bandwidth pulse in Nd:glass based on hybrid-broaden mechanism

    International Nuclear Information System (INIS)

    Sujingqin; Lanqin, L; Wenyi, W; Feng, J; Xiaofeng, W; Xiaomin, Z; Bin, L

    2008-01-01

    In this paper, the cross relaxation time is proposed to combine the homogeneous and inhomogeneous broaden mechanism for broad bandwidth pulse amplification model. The corresponding velocity equation, which can describe the response of inverse population on upper and low energy level of gain media to different frequency of pulse, is also put forward. The gain saturation and energy relaxation effect are also included in the velocity equation. Code named CPAP has been developed to simulate the amplifying process of broad bandwidth pulse in multi-pass laser system. The amplifying capability of multi-pass laser system is evaluated and gain narrowing and temporal shape distortion are also investigated when bandwidth of pulse and cross relaxation time of gain media are different. Results can benefit the design of high-energy PW laser system in LFRC, CAEP

  8. Amplifying modeling for broad bandwidth pulse in Nd:glass based on hybrid-broaden mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Sujingqin; Lanqin, L; Wenyi, W; Feng, J; Xiaofeng, W; Xiaomin, Z [Research Center of Laser Fusion, China Academy of Engineering Physics, P. O. Box 919-988, Mianyang, China, 621900 (China); Bin, L [School of Computer and Communication Engineering, Southwest Jiaotong University, Chengdu. China, 610031 (China)], E-mail: sujingqin@tom.com

    2008-05-15

    In this paper, the cross relaxation time is proposed to combine the homogeneous and inhomogeneous broaden mechanism for broad bandwidth pulse amplification model. The corresponding velocity equation, which can describe the response of inverse population on upper and low energy level of gain media to different frequency of pulse, is also put forward. The gain saturation and energy relaxation effect are also included in the velocity equation. Code named CPAP has been developed to simulate the amplifying process of broad bandwidth pulse in multi-pass laser system. The amplifying capability of multi-pass laser system is evaluated and gain narrowing and temporal shape distortion are also investigated when bandwidth of pulse and cross relaxation time of gain media are different. Results can benefit the design of high-energy PW laser system in LFRC, CAEP.

  9. Reproducibility analysis of measurements with a mechanical semiautomatic eye model for evaluation of intraocular lenses

    Science.gov (United States)

    Rank, Elisabet; Traxler, Lukas; Bayer, Natascha; Reutterer, Bernd; Lux, Kirsten; Drauschke, Andreas

    2014-03-01

    Mechanical eye models are used to validate ex vivo the optical quality of intraocular lenses (IOLs). The quality measurement and test instructions for IOLs are defined in the ISO 11979-2. However, it was mentioned in literature that these test instructions could lead to inaccurate measurements in case of some modern IOL designs. Reproducibility of alignment and measurement processes are presented, performed with a semiautomatic mechanical ex vivo eye model based on optical properties published by Liou and Brennan in the scale 1:1. The cornea, the iris aperture and the IOL itself are separately changeable within the eye model. The adjustment of the IOL can be manipulated by automatic decentration and tilt of the IOL in reference to the optical axis of the whole system, which is defined by the connection line of the central point of the artificial cornea and the iris aperture. With the presented measurement setup two quality criteria are measurable: the modulation transfer function (MTF) and the Strehl ratio. First the reproducibility of the alignment process for definition of initial conditions of the lateral position and tilt in reference to the optical axis of the system is investigated. Furthermore, different IOL holders are tested related to the stable holding of the IOL. The measurement is performed by a before-after comparison of the lens position using a typical decentration and tilt tolerance analysis path. Modulation transfer function MTF and Strehl ratio S before and after this tolerance analysis are compared and requirements for lens holder construction are deduced from the presented results.

  10. Reproducibility of temporomandibular joint tomography. Influence of shifted X-ray beam and tomographic focal plane on reproducibility

    International Nuclear Information System (INIS)

    Saito, Masashi

    1999-01-01

    Proper tomographic focal plane and x-ray beam direction are the most important factors to obtain accurate images of the temporomandibular joint (TMJ). In this study, to clarify the magnitude of effect of these two factors on the image quality. We evaluated the reproducibility of tomograms by measuring the distortion when the x-ray beam was shifted from the correct center of the object. The effects of the deviation of the tomographic focal plane on image quality were evaluated by the MTF (Modulation Transfer Function). Two types of tomograms, one the plane type, the other the rotational type were used in this study. A TMJ model was made from Teflon for the purpose of evaluation by shifting the x-ray beam. The x-ray images were obtained by tilting the model from 0 to 10 degrees 2-degree increments. These x-ray images were processed for computer image analysis, and then the distance between condyle and the joint space was measured. To evaluate the influence of the shifted tomographic focal plane on image sharpness, the x-ray images from each setting were analyzed by MTF. To obtain the MTF, ''knife-edge'' made from Pb was used. The images were scanned with a microdensitometer at the central focal plane, and 0, 0.5, 1 mm away respectively. The density curves were analyzed by Fourier analysis and the MTF was calculated. The reproducibility of images became worse by shifting the x-ray beam. This tendency was similar for both tomograms. Object characteristics such as anterior and posterior portion of the joint space affected the deterioration of reproducibility of the tomography. The deviation of the tomographic focal plane also decreased the reproducibility of the x-ray images. The rotational type showed a better MTF, but it became seriously unfavorable with slight changes of the tomographic focal plane. Contrarily, the plane type showed a lower MTF, but the image was stable with shifting of the tomographic focal plane. (author)

  11. Contrasting response to nutrient manipulation in Arctic mesocosms are reproduced by a minimum microbial food web model.

    Science.gov (United States)

    Larsen, Aud; Egge, Jorun K; Nejstgaard, Jens C; Di Capua, Iole; Thyrhaug, Runar; Bratbak, Gunnar; Thingstad, T Frede

    2015-03-01

    A minimum mathematical model of the marine pelagic microbial food web has previously shown to be able to reproduce central aspects of observed system response to different bottom-up manipulations in a mesocosm experiment Microbial Ecosystem Dynamics (MEDEA) in Danish waters. In this study, we apply this model to two mesocosm experiments (Polar Aquatic Microbial Ecology (PAME)-I and PAME-II) conducted at the Arctic location Kongsfjorden, Svalbard. The different responses of the microbial community to similar nutrient manipulation in the three mesocosm experiments may be described as diatom-dominated (MEDEA), bacteria-dominated (PAME-I), and flagellated-dominated (PAME-II). When allowing ciliates to be able to feed on small diatoms, the model describing the diatom-dominated MEDEA experiment give a bacteria-dominated response as observed in PAME I in which the diatom community comprised almost exclusively small-sized cells. Introducing a high initial mesozooplankton stock as observed in PAME-II, the model gives a flagellate-dominated response in accordance with the observed response also of this experiment. The ability of the model originally developed for temperate waters to reproduce population dynamics in a 10°C colder Arctic fjord, does not support the existence of important shifts in population balances over this temperature range. Rather, it suggests a quite resilient microbial food web when adapted to in situ temperature. The sensitivity of the model response to its mesozooplankton component suggests, however, that the seasonal vertical migration of Arctic copepods may be a strong forcing factor on Arctic microbial food webs.

  12. Self-Consistent Dynamical Model of the Broad Line Region

    Energy Technology Data Exchange (ETDEWEB)

    Czerny, Bozena [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Li, Yan-Rong [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, Beijing (China); Sredzinska, Justyna; Hryniewicz, Krzysztof [Copernicus Astronomical Center, Polish Academy of Sciences, Warsaw (Poland); Panda, Swayam [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Copernicus Astronomical Center, Polish Academy of Sciences, Warsaw (Poland); Wildy, Conor [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Karas, Vladimir, E-mail: bcz@cft.edu.pl [Astronomical Institute, Czech Academy of Sciences, Prague (Czech Republic)

    2017-06-22

    We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency) and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.

  13. Self-Consistent Dynamical Model of the Broad Line Region

    Directory of Open Access Journals (Sweden)

    Bozena Czerny

    2017-06-01

    Full Text Available We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.

  14. Can a coupled meteorology–chemistry model reproduce the historical trend in aerosol direct radiative effects over the Northern Hemisphere?

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere h...

  15. Reproducibility in a multiprocessor system

    Science.gov (United States)

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  16. Contextual sensitivity in scientific reproducibility

    Science.gov (United States)

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  17. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  18. Standing Together for Reproducibility in Large-Scale Computing: Report on reproducibility@XSEDE

    OpenAIRE

    James, Doug; Wilkins-Diehr, Nancy; Stodden, Victoria; Colbry, Dirk; Rosales, Carlos; Fahey, Mark; Shi, Justin; Silva, Rafael F.; Lee, Kyo; Roskies, Ralph; Loewe, Laurence; Lindsey, Susan; Kooper, Rob; Barba, Lorena; Bailey, David

    2014-01-01

    This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enab...

  19. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  20. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  1. Power scaling and experimentally fitted model for broad area quantum cascade lasers in continuous wave operation

    Science.gov (United States)

    Suttinger, Matthew; Go, Rowel; Figueiredo, Pedro; Todi, Ankesh; Shu, Hong; Leshin, Jason; Lyakh, Arkadiy

    2018-01-01

    Experimental and model results for 15-stage broad area quantum cascade lasers (QCLs) are presented. Continuous wave (CW) power scaling from 1.62 to 2.34 W has been experimentally demonstrated for 3.15-mm long, high reflection-coated QCLs for an active region width increased from 10 to 20 μm. A semiempirical model for broad area devices operating in CW mode is presented. The model uses measured pulsed transparency current, injection efficiency, waveguide losses, and differential gain as input parameters. It also takes into account active region self-heating and sublinearity of pulsed power versus current laser characteristic. The model predicts that an 11% improvement in maximum CW power and increased wall-plug efficiency can be achieved from 3.15 mm×25 μm devices with 21 stages of the same design, but half doping in the active region. For a 16-stage design with a reduced stage thickness of 300 Å, pulsed rollover current density of 6 kA/cm2, and InGaAs waveguide layers, an optical power increase of 41% is projected. Finally, the model projects that power level can be increased to ˜4.5 W from 3.15 mm×31 μm devices with the baseline configuration with T0 increased from 140 K for the present design to 250 K.

  2. DISCOVERY OF BROAD MOLECULAR LINES AND OF SHOCKED MOLECULAR HYDROGEN FROM THE SUPERNOVA REMNANT G357.7+0.3: HHSMT, APEX, SPITZER , AND SOFIA OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Rho, J. [SETI Institute, 189 N. Bernardo Ave., Mountain View, CA 94043 (United States); Hewitt, J. W. [CRESST/University of Maryland, Baltimore County, Baltimore, MD 21250 and NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Bieging, J. [Steward Observatory, The University of Arizona, Tucson AZ 85721 (United States); Reach, W. T. [Universities Space Research Association, SOFIA Science Center, NASA Ames Research Center, MS 232, Moffett Field, CA 94034 (United States); Andersen, M. [Gemini Observatory, Casilla 603, La Serena (Chile); Güsten, R., E-mail: jrho@seti.org, E-mail: john.w.hewitt@unf.edu, E-mail: jbieging@as.arizona.edu, E-mail: wreach@sofia.usra.edu, E-mail: manderse@gemini.edu, E-mail: guesten@mpifr-bonn.mpg.de [Max Planck Institut für Radioastronomie, Auf dem Hugel 69, D-53121 Bonn (Germany)

    2017-01-01

    We report a discovery of shocked gas from the supernova remnant (SNR) G357.7+0.3. Our millimeter and submillimeter observations reveal broad molecular lines of CO(2-1), CO(3-2), CO(4-3), {sup 13}CO (2-1), and {sup 13}CO (3-2), HCO{sup +}, and HCN using the Heinrich Hertz Submillimeter Telescope, the Arizona 12 m Telescope, APEX, and the MOPRA Telescope. The widths of the broad lines are 15–30 km s{sup −1}, and the detection of such broad lines is unambiguous, dynamic evidence showing that the SNR G357.7+0.3 is interacting with molecular clouds. The broad lines appear in extended regions (>4.′5 × 5′). We also present the detection of shocked H{sub 2} emission in the mid-infrared but lacking ionic lines using Spitzer /IRS observations to map a few-arcminute area. The H{sub 2} excitation diagram shows a best fit with a two-temperature local thermal equilibrium model with the temperatures of ∼200 and 660 K. We observed [C ii] at 158 μ m and high- J CO(11-10) with the German Receiver for Astronomy at Terahertz Frequencies (GREAT) on the Stratospheric Observatory for Infrared Astronomy. The GREAT spectrum of [C ii], a 3 σ detection, shows a broad line profile with a width of 15.7 km{sup −1} that is similar to those of broad CO molecular lines. The line width of [C ii] implies that ionic lines can come from a low-velocity C-shock. Comparison of H{sub 2} emission with shock models shows that a combination of two C-shock models is favored over a combination of C- and J-shocks or a single shock. We estimate the CO density, column density, and temperature using a RADEX model. The best-fit model with n (H{sub 2}) = 1.7 × 10{sup 4} cm{sup −3}, N(CO) = 5.6 × 10{sup 16} cm{sup −2}, and T  = 75 K can reproduce the observed millimeter CO brightnesses.

  3. Validation of EURO-CORDEX regional climate models in reproducing the variability of precipitation extremes in Romania

    Science.gov (United States)

    Dumitrescu, Alexandru; Busuioc, Aristita

    2016-04-01

    EURO-CORDEX is the European branch of the international CORDEX initiative that aims to provide improved regional climate change projections for Europe. The main objective of this paper is to document the performance of the individual models in reproducing the variability of precipitation extremes in Romania. Here three EURO-CORDEX regional climate models (RCMs) ensemble (scenario RCP4.5) are analysed and inter-compared: DMI-HIRHAM5, KNMI-RACMO2.2 and MPI-REMO. Compared to previous studies, when the RCM validation regarding the Romanian climate has mainly been made on mean state and at station scale, a more quantitative approach of precipitation extremes is proposed. In this respect, to have a more reliable comparison with observation, a high resolution daily precipitation gridded data set was used as observational reference (CLIMHYDEX project). The comparison between the RCM outputs and observed grid point values has been made by calculating three extremes precipitation indices, recommended by the Expert Team on Climate Change Detection Indices (ETCCDI), for the 1976-2005 period: R10MM, annual count of days when precipitation ≥10mm; RX5DAY, annual maximum 5-day precipitation and R95P%, precipitation fraction of annual total precipitation due to daily precipitation > 95th percentile. The RCMs capability to reproduce the mean state for these variables, as well as the main modes of their spatial variability (given by the first three EOF patterns), are analysed. The investigation confirms the ability of RCMs to simulate the main features of the precipitation extreme variability over Romania, but some deficiencies in reproducing of their regional characteristics were found (for example, overestimation of the mea state, especially over the extra Carpathian regions). This work has been realised within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian

  4. Measurement of cerebral blood flow by intravenous xenon-133 technique and a mobile system. Reproducibility using the Obrist model compared to total curve analysis

    DEFF Research Database (Denmark)

    Schroeder, T; Holstein, P; Lassen, N A

    1986-01-01

    and side-to-side asymmetry. Data were analysed according to the Obrist model and the results compared with those obtained using a model correcting for the air passage artifact. Reproducibility was of the same order of magnitude as reported using stationary equipment. The side-to-side CBF asymmetry...... was considerably more reproducible than CBF level. Using a single detector instead of five regional values averaged as the hemispheric flow increased standard deviation of CBF level by 10-20%, while the variation in asymmetry was doubled. In optimal measuring conditions the two models revealed no significant...... differences, but in low flow situations the artifact model yielded significantly more stable results. The present apparatus, equipped with 3-5 detectors covering each hemisphere, offers the opportunity of performing serial CBF measurements in situations not otherwise feasible....

  5. Improving Students' Understanding of Molecular Structure through Broad-Based Use of Computer Models in the Undergraduate Organic Chemistry Lecture

    Science.gov (United States)

    Springer, Michael T.

    2014-01-01

    Several articles suggest how to incorporate computer models into the organic chemistry laboratory, but relatively few papers discuss how to incorporate these models broadly into the organic chemistry lecture. Previous research has suggested that "manipulating" physical or computer models enhances student understanding; this study…

  6. A stable and reproducible human blood-brain barrier model derived from hematopoietic stem cells.

    Directory of Open Access Journals (Sweden)

    Romeo Cecchelli

    Full Text Available The human blood brain barrier (BBB is a selective barrier formed by human brain endothelial cells (hBECs, which is important to ensure adequate neuronal function and protect the central nervous system (CNS from disease. The development of human in vitro BBB models is thus of utmost importance for drug discovery programs related to CNS diseases. Here, we describe a method to generate a human BBB model using cord blood-derived hematopoietic stem cells. The cells were initially differentiated into ECs followed by the induction of BBB properties by co-culture with pericytes. The brain-like endothelial cells (BLECs express tight junctions and transporters typically observed in brain endothelium and maintain expression of most in vivo BBB properties for at least 20 days. The model is very reproducible since it can be generated from stem cells isolated from different donors and in different laboratories, and could be used to predict CNS distribution of compounds in human. Finally, we provide evidence that Wnt/β-catenin signaling pathway mediates in part the BBB inductive properties of pericytes.

  7. Prediction of lung tumour position based on spirometry and on abdominal displacement: Accuracy and reproducibility

    International Nuclear Information System (INIS)

    Hoisak, Jeremy D.P.; Sixel, Katharina E.; Tirona, Romeo; Cheung, Patrick C.F.; Pignol, Jean-Philippe

    2006-01-01

    Background and purpose: A simulation investigating the accuracy and reproducibility of a tumour motion prediction model over clinical time frames is presented. The model is formed from surrogate and tumour motion measurements, and used to predict the future position of the tumour from surrogate measurements alone. Patients and methods: Data were acquired from five non-small cell lung cancer patients, on 3 days. Measurements of respiratory volume by spirometry and abdominal displacement by a real-time position tracking system were acquired simultaneously with X-ray fluoroscopy measurements of superior-inferior tumour displacement. A model of tumour motion was established and used to predict future tumour position, based on surrogate input data. The calculated position was compared against true tumour motion as seen on fluoroscopy. Three different imaging strategies, pre-treatment, pre-fraction and intrafractional imaging, were employed in establishing the fitting parameters of the prediction model. The impact of each imaging strategy upon accuracy and reproducibility was quantified. Results: When establishing the predictive model using pre-treatment imaging, four of five patients exhibited poor interfractional reproducibility for either surrogate in subsequent sessions. Simulating the formulation of the predictive model prior to each fraction resulted in improved interfractional reproducibility. The accuracy of the prediction model was only improved in one of five patients when intrafractional imaging was used. Conclusions: Employing a prediction model established from measurements acquired at planning resulted in localization errors. Pre-fractional imaging improved the accuracy and reproducibility of the prediction model. Intrafractional imaging was of less value, suggesting that the accuracy limit of a surrogate-based prediction model is reached with once-daily imaging

  8. Field Validation of Habitat Suitability Models for Vulnerable Marine Ecosystems in the South Pacific Ocean: Implications for the use of Broad-scale Models in Fisheries Management

    Science.gov (United States)

    Anderson, O. F.; Guinotte, J. M.; Clark, M. R.; Rowden, A. A.; Mormede, S.; Davies, A. J.; Bowden, D.

    2016-02-01

    Spatial management of vulnerable marine ecosystems requires accurate knowledge of their distribution. Predictive habitat suitability modelling, using species presence data and a suite of environmental predictor variables, has emerged as a useful tool for inferring distributions outside of known areas. However, validation of model predictions is typically performed with non-independent data. In this study, we describe the results of habitat suitability models constructed for four deep-sea reef-forming coral species across a large region of the South Pacific Ocean using MaxEnt and Boosted Regression Tree modelling approaches. In order to validate model predictions we conducted a photographic survey on a set of seamounts in an un-sampled area east of New Zealand. The likelihood of habitat suitable for reef forming corals on these seamounts was predicted to be variable, but very high in some regions, particularly where levels of aragonite saturation, dissolved oxygen, and particulate organic carbon were optimal. However, the observed frequency of coral occurrence in analyses of survey photographic data was much lower than expected, and patterns of observed versus predicted coral distribution were not highly correlated. The poor performance of these broad-scale models is attributed to lack of recorded species absences to inform the models, low precision of global bathymetry models, and lack of data on the geomorphology and substrate of the seamounts at scales appropriate to the modelled taxa. This demonstrates the need to use caution when interpreting and applying broad-scale, presence-only model results for fisheries management and conservation planning in data poor areas of the deep sea. Future improvements in the predictive performance of broad-scale models will rely on the continued advancement in modelling of environmental predictor variables, refinements in modelling approaches to deal with missing or biased inputs, and incorporation of true absence data.

  9. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    Science.gov (United States)

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  10. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  11. Stochastic model of financial markets reproducing scaling and memory in volatility return intervals

    Science.gov (United States)

    Gontis, V.; Havlin, S.; Kononovicius, A.; Podobnik, B.; Stanley, H. E.

    2016-11-01

    We investigate the volatility return intervals in the NYSE and FOREX markets. We explain previous empirical findings using a model based on the interacting agent hypothesis instead of the widely-used efficient market hypothesis. We derive macroscopic equations based on the microscopic herding interactions of agents and find that they are able to reproduce various stylized facts of different markets and different assets with the same set of model parameters. We show that the power-law properties and the scaling of return intervals and other financial variables have a similar origin and could be a result of a general class of non-linear stochastic differential equations derived from a master equation of an agent system that is coupled by herding interactions. Specifically, we find that this approach enables us to recover the volatility return interval statistics as well as volatility probability and spectral densities for the NYSE and FOREX markets, for different assets, and for different time-scales. We find also that the historical S&P500 monthly series exhibits the same volatility return interval properties recovered by our proposed model. Our statistical results suggest that human herding is so strong that it persists even when other evolving fluctuations perturbate the financial system.

  12. A novel, comprehensive, and reproducible porcine model for determining the timing of bruises in forensic pathology

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2016-01-01

    Purpose Calculating the timing of bruises is crucial in forensic pathology but is a challenging discipline in both human and veterinary medicine. A mechanical device for inflicting bruises in pigs was developed and validated, and the pathological reactions in the bruises were studied over time......-dependent response. Combining these parameters, bruises could be grouped as being either less than 4 h old or between 4 and 10 h of age. Gross lesions and changes in the epidermis and dermis were inconclusive with respect to time determination. Conclusions The model was reproducible and resembled forensic cases...

  13. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  14. Repeatability and reproducibility of Population Viability Analysis (PVA and the implications for threatened species management

    Directory of Open Access Journals (Sweden)

    Clare Morrison

    2016-08-01

    Full Text Available Conservation triage focuses on prioritizing species, populations or habitats based on urgency, biodiversity benefits, recovery potential as well as cost. Population Viability Analysis (PVA is frequently used in population focused conservation prioritizations. The critical nature of many of these management decisions requires that PVA models are repeatable and reproducible to reliably rank species and/or populations quantitatively. This paper assessed the repeatability and reproducibility of a subset of previously published PVA models. We attempted to rerun baseline models from 90 publicly available PVA studies published between 2000-2012 using the two most common PVA modelling software programs, VORTEX and RAMAS-GIS. Forty percent (n = 36 failed, 50% (45 were both repeatable and reproducible, and 10% (9 had missing baseline models. Repeatability was not linked to taxa, IUCN category, PVA program version used, year published or the quality of publication outlet, suggesting that the problem is systemic within the discipline. Complete and systematic presentation of PVA parameters and results are needed to ensure that the scientific input into conservation planning is both robust and reliable, thereby increasing the chances of making decisions that are both beneficial and defensible. The implications for conservation triage may be far reaching if population viability models cannot be reproduced with confidence, thus undermining their intended value.

  15. On the origin of reproducible sequential activity in neural circuits

    Science.gov (United States)

    Afraimovich, V. S.; Zhigulin, V. P.; Rabinovich, M. I.

    2004-12-01

    Robustness and reproducibility of sequential spatio-temporal responses is an essential feature of many neural circuits in sensory and motor systems of animals. The most common mathematical images of dynamical regimes in neural systems are fixed points, limit cycles, chaotic attractors, and continuous attractors (attractive manifolds of neutrally stable fixed points). These are not suitable for the description of reproducible transient sequential neural dynamics. In this paper we present the concept of a stable heteroclinic sequence (SHS), which is not an attractor. SHS opens the way for understanding and modeling of transient sequential activity in neural circuits. We show that this new mathematical object can be used to describe robust and reproducible sequential neural dynamics. Using the framework of a generalized high-dimensional Lotka-Volterra model, that describes the dynamics of firing rates in an inhibitory network, we present analytical results on the existence of the SHS in the phase space of the network. With the help of numerical simulations we confirm its robustness in presence of noise in spite of the transient nature of the corresponding trajectories. Finally, by referring to several recent neurobiological experiments, we discuss possible applications of this new concept to several problems in neuroscience.

  16. Mouse Models of Diet-Induced Nonalcoholic Steatohepatitis Reproduce the Heterogeneity of the Human Disease

    Science.gov (United States)

    Machado, Mariana Verdelho; Michelotti, Gregory Alexander; Xie, Guanhua; de Almeida, Thiago Pereira; Boursier, Jerome; Bohnic, Brittany; Guy, Cynthia D.; Diehl, Anna Mae

    2015-01-01

    Background and aims Non-alcoholic steatohepatitis (NASH), the potentially progressive form of nonalcoholic fatty liver disease (NAFLD), is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD) diet and Western diet. Methods Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose) for 16 weeks. Liver pathology and metabolic profile were compared. Results The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation) was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation. Conclusion Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH. PMID:26017539

  17. Mouse models of diet-induced nonalcoholic steatohepatitis reproduce the heterogeneity of the human disease.

    Directory of Open Access Journals (Sweden)

    Mariana Verdelho Machado

    Full Text Available Non-alcoholic steatohepatitis (NASH, the potentially progressive form of nonalcoholic fatty liver disease (NAFLD, is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD diet and Western diet.Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose for 16 weeks. Liver pathology and metabolic profile were compared.The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation.Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH.

  18. Broad spectrum microarray for fingerprint-based bacterial species identification

    Directory of Open Access Journals (Sweden)

    Frey Jürg E

    2010-02-01

    Full Text Available Abstract Background Microarrays are powerful tools for DNA-based molecular diagnostics and identification of pathogens. Most target a limited range of organisms and are based on only one or a very few genes for specific identification. Such microarrays are limited to organisms for which specific probes are available, and often have difficulty discriminating closely related taxa. We have developed an alternative broad-spectrum microarray that employs hybridisation fingerprints generated by high-density anonymous markers distributed over the entire genome for identification based on comparison to a reference database. Results A high-density microarray carrying 95,000 unique 13-mer probes was designed. Optimized methods were developed to deliver reproducible hybridisation patterns that enabled confident discrimination of bacteria at the species, subspecies, and strain levels. High correlation coefficients were achieved between replicates. A sub-selection of 12,071 probes, determined by ANOVA and class prediction analysis, enabled the discrimination of all samples in our panel. Mismatch probe hybridisation was observed but was found to have no effect on the discriminatory capacity of our system. Conclusions These results indicate the potential of our genome chip for reliable identification of a wide range of bacterial taxa at the subspecies level without laborious prior sequencing and probe design. With its high resolution capacity, our proof-of-principle chip demonstrates great potential as a tool for molecular diagnostics of broad taxonomic groups.

  19. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    Science.gov (United States)

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    osteosarcoma model was shown to be feasible: the take rate was high, surgical mortality was negligible and the procedure was simple to perform and easily reproduced. It may be a useful tool in the investigation of antiangiogenic and anticancer therapeutics. Ultrasound was found to be a highly accurate tool for tumor diagnosis, localization and measurement and may be recommended for monitoring tumor growth in this model.

  20. Reproducibility study of TLD-100 micro-cubes at radiotherapy dose level

    International Nuclear Information System (INIS)

    Rosa, Luiz Antonio R. da; Regulla, Dieter F.; Fill, Ute A.

    1999-01-01

    The precision of the thermoluminescent response of Harshaw micro-cube dosimeters (TLD-100), evaluated in both Harshaw thermoluminescent readers 5500 and 3500, for 1 Gy dose value, was investigated. The mean reproducibility for micro-cubes, pre-readout annealed at 100 deg. C for 15 min, evaluated with the manual planchet reader 3500, is 0.61% (1 standard deviation). When micro-cubes are evaluated with the automated hot-gas reader 5500, reproducibility values are undoubtedly worse, mean reproducibility for numerically stabilised dosimeters being equal to 3.27% (1 standard deviation). These results indicate that the reader model 5500, or, at least, the instrument used for the present measurements, is not adequate for micro-cube evaluation, if precise and accurate dosimetry is required. The difference in precision is apparently due to geometry inconsistencies in the orientation of the imperfect micro-cube faces during readout, requiring careful and manual reproducible arrangement of the selected micro-cube faces in contact with the manual reader planchet

  1. Quantifying the Accuracy of Digital Hemispherical Photography for Leaf Area Index Estimates on Broad-Leaved Tree Species.

    Science.gov (United States)

    Gilardelli, Carlo; Orlando, Francesca; Movedi, Ermes; Confalonieri, Roberto

    2018-03-29

    Digital hemispherical photography (DHP) has been widely used to estimate leaf area index (LAI) in forestry. Despite the advancement in the processing of hemispherical images with dedicated tools, several steps are still manual and thus easily affected by user's experience and sensibility. The purpose of this study was to quantify the impact of user's subjectivity on DHP LAI estimates for broad-leaved woody canopies using the software Can-Eye. Following the ISO 5725 protocol, we quantified the repeatability and reproducibility of the method, thus defining its precision for a wide range of broad-leaved canopies markedly differing for their structure. To get a complete evaluation of the method accuracy, we also quantified its trueness using artificial canopy images with known canopy cover. Moreover, the effect of the segmentation method was analysed. The best results for precision (restrained limits of repeatability and reproducibility) were obtained for high LAI values (>5) with limits corresponding to a variation of 22% in the estimated LAI values. Poorer results were obtained for medium and low LAI values, with a variation of the estimated LAI values that exceeded the 40%. Regardless of the LAI range explored, satisfactory results were achieved for trees in row-structured plantations (limits almost equal to the 30% of the estimated LAI). Satisfactory results were achieved for trueness, regardless of the canopy structure. The paired t -test revealed that the effect of the segmentation method on LAI estimates was significant. Despite a non-negligible user effect, the accuracy metrics for DHP are consistent with those determined for other indirect methods for LAI estimates, confirming the overall reliability of DHP in broad-leaved woody canopies.

  2. Quantifying the Accuracy of Digital Hemispherical Photography for Leaf Area Index Estimates on Broad-Leaved Tree Species

    Directory of Open Access Journals (Sweden)

    Carlo Gilardelli

    2018-03-01

    Full Text Available Digital hemispherical photography (DHP has been widely used to estimate leaf area index (LAI in forestry. Despite the advancement in the processing of hemispherical images with dedicated tools, several steps are still manual and thus easily affected by user’s experience and sensibility. The purpose of this study was to quantify the impact of user’s subjectivity on DHP LAI estimates for broad-leaved woody canopies using the software Can-Eye. Following the ISO 5725 protocol, we quantified the repeatability and reproducibility of the method, thus defining its precision for a wide range of broad-leaved canopies markedly differing for their structure. To get a complete evaluation of the method accuracy, we also quantified its trueness using artificial canopy images with known canopy cover. Moreover, the effect of the segmentation method was analysed. The best results for precision (restrained limits of repeatability and reproducibility were obtained for high LAI values (>5 with limits corresponding to a variation of 22% in the estimated LAI values. Poorer results were obtained for medium and low LAI values, with a variation of the estimated LAI values that exceeded the 40%. Regardless of the LAI range explored, satisfactory results were achieved for trees in row-structured plantations (limits almost equal to the 30% of the estimated LAI. Satisfactory results were achieved for trueness, regardless of the canopy structure. The paired t-test revealed that the effect of the segmentation method on LAI estimates was significant. Despite a non-negligible user effect, the accuracy metrics for DHP are consistent with those determined for other indirect methods for LAI estimates, confirming the overall reliability of DHP in broad-leaved woody canopies.

  3. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  4. Screening of broad spectrum natural pesticides against conserved target arginine kinase in cotton pests by molecular modeling.

    Science.gov (United States)

    Sakthivel, Seethalakshmi; Habeeb, S K M; Raman, Chandrasekar

    2018-03-12

    Cotton is an economically important crop and its production is challenged by the diversity of pests and related insecticide resistance. Identification of the conserved target across the cotton pest will help to design broad spectrum insecticide. In this study, we have identified conserved sequences by Expressed Sequence Tag profiling from three cotton pests namely Aphis gossypii, Helicoverpa armigera, and Spodoptera exigua. One target protein arginine kinase having a key role in insect physiology and energy metabolism was studied further using homology modeling, virtual screening, molecular docking, and molecular dynamics simulation to identify potential biopesticide compounds from the Zinc natural database. We have identified four compounds having excellent inhibitor potential against the identified broad spectrum target which are highly specific to invertebrates.

  5. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    Science.gov (United States)

    2017-08-09

    project.org/) and SPSS (IBM Corp., Armonk, NY) for data analysis. Mean and confidence inter- vals for each measure are found in Tables 1–7. To assess...visits, and was calculated using a two- way mixed model in SPSS MCV and MRD values closer to 0 are considered to be the most reproducible, and ICC

  6. Hopping models for ion conduction in noncrystals

    DEFF Research Database (Denmark)

    Dyre, Jeppe; Schrøder, Thomas

    2007-01-01

    semiconductors). These universalities are subject of much current interest, for instance interpreted in the context of simple hopping models. In the present paper we first discuss the temperature dependence of the dc conductivity in hopping models and the importance of the percolation phenomenon. Next......, the experimental (quasi)universality of the ac conductivity is discussed. It is shown that hopping models are able to reproduce the experimental finding that the response obeys time-temperature superposition, while at the same time a broad range of activation energies is involved in the conduction process. Again...

  7. Relevant principal factors affecting the reproducibility of insect primary culture.

    Science.gov (United States)

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-06-01

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  8. Silver nanowires for highly reproducible cantilever based AFM-TERS microscopy: towards a universal TERS probe.

    Science.gov (United States)

    Walke, Peter; Fujita, Yasuhiko; Peeters, Wannes; Toyouchi, Shuichi; Frederickx, Wout; De Feyter, Steven; Uji-I, Hiroshi

    2018-04-26

    Tip-enhanced Raman scattering (TERS) microscopy is a unique analytical tool to provide complementary chemical and topographic information of surfaces with nanometric resolution. However, difficulties in reliably producing the necessary metallized scanning probe tips has limited its widespread utilisation, particularly in the case of cantilever-based atomic force microscopy. Attempts to alleviate tip related issues using colloidal or bottom-up engineered tips have so far not reported consistent probes for both Raman and topographic imaging. Here we demonstrate the reproducible fabrication of cantilever-based high-performance TERS probes for both topographic and Raman measurements, based on an approach that utilises noble metal nanowires as the active TERS probe. The tips show 10 times higher TERS contrasts than the most typically used electrochemically-etched tips, and show a reproducibility for TERS greater than 90%, far greater than found with standard methods. We show that TERS can be performed in tapping as well as contact AFM mode, with optical resolutions around or below 15 nm, and with a maximum resolution achieved in tapping-mode of 6 nm. Our work illustrates that superior TERS probes can be produced in a fast and cost-effective manner using simple wet-chemistry methods, leading to reliable and reproducible high-resolution and high-sensitivity TERS, and thus renders the technique applicable for a broad community.

  9. Reproducible Hydrogeophysical Inversions through the Open-Source Library pyGIMLi

    Science.gov (United States)

    Wagner, F. M.; Rücker, C.; Günther, T.

    2017-12-01

    Many tasks in applied geosciences cannot be solved by a single measurement method and require the integration of geophysical, geotechnical and hydrological methods. In the emerging field of hydrogeophysics, researchers strive to gain quantitative information on process-relevant subsurface parameters by means of multi-physical models, which simulate the dynamic process of interest as well as its geophysical response. However, such endeavors are associated with considerable technical challenges, since they require coupling of different numerical models. This represents an obstacle for many practitioners and students. Even technically versatile users tend to build individually tailored solutions by coupling different (and potentially proprietary) forward simulators at the cost of scientific reproducibility. We argue that the reproducibility of studies in computational hydrogeophysics, and therefore the advancement of the field itself, requires versatile open-source software. To this end, we present pyGIMLi - a flexible and computationally efficient framework for modeling and inversion in geophysics. The object-oriented library provides management for structured and unstructured meshes in 2D and 3D, finite-element and finite-volume solvers, various geophysical forward operators, as well as Gauss-Newton based frameworks for constrained, joint and fully-coupled inversions with flexible regularization. In a step-by-step demonstration, it is shown how the hydrogeophysical response of a saline tracer migration can be simulated. Tracer concentration data from boreholes and measured voltages at the surface are subsequently used to estimate the hydraulic conductivity distribution of the aquifer within a single reproducible Python script.

  10. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  11. Reproducing Epidemiologic Research and Ensuring Transparency.

    Science.gov (United States)

    Coughlin, Steven S

    2017-08-15

    Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  13. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  14. Reproducing the Wechsler Intelligence Scale for Children-Fifth Edition: Factor Model Results

    Science.gov (United States)

    Beaujean, A. Alexander

    2016-01-01

    One of the ways to increase the reproducibility of research is for authors to provide a sufficient description of the data analytic procedures so that others can replicate the results. The publishers of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) do not follow these guidelines when reporting their confirmatory factor…

  15. Photoionization Modeling

    Science.gov (United States)

    Kallman, T.

    2010-01-01

    Warm absorber spectra are characterized by the many lines from partially ionized intermediate-Z elements, and iron, detected with the grating instruments on Chandra and XMM-Newton. If these ions are formed in a gas which is in photoionization equilibrium, they correspond to a broad range of ionization parameters, although there is evidence for certain preferred values. A test for any dynamical model for these outflows is to reproduce these properties, at some level of detail. In this paper we present a statistical analysis of the ionization distribution which can be applied both the observed spectra and to theoretical models. As an example, we apply it to our dynamical models for warm absorber outflows, based on evaporation from the molecular torus.

  16. Hot-Volumes as Uniform and Reproducible SERS-Detection Enhancers in Weakly-Coupled Metallic Nanohelices

    Science.gov (United States)

    Caridad, José M.; Winters, Sinéad; McCloskey, David; Duesberg, Georg S.; Donegan, John F.; Krstić, Vojislav

    2017-03-01

    Reproducible and enhanced optical detection of molecules in low concentrations demands simultaneously intense and homogeneous electric fields acting as robust signal amplifiers. To generate such sophisticated optical near-fields, different plasmonic nanostructures were investigated in recent years. These, however, exhibit either high enhancement factor (EF) or spatial homogeneity but not both. Small interparticle gaps or sharp nanostructures show enormous EFs but no near-field homogeneity. Meanwhile, approaches using rounded and separated monomers create uniform near-fields with moderate EFs. Here, guided by numerical simulations, we show how arrays of weakly-coupled Ag nanohelices achieve both homogeneous and strong near-field enhancements, reaching even the limit forreproducible detection of individual molecules. The unique near-field distribution of a single nanohelix consists of broad hot-spots, merging with those from neighbouring nanohelices in specific array configurations and generating a wide and uniform detection zone (“hot-volume”). We experimentally assessed these nanostructures via surface-enhanced Raman spectroscopy, obtaining a corresponding EF of ~107 and a relative standard deviation <10%. These values demonstrate arrays of nanohelices as state-of-the-art substrates for reproducible optical detection as well as compelling nanostructures for related fields such as near-field imaging.

  17. Broad-Band Spectroscopy of Hercules X-1 with Suzaku

    Science.gov (United States)

    Asami, Fumi; Enoto, Teruaki; Iwakiri, Wataru; Yamada, Shin'ya; Tamagawa, Toru; Mihara, Tatehiro; Nagase, Fumiaki

    2014-01-01

    Hercules X-1 was observed with Suzaku in the main-on state from 2005 to 2010. The 0.4- 100 keV wide-band spectra obtained in four observations showed a broad hump around 4-9 keV in addition to narrow Fe lines at 6.4 and 6.7 keV. The hump was seen in all the four observations regardless of the selection of the continuum models. Thus it is considered a stable and intrinsic spectral feature in Her X-1. The broad hump lacked a sharp structure like an absorption edge. Thus it was represented by two different spectral models: an ionized partial covering or an additional broad line at 6.5 keV. The former required a persistently existing ionized absorber, whose origin was unclear. In the latter case, the Gaussian fitting of the 6.5-keV line needs a large width of sigma = 1.0-1.5 keV and a large equivalent width of 400-900 eV. If the broad line originates from Fe fluorescence of accreting matter, its large width may be explained by the Doppler broadening in the accretion flow. However, the large equivalent width may be inconsistent with a simple accretion geometry.

  18. Enriched reproducing kernel particle method for fractional advection-diffusion equation

    Science.gov (United States)

    Ying, Yuping; Lian, Yanping; Tang, Shaoqiang; Liu, Wing Kam

    2018-06-01

    The reproducing kernel particle method (RKPM) has been efficiently applied to problems with large deformations, high gradients and high modal density. In this paper, it is extended to solve a nonlocal problem modeled by a fractional advection-diffusion equation (FADE), which exhibits a boundary layer with low regularity. We formulate this method on a moving least-square approach. Via the enrichment of fractional-order power functions to the traditional integer-order basis for RKPM, leading terms of the solution to the FADE can be exactly reproduced, which guarantees a good approximation to the boundary layer. Numerical tests are performed to verify the proposed approach.

  19. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  20. Artificial neural network model to predict slag viscosity over a broad range of temperatures and slag compositions

    Energy Technology Data Exchange (ETDEWEB)

    Duchesne, Marc A. [Chemical and Biological Engineering Department, University of Ottawa, 161 Louis Pasteur, Ottawa, Ont. (Canada); CanmetENERGY, 1 Haanel Drive, Ottawa, Ontario (Canada); Macchi, Arturo [Chemical and Biological Engineering Department, University of Ottawa, 161 Louis Pasteur, Ottawa, Ont. (Canada); Lu, Dennis Y.; Hughes, Robin W.; McCalden, David; Anthony, Edward J. [CanmetENERGY, 1 Haanel Drive, Ottawa, Ontario (Canada)

    2010-08-15

    Threshold slag viscosity heuristics are often used for the initial assessment of coal gasification projects. Slag viscosity predictions are also required for advanced combustion and gasification models. Due to unsatisfactory performance of theoretical equations, an artificial neural network model was developed to predict slag viscosity over a broad range of temperatures and slag compositions. This model outperforms other slag viscosity models, resulting in an average error factor of 5.05 which is lower than the best obtained with other available models. Genesee coal ash viscosity predictions were made to investigate the effect of adding Canadian limestone and dolomite. The results indicate that magnesium in the fluxing agent provides a greater viscosity reduction than calcium for the threshold slag tapping temperature range. (author)

  1. Entangled states that cannot reproduce original classical games in their quantum version

    International Nuclear Information System (INIS)

    Shimamura, Junichi; Oezdemir, S.K.; Morikoshi, Fumiaki; Imoto, Nobuyuki

    2004-01-01

    A model of a quantum version of classical games should reproduce the original classical games in order to be able to make a comparative analysis of quantum and classical effects. We analyze a class of symmetric multipartite entangled states and their effect on the reproducibility of the classical games. We present the necessary and sufficient condition for the reproducibility of the original classical games. Satisfying this condition means that complete orthogonal bases can be constructed from a given multipartite entangled state provided that each party is restricted to two local unitary operators. We prove that most of the states belonging to the class of symmetric states with respect to permutations, including the N-qubit W state, do not satisfy this condition

  2. Broad beam ion sources and some surface processes

    International Nuclear Information System (INIS)

    Neumann, H.; Scholze, F.; Tarz, M.; Schindler, A.; Wiese, R.; Nestler, M.; Blum, T.

    2005-01-01

    Modern broad-beam multi-aperture ion sources are widely used in material and surface technology applications. Customizing the generated ion beam properties (i. e. the ion current density profile) for specific demands of the application is a main challenge in the improvement of the ion beam technologies. First we introduce ion sources based on different plasma excitation principles shortly. An overview of source plasma and ion beam measurement methods deliver input data for modelling methods. This beam profile modelling using numerical trajectory codes and the validation of the results by Faraday cup measurements as a basis for ion beam profile design are described. Furthermore possibilities for ex situ and in situ beam profile control are demonstrated, like a special method for in situ control of a linear ion source beam profile, a grid modification for circular beam profile design and a cluster principle for broad beam sources. By means of these methods, the beam shape may be adapted to specific technological demands. Examples of broad beam source application in ion beam figuring of optical surfaces, modification of stainless steel, photo voltaic processes and deposition of EUVL-multilayer stacks are finally presented. (Author)

  3. Efficient and reproducible myogenic differentiation from human iPS cells: prospects for modeling Miyoshi Myopathy in vitro.

    Directory of Open Access Journals (Sweden)

    Akihito Tanaka

    Full Text Available The establishment of human induced pluripotent stem cells (hiPSCs has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1 in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70-90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF. These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs.

  4. Efficient and Reproducible Myogenic Differentiation from Human iPS Cells: Prospects for Modeling Miyoshi Myopathy In Vitro

    Science.gov (United States)

    Tanaka, Akihito; Woltjen, Knut; Miyake, Katsuya; Hotta, Akitsu; Ikeya, Makoto; Yamamoto, Takuya; Nishino, Tokiko; Shoji, Emi; Sehara-Fujisawa, Atsuko; Manabe, Yasuko; Fujii, Nobuharu; Hanaoka, Kazunori; Era, Takumi; Yamashita, Satoshi; Isobe, Ken-ichi; Kimura, En; Sakurai, Hidetoshi

    2013-01-01

    The establishment of human induced pluripotent stem cells (hiPSCs) has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1) in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70–90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM) is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF). These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs. PMID:23626698

  5. How well can DFT reproduce key interactions in Ziegler-Natta systems?

    KAUST Repository

    Correa, Andrea; Bahri-Laleh, Naeimeh; Cavallo, Luigi

    2013-01-01

    The performance of density functional theory in reproducing some of the main interactions occurring in MgCl2-supported Ziegler-Natta catalytic systems is assessed. Eight model systems, representatives of key interactions occurring in Ziegler

  6. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  7. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  8. Reproducibility of central lumbar vertebral BMD

    International Nuclear Information System (INIS)

    Chan, F.; Pocock, N.; Griffiths, M.; Majerovic, Y.; Freund, J.

    1997-01-01

    Full text: Lumbar vertebral bone mineral density (BMD) using dual X-ray absorptiometry (DXA) has generally been calculated from a region of interest which includes the entire vertebral body. Although this region excludes part of the transverse processes, it does include the outer cortical shell of the vertebra. Recent software has been devised to calculate BMD in a central vertebral region of interest which excludes the outer cortical envelope. Theoretically this area may be more sensitive to detecting osteoporosis which affects trabecular bone to a greater extent than cortical bone. Apart from the sensitivity of BMD estimation, the reproducibility of any measurement is important owing to the slow rate of change of bone mass. We have evaluated the reproducibility of this new vertebral region of interest in 23 women who had duplicate lumbar spine DXA scans performed on the same day. The patients were repositioned between each measurement. Central vertebral analysis was performed for L2-L4 and the reproducibility of area, bone mineral content (BMC) and BMD calculated as the coefficient of variation; these values were compared with those from conventional analysis. Thus we have shown that the reproducibility of the central BMD is comparable to the conventional analysis which is essential if this technique is to provide any additional clinical data. The reasons for the decrease in reproducibility of the area and hence BMC requires further investigation

  9. Three-dimensional surgical modelling with an open-source software protocol: study of precision and reproducibility in mandibular reconstruction with the fibula free flap.

    Science.gov (United States)

    Ganry, L; Quilichini, J; Bandini, C M; Leyder, P; Hersant, B; Meningaud, J P

    2017-08-01

    Very few surgical teams currently use totally independent and free solutions to perform three-dimensional (3D) surgical modelling for osseous free flaps in reconstructive surgery. This study assessed the precision and technical reproducibility of a 3D surgical modelling protocol using free open-source software in mandibular reconstruction with fibula free flaps and surgical guides. Precision was assessed through comparisons of the 3D surgical guide to the sterilized 3D-printed guide, determining accuracy to the millimetre level. Reproducibility was assessed in three surgical cases by volumetric comparison to the millimetre level. For the 3D surgical modelling, a difference of less than 0.1mm was observed. Almost no deformations (free flap modelling was between 0.1mm and 0.4mm, and the average precision of the complete reconstructed mandible was less than 1mm. The open-source software protocol demonstrated high accuracy without complications. However, the precision of the surgical case depends on the surgeon's 3D surgical modelling. Therefore, surgeons need training on the use of this protocol before applying it to surgical cases; this constitutes a limitation. Further studies should address the transfer of expertise. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  10. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  11. Timbral aspects of reproduced sound in small rooms. I

    DEFF Research Database (Denmark)

    Bech, Søren

    1995-01-01

    , has been simulated using an electroacoustic setup. The model included the direct sound, 17 individual reflections, and the reverberant field. The threshold of detection and just-noticeable differences for an increase in level were measured for individual reflections using eight subjects for noise......This paper reports some of the influences of individual reflections on the timbre of reproduced sound. A single loudspeaker with frequency-independent directivity characteristics, positioned in a listening room of normal size with frequency-independent absorption coefficients of the room surfaces...... and speech. The results have shown that the first-order floor and ceiling reflections are likely to individually contribute to the timbre of reproduced speech. For a noise signal, additional reflections from the left sidewall will contribute individually. The level of the reverberant field has been found...

  12. Scientific Reproducibility in Biomedical Research: Provenance Metadata Ontology for Semantic Annotation of Study Description.

    Science.gov (United States)

    Sahoo, Satya S; Valdez, Joshua; Rueschman, Michael

    2016-01-01

    Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled "Rigor and Reproducibility " for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project.

  13. Why are models unable to reproduce multi-decadal trends in lower tropospheric baseline ozone levels?

    Science.gov (United States)

    Hu, L.; Liu, J.; Mickley, L. J.; Strahan, S. E.; Steenrod, S.

    2017-12-01

    Assessments of tropospheric ozone radiative forcing rely on accurate model simulations. Parrish et al (2014) found that three chemistry-climate models (CCMs) overestimate present-day O3 mixing ratios and capture only 50% of the observed O3 increase over the last five decades at 12 baseline sites in the northern mid-latitudes, indicating large uncertainties in our understanding of the ozone trends and their implications for radiative forcing. Here we present comparisons of outputs from two chemical transport models (CTMs) - GEOS-Chem and the Global Modeling Initiative model - with O3 observations from the same sites and from the global ozonesonde network. Both CTMs are driven by reanalysis meteorological data (MERRA or MERRA2) and thus are expected to be different in atmospheric transport processes relative to those freely running CCMs. We test whether recent model developments leading to more active ozone chemistry affect the computed ozone sensitivity to perturbations in emissions. Preliminary results suggest these CTMs can reproduce present-day ozone levels but fail to capture the multi-decadal trend since 1980. Both models yield widespread overpredictions of free tropospheric ozone in the 1980s. Sensitivity studies in GEOS-Chem suggest that the model estimate of natural background ozone is too high. We discuss factors that contribute to the variability and trends of tropospheric ozone over the last 30 years, with a focus on intermodel differences in spatial resolution and in the representation of stratospheric chemistry, stratosphere-troposphere exchange, halogen chemistry, and biogenic VOC emissions and chemistry. We also discuss uncertainty in the historical emission inventories used in models, and how these affect the simulated ozone trends.

  14. Inter- and intra-laboratory study to determine the reproducibility of toxicogenomics datasets.

    Science.gov (United States)

    Scott, D J; Devonshire, A S; Adeleye, Y A; Schutte, M E; Rodrigues, M R; Wilkes, T M; Sacco, M G; Gribaldo, L; Fabbri, M; Coecke, S; Whelan, M; Skinner, N; Bennett, A; White, A; Foy, C A

    2011-11-28

    The application of toxicogenomics as a predictive tool for chemical risk assessment has been under evaluation by the toxicology community for more than a decade. However, it predominately remains a tool for investigative research rather than for regulatory risk assessment. In this study, we assessed whether the current generation of microarray technology in combination with an in vitro experimental design was capable of generating robust, reproducible data of sufficient quality to show promise as a tool for regulatory risk assessment. To this end, we designed a prospective collaborative study to determine the level of inter- and intra-laboratory reproducibility between three independent laboratories. All test centres (TCs) adopted the same protocols for all aspects of the toxicogenomic experiment including cell culture, chemical exposure, RNA extraction, microarray data generation and analysis. As a case study, the genotoxic carcinogen benzo[a]pyrene (B[a]P) and the human hepatoma cell line HepG2 were used to generate three comparable toxicogenomic data sets. High levels of technical reproducibility were demonstrated using a widely employed gene expression microarray platform. While differences at the global transcriptome level were observed between the TCs, a common subset of B[a]P responsive genes (n=400 gene probes) was identified at all TCs which included many genes previously reported in the literature as B[a]P responsive. These data show promise that the current generation of microarray technology, in combination with a standard in vitro experimental design, can produce robust data that can be generated reproducibly in independent laboratories. Future work will need to determine whether such reproducible in vitro model(s) can be predictive for a range of toxic chemicals with different mechanisms of action and thus be considered as part of future testing regimes for regulatory risk assessment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. Isolated heart models: cardiovascular system studies and technological advances.

    Science.gov (United States)

    Olejnickova, Veronika; Novakova, Marie; Provaznik, Ivo

    2015-07-01

    Isolated heart model is a relevant tool for cardiovascular system studies. It represents a highly reproducible model for studying broad spectrum of biochemical, physiological, morphological, and pharmaceutical parameters, including analysis of intrinsic heart mechanics, metabolism, and coronary vascular response. Results obtained in this model are under no influence of other organ systems, plasma concentration of hormones or ions and influence of autonomic nervous system. The review describes various isolated heart models, the modes of heart perfusion, and advantages and limitations of various experimental setups. It reports the improvements of perfusion setup according to Langendorff introduced by the authors.

  16. Endovascular Broad-Neck Aneurysm Creation in a Porcine Model Using a Vascular Plug

    International Nuclear Information System (INIS)

    Mühlenbruch, Georg; Nikoubashman, Omid; Steffen, Björn; Dadak, Mete; Palmowski, Moritz; Wiesmann, Martin

    2013-01-01

    Ruptured cerebral arterial aneurysms require prompt treatment by either surgical clipping or endovascular coiling. Training for these sophisticated endovascular procedures is essential and ideally performed in animals before their use in humans. Simulators and established animal models have shown drawbacks with respect to degree of reality, size of the animal model and aneurysm, or time and effort needed for aneurysm creation. We therefore aimed to establish a realistic and readily available aneurysm model. Five anticoagulated domestic pigs underwent endovascular intervention through right femoral access. A total of 12 broad-neck aneurysms were created in the carotid, subclavian, and renal arteries using the Amplatzer vascular plug. With dedicated vessel selection, cubic, tubular, and side-branch aneurysms could be created. Three of the 12 implanted occluders, two of them implanted over a side branch of the main vessel, did not induce complete vessel occlusion. However, all aneurysms remained free of intraluminal thrombus formation and were available for embolization training during a surveillance period of 6 h. Two aneurysms underwent successful exemplary treatment: one was stent-assisted, and one was performed with conventional endovascular coil embolization. The new porcine aneurysm model proved to be a straightforward approach that offers a wide range of training and scientific applications that might help further improve endovascular coil embolization therapy in patients with cerebral aneurysms.

  17. Endovascular Broad-Neck Aneurysm Creation in a Porcine Model Using a Vascular Plug

    Energy Technology Data Exchange (ETDEWEB)

    Muehlenbruch, Georg, E-mail: gmuehlenbruch@ukaachen.de; Nikoubashman, Omid; Steffen, Bjoern; Dadak, Mete [RWTH Aachen University, Department of Diagnostic and Interventional Neuroradiology, University Hospital (Germany); Palmowski, Moritz [RWTH Aachen University, Department of Nuclear Medicine, University Hospital (Germany); Wiesmann, Martin [RWTH Aachen University, Department of Diagnostic and Interventional Neuroradiology, University Hospital (Germany)

    2013-02-15

    Ruptured cerebral arterial aneurysms require prompt treatment by either surgical clipping or endovascular coiling. Training for these sophisticated endovascular procedures is essential and ideally performed in animals before their use in humans. Simulators and established animal models have shown drawbacks with respect to degree of reality, size of the animal model and aneurysm, or time and effort needed for aneurysm creation. We therefore aimed to establish a realistic and readily available aneurysm model. Five anticoagulated domestic pigs underwent endovascular intervention through right femoral access. A total of 12 broad-neck aneurysms were created in the carotid, subclavian, and renal arteries using the Amplatzer vascular plug. With dedicated vessel selection, cubic, tubular, and side-branch aneurysms could be created. Three of the 12 implanted occluders, two of them implanted over a side branch of the main vessel, did not induce complete vessel occlusion. However, all aneurysms remained free of intraluminal thrombus formation and were available for embolization training during a surveillance period of 6 h. Two aneurysms underwent successful exemplary treatment: one was stent-assisted, and one was performed with conventional endovascular coil embolization. The new porcine aneurysm model proved to be a straightforward approach that offers a wide range of training and scientific applications that might help further improve endovascular coil embolization therapy in patients with cerebral aneurysms.

  18. On the solutions of electrohydrodynamic flow with fractional differential equations by reproducing kernel method

    Directory of Open Access Journals (Sweden)

    Akgül Ali

    2016-01-01

    Full Text Available In this manuscript we investigate electrodynamic flow. For several values of the intimate parameters we proved that the approximate solution depends on a reproducing kernel model. Obtained results prove that the reproducing kernel method (RKM is very effective. We obtain good results without any transformation or discretization. Numerical experiments on test examples show that our proposed schemes are of high accuracy and strongly support the theoretical results.

  19. Ratio-scaling of listener preference of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian

    2005-01-01

    -trivial assumption in the case of complex spatial sounds. In the present study the Bradley-Terry-Luce (BTL) model was employed to investigate the unidimensionality of preference judgments made by 40 listeners on multichannel reproduced sound. Short musical excerpts played back in eight reproduction modes (mono...... music). As a main result, the BTL model was found to predict the choice frequencies well. This implies that listeners were able to integrate the complex nature of the sounds into a unidimensional preference judgment. It further implies the existence of a preference scale on which the reproduction modes...

  20. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  1. Prognostic Value and Reproducibility of Pretreatment CT Texture Features in Stage III Non-Small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Fried, David V. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas (United States); Tucker, Susan L. [Department of Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhou, Shouhao [Division of Quantitative Sciences, Department of Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Liao, Zhongxing [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Mawlawi, Osama [Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas (United States); Ibbott, Geoffrey [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas (United States); Court, Laurence E., E-mail: LECourt@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas (United States)

    2014-11-15

    Purpose: To determine whether pretreatment CT texture features can improve patient risk stratification beyond conventional prognostic factors (CPFs) in stage III non-small cell lung cancer (NSCLC). Methods and Materials: We retrospectively reviewed 91 cases with stage III NSCLC treated with definitive chemoradiation therapy. All patients underwent pretreatment diagnostic contrast enhanced computed tomography (CE-CT) followed by 4-dimensional CT (4D-CT) for treatment simulation. We used the average-CT and expiratory (T50-CT) images from the 4D-CT along with the CE-CT for texture extraction. Histogram, gradient, co-occurrence, gray tone difference, and filtration-based techniques were used for texture feature extraction. Penalized Cox regression implementing cross-validation was used for covariate selection and modeling. Models incorporating texture features from the 33 image types and CPFs were compared to those with models incorporating CPFs alone for overall survival (OS), local-regional control (LRC), and freedom from distant metastases (FFDM). Predictive Kaplan-Meier curves were generated using leave-one-out cross-validation. Patients were stratified based on whether their predicted outcome was above or below the median. Reproducibility of texture features was evaluated using test-retest scans from independent patients and quantified using concordance correlation coefficients (CCC). We compared models incorporating the reproducibility seen on test-retest scans to our original models and determined the classification reproducibility. Results: Models incorporating both texture features and CPFs demonstrated a significant improvement in risk stratification compared to models using CPFs alone for OS (P=.046), LRC (P=.01), and FFDM (P=.005). The average CCCs were 0.89, 0.91, and 0.67 for texture features extracted from the average-CT, T50-CT, and CE-CT, respectively. Incorporating reproducibility within our models yielded 80.4% (±3.7% SD), 78.3% (±4.0% SD), and 78

  2. Prognostic Value and Reproducibility of Pretreatment CT Texture Features in Stage III Non-Small Cell Lung Cancer

    International Nuclear Information System (INIS)

    Fried, David V.; Tucker, Susan L.; Zhou, Shouhao; Liao, Zhongxing; Mawlawi, Osama; Ibbott, Geoffrey; Court, Laurence E.

    2014-01-01

    Purpose: To determine whether pretreatment CT texture features can improve patient risk stratification beyond conventional prognostic factors (CPFs) in stage III non-small cell lung cancer (NSCLC). Methods and Materials: We retrospectively reviewed 91 cases with stage III NSCLC treated with definitive chemoradiation therapy. All patients underwent pretreatment diagnostic contrast enhanced computed tomography (CE-CT) followed by 4-dimensional CT (4D-CT) for treatment simulation. We used the average-CT and expiratory (T50-CT) images from the 4D-CT along with the CE-CT for texture extraction. Histogram, gradient, co-occurrence, gray tone difference, and filtration-based techniques were used for texture feature extraction. Penalized Cox regression implementing cross-validation was used for covariate selection and modeling. Models incorporating texture features from the 33 image types and CPFs were compared to those with models incorporating CPFs alone for overall survival (OS), local-regional control (LRC), and freedom from distant metastases (FFDM). Predictive Kaplan-Meier curves were generated using leave-one-out cross-validation. Patients were stratified based on whether their predicted outcome was above or below the median. Reproducibility of texture features was evaluated using test-retest scans from independent patients and quantified using concordance correlation coefficients (CCC). We compared models incorporating the reproducibility seen on test-retest scans to our original models and determined the classification reproducibility. Results: Models incorporating both texture features and CPFs demonstrated a significant improvement in risk stratification compared to models using CPFs alone for OS (P=.046), LRC (P=.01), and FFDM (P=.005). The average CCCs were 0.89, 0.91, and 0.67 for texture features extracted from the average-CT, T50-CT, and CE-CT, respectively. Incorporating reproducibility within our models yielded 80.4% (±3.7% SD), 78.3% (±4.0% SD), and 78

  3. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  4. Does systematic variation improve the reproducibility of animal experiments?

    NARCIS (Netherlands)

    Jonker, R.M.; Guenther, A.; Engqvist, L.; Schmoll, T.

    2013-01-01

    Reproducibility of results is a fundamental tenet of science. In this journal, Richter et al.1 tested whether systematic variation in experimental conditions (heterogenization) affects the reproducibility of results. Comparing this approach with the current standard of ensuring reproducibility

  5. The 2010 Broad Prize

    Science.gov (United States)

    Education Digest: Essential Readings Condensed for Quick Review, 2011

    2011-01-01

    A new data analysis, based on data collected as part of The Broad Prize process, provides insights into which large urban school districts in the United States are doing the best job of educating traditionally disadvantaged groups: African-American, Hispanics, and low-income students. Since 2002, The Eli and Edythe Broad Foundation has awarded The…

  6. TU-AB-BRC-05: Creation of a Monte Carlo TrueBeam Model by Reproducing Varian Phase Space Data

    International Nuclear Information System (INIS)

    O’Grady, K; Davis, S; Seuntjens, J

    2016-01-01

    Purpose: To create a Varian TrueBeam 6 MV FFF Monte Carlo model using BEAMnrc/EGSnrc that accurately reproduces the Varian representative dataset, followed by tuning the model’s source parameters to accurately reproduce in-house measurements. Methods: A BEAMnrc TrueBeam model for 6 MV FFF has been created by modifying a validated 6 MV Varian CL21EX model. Geometric dimensions and materials were adjusted in a trial and error approach to match the fluence and spectra of TrueBeam phase spaces output by the Varian VirtuaLinac. Once the model’s phase space matched Varian’s counterpart using the default source parameters, it was validated to match 10 × 10 cm"2 Varian representative data obtained with the IBA CC13. The source parameters were then tuned to match in-house 5 × 5 cm"2 PTW microDiamond measurements. All dose to water simulations included detector models to include the effects of volume averaging and the non-water equivalence of the chamber materials, allowing for more accurate source parameter selection. Results: The Varian phase space spectra and fluence were matched with excellent agreement. The in-house model’s PDD agreement with CC13 TrueBeam representative data was within 0.9% local percent difference beyond the first 3 mm. Profile agreement at 10 cm depth was within 0.9% local percent difference and 1.3 mm distance-to-agreement in the central axis and penumbra regions, respectively. Once the source parameters were tuned, PDD agreement with microDiamond measurements was within 0.9% local percent difference beyond 2 mm. The microDiamond profile agreement at 10 cm depth was within 0.6% local percent difference and 0.4 mm distance-to-agreement in the central axis and penumbra regions, respectively. Conclusion: An accurate in-house Monte Carlo model of the Varian TrueBeam was achieved independently of the Varian phase space solution and was tuned to in-house measurements. KO acknowledges partial support by the CREATE Medical Physics Research

  7. A Reliable and Reproducible Model for Assessing the Effect of Different Concentrations of α-Solanine on Rat Bone Marrow Mesenchymal Stem Cells

    Directory of Open Access Journals (Sweden)

    Adriana Ordóñez-Vásquez

    2017-01-01

    Full Text Available Αlpha-solanine (α-solanine is a glycoalkaloid present in potato (Solanum tuberosum. It has been of particular interest because of its toxicity and potential teratogenic effects that include abnormalities of the central nervous system, such as exencephaly, encephalocele, and anophthalmia. Various types of cell culture have been used as experimental models to determine the effect of α-solanine on cell physiology. The morphological changes in the mesenchymal stem cell upon exposure to α-solanine have not been established. This study aimed to describe a reliable and reproducible model for assessing the structural changes induced by exposure of mouse bone marrow mesenchymal stem cells (MSCs to different concentrations of α-solanine for 24 h. The results demonstrate that nonlethal concentrations of α-solanine (2–6 μM changed the morphology of the cells, including an increase in the number of nucleoli, suggesting elevated protein synthesis, and the formation of spicules. In addition, treatment with α-solanine reduced the number of adherent cells and the formation of colonies in culture. Immunophenotypic characterization and staining of MSCs are proposed as a reproducible method that allows description of cells exposed to the glycoalkaloid, α-solanine.

  8. Reproducibility of image quality for moving objects using respiratory-gated computed tomography. A study using a phantom model

    International Nuclear Information System (INIS)

    Fukumitsu, Nobuyoshi; Ishida, Masaya; Terunuma, Toshiyuki

    2012-01-01

    To investigate the reproducibility of computed tomography (CT) imaging quality in respiratory-gated radiation treatment planning is essential in radiotherapy of movable tumors. Seven series of regular and six series of irregular respiratory motions were performed using a thorax dynamic phantom. For the regular respiratory motions, the respiratory cycle was changed from 2.5 to 4 s and the amplitude was changed from 4 to 10 mm. For the irregular respiratory motions, a cycle of 2.5 to 4 or an amplitude of 4 to 10 mm was added to the base data (id est (i.e.) 3.5-s cycle, 6-mm amplitude) every three cycles. Images of the object were acquired six times using respiratory-gated data acquisition. The volume of the object was calculated and the reproducibility of the volume was decided based on the variety. The registered image of the object was added and the reproducibility of the shape was decided based on the degree of overlap of objects. The variety in the volumes and shapes differed significantly as the respiratory cycle changed according to regular respiratory motions. In irregular respiratory motion, shape reproducibility was further inferior, and the percentage of overlap among the six images was 35.26% in the 2.5- and 3.5-s cycle mixed group. Amplitude changes did not produce significant differences in the variety of the volumes and shapes. Respiratory cycle changes reduced the reproducibility of the image quality in respiratory-gated CT. (author)

  9. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  10. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  11. Batch-batch stable microbial community in the traditional fermentation process of huyumei broad bean pastes.

    Science.gov (United States)

    Zhu, Linjiang; Fan, Zihao; Kuai, Hui; Li, Qi

    2017-09-01

    During natural fermentation processes, a characteristic microbial community structure (MCS) is naturally formed, and it is interesting to know about its batch-batch stability. This issue was explored in a traditional semi-solid-state fermentation process of huyumei, a Chinese broad bean paste product. The results showed that this MCS mainly contained four aerobic Bacillus species (8 log CFU per g), including B. subtilis, B. amyloliquefaciens, B. methylotrophicus, and B. tequilensis, and the facultative anaerobe B. cereus with a low concentration (4 log CFU per g), besides a very small amount of the yeast Zygosaccharomyces rouxii (2 log CFU per g). The dynamic change of the MCS in the brine fermentation process showed that the abundance of dominant species varied within a small range, and in the beginning of process the growth of lactic acid bacteria was inhibited and Staphylococcus spp. lost its viability. Also, the MCS and its dynamic change were proved to be highly reproducible among seven batches of fermentation. Therefore, the MCS naturally and stably forms between different batches of the traditional semi-solid-state fermentation of huyumei. Revealing microbial community structure and its batch-batch stability is helpful for understanding the mechanisms of community formation and flavour production in a traditional fermentation. This issue in a traditional semi-solid-state fermentation of huyumei broad bean paste was firstly explored. This fermentation process was revealed to be dominated by a high concentration of four aerobic species of Bacillus, a low concentration of B. cereus and a small amount of Zygosaccharomyces rouxii. Lactic acid bacteria and Staphylococcus spp. lost its viability at the beginning of fermentation. Such the community structure was proved to be highly reproducible among seven batches. © 2017 The Society for Applied Microbiology.

  12. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  13. A broad scope knowledge based model for optimization of VMAT in esophageal cancer: validation and assessment of plan quality among different treatment centers

    International Nuclear Information System (INIS)

    Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Laksar, Sarbani; Tozzi, Angelo; Scorsetti, Marta; Cozzi, Luca

    2015-01-01

    To evaluate the performance of a broad scope model-based optimisation process for volumetric modulated arc therapy applied to esophageal cancer. A set of 70 previously treated patients in two different institutions, were selected to train a model for the prediction of dose-volume constraints. The model was built with a broad-scope purpose, aiming to be effective for different dose prescriptions and tumour localisations. It was validated on three groups of patients from the same institution and from another clinic not providing patients for the training phase. Comparison of the automated plans was done against reference cases given by the clinically accepted plans. Quantitative improvements (statistically significant for the majority of the analysed dose-volume parameters) were observed between the benchmark and the test plans. Of 624 dose-volume objectives assessed for plan evaluation, in 21 cases (3.3 %) the reference plans failed to respect the constraints while the model-based plans succeeded. Only in 3 cases (<0.5 %) the reference plans passed the criteria while the model-based failed. In 5.3 % of the cases both groups of plans failed and in the remaining cases both passed the tests. Plans were optimised using a broad scope knowledge-based model to determine the dose-volume constraints. The results showed dosimetric improvements when compared to the benchmark data. Particularly the plans optimised for patients from the third centre, not participating to the training, resulted in superior quality. The data suggests that the new engine is reliable and could encourage its application to clinical practice. The online version of this article (doi:10.1186/s13014-015-0530-5) contains supplementary material, which is available to authorized users

  14. Eccentric Contraction-Induced Muscle Injury: Reproducible, Quantitative, Physiological Models to Impair Skeletal Muscle's Capacity to Generate Force.

    Science.gov (United States)

    Call, Jarrod A; Lowe, Dawn A

    2016-01-01

    In order to investigate the molecular and cellular mechanisms of muscle regeneration an experimental injury model is required. Advantages of eccentric contraction-induced injury are that it is a controllable, reproducible, and physiologically relevant model to cause muscle injury, with injury being defined as a loss of force generating capacity. While eccentric contractions can be incorporated into conscious animal study designs such as downhill treadmill running, electrophysiological approaches to elicit eccentric contractions and examine muscle contractility, for example before and after the injurious eccentric contractions, allows researchers to circumvent common issues in determining muscle function in a conscious animal (e.g., unwillingness to participate). Herein, we describe in vitro and in vivo methods that are reliable, repeatable, and truly maximal because the muscle contractions are evoked in a controlled, quantifiable manner independent of subject motivation. Both methods can be used to initiate eccentric contraction-induced injury and are suitable for monitoring functional muscle regeneration hours to days to weeks post-injury.

  15. Reproducibility of precipitation distributions over extratropical continental regions in the CMIP5

    Science.gov (United States)

    Hirota, Nagio; Takayabu, Yukari

    2013-04-01

    Reproducibility of precipitation distributions over extratropical continental regions in the CMIP5 Nagio Hirota1,2 and Yukari N. Takayabu2 (1) National Institute of Polar Research (NIPR) (2) Atmosphere and Ocean Research Institute (AORI), the University of Tokyo Reproducibility of precipitation distributions over extratropical continental regions by CMIP5 climate models in their historical runs are evaluated, in comparison with GPCP(V2.2), CMAP(V0911), daily gridded gauge data APHRODITE. Surface temperature, cloud radiative forcing, and atmospheric circulations are also compared with observations of CRU-UEA, CERES, and ERA-interim/ERA40/JRA reanalysis data. It is shown that many CMIP5 models underestimate and overestimate summer precipitation over West and East Eurasia, respectively. These precipitation biases correspond to moisture transport associated with a cyclonic circulation bias over the whole continent of Eurasia. Meanwhile, many models underestimate cloud over the Eurasian continent, and associated shortwave cloud radiative forcing result in a significant warm bias. Evaporation feedback amplify the warm bias over West Eurasia. These processes consistently explain the precipitation biases over the Erasian continent in summer. We also examined reproducibility of winter precipitation, but robust results are not obtained yet due to the large uncertainty in observation associated with the adjustment of snow measurement in windy condition. Better observational data sets are necessary for further model validation. Acknowledgment: This study is supported by the PMM RA of JAXA, Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology, Japan, and Environment Research and Technology Development Fund (A-1201) of the Ministry of the Environment, Japan.

  16. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  17. 78 FR 20119 - Broad Stakeholder Survey

    Science.gov (United States)

    2013-04-03

    ... DEPARTMENT OF HOMELAND SECURITY [Docket No. DHS-2012-0042] Broad Stakeholder Survey AGENCY... concerning the Broad Stakeholder Survey. DHS previously published this ICR in the Federal Register on August... across the Nation. The Broad Stakeholder Survey is designed to gather stakeholder feedback on the...

  18. Broad band exciplex dye lasers

    International Nuclear Information System (INIS)

    Dienes, A.; Shank, C.V.; Trozzolo, A.M.

    1975-01-01

    The disclosure is concerned with exciplex dye lasers, i.e., lasers in which the emitting species is a complex formed only from a constituent in an electronically excited state. Noting that an exciplex laser, favorable from the standpoint of broad tunability, results from a broad shift in the peak emission wavelength for the exciplex relative to the unreacted species, a desirable class resulting in such broad shift is described. Preferred classes of laser media utilizing specified resonant molecules are set forth. (auth)

  19. EVOLUTION AND HYDRODYNAMICS OF THE VERY BROAD X-RAY LINE EMISSION IN SN 1987A

    Energy Technology Data Exchange (ETDEWEB)

    Dewey, D.; Canizares, C. R. [MIT Kavli Institute, Cambridge, MA 02139 (United States); Dwarkadas, V. V. [Department of Astronomy and Astrophysics, University of Chicago, Chicago, IL 60637 (United States); Haberl, F.; Sturm, R., E-mail: dd@space.mit.edu, E-mail: vikram@oddjob.uchicago.edu [Max-Planck-Institut fuer extraterrestrische Physik, Giessenbachstrasse, Garching D-85748 (Germany)

    2012-06-20

    Observations of SN 1987A by the Chandra High Energy Transmission Grating (HETG) in 1999 and the XMM-Newton Reflection Grating Spectrometer (RGS) in 2003 show very broad (v-b) lines with a full width at half-maximum (FWHM) of order 10{sup 4} km s{sup -1}; at these times the blast wave (BW) was primarily interacting with the H II region around the progenitor. Since then, the X-ray emission has been increasingly dominated by narrower components as the BW encounters dense equatorial ring (ER) material. Even so, continuing v-b emission is seen in the grating spectra suggesting that the interaction with H II region material is ongoing. Based on the deep HETG 2007 and 2011 data sets, and confirmed by RGS and other HETG observations, the v-b component has a width of 9300 {+-} 2000 km s{sup -1} FWHM and contributes of order 20% of the current 0.5-2 keV flux. Guided by this result, SN 1987A's X-ray spectra are modeled as the weighted sum of the non-equilibrium-ionization emission from two simple one-dimensional hydrodynamic simulations; this '2 Multiplication-Sign 1D' model reproduces the observed radii, light curves, and spectra with a minimum of free parameters. The interaction with the H II region ({rho}{sub init} Almost-Equal-To 130 amu cm{sup -3}, {+-} 15 Degree-Sign opening angle) produces the very broad emission lines and most of the 3-10 keV flux. Our ER hydrodynamics, admittedly a crude approximation to the multi-D reality, gives ER densities of {approx}10{sup 4} amu cm{sup -3}, requires dense clumps ( Multiplication-Sign 5.5 density enhancement in {approx}30% of the volume), and predicts that the 0.5-2 keV flux will drop at a rate of {approx}17% per year once no new dense ER material is being shocked.

  20. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  1. Rogeaulito: A World Energy Scenario Modeling Tool for Transparent Energy System Thinking

    International Nuclear Information System (INIS)

    Benichou, Léo; Mayr, Sebastian

    2014-01-01

    Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic, and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  2. Rogeaulito: A World Energy Scenario Modeling Tool for Transparent Energy System Thinking

    Energy Technology Data Exchange (ETDEWEB)

    Benichou, Léo, E-mail: leo.benichou@theshiftproject.org [The Shift Project, Paris (France); Mayr, Sebastian, E-mail: communication@theshiftproject.org [Paris School of International Affairs, Sciences Po., Paris (France)

    2014-01-13

    Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic, and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  3. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  4. Undefined cellulase formulations hinder scientific reproducibility.

    Science.gov (United States)

    Himmel, Michael E; Abbas, Charles A; Baker, John O; Bayer, Edward A; Bomble, Yannick J; Brunecky, Roman; Chen, Xiaowen; Felby, Claus; Jeoh, Tina; Kumar, Rajeev; McCleary, Barry V; Pletschke, Brett I; Tucker, Melvin P; Wyman, Charles E; Decker, Stephen R

    2017-01-01

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparations may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.

  5. [Parameter optimization of BEPS model based on the flux data of the temperate deciduous broad-leaved forest in Northeast China.

    Science.gov (United States)

    Lu, Wei; Fan, Wen Yi; Tian, Tian

    2016-05-01

    Keeping other parameters as empirical constants, different numerical combinations of the main photosynthetic parameters V c max and J max were conducted to estimate daily GPP by using the iteration method in this paper. To optimize V c max and J max in BEPSHourly model at hourly time steps, simulated daily GPP using different numerical combinations of the parameters were compared with the flux tower data obtained from the temperate deciduous broad-leaved forest of the Maoershan Forest Farm in Northeast China. Comparing the simulated daily GPP with the observed flux data in 2011, the results showed that optimal V c max and J max for the deciduous broad-leaved forest in Northeast China were 41.1 μmol·m -2 ·s -1 and 82.8 μmol·m -2 ·s -1 respectively with the minimal RMSE and the maximum R 2 of 1.10 g C·m -2 ·d -1 and 0.95. After V c max and J max optimization, BEPSHourly model simulated the seasonal variation of GPP better.

  6. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  7. Reproducibility of haemodynamical simulations in a subject-specific stented aneurysm model--a report on the Virtual Intracranial Stenting Challenge 2007.

    Science.gov (United States)

    Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F

    2008-07-19

    This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and

  8. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  9. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  10. Systematic heterogenization for better reproducibility in animal experimentation.

    Science.gov (United States)

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  11. Multi-epitope Models Explain How Pre-existing Antibodies Affect the Generation of Broadly Protective Responses to Influenza.

    Directory of Open Access Journals (Sweden)

    Veronika I Zarnitsyna

    2016-06-01

    Full Text Available The development of next-generation influenza vaccines that elicit strain-transcendent immunity against both seasonal and pandemic viruses is a key public health goal. Targeting the evolutionarily conserved epitopes on the stem of influenza's major surface molecule, hemagglutinin, is an appealing prospect, and novel vaccine formulations show promising results in animal model systems. However, studies in humans indicate that natural infection and vaccination result in limited boosting of antibodies to the stem of HA, and the level of stem-specific antibody elicited is insufficient to provide broad strain-transcendent immunity. Here, we use mathematical models of the humoral immune response to explore how pre-existing immunity affects the ability of vaccines to boost antibodies to the head and stem of HA in humans, and, in particular, how it leads to the apparent lack of boosting of broadly cross-reactive antibodies to the stem epitopes. We consider hypotheses where binding of antibody to an epitope: (i results in more rapid clearance of the antigen; (ii leads to the formation of antigen-antibody complexes which inhibit B cell activation through Fcγ receptor-mediated mechanism; and (iii masks the epitope and prevents the stimulation and proliferation of specific B cells. We find that only epitope masking but not the former two mechanisms to be key in recapitulating patterns in data. We discuss the ramifications of our findings for the development of vaccines against both seasonal and pandemic influenza.

  12. Evaluation of Oceanic Surface Observation for Reproducing the Upper Ocean Structure in ECHAM5/MPI-OM

    Science.gov (United States)

    Luo, Hao; Zheng, Fei; Zhu, Jiang

    2017-12-01

    Better constraints of initial conditions from data assimilation are necessary for climate simulations and predictions, and they are particularly important for the ocean due to its long climate memory; as such, ocean data assimilation (ODA) is regarded as an effective tool for seasonal to decadal predictions. In this work, an ODA system is established for a coupled climate model (ECHAM5/MPI-OM), which can assimilate all available oceanic observations using an ensemble optimal interpolation approach. To validate and isolate the performance of different surface observations in reproducing air-sea climate variations in the model, a set of observing system simulation experiments (OSSEs) was performed over 150 model years. Generally, assimilating sea surface temperature, sea surface salinity, and sea surface height (SSH) can reasonably reproduce the climate variability and vertical structure of the upper ocean, and assimilating SSH achieves the best results compared to the true states. For the El Niño-Southern Oscillation (ENSO), assimilating different surface observations captures true aspects of ENSO well, but assimilating SSH can further enhance the accuracy of ENSO-related feedback processes in the coupled model, leading to a more reasonable ENSO evolution and air-sea interaction over the tropical Pacific. For ocean heat content, there are still limitations in reproducing the long time-scale variability in the North Atlantic, even if SSH has been taken into consideration. These results demonstrate the effectiveness of assimilating surface observations in capturing the interannual signal and, to some extent, the decadal signal but still highlight the necessity of assimilating profile data to reproduce specific decadal variability.

  13. The use of real-time cell analyzer technology in drug discovery: defining optimal cell culture conditions and assay reproducibility with different adherent cellular models.

    Science.gov (United States)

    Atienzar, Franck A; Tilmant, Karen; Gerets, Helga H; Toussaint, Gaelle; Speeckaert, Sebastien; Hanon, Etienne; Depelchin, Olympe; Dhalluin, Stephane

    2011-07-01

    The use of impedance-based label-free technology applied to drug discovery is nowadays receiving more and more attention. Indeed, such a simple and noninvasive assay that interferes minimally with cell morphology and function allows one to perform kinetic measurements and to obtain information on proliferation, migration, cytotoxicity, and receptor-mediated signaling. The objective of the study was to further assess the usefulness of a real-time cell analyzer (RTCA) platform based on impedance in the context of quality control and data reproducibility. The data indicate that this technology is useful to determine the best coating and cellular density conditions for different adherent cellular models including hepatocytes, cardiomyocytes, fibroblasts, and hybrid neuroblastoma/neuronal cells. Based on 31 independent experiments, the reproducibility of cell index data generated from HepG2 cells exposed to DMSO and to Triton X-100 was satisfactory, with a coefficient of variation close to 10%. Cell index data were also well reproduced when cardiomyocytes and fibroblasts were exposed to 21 compounds three times (correlation >0.91, p technology appears to be a powerful and reliable tool in drug discovery because of the reasonable throughput, rapid and efficient performance, technical optimization, and cell quality control.

  14. Intestinal microdialysis--applicability, reproducibility and local tissue response in a pig model

    DEFF Research Database (Denmark)

    Emmertsen, K J; Wara, P; Sørensen, Flemming Brandt

    2005-01-01

    BACKGROUND AND AIMS: Microdialysis has been applied to the intestinal wall for the purpose of monitoring local ischemia. The aim of this study was to investigate the applicability, reproducibility and local response to microdialysis in the intestinal wall. MATERIALS AND METHODS: In 12 pigs two...... the probes were processed for histological examination. RESULTS: Large intra- and inter-group differences in the relative recovery were found between all locations. Absolute values of metabolites showed no significant changes during the study period. The lactate in blood was 25-30% of the intra-tissue values...

  15. Broad-Band Visually Evoked Potentials: Re(convolution in Brain-Computer Interfacing.

    Directory of Open Access Journals (Sweden)

    Jordy Thielen

    Full Text Available Brain-Computer Interfaces (BCIs allow users to control devices and communicate by using brain activity only. BCIs based on broad-band visual stimulation can outperform BCIs using other stimulation paradigms. Visual stimulation with pseudo-random bit-sequences evokes specific Broad-Band Visually Evoked Potentials (BBVEPs that can be reliably used in BCI for high-speed communication in speller applications. In this study, we report a novel paradigm for a BBVEP-based BCI that utilizes a generative framework to predict responses to broad-band stimulation sequences. In this study we designed a BBVEP-based BCI using modulated Gold codes to mark cells in a visual speller BCI. We defined a linear generative model that decomposes full responses into overlapping single-flash responses. These single-flash responses are used to predict responses to novel stimulation sequences, which in turn serve as templates for classification. The linear generative model explains on average 50% and up to 66% of the variance of responses to both seen and unseen sequences. In an online experiment, 12 participants tested a 6 × 6 matrix speller BCI. On average, an online accuracy of 86% was reached with trial lengths of 3.21 seconds. This corresponds to an Information Transfer Rate of 48 bits per minute (approximately 9 symbols per minute. This study indicates the potential to model and predict responses to broad-band stimulation. These predicted responses are proven to be well-suited as templates for a BBVEP-based BCI, thereby enabling communication and control by brain activity only.

  16. Development of a Three-Dimensional Hand Model Using Three-Dimensional Stereophotogrammetry: Assessment of Image Reproducibility.

    Directory of Open Access Journals (Sweden)

    Inge A Hoevenaren

    Full Text Available Using three-dimensional (3D stereophotogrammetry precise images and reconstructions of the human body can be produced. Over the last few years, this technique is mainly being developed in the field of maxillofacial reconstructive surgery, creating fusion images with computed tomography (CT data for precise planning and prediction of treatment outcome. Though, in hand surgery 3D stereophotogrammetry is not yet being used in clinical settings.A total of 34 three-dimensional hand photographs were analyzed to investigate the reproducibility. For every individual, 3D photographs were captured at two different time points (baseline T0 and one week later T1. Using two different registration methods, the reproducibility of the methods was analyzed. Furthermore, the differences between 3D photos of men and women were compared in a distance map as a first clinical pilot testing our registration method.The absolute mean registration error for the complete hand was 1.46 mm. This reduced to an error of 0.56 mm isolating the region to the palm of the hand. When comparing hands of both sexes, it was seen that the male hand was larger (broader base and longer fingers than the female hand.This study shows that 3D stereophotogrammetry can produce reproducible images of the hand without harmful side effects for the patient, so proving to be a reliable method for soft tissue analysis. Its potential use in everyday practice of hand surgery needs to be further explored.

  17. An antibiotic-responsive mouse model of fulminant ulcerative colitis.

    Directory of Open Access Journals (Sweden)

    Silvia S Kang

    2008-03-01

    Full Text Available BACKGROUND: The constellation of human inflammatory bowel disease (IBD includes ulcerative colitis and Crohn's disease, which both display a wide spectrum in the severity of pathology. One theory is that multiple genetic hits to the host immune system may contribute to the susceptibility and severity of IBD. However, experimental proof of this concept is still lacking. Several genetic mouse models that each recapitulate some aspects of human IBD have utilized a single gene defect to induce colitis. However, none have produced pathology clearly distinguishable as either ulcerative colitis or Crohn's disease, in part because none of them reproduce the most severe forms of disease that are observed in human patients. This lack of severe IBD models has posed a challenge for research into pathogenic mechanisms and development of new treatments. We hypothesized that multiple genetic hits to the regulatory machinery that normally inhibits immune activation in the intestine would generate more severe, reproducible pathology that would mimic either ulcerative colitis or Crohn's disease. METHODS AND FINDINGS: We generated a novel mouse line (dnKO that possessed defects in both TGFbetaRII and IL-10R2 signaling. These mice rapidly and reproducibly developed a disease resembling fulminant human ulcerative colitis that was quite distinct from the much longer and more variable course of pathology observed previously in mice possessing only single defects. Pathogenesis was driven by uncontrolled production of proinflammatory cytokines resulting in large part from T cell activation. The disease process could be significantly ameliorated by administration of antibodies against IFNgamma and TNFalpha and was completely inhibited by a combination of broad-spectrum antibiotics. CONCLUSIONS: Here, we develop to our knowledge the first mouse model of fulminant ulcerative colitis by combining multiple genetic hits in immune regulation and demonstrate that the resulting

  18. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built-up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognize systematic error distorting the performance test....

  19. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built–up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognise systematic error distorting the performance test....

  20. Acute multi-sgRNA knockdown of KEOPS complex genes reproduces the microcephaly phenotype of the stable knockout zebrafish model.

    Directory of Open Access Journals (Sweden)

    Tilman Jobst-Schwan

    Full Text Available Until recently, morpholino oligonucleotides have been widely employed in zebrafish as an acute and efficient loss-of-function assay. However, off-target effects and reproducibility issues when compared to stable knockout lines have compromised their further use. Here we employed an acute CRISPR/Cas approach using multiple single guide RNAs targeting simultaneously different positions in two exemplar genes (osgep or tprkb to increase the likelihood of generating mutations on both alleles in the injected F0 generation and to achieve a similar effect as morpholinos but with the reproducibility of stable lines. This multi single guide RNA approach resulted in median likelihoods for at least one mutation on each allele of >99% and sgRNA specific insertion/deletion profiles as revealed by deep-sequencing. Immunoblot showed a significant reduction for Osgep and Tprkb proteins. For both genes, the acute multi-sgRNA knockout recapitulated the microcephaly phenotype and reduction in survival that we observed previously in stable knockout lines, though milder in the acute multi-sgRNA knockout. Finally, we quantify the degree of mutagenesis by deep sequencing, and provide a mathematical model to quantitate the chance for a biallelic loss-of-function mutation. Our findings can be generalized to acute and stable CRISPR/Cas targeting for any zebrafish gene of interest.

  1. Reproducibility of summertime diurnal precipitation over northern Eurasia simulated by CMIP5 climate models

    Science.gov (United States)

    Hirota, N.; Takayabu, Y. N.

    2015-12-01

    Reproducibility of diurnal precipitation over northern Eurasia simulated by CMIP5 climate models in their historical runs were evaluated, in comparison with station data (NCDC-9813) and satellite data (GSMaP-V5). We first calculated diurnal cycles by averaging precipitation at each local solar time (LST) in June-July-August during 1981-2000 over the continent of northern Eurasia (0-180E, 45-90N). Then we examined occurrence time of maximum precipitation and a contribution of diurnally varying precipitation to the total precipitation.The contribution of diurnal precipitation was about 21% in both NCDC-9813 and GSMaP-V5. The maximum precipitation occurred at 18LST in NCDC-9813 but 16LST in GSMaP-V5, indicating some uncertainties even in the observational datasets. The diurnal contribution of the CMIP5 models varied largely from 11% to 62%, and their timing of the precipitation maximum ranged from 11LST to 20LST. Interestingly, the contribution and the timing had strong negative correlation of -0.65. The models with larger diurnal precipitation showed precipitation maximum earlier around noon. Next, we compared sensitivity of precipitation to surface temperature and tropospheric humidity between 5 models with large diurnal precipitation (LDMs) and 5 models with small diurnal precipitation (SDMs). Precipitation in LDMs showed high sensitivity to surface temperature, indicating its close relationship with local instability. On the other hand, synoptic disturbances were more active in SDMs with a dominant role of the large scale condensation, and precipitation in SDMs was more related with tropospheric moisture. Therefore, the relative importance of the local instability and the synoptic disturbances was suggested to be an important factor in determining the contribution and timing of the diurnal precipitation. Acknowledgment: This study is supported by Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology

  2. Two-Finger Tightness: What Is It? Measuring Torque and Reproducibility in a Simulated Model.

    Science.gov (United States)

    Acker, William B; Tai, Bruce L; Belmont, Barry; Shih, Albert J; Irwin, Todd A; Holmes, James R

    2016-05-01

    Residents in training are often directed to insert screws using "two-finger tightness" to impart adequate torque but minimize the chance of a screw stripping in bone. This study seeks to quantify and describe two-finger tightness and to assess the variability of its application by residents in training. Cortical bone was simulated using a polyurethane foam block (30-pcf density) that was prepared with predrilled holes for tightening 3.5 × 14-mm long cortical screws and mounted to a custom-built apparatus on a load cell to capture torque data. Thirty-three residents in training, ranging from the first through fifth years of residency, along with 8 staff members, were directed to tighten 6 screws to two-finger tightness in the test block, and peak torque values were recorded. The participants were blinded to their torque values. Stripping torque (2.73 ± 0.56 N·m) was determined from 36 trials and served as a threshold for failed screw placement. The average torques varied substantially with regard to absolute torque values, thus poorly defining two-finger tightness. Junior residents less consistently reproduced torque compared with other groups (0.29 and 0.32, respectively). These data quantify absolute values of two-finger tightness but demonstrate considerable variability in absolute torque values, percentage of stripping torque, and ability to consistently reproduce given torque levels. Increased years in training are weakly correlated with reproducibility, but experience does not seem to affect absolute torque levels. These results question the usefulness of two-finger tightness as a teaching tool and highlight the need for improvement in resident motor skill training and development within a teaching curriculum. Torque measuring devices may be a useful simulation tools for this purpose.

  3. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Science.gov (United States)

    Nada, Rania M; Maal, Thomas J J; Breuning, K Hero; Bergé, Stefaan J; Mostafa, Yehya A; Kuijpers-Jagtman, Anne Marie

    2011-02-09

    Superimposition of serial Cone Beam Computed Tomography (CBCT) scans has become a valuable tool for three dimensional (3D) assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16) for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27) for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  4. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  5. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  6. Broad-Band Analysis of Polar Motion Excitations

    Science.gov (United States)

    Chen, J.

    2016-12-01

    Earth rotational changes, i.e. polar motion and length-of-day (LOD), are driven by two types of geophysical excitations: 1) mass redistribution within the Earth system, and 2) angular momentum exchange between the solid Earth (more precisely the crust) and other components of the Earth system. Accurate quantification of Earth rotational excitations has been difficult, due to the lack of global-scale observations of mass redistribution and angular momentum exchange. The over 14-years time-variable gravity measurements from the Gravity Recovery and Climate Experiment (GRACE) have provided a unique means for quantifying Earth rotational excitations from mass redistribution in different components of the climate system. Comparisons between observed Earth rotational changes and geophysical excitations estimated from GRACE, satellite laser ranging (SLR) and climate models show that GRACE-derived excitations agree remarkably well with polar motion observations over a broad-band of frequencies. GRACE estimates also suggest that accelerated polar region ice melting in recent years and corresponding sea level rise have played an important role in driving long-term polar motion as well. With several estimates of polar motion excitations, it is possible to estimate broad-band noise variance and noise power spectra in each, given reasonable assumptions about noise independence. Results based on GRACE CSR RL05 solutions clearly outperform other estimates with the lowest noise levels over a broad band of frequencies.

  7. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  8. [Natural head position's reproducibility on photographs].

    Science.gov (United States)

    Eddo, Marie-Line; El Hayeck, Émilie; Hoyeck, Maha; Khoury, Élie; Ghoubril, Joseph

    2017-12-01

    The purpose of this study is to evaluate the reproducibility of natural head position with time on profile photographs. Our sample is composed of 96 students (20-30 years old) at the department of dentistry of Saint Joseph University in Beirut. Two profile photographs were taken in natural head position about a week apart. No significant differences were found between T0 and T1 (E = 1.065°). Many studies confirmed this reproducibility with time. Natural head position can be adopted as an orientation for profile photographs in orthodontics. © EDP Sciences, SFODF, 2017.

  9. Audiovisual biofeedback improves diaphragm motion reproducibility in MRI

    Science.gov (United States)

    Kim, Taeho; Pollock, Sean; Lee, Danny; O’Brien, Ricky; Keall, Paul

    2012-01-01

    Purpose: In lung radiotherapy, variations in cycle-to-cycle breathing results in four-dimensional computed tomography imaging artifacts, leading to inaccurate beam coverage and tumor targeting. In previous studies, the effect of audiovisual (AV) biofeedback on the external respiratory signal reproducibility has been investigated but the internal anatomy motion has not been fully studied. The aim of this study is to test the hypothesis that AV biofeedback improves diaphragm motion reproducibility of internal anatomy using magnetic resonance imaging (MRI). Methods: To test the hypothesis 15 healthy human subjects were enrolled in an ethics-approved AV biofeedback study consisting of two imaging sessions spaced ∼1 week apart. Within each session MR images were acquired under free breathing and AV biofeedback conditions. The respiratory signal to the AV biofeedback system utilized optical monitoring of an external marker placed on the abdomen. Synchronously, serial thoracic 2D MR images were obtained to measure the diaphragm motion using a fast gradient-recalled-echo MR pulse sequence in both coronal and sagittal planes. The improvement in the diaphragm motion reproducibility using the AV biofeedback system was quantified by comparing cycle-to-cycle variability in displacement, respiratory period, and baseline drift. Additionally, the variation in improvement between the two sessions was also quantified. Results: The average root mean square error (RMSE) of diaphragm cycle-to-cycle displacement was reduced from 2.6 mm with free breathing to 1.6 mm (38% reduction) with the implementation of AV biofeedback (p-value biofeedback (p-value biofeedback (p-value = 0.012). The diaphragm motion reproducibility improvements with AV biofeedback were consistent with the abdominal motion reproducibility that was observed from the external marker motion variation. Conclusions: This study was the first to investigate the potential of AV biofeedback to improve the motion

  10. Towards reproducible descriptions of neuronal network models.

    Directory of Open Access Journals (Sweden)

    Eilen Nordlie

    2009-08-01

    Full Text Available Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.

  11. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.

    Science.gov (United States)

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  12. Short- and long-term reproducibility of radioisotopic examination of gastric emptying

    Energy Technology Data Exchange (ETDEWEB)

    Jonderko, K. (Silesian School of Medicine, Katowice (Poland). Dept. of Gastroenterology)

    1990-01-01

    Reproducibility of gastric emptying (GE) of a radiolabelled solid meal was assessed. The short-term reproducibility was evaluated on the basis of 12 paired GE examinations performed 1-3 days apart. Twelve paired GE examinations taken 3-8 months apart enabled long-term reproducibility assessment. Reproducibility of GE parameters was expressed in terms of the coefficient of variation, CV. No significant between-day variation of solid GE was found either regarding the short-term or the long-term reproducibility. Although slightly higher CV values characterized the long-term reproducibility of the GE parameters considered, the variations of the differences between repeated GE examinations did not differ significantly between short- and long-term GE reproducibility. The results obtained justify the use of radioisotopic GE measurement for the assessment of early and late results of pharmacologic or surgical management. (author).

  13. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  14. Reproducing an extreme flood with uncertain post-event information

    Directory of Open Access Journals (Sweden)

    D. Fuentes-Andino

    2017-07-01

    Full Text Available Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum–Cunge–Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events

  15. A computational model incorporating neural stem cell dynamics reproduces glioma incidence across the lifespan in the human population.

    Directory of Open Access Journals (Sweden)

    Roman Bauer

    Full Text Available Glioma is the most common form of primary brain tumor. Demographically, the risk of occurrence increases until old age. Here we present a novel computational model to reproduce the probability of glioma incidence across the lifespan. Previous mathematical models explaining glioma incidence are framed in a rather abstract way, and do not directly relate to empirical findings. To decrease this gap between theory and experimental observations, we incorporate recent data on cellular and molecular factors underlying gliomagenesis. Since evidence implicates the adult neural stem cell as the likely cell-of-origin of glioma, we have incorporated empirically-determined estimates of neural stem cell number, cell division rate, mutation rate and oncogenic potential into our model. We demonstrate that our model yields results which match actual demographic data in the human population. In particular, this model accounts for the observed peak incidence of glioma at approximately 80 years of age, without the need to assert differential susceptibility throughout the population. Overall, our model supports the hypothesis that glioma is caused by randomly-occurring oncogenic mutations within the neural stem cell population. Based on this model, we assess the influence of the (experimentally indicated decrease in the number of neural stem cells and increase of cell division rate during aging. Our model provides multiple testable predictions, and suggests that different temporal sequences of oncogenic mutations can lead to tumorigenesis. Finally, we conclude that four or five oncogenic mutations are sufficient for the formation of glioma.

  16. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  17. Flow structure in front of the broad-crested weir

    Directory of Open Access Journals (Sweden)

    Zachoval Zbyněk

    2015-01-01

    Full Text Available The paper deals with research focused on description of flow structure in front of broad-crested weir. Based on experimental measurement, the flow structure in front of the weir (the recirculation zone of flow and tornado vortices and flow structure on the weir crest has been described. The determined flow character has been simulated using numerical model and based on comparing results the suitable model of turbulence has been recommended.

  18. Reproducibility of computer-aided detection system in digital mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Cho, Nariya; Cha, Joo Hee; Chung, Hye Kyung; Lee, Sin Ho; Cho, Kyung Soo; Kim, Sun Mi; Moon, Woo Kyung

    2005-01-01

    To evaluate the reproducibility of the computer-aided detection (CAD) system for digital mammograms. We applied the CAD system (ImageChecker M1000-DM, version 3.1; R2 Technology) to full field digital mammograms. These mammograms were taken twice at an interval of 10-45 days (mean:25 days) for 34 preoperative patients (breast cancer n=27, benign disease n=7, age range:20-66 years, mean age:47.9 years). On the mammograms, lesions were visible in 19 patients and these were depicted as 15 masses and 12 calcification clusters. We analyzed the sensitivity, the false positive rate (FPR) and the reproducibility of the CAD marks. The broader sensitivities of the CAD system were 80% (12 of 15), 67%(10 of 15) for masses and those for calcification clusters were 100% (12 of 12). The strict sensitivities were 50% (15 of 30) and 50% (15 of 30) for masses and 92% (22 of 24) and 79% (19 of 24) for the clusters. The FPR for the masses was 0.21-0.22/image, the FPR for the clusters was 0.03-0.04/image and the total FPR was 0.24-0.26/image. Among 132 mammography images, the identical images regardless of the existence of CAD marks were 59% (78 of 132), and the identical images with CAD marks were 22% (15 of 69). The reproducibility of the CAD marks for the true positive mass was 67% (12 of 18) and 71% (17 of 24) for the true positive cluster. The reproducibility of CAD marks for the false positive mass was 8% (4 of 53), and the reproducibility of CAD marks for the false positive clusters was 14% (1 of 7). The reproducibility of the total mass marks was 23% (16 of 71), and the reproducibility of the total cluster marks was 58% (18 of 31). CAD system showed higher sensitivity and reproducibility of CAD marks for the calcification clusters which are related to breast cancer. Yet the overall reproducibility of CAD marks was low; therefore, the CAD system must be applied considering this limitation

  19. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Directory of Open Access Journals (Sweden)

    Wilke Daniel N.

    2017-01-01

    Full Text Available The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  20. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Science.gov (United States)

    Wilke, Daniel N.; Pizette, Patrick; Govender, Nicolin; Abriak, Nor-Edine

    2017-06-01

    The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA) particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  1. Rogeaulito: a world energy scenario modeling tool for transparent energy system thinking

    Directory of Open Access Journals (Sweden)

    Léo eBenichou

    2014-01-01

    Full Text Available Rogeaulito is a world energy model for scenario building developed by the European think tank The Shift Project. It’s a tool to explore world energy choices from a very long-term and systematic perspective. As a key feature and novelty it computes energy supply and demand independently from each other revealing potentially missing energy supply by 2100. It is further simple to use, didactic and open source. As such, it targets a broad user group and advocates for reproducibility and transparency in scenario modeling as well as model-based learning. Rogeaulito applies an engineering approach using disaggregated data in a spreadsheet model.

  2. Revisiting the scientific method to improve rigor and reproducibility of immunohistochemistry in reproductive science.

    Science.gov (United States)

    Manuel, Sharrón L; Johnson, Brian W; Frevert, Charles W; Duncan, Francesca E

    2018-04-21

    Immunohistochemistry (IHC) is a robust scientific tool whereby cellular components are visualized within a tissue, and this method has been and continues to be a mainstay for many reproductive biologists. IHC is highly informative if performed and interpreted correctly, but studies have shown that the general use and reporting of appropriate controls in IHC experiments is low. This omission of the scientific method can result in data that lacks rigor and reproducibility. In this editorial, we highlight key concepts in IHC controls and describe an opportunity for our field to partner with the Histochemical Society to adopt their IHC guidelines broadly as researchers, authors, ad hoc reviewers, editorial board members, and editors-in-chief. Such cross-professional society interactions will ensure that we produce the highest quality data as new technologies emerge that still rely upon the foundations of classic histological and immunohistochemical principles.

  3. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  4. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Directory of Open Access Journals (Sweden)

    Rania M Nada

    Full Text Available Superimposition of serial Cone Beam Computed Tomography (CBCT scans has become a valuable tool for three dimensional (3D assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16 for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27 for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  5. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    Science.gov (United States)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  6. New Constraints on Quasar Broad Absorption and Emission Line Regions from Gravitational Microlensing

    Energy Technology Data Exchange (ETDEWEB)

    Hutsemékers, Damien; Braibant, Lorraine; Sluse, Dominique [Institut d' Astrophysique et de Géophysique, Université de Liège, Liège (Belgium); Anguita, Timo [Departamento de Ciencias Fisicas, Universidad Andres Bello, Santiago (Chile); Goosmann, René, E-mail: hutsemekers@astro.ulg.ac.be [Observatoire Astronomique de Strasbourg, Université de Strasbourg, Strasbourg (France)

    2017-09-29

    Gravitational microlensing is a powerful tool allowing one to probe the structure of quasars on sub-parsec scale. We report recent results, focusing on the broad absorption and emission line regions. In particular microlensing reveals the intrinsic absorption hidden in the P Cygni-type line profiles observed in the broad absorption line quasar H1413+117, as well as the existence of an extended continuum source. In addition, polarization microlensing provides constraints on the scattering region. In the quasar Q2237+030, microlensing differently distorts the Hα and CIV broad emission line profiles, indicating that the low- and high-ionization broad emission lines must originate from regions with distinct kinematical properties. We also present simulations of the effect of microlensing on line profiles considering simple but representative models of the broad emission line region. Comparison of observations to simulations allows us to conclude that the Hα emitting region in Q2237+030 is best represented by a Keplerian disk.

  7. New Constraints on Quasar Broad Absorption and Emission Line Regions from Gravitational Microlensing

    Directory of Open Access Journals (Sweden)

    Damien Hutsemékers

    2017-09-01

    Full Text Available Gravitational microlensing is a powerful tool allowing one to probe the structure of quasars on sub-parsec scale. We report recent results, focusing on the broad absorption and emission line regions. In particular microlensing reveals the intrinsic absorption hidden in the P Cygni-type line profiles observed in the broad absorption line quasar H1413+117, as well as the existence of an extended continuum source. In addition, polarization microlensing provides constraints on the scattering region. In the quasar Q2237+030, microlensing differently distorts the Hα and CIV broad emission line profiles, indicating that the low- and high-ionization broad emission lines must originate from regions with distinct kinematical properties. We also present simulations of the effect of microlensing on line profiles considering simple but representative models of the broad emission line region. Comparison of observations to simulations allows us to conclude that the Hα emitting region in Q2237+030 is best represented by a Keplerian disk.

  8. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    Science.gov (United States)

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P 30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

  9. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  10. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  11. Shear wave elastography for breast masses is highly reproducible.

    Science.gov (United States)

    Cosgrove, David O; Berg, Wendie A; Doré, Caroline J; Skyba, Danny M; Henry, Jean-Pierre; Gay, Joel; Cohen-Bacrie, Claude

    2012-05-01

    To evaluate intra- and interobserver reproducibility of shear wave elastography (SWE) for breast masses. For intraobserver reproducibility, each observer obtained three consecutive SWE images of 758 masses that were visible on ultrasound. 144 (19%) were malignant. Weighted kappa was used to assess the agreement of qualitative elastographic features; the reliability of quantitative measurements was assessed by intraclass correlation coefficients (ICC). For the interobserver reproducibility, a blinded observer reviewed images and agreement on features was determined. Mean age was 50 years; mean mass size was 13 mm. Qualitatively, SWE images were at least reasonably similar for 666/758 (87.9%). Intraclass correlation for SWE diameter, area and perimeter was almost perfect (ICC ≥ 0.94). Intraobserver reliability for maximum and mean elasticity was almost perfect (ICC = 0.84 and 0.87) and was substantial for the ratio of mass-to-fat elasticity (ICC = 0.77). Interobserver agreement was moderate for SWE homogeneity (κ = 0.57), substantial for qualitative colour assessment of maximum elasticity (κ = 0.66), fair for SWE shape (κ = 0.40), fair for B-mode mass margins (κ = 0.38), and moderate for B-mode mass shape (κ = 0.58), orientation (κ = 0.53) and BI-RADS assessment (κ = 0.59). SWE is highly reproducible for assessing elastographic features of breast masses within and across observers. SWE interpretation is at least as consistent as that of BI-RADS ultrasound B-mode features. • Shear wave ultrasound elastography can measure the stiffness of breast tissue • It provides a qualitatively and quantitatively interpretable colour-coded map of tissue stiffness • Intraobserver reproducibility of SWE is almost perfect while intraobserver reproducibility of SWE proved to be moderate to substantial • The most reproducible SWE features between observers were SWE image homogeneity and maximum elasticity.

  12. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  13. Reproducibility, controllability, and optimization of LENR experiments

    International Nuclear Information System (INIS)

    Nagel, David J.

    2006-01-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR

  14. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    Directory of Open Access Journals (Sweden)

    Eiji Watanabe

    2018-03-01

    Full Text Available The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  15. Chimeric Hemagglutinin Constructs Induce Broad Protection against Influenza B Virus Challenge in the Mouse Model.

    Science.gov (United States)

    Ermler, Megan E; Kirkpatrick, Ericka; Sun, Weina; Hai, Rong; Amanat, Fatima; Chromikova, Veronika; Palese, Peter; Krammer, Florian

    2017-06-15

    Seasonal influenza virus epidemics represent a significant public health burden. Approximately 25% of all influenza virus infections are caused by type B viruses, and these infections can be severe, especially in children. Current influenza virus vaccines are an effective prophylaxis against infection but are impacted by rapid antigenic drift, which can lead to mismatches between vaccine strains and circulating strains. Here, we describe a broadly protective vaccine candidate based on chimeric hemagglutinins, consisting of globular head domains from exotic influenza A viruses and stalk domains from influenza B viruses. Sequential vaccination with these constructs in mice leads to the induction of broadly reactive antibodies that bind to the conserved stalk domain of influenza B virus hemagglutinin. Vaccinated mice are protected from lethal challenge with diverse influenza B viruses. Results from serum transfer experiments and antibody-dependent cell-mediated cytotoxicity (ADCC) assays indicate that this protection is antibody mediated and based on Fc effector functions. The present data suggest that chimeric hemagglutinin-based vaccination is a viable strategy to broadly protect against influenza B virus infection. IMPORTANCE While current influenza virus vaccines are effective, they are affected by mismatches between vaccine strains and circulating strains. Furthermore, the antiviral drug oseltamivir is less effective for treating influenza B virus infections than for treating influenza A virus infections. A vaccine that induces broad and long-lasting protection against influenza B viruses is therefore urgently needed. Copyright © 2017 American Society for Microbiology.

  16. Reproducing Kernels and Coherent States on Julia Sets

    Energy Technology Data Exchange (ETDEWEB)

    Thirulogasanthar, K., E-mail: santhar@cs.concordia.ca; Krzyzak, A. [Concordia University, Department of Computer Science and Software Engineering (Canada)], E-mail: krzyzak@cs.concordia.ca; Honnouvo, G. [Concordia University, Department of Mathematics and Statistics (Canada)], E-mail: g_honnouvo@yahoo.fr

    2007-11-15

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems.

  17. Reproducing Kernels and Coherent States on Julia Sets

    International Nuclear Information System (INIS)

    Thirulogasanthar, K.; Krzyzak, A.; Honnouvo, G.

    2007-01-01

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems

  18. In vivo reproducibility of robotic probe placement for an integrated US-CT image-guided radiation therapy system

    Science.gov (United States)

    Lediju Bell, Muyinatu A.; Sen, H. Tutkun; Iordachita, Iulian; Kazanzides, Peter; Wong, John

    2014-03-01

    Radiation therapy is used to treat cancer by delivering high-dose radiation to a pre-defined target volume. Ultrasound (US) has the potential to provide real-time, image-guidance of radiation therapy to identify when a target moves outside of the treatment volume (e.g. due to breathing), but the associated probe-induced tissue deformation causes local anatomical deviations from the treatment plan. If the US probe is placed to achieve similar tissue deformations in the CT images required for treatment planning, its presence causes streak artifacts that will interfere with treatment planning calculations. To overcome these challenges, we propose robot-assisted placement of a real ultrasound probe, followed by probe removal and replacement with a geometrically-identical, CT-compatible model probe. This work is the first to investigate in vivo deformation reproducibility with the proposed approach. A dog's prostate, liver, and pancreas were each implanted with three 2.38-mm spherical metallic markers, and the US probe was placed to visualize the implanted markers in each organ. The real and model probes were automatically removed and returned to the same position (i.e. position control), and CT images were acquired with each probe placement. The model probe was also removed and returned with the same normal force measured with the real US probe (i.e. force control). Marker positions in CT images were analyzed to determine reproducibility, and a corollary reproducibility study was performed on ex vivo tissue. In vivo results indicate that tissue deformations with the real probe were repeatable under position control for the prostate, liver, and pancreas, with median 3D reproducibility of 0.3 mm, 0.3 mm, and 1.6 mm, respectively, compared to 0.6 mm for the ex vivo tissue. For the prostate, the mean 3D tissue displacement errors between the real and model probes were 0.2 mm under position control and 0.6 mm under force control, which are both within acceptable

  19. Highly Reproducible Automated Proteomics Sample Preparation Workflow for Quantitative Mass Spectrometry.

    Science.gov (United States)

    Fu, Qin; Kowalski, Michael P; Mastali, Mitra; Parker, Sarah J; Sobhani, Kimia; van den Broek, Irene; Hunter, Christie L; Van Eyk, Jennifer E

    2018-01-05

    Sample preparation for protein quantification by mass spectrometry requires multiple processing steps including denaturation, reduction, alkylation, protease digestion, and peptide cleanup. Scaling these procedures for the analysis of numerous complex biological samples can be tedious and time-consuming, as there are many liquid transfer steps and timed reactions where technical variations can be introduced and propagated. We established an automated sample preparation workflow with a total processing time for 96 samples of 5 h, including a 2 h incubation with trypsin. Peptide cleanup is accomplished by online diversion during the LC/MS/MS analysis. In a selected reaction monitoring (SRM) assay targeting 6 plasma biomarkers and spiked β-galactosidase, mean intraday and interday cyclic voltammograms (CVs) for 5 serum and 5 plasma samples over 5 days were samples repeated on 3 separate days had total CVs below 20%. Similar results were obtained when the workflow was transferred to a second site: 93% of peptides had CVs below 20%. An automated trypsin digestion workflow yields uniformly processed samples in less than 5 h. Reproducible quantification of peptides was observed across replicates, days, instruments, and laboratory sites, demonstrating the broad applicability of this approach.

  20. Reproducibility of preclinical animal research improves with heterogeneity of study samples

    Science.gov (United States)

    Vogt, Lucile; Sena, Emily S.; Würbel, Hanno

    2018-01-01

    Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research. PMID:29470495

  1. Flow characteristics at trapezoidal broad-crested side weir

    Directory of Open Access Journals (Sweden)

    Říha Jaromír

    2015-06-01

    Full Text Available Broad-crested side weirs have been the subject of numerous hydraulic studies; however, the flow field at the weir crest and in front of the weir in the approach channel still has not been fully described. Also, the discharge coefficient of broad-crested side weirs, whether slightly inclined towards the stream or lateral, still has yet to be clearly determined. Experimental research was carried out to describe the flow characteristics at low Froude numbers in the approach flow channel for various combinations of in- and overflow discharges. Three side weir types with different oblique angles were studied. Their flow characteristics and discharge coefficients were analyzed and assessed based on the results obtained from extensive measurements performed on a hydraulic model. The empirical relation between the angle of side weir obliqueness, Froude numbers in the up- and downstream channels, and the coefficient of obliqueness was derived.

  2. Visual attention spreads broadly but selects information locally.

    Science.gov (United States)

    Shioiri, Satoshi; Honjyo, Hajime; Kashiwase, Yoshiyuki; Matsumiya, Kazumichi; Kuriki, Ichiro

    2016-10-19

    Visual attention spreads over a range around the focus as the spotlight metaphor describes. Spatial spread of attentional enhancement and local selection/inhibition are crucial factors determining the profile of the spatial attention. Enhancement and ignorance/suppression are opposite effects of attention, and appeared to be mutually exclusive. Yet, no unified view of the factors has been provided despite their necessity for understanding the functions of spatial attention. This report provides electroencephalographic and behavioral evidence for the attentional spread at an early stage and selection/inhibition at a later stage of visual processing. Steady state visual evoked potential showed broad spatial tuning whereas the P3 component of the event related potential showed local selection or inhibition of the adjacent areas. Based on these results, we propose a two-stage model of spatial attention with broad spread at an early stage and local selection at a later stage.

  3. When Quality Beats Quantity: Decision Theory, Drug Discovery, and the Reproducibility Crisis.

    Directory of Open Access Journals (Sweden)

    Jack W Scannell

    Full Text Available A striking contrast runs through the last 60 years of biopharmaceutical discovery, research, and development. Huge scientific and technological gains should have increased the quality of academic science and raised industrial R&D efficiency. However, academia faces a "reproducibility crisis"; inflation-adjusted industrial R&D costs per novel drug increased nearly 100 fold between 1950 and 2010; and drugs are more likely to fail in clinical development today than in the 1970s. The contrast is explicable only if powerful headwinds reversed the gains and/or if many "gains" have proved illusory. However, discussions of reproducibility and R&D productivity rarely address this point explicitly. The main objectives of the primary research in this paper are: (a to provide quantitatively and historically plausible explanations of the contrast; and (b identify factors to which R&D efficiency is sensitive. We present a quantitative decision-theoretic model of the R&D process. The model represents therapeutic candidates (e.g., putative drug targets, molecules in a screening library, etc. within a "measurement space", with candidates' positions determined by their performance on a variety of assays (e.g., binding affinity, toxicity, in vivo efficacy, etc. whose results correlate to a greater or lesser degree. We apply decision rules to segment the space, and assess the probability of correct R&D decisions. We find that when searching for rare positives (e.g., candidates that will successfully complete clinical development, changes in the predictive validity of screening and disease models that many people working in drug discovery would regard as small and/or unknowable (i.e., an 0.1 absolute change in correlation coefficient between model output and clinical outcomes in man can offset large (e.g., 10 fold, even 100 fold changes in models' brute-force efficiency. We also show how validity and reproducibility correlate across a population of simulated

  4. Reproducibility of the sella turcica landmark in three dimensions using a sella turcica-specific reference system

    International Nuclear Information System (INIS)

    Pittayapat, Pisha; Jacobs, Reinhilde; Odri, Guillaume A.; De Faria Vasconcelos, Karla; Willems, Guy; Olszewski, Raphael

    2015-01-01

    This study was performed to assess the reproducibility of identifying the sella turcica landmark in a three-dimensional (3D) model by using a new sella-specific landmark reference system. Thirty-two cone-beam computed tomographic scans (3D Accuitomo 170, J. Morita, Kyoto, Japan) were retrospectively collected. The 3D data were exported into the Digital Imaging and Communications in Medicine standard and then imported into the Maxilim software (Medicim NV, Sint-Niklaas, Belgium) to create 3D surface models. Five observers identified four osseous landmarks in order to create the reference frame and then identified two sella landmarks. The x, y, and z coordinates of each landmark were exported. The observations were repeated after four weeks. Statistical analysis was performed using the multiple paired t-test with Bonferroni correction (intraobserver precision: p<0.005, interobserver precision: p<0.0011). The intraobserver mean precision of all landmarks was <1 mm. Significant differences were found when comparing the intraobserver precision of each observer (p<0.005). For the sella landmarks, the intraobserver mean precision ranged from 0.43±0.34 mm to 0.51±0.46 mm. The intraobserver reproducibility was generally good. The overall interobserver mean precision was <1 mm. Significant differences between each pair of observers for all anatomical landmarks were found (p<0.0011). The interobserver reproducibility of sella landmarks was good, with >50% precision in locating the landmark within 1 mm. A newly developed reference system offers high precision and reproducibility for sella turcica identification in a 3D model without being based on two-dimensional images derived from 3D data.

  5. Reproducibility of the sella turcica landmark in three dimensions using a sella turcica-specific reference system

    Energy Technology Data Exchange (ETDEWEB)

    Pittayapat, Pisha; Jacobs, Reinhilde [University Hospitals Leuven, University of Leuven, Leuven (Belgium); Odri, Guillaume A. [Service de Chirurgie Orthopedique et Traumatologique, Centre Hospitalier Regional d' Orleans, Orleans Cedex2 (France); De Faria Vasconcelos, Karla [Dept. of Oral Diagnosis, Division of Oral Radiology, Piracicaba Dental School, University of Campinas, Sao Paulo (Brazil); Willems, Guy [Dept. of Oral Health Sciences, Orthodontics, KU Leuven and Dentistry, University Hospitals Leuven, University of Leuven, Leuven (Belgium); Olszewski, Raphael [Dept. of Oral and Maxillofacial Surgery, Cliniques Universitaires Saint Luc, Universite Catholique de Louvain, Brussels (Belgium)

    2015-03-15

    This study was performed to assess the reproducibility of identifying the sella turcica landmark in a three-dimensional (3D) model by using a new sella-specific landmark reference system. Thirty-two cone-beam computed tomographic scans (3D Accuitomo 170, J. Morita, Kyoto, Japan) were retrospectively collected. The 3D data were exported into the Digital Imaging and Communications in Medicine standard and then imported into the Maxilim software (Medicim NV, Sint-Niklaas, Belgium) to create 3D surface models. Five observers identified four osseous landmarks in order to create the reference frame and then identified two sella landmarks. The x, y, and z coordinates of each landmark were exported. The observations were repeated after four weeks. Statistical analysis was performed using the multiple paired t-test with Bonferroni correction (intraobserver precision: p<0.005, interobserver precision: p<0.0011). The intraobserver mean precision of all landmarks was <1 mm. Significant differences were found when comparing the intraobserver precision of each observer (p<0.005). For the sella landmarks, the intraobserver mean precision ranged from 0.43±0.34 mm to 0.51±0.46 mm. The intraobserver reproducibility was generally good. The overall interobserver mean precision was <1 mm. Significant differences between each pair of observers for all anatomical landmarks were found (p<0.0011). The interobserver reproducibility of sella landmarks was good, with >50% precision in locating the landmark within 1 mm. A newly developed reference system offers high precision and reproducibility for sella turcica identification in a 3D model without being based on two-dimensional images derived from 3D data.

  6. Reproducible diagnosis of Chronic Lymphocytic Leukemia by flow cytometry

    DEFF Research Database (Denmark)

    Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha

    2018-01-01

    The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the di...

  7. Participant Nonnaiveté and the reproducibility of cognitive psychology

    NARCIS (Netherlands)

    R.A. Zwaan (Rolf); D. Pecher (Diane); G. Paolacci (Gabriele); S. Bouwmeester (Samantha); P.P.J.L. Verkoeijen (Peter); K. Dijkstra (Katinka); R. Zeelenberg (René)

    2017-01-01

    textabstractMany argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature—three each from the domains of perception/action, memory, and language, respectively—and found that they are highly reproducible. Not only can

  8. Modeling of leachable 137Cs in throughfall and stemflow for Japanese forest canopies after Fukushima Daiichi Nuclear Power Plant accident

    International Nuclear Information System (INIS)

    Loffredo, Nicolas; Onda, Yuichi; Kawamori, Ayumi; Kato, Hiroaki

    2014-01-01

    The Fukushima accident dispersed significant amounts of radioactive cesium (Cs) in the landscape. Our research investigated, from June 2011 to November 2013, the mobility of leachable Cs in forests canopies. In particular, 137 Cs and 134 Cs activity concentrations were measured in rainfall, throughfall, and stemflow in broad-leaf and cedar forests in an area located 40 km from the power plant. Leachable 137 Cs loss was modeled by a double exponential (DE) model. This model could not reproduce the variation in activity concentration observed. In order to refine the DE model, the main physical measurable parameters (rainfall intensity, wind velocity, and snowfall occurrence) were assessed, and rainfall was identified as the dominant factor controlling observed variation. A corrective factor was then developed to incorporate rainfall intensity in an improved DE model. With the original DE model, we estimated total 137 Cs loss by leaching from canopies to be 72 ± 4%, 67 ± 4%, and 48 ± 2% of the total plume deposition under mature cedar, young cedar, and broad-leaf forests, respectively. In contrast, with the improved DE model, the total 137 Cs loss by leaching was estimated to be 34 ± 2%, 34 ± 2%, and 16 ± 1% of the total plume deposition under mature cedar, young cedar, and broad-leaf forests, respectively. The improved DE model corresponds better to observed data in literature. Understanding 137 Cs and 134 Cs forest dynamics is important for forecasting future contamination of forest soils around the FDNPP. It also provides a basis for understanding forest transfers in future potential nuclear disasters. - Highlights: • A double exponential model was used to model leachable cesium loss from canopies. • The model could not reproduce variation observed. • Rainfall was identified as the dominant factor controlling the variation. • A rainfall parameter was used to develop an improved double exponential model. • The improved model gives a better estimation

  9. Ceramic molar crown reproducibility by digital workflow manufacturing: An in vitro study.

    Science.gov (United States)

    Jeong, Ii-Do; Kim, Woong-Chul; Park, Jinyoung; Kim, Chong-Myeong; Kim, Ji-Hwan

    2017-08-01

    This in vitro study aimed to analyze and compare the reproducibility of zirconia and lithium disilicate crowns manufactured by digital workflow. A typodont model with a prepped upper first molar was set in a phantom head, and a digital impression was obtained with a video intraoral scanner (CEREC Omnicam; Sirona GmbH), from which a single crown was designed and manufactured with CAD/CAM into a zirconia crown and lithium disilicate crown (n=12). Reproducibility of each crown was quantitatively retrieved by superimposing the digitized data of the crown in 3D inspection software, and differences were graphically mapped in color. Areas with large differences were analyzed with digital microscopy. Mean quadratic deviations (RMS) quantitatively obtained from each ceramic group were statistically analyzed with Student's t-test (α=.05). The RMS value of lithium disilicate crown was 29.2 (4.1) µm and 17.6 (5.5) µm on the outer and inner surfaces, respectively, whereas these values were 18.6 (2.0) µm and 20.6 (5.1) µm for the zirconia crown. Reproducibility of zirconia and lithium disilicate crowns had a statistically significant difference only on the outer surface ( P <.001). The outer surface of lithium disilicate crown showed over-contouring on the buccal surface and under-contouring on the inner occlusal surface. The outer surface of zirconia crown showed both over- and under-contouring on the buccal surface, and the inner surface showed under-contouring in the marginal areas. Restoration manufacturing by digital workflow will enhance the reproducibility of zirconia single crowns more than that of lithium disilicate single crowns.

  10. 77 FR 50144 - Broad Stakeholder Survey

    Science.gov (United States)

    2012-08-20

    ... DEPARTMENT OF HOMELAND SECURITY [Docket No. DHS-2012-0042] Broad Stakeholder Survey AGENCY... Information Collection Request: 1670-NEW. SUMMARY: The Department of Homeland Security (DHS), National... (Pub. L. 104-13, 44 U.S.C. Chapter 35). NPPD is soliciting comments concerning the Broad Stakeholder...

  11. 76 FR 34087 - Broad Stakeholder Survey

    Science.gov (United States)

    2011-06-10

    ... DEPARTMENT OF HOMELAND SECURITY [Docket No. DHS-2011-0027] Broad Stakeholder Survey AGENCY... Information Collection Request: 1670-NEW. SUMMARY: The Department of Homeland Security (DHS), National... (Pub. L. 104-13, 44 U.S.C. Chapter 35). NPPD is soliciting comments concerning the Broad Stakeholder...

  12. Interpretative intra- and interobserver reproducibility of Stress/Rest 99m Tc-steamboat's myocardial perfusion SPECT using semi quantitative 20-segment model

    International Nuclear Information System (INIS)

    Fazeli, M.; Firoozi, F.

    2002-01-01

    It well established that myocardial perfusion SPECT with 201 T L or 99 mTc-se sta mi bi play an important role diagnosis and risk assessment in patients with known or suspected coronary artery disease. Both quantitative and qualitative methods are available for interpretation of images. The use of a semi quantitative scoring system in which each of 20 segments is scored according to a five-point scheme provides an approach to interpretation that is more systematic and reproducible than simple qualitative evaluation. Only a limited number of studies have dealt with the interpretive observer reproducibility of 99 mTc-steamboat's myocardial perfusion imaging. The aim of this study was to assess the intra-and inter observer variability of semi quantitative SPECT performed with this technique. Among 789 patients that underwent myocardial perfusion SPECT during last year 80 patients finally need to coronary angiography as gold standard. In this group of patients a semi quantitative visual interpretation was carried out using short axis and vertical long-axis myocardial tomograms and a 20-segments model. These segments we reassigned on six evenly spaced regions in the apical, mid-ventricular, and basal short-axis view and two apical segments on the mid-ventricular long-axis slice. Uptake in each segment was graded on a 5-point scale (0=normal, 1=equivocal, 2=moderate, 3=severe, 4=absence of uptake). The steamboat's images was interpreted separately w ice by two observers without knowledge of each other's findings or results of angiography. A SPECT study was judged abnormal if there were two or more segments with a stress score equal or more than 2. We con eluded that semi-quantitative visual analysis is a simple and reproducible method of interpretation

  13. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  14. Extreme Variability in a Broad Absorption Line Quasar

    Energy Technology Data Exchange (ETDEWEB)

    Stern, Daniel; Jun, Hyunsung D. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Mail Stop 169-221, Pasadena, CA 91109 (United States); Graham, Matthew J.; Djorgovski, S. G.; Donalek, Ciro; Drake, Andrew J.; Mahabal, Ashish A.; Steidel, Charles C. [California Institute of Technology, 1200 E. California Boulevard, Pasadena, CA 91125 (United States); Arav, Nahum; Chamberlain, Carter [Department of Physics, Virginia Tech, Blacksburg, VA 24061 (United States); Barth, Aaron J. [Department of Physics and Astronomy, 4129 Frederick Reines Hall, University of California, Irvine, CA 92697 (United States); Glikman, Eilat, E-mail: daniel.k.stern@jpl.nasa.gov [Department of Physics, Middlebury College, Middlebury, VT 05753 (United States)

    2017-04-20

    CRTS J084133.15+200525.8 is an optically bright quasar at z = 2.345 that has shown extreme spectral variability over the past decade. Photometrically, the source had a visual magnitude of V ∼ 17.3 between 2002 and 2008. Then, over the following five years, the source slowly brightened by approximately one magnitude, to V ∼ 16.2. Only ∼1 in 10,000 quasars show such extreme variability, as quantified by the extreme parameters derived for this quasar assuming a damped random walk model. A combination of archival and newly acquired spectra reveal the source to be an iron low-ionization broad absorption line quasar with extreme changes in its absorption spectrum. Some absorption features completely disappear over the 9 years of optical spectra, while other features remain essentially unchanged. We report the first definitive redshift for this source, based on the detection of broad H α in a Keck/MOSFIRE spectrum. Absorption systems separated by several 1000 km s{sup −1} in velocity show coordinated weakening in the depths of their troughs as the continuum flux increases. We interpret the broad absorption line variability to be due to changes in photoionization, rather than due to motion of material along our line of sight. This source highlights one sort of rare transition object that astronomy will now be finding through dedicated time-domain surveys.

  15. A dynamic model of renal blood flow autoregulation

    DEFF Research Database (Denmark)

    Holstein-Rathlou, N H; Marsh, D J

    1994-01-01

    To test whether a mathematical model combining dynamic models of the tubuloglomerular feedback (TGF) mechanism and the myogenic mechanism was sufficient to explain dynamic autoregulation of renal blood flow, we compared model simulations with experimental data. To assess the dynamic characteristics...... of renal autoregulation, a broad band perturbation of the arterial pressure was employed in both the simulations and the experiments. Renal blood flow and tubular pressure were used as response variables in the comparison. To better approximate the situation in vivo where a large number of individual...... data, which shows a unimodal curve for the admittance phase. The ability of the model to reproduce the experimental data supports the hypothesis that dynamic autoregulation of renal blood flow is due to the combined action of TGF and the myogenic response....

  16. An improved cost-effective, reproducible method for evaluation of bone loss in a rodent model.

    Science.gov (United States)

    Fine, Daniel H; Schreiner, Helen; Nasri-Heir, Cibele; Greenberg, Barbara; Jiang, Shuying; Markowitz, Kenneth; Furgang, David

    2009-02-01

    This study was designed to investigate the utility of two "new" definitions for assessment of bone loss in a rodent model of periodontitis. Eighteen rats were divided into three groups. Group 1 was infected by Aggregatibacter actinomycetemcomitans (Aa), group 2 was infected with an Aa leukotoxin knock-out, and group 3 received no Aa (controls). Microbial sampling and antibody titres were determined. Initially, two examiners measured the distance from the cemento-enamel-junction to alveolar bone crest using the three following methods; (1) total area of bone loss by radiograph, (2) linear bone loss by radiograph, (3) a direct visual measurement (DVM) of horizontal bone loss. Two "new" definitions were adopted; (1) any site in infected animals showing bone loss >2 standard deviations above the mean seen at that site in control animals was recorded as bone loss, (2) any animal with two or more sites in any quadrant affected by bone loss was considered as diseased. Using the "new" definitions both evaluators independently found that infected animals had significantly more disease than controls (DVM system; p<0.05). The DVM method provides a simple, cost effective, and reproducible method for studying periodontal disease in rodents.

  17. Modeling of leachable {sup 137}Cs in throughfall and stemflow for Japanese forest canopies after Fukushima Daiichi Nuclear Power Plant accident

    Energy Technology Data Exchange (ETDEWEB)

    Loffredo, Nicolas, E-mail: wataiso@free.fr [Center for Research in Isotopes and Environmental Dynamics, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8572 (Japan); Onda, Yuichi [Center for Research in Isotopes and Environmental Dynamics, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8572 (Japan); Kawamori, Ayumi [Graduate School of Life and Environmental Sciences, University of Tsukuba (Japan); Kato, Hiroaki [Center for Research in Isotopes and Environmental Dynamics, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8572 (Japan)

    2014-09-15

    The Fukushima accident dispersed significant amounts of radioactive cesium (Cs) in the landscape. Our research investigated, from June 2011 to November 2013, the mobility of leachable Cs in forests canopies. In particular, {sup 137}Cs and {sup 134}Cs activity concentrations were measured in rainfall, throughfall, and stemflow in broad-leaf and cedar forests in an area located 40 km from the power plant. Leachable {sup 137}Cs loss was modeled by a double exponential (DE) model. This model could not reproduce the variation in activity concentration observed. In order to refine the DE model, the main physical measurable parameters (rainfall intensity, wind velocity, and snowfall occurrence) were assessed, and rainfall was identified as the dominant factor controlling observed variation. A corrective factor was then developed to incorporate rainfall intensity in an improved DE model. With the original DE model, we estimated total {sup 137}Cs loss by leaching from canopies to be 72 ± 4%, 67 ± 4%, and 48 ± 2% of the total plume deposition under mature cedar, young cedar, and broad-leaf forests, respectively. In contrast, with the improved DE model, the total {sup 137}Cs loss by leaching was estimated to be 34 ± 2%, 34 ± 2%, and 16 ± 1% of the total plume deposition under mature cedar, young cedar, and broad-leaf forests, respectively. The improved DE model corresponds better to observed data in literature. Understanding {sup 137}Cs and {sup 134}Cs forest dynamics is important for forecasting future contamination of forest soils around the FDNPP. It also provides a basis for understanding forest transfers in future potential nuclear disasters. - Highlights: • A double exponential model was used to model leachable cesium loss from canopies. • The model could not reproduce variation observed. • Rainfall was identified as the dominant factor controlling the variation. • A rainfall parameter was used to develop an improved double exponential model. • The

  18. Periodically Collapsing Bubbles in Stock Prices Cointegrated with Broad Dividends and Macroeconomic Factors

    Directory of Open Access Journals (Sweden)

    Man Fu

    2011-12-01

    Full Text Available We study fluctuations in stock prices using a framework derived from the present value model augmented with a macroeconomic factor. The fundamental value is derived as the expected present discounted value of broad dividends that include, in addition to traditional cash dividends, other payouts to shareholders. A stochastic discount factor motivated by the consumption-based asset pricing model is utilized. A single macroeconomic factor, namely the output gap determines the non-fundamental component of stock prices. A resulting trivariate Vector Autoregression (TVAR model of stock prices, broad dividends, and the output gap shows evidence of cointegration in the DJIA and S&P 500 index data. Nonetheless, a sup augmented Dickey-Fuller test reveals existence of periodically collapsing bubbles in S&P 500 data during the late 1990s.

  19. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    Science.gov (United States)

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-08-01

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  20. How well do CMIP5 Climate Models Reproduce the Hydrologic Cycle of the Colorado River Basin?

    Science.gov (United States)

    Gautam, J.; Mascaro, G.

    2017-12-01

    The Colorado River, which is the primary source of water for nearly 40 million people in the arid Southwestern states of the United States, has been experiencing an extended drought since 2000, which has led to a significant reduction in water supply. As the water demands increase, one of the major challenges for water management in the region has been the quantification of uncertainties associated with streamflow predictions in the Colorado River Basin (CRB) under potential changes of future climate. Hence, testing the reliability of model predictions in the CRB is critical in addressing this challenge. In this study, we evaluated the performances of 17 General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase Five (CMIP5) and 4 Regional Climate Models (RCMs) in reproducing the statistical properties of the hydrologic cycle in the CRB. We evaluated the water balance components at four nested sub-basins along with the inter-annual and intra-annual changes of precipitation (P), evaporation (E), runoff (R) and temperature (T) from 1979 to 2005. Most of the models captured the net water balance fairly well in the most-upstream basin but simulated a weak hydrological cycle in the evaporation channel at the downstream locations. The simulated monthly variability of P had different patterns, with correlation coefficients ranging from -0.6 to 0.8 depending on the sub-basin and the models from same parent institution clustering together. Apart from the most-upstream sub-basin where the models were mainly characterized by a negative seasonal bias in SON (of up to -50%), most of them had a positive bias in all seasons (of up to +260%) in the other three sub-basins. The models, however, captured the monthly variability of T well at all sites with small inter-model variabilities and a relatively similar range of bias (-7 °C to +5 °C) across all seasons. Mann-Kendall test was applied to the annual P and T time-series where majority of the models

  1. Broad Prize: Do the Successes Spread?

    Science.gov (United States)

    Samuels, Christina A.

    2011-01-01

    When the Broad Prize for Urban Education was created in 2002, billionaire philanthropist Eli Broad said he hoped the awards, in addition to rewarding high-performing school districts, would foster healthy competition; boost the prestige of urban education, long viewed as dysfunctional; and showcase best practices. Over the 10 years the prize has…

  2. Application of a process-based shallow landslide hazard model over a broad area in Central Italy

    Science.gov (United States)

    Gioia, Eleonora; Speranza, Gabriella; Ferretti, Maurizio; Godt, Jonathan W.; Baum, Rex L.; Marincioni, Fausto

    2015-01-01

    Process-based models are widely used for rainfall-induced shallow landslide forecasting. Previous studies have successfully applied the U.S. Geological Survey’s Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability (TRIGRS) model (Baum et al. 2002) to compute infiltration-driven changes in the hillslopes’ factor of safety on small scales (i.e., tens of square kilometers). Soil data input for such models are difficult to obtain across larger regions. This work describes a novel methodology for the application of TRIGRS over broad areas with relatively uniform hydrogeological properties. The study area is a 550-km2 region in Central Italy covered by post-orogenic Quaternary sediments. Due to the lack of field data, we assigned mechanical and hydrological property values through a statistical analysis based on literature review of soils matching the local lithologies. We calibrated the model using rainfall data from 25 historical rainfall events that triggered landslides. We compared the variation of pressure head and factor of safety with the landslide occurrence to identify the best fitting input conditions. Using calibrated inputs and a soil depth model, we ran TRIGRS for the study area. Receiver operating characteristic (ROC) analysis, comparing the model’s output with a shallow landslide inventory, shows that TRIGRS effectively simulated the instability conditions in the post-orogenic complex during historical rainfall scenarios. The implication of this work is that rainfall-induced landslides over large regions may be predicted by a deterministic model, even where data on geotechnical and hydraulic properties as well as temporal changes in topography or subsurface conditions are not available.

  3. SU-F-BRD-15: Quality Correction Factors in Scanned Or Broad Proton Therapy Beams Are Indistinguishable

    International Nuclear Information System (INIS)

    Sorriaux, J; Lee, J; Testa, M; Paganetti, H; Bertrand, D; Orban de Xivry, J; Palmans, H; Vynckier, S; Sterpin, E

    2015-01-01

    Purpose: The IAEA TRS-398 code of practice details the reference conditions for reference dosimetry of proton beams using ionization chambers and the required beam quality correction factors (kQ). Pencil beam scanning (PBS) requires multiple spots to reproduce the reference conditions. The objective is to demonstrate, using Monte Carlo (MC) calculations, that kQ factors for broad beams can be used for scanned beams under the same reference conditions with no significant additional uncertainty. We consider hereafter the general Alfonso formalism (Alfonso et al, 2008) for non-standard beam. Methods: To approach the reference conditions and the associated dose distributions, PBS must combine many pencil beams with range modulation and shaping techniques different than those used in passive systems (broad beams). This might lead to a different energy spectrum at the measurement point. In order to evaluate the impact of these differences on kQ factors, ion chamber responses are computed with MC (Geant4 9.6) in a dedicated scanned pencil beam (Q-pcsr) producing a 10×10cm2 composite field with a flat dose distribution from 10 to 16 cm depth. Ion chamber responses are also computed by MC in a broad beam with quality Q-ds (double scattering). The dose distribution of Q -pcsr matches the dose distribution of Q-ds. k-(Q-pcsr,Q-ds) is computed for a 2×2×0.2cm 3 idealized air cavity and a realistic plane-parallel ion chamber (IC). Results: Under reference conditions, quality correction factors for a scanned composite field versus a broad beam are the same for air cavity dose response, k-(Q-pcsr,Q-ds) =1.001±0.001 and for a Roos IC, k-(Q-pcsr,Q-ds) =0.999±0.005. Conclusion: Quality correction factors for ion chamber response in scanned and broad proton therapy beams are identical under reference conditions within the calculation uncertainties. The results indicate that quality correction factors published in IAEA TRS-398 can be used for scanned beams in the SOBP of a high

  4. SU-F-BRD-15: Quality Correction Factors in Scanned Or Broad Proton Therapy Beams Are Indistinguishable

    Energy Technology Data Exchange (ETDEWEB)

    Sorriaux, J; Lee, J [Molecular Imaging Radiotherapy & Oncology, Universite Catholique de Louvain, Brussels (Belgium); ICTEAM Institute, Universite catholique de Louvain, Louvain-la-Neuve (Belgium); Testa, M; Paganetti, H [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA 02114, Massachusetts (United States); Bertrand, D; Orban de Xivry, J [Ion Beam Applications, Louvain-la-neuve, Brabant Wallon (Belgium); Palmans, H [EBG MedAustron GmbH, Wiener Neustadt (Austria); National Physical Laboratory, Teddington (United Kingdom); Vynckier, S [Cliniques Universitaires Saint-Luc, Brussels (Belgium); Sterpin, E [Molecular Imaging Radiotherapy & Oncology, Universite Catholique de Louvain, Brussels (Belgium)

    2015-06-15

    Purpose: The IAEA TRS-398 code of practice details the reference conditions for reference dosimetry of proton beams using ionization chambers and the required beam quality correction factors (kQ). Pencil beam scanning (PBS) requires multiple spots to reproduce the reference conditions. The objective is to demonstrate, using Monte Carlo (MC) calculations, that kQ factors for broad beams can be used for scanned beams under the same reference conditions with no significant additional uncertainty. We consider hereafter the general Alfonso formalism (Alfonso et al, 2008) for non-standard beam. Methods: To approach the reference conditions and the associated dose distributions, PBS must combine many pencil beams with range modulation and shaping techniques different than those used in passive systems (broad beams). This might lead to a different energy spectrum at the measurement point. In order to evaluate the impact of these differences on kQ factors, ion chamber responses are computed with MC (Geant4 9.6) in a dedicated scanned pencil beam (Q-pcsr) producing a 10×10cm2 composite field with a flat dose distribution from 10 to 16 cm depth. Ion chamber responses are also computed by MC in a broad beam with quality Q-ds (double scattering). The dose distribution of Q -pcsr matches the dose distribution of Q-ds. k-(Q-pcsr,Q-ds) is computed for a 2×2×0.2cm{sup 3} idealized air cavity and a realistic plane-parallel ion chamber (IC). Results: Under reference conditions, quality correction factors for a scanned composite field versus a broad beam are the same for air cavity dose response, k-(Q-pcsr,Q-ds) =1.001±0.001 and for a Roos IC, k-(Q-pcsr,Q-ds) =0.999±0.005. Conclusion: Quality correction factors for ion chamber response in scanned and broad proton therapy beams are identical under reference conditions within the calculation uncertainties. The results indicate that quality correction factors published in IAEA TRS-398 can be used for scanned beams in the SOBP of a

  5. 3D-modeling of the spine using EOS imaging system: Inter-reader reproducibility and reliability.

    Directory of Open Access Journals (Sweden)

    Johannes Rehm

    Full Text Available To retrospectively assess the interreader reproducibility and reliability of EOS 3D full spine reconstructions in patients with adolescent idiopathic scoliosis (AIS.73 patients with mean age of 17 years and a moderate AIS (median Cobb Angle 18.2° obtained low-dose standing biplanar radiographs with EOS. Two independent readers performed "full spine" 3D reconstructions of the spine with the "full-spine" method adjusting the bone contour of every thoracic and lumbar vertebra (Th1-L5. Interreader reproducibility was assessed regarding rotation of every single vertebra in the coronal (i.e. frontal, sagittal (i.e. lateral, and axial plane, T1/T12 kyphosis, T4/T12 kyphosis, L1/L5 lordosis, L1/S1 lordosis and pelvic parameters. Radiation exposure, scan-time and 3D reconstruction time were recorded.Interclass correlation (ICC ranged between 0.83 and 0.98 for frontal vertebral rotation, between 0.94 and 0.99 for lateral vertebral rotation and between 0.51 and 0.88 for axial vertebral rotation. ICC was 0.92 for T1/T12 kyphosis, 0.95 for T4/T12 kyphosis, 0.90 for L1/L5 lordosis, 0.85 for L1/S1 lordosis, 0.97 for pelvic incidence, 0.96 for sacral slope, 0.98 for sagittal pelvic tilt and 0.94 for lateral pelvic tilt. The mean time for reconstruction was 14.9 minutes (reader 1: 14.6 minutes, reader 2: 15.2 minutes, p<0.0001. The mean total absorbed dose was 593.4μGy ±212.3 per patient.EOS "full spine" 3D angle measurement of vertebral rotation proved to be reliable and was performed in an acceptable reconstruction time. Interreader reproducibility of axial rotation was limited to some degree in the upper and middle thoracic spine due the obtuse angulation of the pedicles and the processi spinosi in the frontal view somewhat complicating their delineation.

  6. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    Science.gov (United States)

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  7. Reproducible and controllable induction voltage adder for scaled beam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko [Department of Energy Sciences, Tokyo Institute of Technology, 4259 Nagatsuta, Midori-ku, Yokohama 226-8502 (Japan)

    2016-08-15

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  8. Estimating carbon dioxide fluxes from temperate mountain grasslands using broad-band vegetation indices

    Directory of Open Access Journals (Sweden)

    G. Wohlfahrt

    2010-02-01

    Full Text Available The broad-band normalised difference vegetation index (NDVI and the simple ratio (SR were calculated from measurements of reflectance of photosynthetically active and short-wave radiation at two temperate mountain grasslands in Austria and related to the net ecosystem CO2 exchange (NEE measured concurrently by means of the eddy covariance method. There was no significant statistical difference between the relationships of midday mean NEE with narrow- and broad-band NDVI and SR, measured during and calculated for that same time window, respectively. The skill of broad-band NDVI and SR in predicting CO2 fluxes was higher for metrics dominated by gross photosynthesis and lowest for ecosystem respiration, with NEE in between. A method based on a simple light response model whose parameters were parameterised based on broad-band NDVI allowed to improve predictions of daily NEE and is suggested to hold promise for filling gaps in the NEE time series. Relationships of CO2 flux metrics with broad-band NDVI and SR however generally differed between the two studied grassland sites indicting an influence of additional factors not yet accounted for.

  9. A broad model for demand forecasting of gasoline and fuel alcohol; Um modelo abrangente para a projecao das demandas de gasolina e alcool carburante

    Energy Technology Data Exchange (ETDEWEB)

    Buonfiglio, Antonio [PETROBRAS, Paulinia, SP (Brazil). Dept. Industrial; Bajay, Sergio Valdir [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Mecanica

    1992-12-31

    Formulating a broad, mixed: econometric/end-use, demand forecasting model for gasoline and fuel alcohol is the main objective of this work. In the model, the gasoline and hydrated alcohol demands are calculated as the corresponding products if their fleet by the average car mileage, divided by the average specific mileage. Several simulations with the proposed forecasting model are carried out, within the context of alternative scenarios for the development of these competing fuels in the Brazilian market. (author) 4 refs., 1 fig., 3 tabs.

  10. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  11. Reproducibility of tumor uptake heterogeneity characterization through textural feature analysis in 18F-FDG PET.

    Science.gov (United States)

    Tixier, Florent; Hatt, Mathieu; Le Rest, Catherine Cheze; Le Pogam, Adrien; Corcos, Laurent; Visvikis, Dimitris

    2012-05-01

    (18)F-FDG PET measurement of standardized uptake value (SUV) is increasingly used for monitoring therapy response and predicting outcome. Alternative parameters computed through textural analysis were recently proposed to quantify the heterogeneity of tracer uptake by tumors as a significant predictor of response. The primary objective of this study was to evaluate the reproducibility of these heterogeneity measurements. Double baseline (18)F-FDG PET scans were acquired within 4 d of each other for 16 patients before any treatment was considered. A Bland-Altman analysis was performed on 8 parameters based on histogram measurements and 17 parameters based on textural heterogeneity features after discretization with values between 8 and 128. The reproducibility of maximum and mean SUV was similar to that in previously reported studies, with a mean percentage difference of 4.7% ± 19.5% and 5.5% ± 21.2%, respectively. By comparison, better reproducibility was measured for some textural features describing local heterogeneity of tracer uptake, such as entropy and homogeneity, with a mean percentage difference of -2% ± 5.4% and 1.8% ± 11.5%, respectively. Several regional heterogeneity parameters such as variability in the intensity and size of regions of homogeneous activity distribution had reproducibility similar to that of SUV measurements, with 95% confidence intervals of -22.5% to 3.1% and -1.1% to 23.5%, respectively. These parameters were largely insensitive to the discretization range. Several parameters derived from textural analysis describing heterogeneity of tracer uptake by tumors on local and regional scales had reproducibility similar to or better than that of simple SUV measurements. These reproducibility results suggest that these (18)F-FDG PET-derived parameters, which have already been shown to have predictive and prognostic value in certain cancer models, may be used to monitor therapy response and predict patient outcome.

  12. Can CFMIP2 models reproduce the leading modes of cloud vertical structure in the CALIPSO-GOCCP observations?

    Science.gov (United States)

    Wang, Fang; Yang, Song

    2018-02-01

    Using principal component (PC) analysis, three leading modes of cloud vertical structure (CVS) are revealed by the GCM-Oriented CALIPSO Cloud Product (GOCCP), i.e. tropical high, subtropical anticyclonic and extratropical cyclonic cloud modes (THCM, SACM and ECCM, respectively). THCM mainly reflect the contrast between tropical high clouds and clouds in middle/high latitudes. SACM is closely associated with middle-high clouds in tropical convective cores, few-cloud regimes in subtropical anticyclonic clouds and stratocumulus over subtropical eastern oceans. ECCM mainly corresponds to clouds along extratropical cyclonic regions. Models of phase 2 of Cloud Feedback Model Intercomparison Project (CFMIP2) well reproduce the THCM, but SACM and ECCM are generally poorly simulated compared to GOCCP. Standardized PCs corresponding to CVS modes are generally captured, whereas original PCs (OPCs) are consistently underestimated (overestimated) for THCM (SACM and ECCM) by CFMIP2 models. The effects of CVS modes on relative cloud radiative forcing (RSCRF/RLCRF) (RSCRF being calculated at the surface while RLCRF at the top of atmosphere) are studied in terms of principal component regression method. Results show that CFMIP2 models tend to overestimate (underestimated or simulate the opposite sign) RSCRF/RLCRF radiative effects (REs) of ECCM (THCM and SACM) in unit global mean OPC compared to observations. These RE biases may be attributed to two factors, one of which is underestimation (overestimation) of low/middle clouds (high clouds) (also known as stronger (weaker) REs in unit low/middle (high) clouds) in simulated global mean cloud profiles, the other is eigenvector biases in CVS modes (especially for SACM and ECCM). It is suggested that much more attention should be paid on improvement of CVS, especially cloud parameterization associated with particular physical processes (e.g. downwelling regimes with the Hadley circulation, extratropical storm tracks and others), which

  13. Efficient and reproducible identification of mismatch repair deficient colon cancer

    DEFF Research Database (Denmark)

    Joost, Patrick; Bendahl, Pär-Ola; Halvarsson, Britta

    2013-01-01

    BACKGROUND: The identification of mismatch-repair (MMR) defective colon cancer is clinically relevant for diagnostic, prognostic and potentially also for treatment predictive purposes. Preselection of tumors for MMR analysis can be obtained with predictive models, which need to demonstrate ease...... of application and favorable reproducibility. METHODS: We validated the MMR index for the identification of prognostically favorable MMR deficient colon cancers and compared performance to 5 other prediction models. In total, 474 colon cancers diagnosed ≥ age 50 were evaluated with correlation between...... clinicopathologic variables and immunohistochemical MMR protein expression. RESULTS: Female sex, age ≥60 years, proximal tumor location, expanding growth pattern, lack of dirty necrosis, mucinous differentiation and presence of tumor-infiltrating lymphocytes significantly correlated with MMR deficiency. Presence...

  14. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  15. Reproducibility in the analysis of multigated radionuclide studies of left ventricular ejection fraction

    International Nuclear Information System (INIS)

    Gjorup, T.; Kelbaek, H.; Vestergaard, B.; Fogh, J.; Munck, O.; Jensen, A.M.

    1989-01-01

    The authors determined the reproducibility (the standard deviation [SD]) in the analysis of multigated radionuclide studies of left ventricular ejection fraction (LVEF). Radionuclide studies from a consecutive series of 38 patients suspected of ischemic heart disease were analyzed independently by four nuclear medicine physiologists and four laboratory technicians. Each study was analyzed three times by each of the observers. Based on the analyses of the eight observers, the SD could be estimated by the use of a variance component model for LVEF determinations calculated as the average of the analyses of an arbitrary number of observers making an arbitrary number of analyses. This study presents the SDs for LVEF determinations based on the analyses of one to five observers making one to five analyses each. The SD of a LVEF determination decreased from 3.96% to 2.98% when an observer increased his number of analyses from one to five. A more pronounced decrease in the SD from 3.96% to 1.77% was obtained when the LVEF determinations were based on the average of a single analysis made by one to five observers. However, when dealing with the difference between LVEF determinations from two studies, the highest reproducibility was obtained if the LVEF determinations at both studies were based on the analyses made by the same observer. No significant difference was found in the reproducibility of analyses made by nuclear medicine physicians and laboratory technicians. Our study revealed that to increase the reproducibility of LVEF determinations, special efforts should be made to standardize the outlining of the end-systolic region interest

  16. Reproducibility problems of in-service ultrasonic testing results

    International Nuclear Information System (INIS)

    Honcu, E.

    1974-01-01

    The reproducibility of the results of ultrasonic testing is the basic precondition for its successful application in in-service inspection of changes in the quality of components of nuclear power installations. The results of periodic ultrasonic inspections are not satisfactory from the point of view of reproducibility. Regardless, the ultrasonic pulse-type method is suitable for evaluating the quality of most components of nuclear installations and often the sole method which may be recommended for inspection with regard to its technical and economic aspects. (J.B.)

  17. Comment on "Most computational hydrology is not reproducible, so is it really science?" by Christopher Hutton et al.

    Science.gov (United States)

    Añel, Juan A.

    2017-03-01

    Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.Plain Language SummaryThis article highlights several important issues to ensure the scientific reproducibility of results within the current scientific framework, going beyond simple documentation. Several specific examples are discussed in the field of hydrological modeling.

  18. Effective Form of Reproducing the Total Financial Potential of Ukraine

    Directory of Open Access Journals (Sweden)

    Portna Oksana V.

    2015-03-01

    Full Text Available Development of scientific principles of reproducing the total financial potential of the country and its effective form is an urgent problem both in theoretical and practical aspects of the study, the solution of which is intended to ensure the active mobilization and effective use of the total financial potential of Ukraine, and as a result — its expanded reproduction as well, which would contribute to realization of the internal capacities for stabilization of the national economy. The purpose of the article is disclosing the essence of the effective form of reproducing the total financial potential of the country, analyzing the results of reproducing the total financial potential of Ukraine. It has been proved that the basis for the effective form of reproducing the total financial potential of the country is the volume and flow of resources, which are associated with the «real» economy, affect the dynamics of GDP and define it, i.e. resource and process forms of reproducing the total financial potential of Ukraine (which precede the effective one. The analysis of reproducing the total financial potential of Ukraine has shown that in the analyzed period there was an increase in the financial possibilities of the country, but steady dynamics of reduction of the total financial potential was observed. If we consider the amount of resources involved in production, creating a net value added and GDP, it occurs on a restricted basis. Growth of the total financial potential of Ukraine is connected only with extensive quantitative factors rather than intensive qualitative changes.

  19. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  20. Mate-finding as an overlooked critical determinant of dispersal variation in sexually-reproducing animals.

    Science.gov (United States)

    Gilroy, James J; Lockwood, Julie L

    2012-01-01

    Dispersal is a critically important process in ecology, but robust predictive models of animal dispersal remain elusive. We identify a potentially ubiquitous component of variation in animal dispersal that has been largely overlooked until now: the influence of mate encounters on settlement probability. We use an individual-based model to simulate dispersal in sexually-reproducing organisms that follow a simple set of movement rules based on conspecific encounters, within an environment lacking spatial habitat heterogeneity. We show that dispersal distances vary dramatically with fluctuations in population density in such a model, even in the absence of variation in dispersive traits between individuals. In a simple random-walk model with promiscuous mating, dispersal distributions become increasingly 'fat-tailed' at low population densities due to the increasing scarcity of mates. Similar variation arises in models incorporating territoriality. In a model with polygynous mating, we show that patterns of sex-biased dispersal can even be reversed across a gradient of population density, despite underlying dispersal mechanisms remaining unchanged. We show that some widespread dispersal patterns found in nature (e.g. fat tailed distributions) can arise as a result of demographic variability in the absence of heterogeneity in dispersive traits across the population. This implies that models in which individual dispersal distances are considered to be fixed traits might be unrealistic, as dispersal distances vary widely under a single dispersal mechanism when settlement is influenced by mate encounters. Mechanistic models offer a promising means of advancing our understanding of dispersal in sexually-reproducing organisms.

  1. Energy determines broad pattern of plant distribution in Western Himalaya.

    Science.gov (United States)

    Panda, Rajendra M; Behera, Mukunda Dev; Roy, Partha S; Biradar, Chandrashekhar

    2017-12-01

    Several factors describe the broad pattern of diversity in plant species distribution. We explore these determinants of species richness in Western Himalayas using high-resolution species data available for the area to energy, water, physiography and anthropogenic disturbance. The floral data involves 1279 species from 1178 spatial locations and 738 sample plots of a national database. We evaluated their correlation with 8-environmental variables, selected on the basis of correlation coefficients and principal component loadings, using both linear (structural equation model) and nonlinear (generalised additive model) techniques. There were 645 genera and 176 families including 815 herbs, 213 shrubs, 190 trees, and 61 lianas. The nonlinear model explained the maximum deviance of 67.4% and showed the dominant contribution of climate on species richness with a 59% share. Energy variables (potential evapotranspiration and temperature seasonality) explained the deviance better than did water variables (aridity index and precipitation of the driest quarter). Temperature seasonality had the maximum impact on the species richness. The structural equation model confirmed the results of the nonlinear model but less efficiently. The mutual influences of the climatic variables were found to affect the predictions of the model significantly. To our knowledge, the 67.4% deviance found in the species richness pattern is one of the highest values reported in mountain studies. Broadly, climate described by water-energy dynamics provides the best explanation for the species richness pattern. Both modeling approaches supported the same conclusion that energy is the best predictor of species richness. The dry and cold conditions of the region account for the dominant contribution of energy on species richness.

  2. Theoretical Models of Optical Transients. I. A Broad Exploration of the Duration-Luminosity Phase Space

    Science.gov (United States)

    Villar, V. Ashley; Berger, Edo; Metzger, Brian D.; Guillochon, James

    2017-11-01

    The duration-luminosity phase space (DLPS) of optical transients is used, mostly heuristically, to compare various classes of transient events, to explore the origin of new transients, and to influence optical survey observing strategies. For example, several observational searches have been guided by intriguing voids and gaps in this phase space. However, we should ask, do we expect to find transients in these voids given our understanding of the various heating sources operating in astrophysical transients? In this work, we explore a broad range of theoretical models and empirical relations to generate optical light curves and to populate the DLPS. We explore transients powered by adiabatic expansion, radioactive decay, magnetar spin-down, and circumstellar interaction. For each heating source, we provide a concise summary of the basic physical processes, a physically motivated choice of model parameter ranges, an overall summary of the resulting light curves and their occupied range in the DLPS, and how the various model input parameters affect the light curves. We specifically explore the key voids discussed in the literature: the intermediate-luminosity gap between classical novae and supernovae, and short-duration transients (≲ 10 days). We find that few physical models lead to transients that occupy these voids. Moreover, we find that only relativistic expansion can produce fast and luminous transients, while for all other heating sources events with durations ≲ 10 days are dim ({M}{{R}}≳ -15 mag). Finally, we explore the detection potential of optical surveys (e.g., Large Synoptic Survey Telescope) in the DLPS and quantify the notion that short-duration and dim transients are exponentially more difficult to discover in untargeted surveys.

  3. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  4. Using prediction markets to estimate the reproducibility of scientific research

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  5. Using prediction markets to estimate the reproducibility of scientific research.

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  6. Composting in small laboratory pilots: Performance and reproducibility

    International Nuclear Information System (INIS)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-01-01

    Highlights: ► We design an innovative small-scale composting device including six 4-l reactors. ► We investigate the performance and reproducibility of composting on a small scale. ► Thermophilic conditions are established by self-heating in all replicates. ► Biochemical transformations, organic matter losses and stabilisation are realistic. ► The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors ( 2 consumption and CO 2 emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  7. Reproducibility of biomarkers in induced sputum and in serum from chronic smokers.

    Science.gov (United States)

    Zuiker, Rob G J A; Kamerling, Ingrid M C; Morelli, Nicoletta; Calderon, Cesar; Boot, J Diderik; de Kam, Marieke; Diamant, Zuzana; Burggraaf, Jacobus; Cohen, Adam F

    2015-08-01

    Soluble inflammatory markers obtained from non-invasive airway sampling such as induced sputum may be useful biomarkers for targeted pharmaceutical interventions. However, before these soluble markers can be used as potential targets, their variability and reproducibility need to be established in distinct study populations. This study aimed to assess the reproducibility of biomarkers obtained from induced sputum and serum in chronic smokers and non-smokers. Sputum and serum samples were obtained from 16 healthy non-smokers and 16 asymptomatic chronic smokers (for both groups: 8M/8F, 30-52 years, FEV1 ≥80% pred.; ≥10 pack years for the smokers) on 2 separate visits 4-10 days apart. Soluble markers in serum and sputum were analysed by ELISA. The differences between smokers vs non-smokers were analysed with a t-test and variability was assessed on log-transformed data by a mixed model ANOVA. Analysable sputum samples could be obtained from all 32 subjects. In both study populations neutrophils and macrophages were the predominant cell types. Serum Pulmonary Surfactant Associated Protein D had favourable reproducibility criteria for reliability ratio (0.99), intra-subject coefficient of variation (11.2%) and the Bland Altman limits of agreement. Furthermore, chronic smokers, compared to non-smokers, had significantly higher sputum concentrations of IL-8 (1094.6 pg/mL vs 460.8 pg/mL, p = 0.006)), and higher serum concentrations of Pulmonary Surfactant Associated Protein D (110.9 pg/mL vs 64.7 pg/mL, p = 0.019), and lower concentrations of Serum Amyloid A (1352.4 pg/mL vs 2297.5 pg/mL, p = 0.022). Serum Pulmonary Surfactant Associated Protein D proved to be a biomarker that fulfilled the criteria for reproducibility in both study groups. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Determining the optimal number of independent components for reproducible transcriptomic data analysis.

    Science.gov (United States)

    Kairov, Ulykbek; Cantini, Laura; Greco, Alessandro; Molkenov, Askhat; Czerwinska, Urszula; Barillot, Emmanuel; Zinovyev, Andrei

    2017-09-11

    Independent Component Analysis (ICA) is a method that models gene expression data as an action of a set of statistically independent hidden factors. The output of ICA depends on a fundamental parameter: the number of components (factors) to compute. The optimal choice of this parameter, related to determining the effective data dimension, remains an open question in the application of blind source separation techniques to transcriptomic data. Here we address the question of optimizing the number of statistically independent components in the analysis of transcriptomic data for reproducibility of the components in multiple runs of ICA (within the same or within varying effective dimensions) and in multiple independent datasets. To this end, we introduce ranking of independent components based on their stability in multiple ICA computation runs and define a distinguished number of components (Most Stable Transcriptome Dimension, MSTD) corresponding to the point of the qualitative change of the stability profile. Based on a large body of data, we demonstrate that a sufficient number of dimensions is required for biological interpretability of the ICA decomposition and that the most stable components with ranks below MSTD have more chances to be reproduced in independent studies compared to the less stable ones. At the same time, we show that a transcriptomics dataset can be reduced to a relatively high number of dimensions without losing the interpretability of ICA, even though higher dimensions give rise to components driven by small gene sets. We suggest a protocol of ICA application to transcriptomics data with a possibility of prioritizing components with respect to their reproducibility that strengthens the biological interpretation. Computing too few components (much less than MSTD) is not optimal for interpretability of the results. The components ranked within MSTD range have more chances to be reproduced in independent studies.

  9. Investigation of the Intra- and Interlaboratory Reproducibility of a Small Scale Standardized Supersaturation and Precipitation Method

    DEFF Research Database (Denmark)

    Plum, Jakob; Madsen, Cecilie M; Teleki, Alexandra

    2017-01-01

    order for the three model compounds using the SSPM (aprepitant > felodipine ≈ fenofibrate). The α-value is dependent on the experimental setup and can be used as a parameter to evaluate the uniformity of the data set. This study indicated that the SSPM was able to obtain the same rank order of the β...... compound available for absorption. However, due to the stochastic nature of nucleation, supersaturating drug delivery systems may lead to inter- and intrapersonal variability. The ability to define a feasible range with respect to the supersaturation level is a crucial factor for a successful formulation...... reproducibility study of felodipine was conducted, after which seven partners contributed with data for three model compounds; aprepitant, felodipine, and fenofibrate, to determine the interlaboratory reproducibility of the SSPM. The first part of the SSPM determines the apparent degrees of supersaturation (a...

  10. Reproducibility of esophageal scintigraphy using semi-solid yoghurt

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Yukinori; Kinoshita, Manabu; Asakura, Yasushi; Kakinuma, Tohru; Shimoji, Katsunori; Fujiwara, Kenji; Suzuki, Kenji; Miyamae, Tatsuya [Saitama Medical School, Moroyama (Japan)

    1999-10-01

    Esophageal scintigraphy is a non-invasive method which evaluate esophageal function quantitatively. We applied new technique using semi-solid yoghurt, which can evaluate esophageal function in a sitting position. To evaluate the reproducibility of this method, scintigraphy were performed in 16 healthy volunteers. From the result of four swallows except the first one, the mean coefficients of variation in esophageal transit time and esophageal emptying time were 12.8% and 13.4% respectively (interday variation). As regards the interday variation, this method had also good reproducibility from the result on the 2 separate days. (author)

  11. Broad-band near-field ground motion simulations in 3-dimensional scattering media

    KAUST Repository

    Imperatori, W.; Mai, Paul Martin

    2012-01-01

    examine scattering phenomena, related to the loss of radiation pattern and the directivity breakdown. We first simulate broad-band ground motions for a point-source characterized by a classic ω2 spectrum model. Fault finiteness is then introduced by means

  12. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  13. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  14. Broad spectrum antiangiogenic treatment for ocular neovascular diseases.

    Directory of Open Access Journals (Sweden)

    Ofra Benny

    2010-09-01

    Full Text Available Pathological neovascularization is a hallmark of late stage neovascular (wet age-related macular degeneration (AMD and the leading cause of blindness in people over the age of 50 in the western world. The treatments focus on suppression of choroidal neovascularization (CNV, while current approved therapies are limited to inhibiting vascular endothelial growth factor (VEGF exclusively. However, this treatment does not address the underlying cause of AMD, and the loss of VEGF's neuroprotective can be a potential side effect. Therapy which targets the key processes in AMD, the pathological neovascularization, vessel leakage and inflammation could bring a major shift in the approach to disease treatment and prevention. In this study we have demonstrated the efficacy of such broad spectrum antiangiogenic therapy on mouse model of AMD.Lodamin, a polymeric formulation of TNP-470, is a potent broad-spectrum antiangiogenic drug. Lodamin significantly reduced key processes involved in AMD progression as demonstrated in mice and rats. Its suppressive effects on angiogenesis, vascular leakage and inflammation were studied in a wide array of assays including; a Matrigel, delayed-type hypersensitivity (DTH, Miles assay, laser-induced CNV and corneal micropocket assay. Lodamin significantly suppressed the secretion of various pro-inflammatory cytokines in the CNV lesion including monocyte chemotactic protein-1 (MCP-1/Ccl2. Importantly, Lodamin was found to regress established CNV lesions, unlike soluble fms-like tyrosine kinase-1 (sFlk-1. The drug was found to be safe in mice and have little toxicity as demonstrated by electroretinography (ERG assessing retinal and by histology.Lodamin, a polymer formulation of TNP-470, was identified as a first in its class, broad-spectrum antiangiogenic drug that can be administered orally or locally to treat corneal and retinal neovascularization. Several unique properties make Lodamin especially beneficial for ophthalmic

  15. Magnet stability and reproducibility

    CERN Document Server

    Marks, N

    2010-01-01

    Magnet stability and reproducibility have become increasingly important as greater precision and beams with smaller dimension are required for research, medical and other purpose. The observed causes of mechanical and electrical instability are introduced and the engineering arrangements needed to minimize these problems discussed; the resulting performance of a state-of-the-art synchrotron source (Diamond) is then presented. The need for orbit feedback to obtain best possible beam stability is briefly introduced, but omitting any details of the necessary technical equipment, which is outside the scope of the presentation.

  16. Observations and Simulations of Formation of Broad Plasma Depletions Through Merging Process

    Science.gov (United States)

    Huang, Chao-Song; Retterer, J. M.; Beaujardiere, O. De La; Roddy, P. A.; Hunton, D.E.; Ballenthin, J. O.; Pfaff, Robert F.

    2012-01-01

    Broad plasma depletions in the equatorial ionosphere near dawn are region in which the plasma density is reduced by 1-3 orders of magnitude over thousands of kilometers in longitude. This phenomenon is observed repeatedly by the Communication/Navigation Outage Forecasting System (C/NOFS) satellite during deep solar minimum. The plasma flow inside the depletion region can be strongly upward. The possible causal mechanism for the formation of broad plasma depletions is that the broad depletions result from merging of multiple equatorial plasma bubbles. The purpose of this study is to demonstrate the feasibility of the merging mechanism with new observations and simulations. We present C/NOFS observations for two cases. A series of plasma bubbles is first detected by C/NOFS over a longitudinal range of 3300-3800 km around midnight. Each of the individual bubbles has a typical width of approx 100 km in longitude, and the upward ion drift velocity inside the bubbles is 200-400 m/s. The plasma bubbles rotate with the Earth to the dawn sector and become broad plasma depletions. The observations clearly show the evolution from multiple plasma bubbles to broad depletions. Large upward plasma flow occurs inside the depletion region over 3800 km in longitude and exists for approx 5 h. We also present the numerical simulations of bubble merging with the physics-based low-latitude ionospheric model. It is found that two separate plasma bubbles join together and form a single, wider bubble. The simulations show that the merging process of plasma bubbles can indeed occur in incompressible ionospheric plasma. The simulation results support the merging mechanism for the formation of broad plasma depletions.

  17. The broad-band overlap problem in atmospheric trace gases

    International Nuclear Information System (INIS)

    Subasilar, B.

    1991-01-01

    In relation to a better understanding of climate change and the related greenhouse problem, one way of projecting for the next decades is through general circulation models (GCMs). The only input as a driving force in the changing atmospheric and oceanic circulation patterns is the amount of heat perturbation either due to natural or man-caused activities. Among these, CO 2 concentrations resulting from the latter has been observed to be accelerating at alarmingly high rates especially after the advent of the industrialization which just began in the last century. In addition to that, collective effects of other greenhouse gases (freons, SO 2 , H 2 O, CH 4 , etc.) are as important as CO 2 . Hence, it is evident from the above considerations that, in the predictions of climate models, the heat input which triggers changes in the atmospheric patterns, should be formulated accurately. In order to realize this objective, in this research, beginning with the available line parameter data, the problems of absorption have been investigated and attacked in the frame known as the broad band modeling since that is the only best and fastest manageable representation for GCMs. The first step was the construction of a full broad band (intra band overlap) model that was also flexible enough to accommodate the individual peculiarities of the bands. Before, the well known and very useful Ramanathan model had a limited applicability in the concentration scale, and it was also not systematically or successfully incorporated into an inter band overlap picture. Then, the established ideas that served as bases up to present, have been employed but found to have a limited practical applicability when it came to predict the inter band overlaps

  18. Modeling vegetation heights from high resolution stereo aerial photography: an application for broad-scale rangeland monitoring.

    Science.gov (United States)

    Gillan, Jeffrey K; Karl, Jason W; Duniway, Michael; Elaksher, Ahmed

    2014-11-01

    Vertical vegetation structure in rangeland ecosystems can be a valuable indicator for assessing rangeland health and monitoring riparian areas, post-fire recovery, available forage for livestock, and wildlife habitat. Federal land management agencies are directed to monitor and manage rangelands at landscapes scales, but traditional field methods for measuring vegetation heights are often too costly and time consuming to apply at these broad scales. Most emerging remote sensing techniques capable of measuring surface and vegetation height (e.g., LiDAR or synthetic aperture radar) are often too expensive, and require specialized sensors. An alternative remote sensing approach that is potentially more practical for managers is to measure vegetation heights from digital stereo aerial photographs. As aerial photography is already commonly used for rangeland monitoring, acquiring it in stereo enables three-dimensional modeling and estimation of vegetation height. The purpose of this study was to test the feasibility and accuracy of estimating shrub heights from high-resolution (HR, 3-cm ground sampling distance) digital stereo-pair aerial images. Overlapping HR imagery was taken in March 2009 near Lake Mead, Nevada and 5-cm resolution digital surface models (DSMs) were created by photogrammetric methods (aerial triangulation, digital image matching) for twenty-six test plots. We compared the heights of individual shrubs and plot averages derived from the DSMs to field measurements. We found strong positive correlations between field and image measurements for several metrics. Individual shrub heights tended to be underestimated in the imagery, however, accuracy was higher for dense, compact shrubs compared with shrubs with thin branches. Plot averages of shrub height from DSMs were also strongly correlated to field measurements but consistently underestimated. Grasses and forbs were generally too small to be detected with the resolution of the DSMs. Estimates of

  19. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  20. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  1. Generation of a Broad-Group HTGR Library for Use with SCALE

    International Nuclear Information System (INIS)

    Ellis, Ronald James; Lee, Deokjung; Wiarda, Dorothea; Williams, Mark L.; Mertyurek, Ugur

    2012-01-01

    With current and ongoing interest in high temperature gas reactors (HTGRs), the U.S. Nuclear Regulatory Commission (NRC) anticipates the need for nuclear data libraries appropriate for use in applications for modeling, assessing, and analyzing HTGR reactor physics and operating behavior. The objective of this work was to develop a broad-group library suitable for production analyses with SCALE for HTGR applications. Several interim libraries were generated from SCALE fine-group 238- and 999-group libraries, and the final broad-group library was created from Evaluated Nuclear Data File/B Version ENDF/B-VII Release 0 cross-section evaluations using new ORNL methodologies with AMPX, SCALE, and other codes. Furthermore, intermediate resonance (IR) methods were applied to the HTGR broadgroup library, and lambda factors and f-factors were incorporated into the library s nuclear data files. A new version of the SCALE BONAMI module named BONAMI-IR was developed to process the IR data in the new library and, thus, eliminate the need for the CENTRM/PMC modules for resonance selfshielding. This report documents the development of the HTGR broad-group nuclear data library and the results of test and benchmark calculations using the new library with SCALE. The 81-group library is shown to model HTGR cases with similar accuracy to the SCALE 238-group library but with significantly faster computational times due to the reduced number of energy groups and the use of BONAMI-IR instead of BONAMI/CENTRM/PMC for resonance self-shielding calculations.

  2. Can Computational Sediment Transport Models Reproduce the Observed Variability of Channel Networks in Modern Deltas?

    Science.gov (United States)

    Nesvold, E.; Mukerji, T.

    2017-12-01

    River deltas display complex channel networks that can be characterized through the framework of graph theory, as shown by Tejedor et al. (2015). Deltaic patterns may also be useful in a Bayesian approach to uncertainty quantification of the subsurface, but this requires a prior distribution of the networks of ancient deltas. By considering subaerial deltas, one can at least obtain a snapshot in time of the channel network spectrum across deltas. In this study, the directed graph structure is semi-automatically extracted from satellite imagery using techniques from statistical processing and machine learning. Once the network is labeled with vertices and edges, spatial trends and width and sinuosity distributions can also be found easily. Since imagery is inherently 2D, computational sediment transport models can serve as a link between 2D network structure and 3D depositional elements; the numerous empirical rules and parameters built into such models makes it necessary to validate the output with field data. For this purpose we have used a set of 110 modern deltas, with average water discharge ranging from 10 - 200,000 m3/s, as a benchmark for natural variability. Both graph theoretic and more general distributions are established. A key question is whether it is possible to reproduce this deltaic network spectrum with computational models. Delft3D was used to solve the shallow water equations coupled with sediment transport. The experimental setup was relatively simple; incoming channelized flow onto a tilted plane, with varying wave and tidal energy, sediment types and grain size distributions, river discharge and a few other input parameters. Each realization was run until a delta had fully developed: between 50 and 500 years (with a morphology acceleration factor). It is shown that input parameters should not be sampled independently from the natural ranges, since this may result in deltaic output that falls well outside the natural spectrum. Since we are

  3. Application of the spine-layer jet radiation model to outbursts in the broad-line radio galaxy 3C 120

    Science.gov (United States)

    Janiak, M.; Sikora, M.; Moderski, R.

    2016-05-01

    We present a detailed Fermi/LAT data analysis for the broad-line radio galaxy 3C 120. This source has recently entered into a state of increased γ-ray activity which manifested itself in two major flares detected by Fermi/LAT in 2014 September and 2015 April with no significant flux changes reported in other wavelengths. We analyse available data focusing our attention on aforementioned outbursts. We find very fast variability time-scale during flares (of the order of hours) together with a significant γ-ray flux increase. We show that the ˜6.8 yr averaged γ-ray emission of 3C 120 is likely a sum of the external radiation Compton and the synchrotron self-Compton radiative components. To address the problem of violent γ-ray flares and fast variability we model the jet radiation dividing the jet structure into two components: the wide and relatively slow outer layer and the fast, narrow spine. We show that with the addition of the fast spine occasionally bent towards the observer we are able to explain observed spectral energy distribution of 3C 120 during flares with the Compton upscattered broad-line region and dusty torus photons as main γ-rays emission mechanism.

  4. In-vitro accuracy and reproducibility evaluation of probing depth measurements of selected periodontal probes

    Directory of Open Access Journals (Sweden)

    K.N. Al Shayeb

    2014-01-01

    Conclusion: Depth measurements with the Chapple UB-CF-15 probe were more accurate and reproducible compared to measurements with the Vivacare TPS and Williams 14 W probes. This in vitro model may be useful for intra-examiner calibration or clinician training prior to the clinical evaluation of patients or in longitudinal studies involving periodontal evaluation.

  5. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters

    Science.gov (United States)

    Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  6. Validation, automatic generation and use of broad phonetic transcriptions

    NARCIS (Netherlands)

    Bael, Cristophe Patrick Jan Van

    2007-01-01

    Broad phonetic transcriptions represent the pronunciation of words as strings of characters from specifically designed symbol sets. In everyday life, broad phonetic transcriptions are often used as aids to pronounce (foreign) words. In addition, broad phonetic transcriptions are often used for

  7. An evaluation of WRF's ability to reproduce the surface wind over complex terrain based on typical circulation patterns.

    NARCIS (Netherlands)

    Jiménez, P.A.; Dudhia, J.; González-Rouco, J.F.; Montávez, J.P.; Garcia-Bustamante, E.; Navarro, J.; Vilà-Guerau de Arellano, J.; Munoz-Roldán, A.

    2013-01-01

    [1] The performance of the Weather Research and Forecasting (WRF) model to reproduce the surface wind circulations over complex terrain is examined. The atmospheric evolution is simulated using two versions of the WRF model during an over 13¿year period (1992 to 2005) over a complex terrain region

  8. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  9. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    Science.gov (United States)

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  10. A non-local hidden-variable model that violates Leggett-type inequalities

    Energy Technology Data Exchange (ETDEWEB)

    Zela, F de [Departamento de Ciencias, Seccion Fisica, Pontificia Universidad Catolica del Peru, Apartado 1761, Lima (Peru)

    2008-12-19

    Recent experiments of Groeblacher et al proved the violation of a Leggett-type inequality that was claimed to be valid for a broad class of non-local hidden-variable theories. The impossibility of constructing a non-local and realistic theory, unless it entails highly counterintuitive features, seems thus to have been experimentally proved. This would bring us close to a definite refutation of realism. Indeed, realism was proved to be also incompatible with locality, according to a series of experiments testing Bell inequalities. The present paper addresses the said experiments of Groeblacher et al and presents an explicit, contextual and realistic, model that reproduces the predictions of quantum mechanics. It thus violates the Leggett-type inequality that was established with the aim of ruling out a supposedly broad class of non-local models. We can thus conclude that plausible contextual, realistic, models are still tenable. This restates the possibility of a future completion of quantum mechanics by a realistic and contextual theory which is not in a class containing only highly counterintuitive models. The class that was ruled out by the experiments of Groeblacher et al is thus proved to be a limited one, arbitrarily separating models that physically belong in the same class.

  11. A non-local hidden-variable model that violates Leggett-type inequalities

    International Nuclear Information System (INIS)

    Zela, F de

    2008-01-01

    Recent experiments of Groeblacher et al proved the violation of a Leggett-type inequality that was claimed to be valid for a broad class of non-local hidden-variable theories. The impossibility of constructing a non-local and realistic theory, unless it entails highly counterintuitive features, seems thus to have been experimentally proved. This would bring us close to a definite refutation of realism. Indeed, realism was proved to be also incompatible with locality, according to a series of experiments testing Bell inequalities. The present paper addresses the said experiments of Groeblacher et al and presents an explicit, contextual and realistic, model that reproduces the predictions of quantum mechanics. It thus violates the Leggett-type inequality that was established with the aim of ruling out a supposedly broad class of non-local models. We can thus conclude that plausible contextual, realistic, models are still tenable. This restates the possibility of a future completion of quantum mechanics by a realistic and contextual theory which is not in a class containing only highly counterintuitive models. The class that was ruled out by the experiments of Groeblacher et al is thus proved to be a limited one, arbitrarily separating models that physically belong in the same class

  12. Reproducibility and consistency of proteomic experiments on natural populations of a non-model aquatic insect.

    Science.gov (United States)

    Hidalgo-Galiana, Amparo; Monge, Marta; Biron, David G; Canals, Francesc; Ribera, Ignacio; Cieslak, Alexandra

    2014-01-01

    Population proteomics has a great potential to address evolutionary and ecological questions, but its use in wild populations of non-model organisms is hampered by uncontrolled sources of variation. Here we compare the response to temperature extremes of two geographically distant populations of a diving beetle species (Agabus ramblae) using 2-D DIGE. After one week of acclimation in the laboratory under standard conditions, a third of the specimens of each population were placed at either 4 or 27°C for 12 h, with another third left as a control. We then compared the protein expression level of three replicated samples of 2-3 specimens for each treatment. Within each population, variation between replicated samples of the same treatment was always lower than variation between treatments, except for some control samples that retained a wider range of expression levels. The two populations had a similar response, without significant differences in the number of protein spots over- or under-expressed in the pairwise comparisons between treatments. We identified exemplary proteins among those differently expressed between treatments, which proved to be proteins known to be related to thermal response or stress. Overall, our results indicate that specimens collected in the wild are suitable for proteomic analyses, as the additional sources of variation were not enough to mask the consistency and reproducibility of the response to the temperature treatments.

  13. The quest for improved reproducibility in MALDI mass spectrometry.

    Science.gov (United States)

    O'Rourke, Matthew B; Djordjevic, Steven P; Padula, Matthew P

    2018-03-01

    Reproducibility has been one of the biggest hurdles faced when attempting to develop quantitative protocols for MALDI mass spectrometry. The heterogeneous nature of sample recrystallization has made automated sample acquisition somewhat "hit and miss" with manual intervention needed to ensure that all sample spots have been analyzed. In this review, we explore the last 30 years of literature and anecdotal evidence that has attempted to address and improve reproducibility in MALDI MS. Though many methods have been attempted, we have discovered a significant publication history surrounding the use of nitrocellulose as a substrate to improve homogeneity of crystal formation and therefore reproducibility. We therefore propose that this is the most promising avenue of research for developing a comprehensive and universal preparation protocol for quantitative MALDI MS analysis. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 37:217-228, 2018. © 2016 Wiley Periodicals, Inc.

  14. Reproducibility of the cutoff probe for the measurement of electron density

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. W.; Oh, W. Y. [Department of Mechanical Engineering, Korea Advanced Institute of Science and Technology, Daejeon 305-701 (Korea, Republic of); You, S. J., E-mail: sjyou@cnu.ac.kr [Department of Physics, Chungnam National University, Daejeon 305-701 (Korea, Republic of); Kwon, J. H.; You, K. H.; Seo, B. H.; Kim, J. H., E-mail: jhkim86@kriss.re.kr [Center for Vacuum Technology, Korea Research Institute of Standards and Science, Daejeon 305-306 (Korea, Republic of); Yoon, J.-S. [Plasma Technology Research Center, National Fusion Research Institute, Gunsan 573-540 (Korea, Republic of)

    2016-06-15

    Since a plasma processing control based on plasma diagnostics attracted considerable attention in industry, the reproducibility of the diagnostics using in this application has become a great interest. Because the cutoff probe is one of the potential candidates for this application, knowing the reproducibility of the cutoff probe measurement becomes quit important in the cutoff probe application research. To test the reproducibility of the cutoff probe measurement, in this paper, a comparative study among the different cutoff probe measurements was performed. The comparative study revealed remarkable result: the cutoff probe has a great reproducibility for the electron density measurement, i.e., there are little differences among measurements by different probes made by different experimenters. The discussion including the reason for the result was addressed via this paper by using a basic measurement principle of cutoff probe and a comparative experiment with Langmuir probe.

  15. Reproducibility of the cutoff probe for the measurement of electron density

    International Nuclear Information System (INIS)

    Kim, D. W.; Oh, W. Y.; You, S. J.; Kwon, J. H.; You, K. H.; Seo, B. H.; Kim, J. H.; Yoon, J.-S.

    2016-01-01

    Since a plasma processing control based on plasma diagnostics attracted considerable attention in industry, the reproducibility of the diagnostics using in this application has become a great interest. Because the cutoff probe is one of the potential candidates for this application, knowing the reproducibility of the cutoff probe measurement becomes quit important in the cutoff probe application research. To test the reproducibility of the cutoff probe measurement, in this paper, a comparative study among the different cutoff probe measurements was performed. The comparative study revealed remarkable result: the cutoff probe has a great reproducibility for the electron density measurement, i.e., there are little differences among measurements by different probes made by different experimenters. The discussion including the reason for the result was addressed via this paper by using a basic measurement principle of cutoff probe and a comparative experiment with Langmuir probe.

  16. Analysis of mammalian gene function through broad based phenotypic screens across a consortium of mouse clinics

    Science.gov (United States)

    Adams, David J; Adams, Niels C; Adler, Thure; Aguilar-Pimentel, Antonio; Ali-Hadji, Dalila; Amann, Gregory; André, Philippe; Atkins, Sarah; Auburtin, Aurelie; Ayadi, Abdel; Becker, Julien; Becker, Lore; Bedu, Elodie; Bekeredjian, Raffi; Birling, Marie-Christine; Blake, Andrew; Bottomley, Joanna; Bowl, Mike; Brault, Véronique; Busch, Dirk H; Bussell, James N; Calzada-Wack, Julia; Cater, Heather; Champy, Marie-France; Charles, Philippe; Chevalier, Claire; Chiani, Francesco; Codner, Gemma F; Combe, Roy; Cox, Roger; Dalloneau, Emilie; Dierich, André; Di Fenza, Armida; Doe, Brendan; Duchon, Arnaud; Eickelberg, Oliver; Esapa, Chris T; El Fertak, Lahcen; Feigel, Tanja; Emelyanova, Irina; Estabel, Jeanne; Favor, Jack; Flenniken, Ann; Gambadoro, Alessia; Garrett, Lilian; Gates, Hilary; Gerdin, Anna-Karin; Gkoutos, George; Greenaway, Simon; Glasl, Lisa; Goetz, Patrice; Da Cruz, Isabelle Goncalves; Götz, Alexander; Graw, Jochen; Guimond, Alain; Hans, Wolfgang; Hicks, Geoff; Hölter, Sabine M; Höfler, Heinz; Hancock, John M; Hoehndorf, Robert; Hough, Tertius; Houghton, Richard; Hurt, Anja; Ivandic, Boris; Jacobs, Hughes; Jacquot, Sylvie; Jones, Nora; Karp, Natasha A; Katus, Hugo A; Kitchen, Sharon; Klein-Rodewald, Tanja; Klingenspor, Martin; Klopstock, Thomas; Lalanne, Valerie; Leblanc, Sophie; Lengger, Christoph; le Marchand, Elise; Ludwig, Tonia; Lux, Aline; McKerlie, Colin; Maier, Holger; Mandel, Jean-Louis; Marschall, Susan; Mark, Manuel; Melvin, David G; Meziane, Hamid; Micklich, Kateryna; Mittelhauser, Christophe; Monassier, Laurent; Moulaert, David; Muller, Stéphanie; Naton, Beatrix; Neff, Frauke; Nolan, Patrick M; Nutter, Lauryl MJ; Ollert, Markus; Pavlovic, Guillaume; Pellegata, Natalia S; Peter, Emilie; Petit-Demoulière, Benoit; Pickard, Amanda; Podrini, Christine; Potter, Paul; Pouilly, Laurent; Puk, Oliver; Richardson, David; Rousseau, Stephane; Quintanilla-Fend, Leticia; Quwailid, Mohamed M; Racz, Ildiko; Rathkolb, Birgit; Riet, Fabrice; Rossant, Janet; Roux, Michel; Rozman, Jan; Ryder, Ed; Salisbury, Jennifer; Santos, Luis; Schäble, Karl-Heinz; Schiller, Evelyn; Schrewe, Anja; Schulz, Holger; Steinkamp, Ralf; Simon, Michelle; Stewart, Michelle; Stöger, Claudia; Stöger, Tobias; Sun, Minxuan; Sunter, David; Teboul, Lydia; Tilly, Isabelle; Tocchini-Valentini, Glauco P; Tost, Monica; Treise, Irina; Vasseur, Laurent; Velot, Emilie; Vogt-Weisenhorn, Daniela; Wagner, Christelle; Walling, Alison; Weber, Bruno; Wendling, Olivia; Westerberg, Henrik; Willershäuser, Monja; Wolf, Eckhard; Wolter, Anne; Wood, Joe; Wurst, Wolfgang; Yildirim, Ali Önder; Zeh, Ramona; Zimmer, Andreas; Zimprich, Annemarie

    2015-01-01

    The function of the majority of genes in the mouse and human genomes remains unknown. The mouse ES cell knockout resource provides a basis for characterisation of relationships between gene and phenotype. The EUMODIC consortium developed and validated robust methodologies for broad-based phenotyping of knockouts through a pipeline comprising 20 disease-orientated platforms. We developed novel statistical methods for pipeline design and data analysis aimed at detecting reproducible phenotypes with high power. We acquired phenotype data from 449 mutant alleles, representing 320 unique genes, of which half had no prior functional annotation. We captured data from over 27,000 mice finding that 83% of the mutant lines are phenodeviant, with 65% demonstrating pleiotropy. Surprisingly, we found significant differences in phenotype annotation according to zygosity. Novel phenotypes were uncovered for many genes with unknown function providing a powerful basis for hypothesis generation and further investigation in diverse systems. PMID:26214591

  17. Reproducibility between conventional and digital periapical radiography for bone height measurement

    Directory of Open Access Journals (Sweden)

    Miguel Simancas Pallares

    2015-10-01

    Conclusions. Reproducibility between methods was considered poor, including subgroup analysis, therefore, reproducibility between methods is minimal. Usage of these methods in periodontics should be made implementing the whole knowledge of the technical features and the advantages of these systems.

  18. Reproducibility Between Brain Uptake Ratio Using Anatomic Standardization and Patlak-Plot Methods.

    Science.gov (United States)

    Shibutani, Takayuki; Onoguchi, Masahisa; Noguchi, Atsushi; Yamada, Tomoki; Tsuchihashi, Hiroko; Nakajima, Tadashi; Kinuya, Seigo

    2015-12-01

    The Patlak-plot and conventional methods of determining brain uptake ratio (BUR) have some problems with reproducibility. We formulated a method of determining BUR using anatomic standardization (BUR-AS) in a statistical parametric mapping algorithm to improve reproducibility. The objective of this study was to demonstrate the inter- and intraoperator reproducibility of mean cerebral blood flow as determined using BUR-AS in comparison to the conventional-BUR (BUR-C) and Patlak-plot methods. The images of 30 patients who underwent brain perfusion SPECT were retrospectively used in this study. The images were reconstructed using ordered-subset expectation maximization and processed using an automatic quantitative analysis for cerebral blood flow of ECD tool. The mean SPECT count was calculated from axial basal ganglia slices of the normal side (slices 31-40) drawn using a 3-dimensional stereotactic region-of-interest template after anatomic standardization. The mean cerebral blood flow was calculated from the mean SPECT count. Reproducibility was evaluated using coefficient of variation and Bland-Altman plotting. For both inter- and intraoperator reproducibility, the BUR-AS method had the lowest coefficient of variation and smallest error range about the Bland-Altman plot. Mean CBF obtained using the BUR-AS method had the highest reproducibility. Compared with the Patlak-plot and BUR-C methods, the BUR-AS method provides greater inter- and intraoperator reproducibility of cerebral blood flow measurement. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  19. Reproducibility study of [{sup 18}F]FPP(RGD){sub 2} uptake in murine models of human tumor xenografts

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Edwin; Liu, Shuangdong; Chin, Frederick; Cheng, Zhen [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Gowrishankar, Gayatri; Yaghoubi, Shahriar [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Wedgeworth, James Patrick [Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Berndorff, Dietmar; Gekeler, Volker [Bayer Schering Pharma AG, Global Drug Discovery, Berlin (Germany); Gambhir, Sanjiv S. [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Canary Center at Stanford for Cancer Early Detection, Nuclear Medicine, Departments of Radiology and Bioengineering, Molecular Imaging Program at Stanford, Stanford, CA (United States)

    2011-04-15

    An {sup 18}F-labeled PEGylated arginine-glycine-aspartic acid (RGD) dimer [{sup 18}F]FPP(RGD){sub 2} has been used to image tumor {alpha}{sub v}{beta}{sub 3} integrin levels in preclinical and clinical studies. Serial positron emission tomography (PET) studies may be useful for monitoring antiangiogenic therapy response or for drug screening; however, the reproducibility of serial scans has not been determined for this PET probe. The purpose of this study was to determine the reproducibility of the integrin {alpha}{sub v}{beta}{sub 3}-targeted PET probe, [{sup 18}F ]FPP(RGD){sub 2} using small animal PET. Human HCT116 colon cancer xenografts were implanted into nude mice (n = 12) in the breast and scapular region and grown to mean diameters of 5-15 mm for approximately 2.5 weeks. A 3-min acquisition was performed on a small animal PET scanner approximately 1 h after administration of [{sup 18}F]FPP(RGD){sub 2} (1.9-3.8 MBq, 50-100 {mu}Ci) via the tail vein. A second small animal PET scan was performed approximately 6 h later after reinjection of the probe to assess for reproducibility. Images were analyzed by drawing an ellipsoidal region of interest (ROI) around the tumor xenograft activity. Percentage injected dose per gram (%ID/g) values were calculated from the mean or maximum activity in the ROIs. Coefficients of variation and differences in %ID/g values between studies from the same day were calculated to determine the reproducibility. The coefficient of variation (mean {+-}SD) for %ID{sub mean}/g and %ID{sub max}/g values between [{sup 18}F]FPP(RGD){sub 2} small animal PET scans performed 6 h apart on the same day were 11.1 {+-} 7.6% and 10.4 {+-} 9.3%, respectively. The corresponding differences in %ID{sub mean}/g and %ID{sub max}/g values between scans were -0.025 {+-} 0.067 and -0.039 {+-} 0.426. Immunofluorescence studies revealed a direct relationship between extent of {alpha}{sub {nu}}{beta}{sub 3} integrin expression in tumors and tumor vasculature

  20. Reproducibility of 3.0 Tesla Magnetic Resonance Spectroscopy for Measuring Hepatic Fat Content

    NARCIS (Netherlands)

    van Werven, Jochem R.; Hoogduin, Johannes M.; Nederveen, Aart J.; van Vliet, Andre A.; Wajs, Ewa; Vandenberk, Petra; Stroes, Erik S. G.; Stoker, Jaap

    Purpose: To investigate reproducibility of proton magnetic resonance spectroscopy (H-1-MRS) to measure hepatic triglyceride content (HTGC). Materials and Methods: In 24 subjects, HTGC was evaluated using H-1-MRS at 3.0 Tesla. We studied "between-weeks" reproducibility and reproducibility of H-1-MRS

  1. Reproducibility and Practical Adoption of GEOBIA with Open-Source Software in Docker Containers

    Directory of Open Access Journals (Sweden)

    Christian Knoth

    2017-03-01

    Full Text Available Geographic Object-Based Image Analysis (GEOBIA mostly uses proprietary software,but the interest in Free and Open-Source Software (FOSS for GEOBIA is growing. This interest stems not only from cost savings, but also from benefits concerning reproducibility and collaboration. Technical challenges hamper practical reproducibility, especially when multiple software packages are required to conduct an analysis. In this study, we use containerization to package a GEOBIA workflow in a well-defined FOSS environment. We explore the approach using two software stacks to perform an exemplary analysis detecting destruction of buildings in bi-temporal images of a conflict area. The analysis combines feature extraction techniques with segmentation and object-based analysis to detect changes using automatically-defined local reference values and to distinguish disappeared buildings from non-target structures. The resulting workflow is published as FOSS comprising both the model and data in a ready to use Docker image and a user interface for interaction with the containerized workflow. The presented solution advances GEOBIA in the following aspects: higher transparency of methodology; easier reuse and adaption of workflows; better transferability between operating systems; complete description of the software environment; and easy application of workflows by image analysis experts and non-experts. As a result, it promotes not only the reproducibility of GEOBIA, but also its practical adoption.

  2. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    Directory of Open Access Journals (Sweden)

    Sonal eGoswami

    2012-06-01

    Full Text Available Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e. that require the hippocampus or not the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze. Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma, PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  3. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome.

    Science.gov (United States)

    Goswami, Sonal; Samuel, Sherin; Sierra, Olga R; Cascardi, Michele; Paré, Denis

    2012-01-01

    Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD) remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e., that require the hippocampus or not) the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze (EPM). Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  4. PDB-NMA of a protein homodimer reproduces distinct experimental motility asymmetry

    Science.gov (United States)

    Tirion, Monique M.; ben-Avraham, Daniel

    2018-03-01

    We have extended our analytically derived PDB-NMA formulation, Atomic Torsional Modal Analysis or ATMAN (Tirion and ben-Avraham 2015 Phys. Rev. E 91 032712), to include protein dimers using mixed internal and Cartesian coordinates. A test case on a 1.3 {\\mathringA} resolution model of a small homodimer, ActVA-ORF6, consisting of two 112-residue subunits identically folded in a compact 50 {\\mathringA} sphere, reproduces the distinct experimental Debye-Waller motility asymmetry for the two chains, demonstrating that structure sensitively selects vibrational signatures. The vibrational analysis of this PDB entry, together with biochemical and crystallographic data, demonstrates the cooperative nature of the dimeric interaction of the two subunits and suggests a mechanical model for subunit interconversion during the catalytic cycle.

  5. Optimized broad-histogram simulations for strong first-order phase transitions: droplet transitions in the large-Q Potts model

    International Nuclear Information System (INIS)

    Bauer, Bela; Troyer, Matthias; Gull, Emanuel; Trebst, Simon; Huse, David A

    2010-01-01

    The numerical simulation of strongly first-order phase transitions has remained a notoriously difficult problem even for classical systems due to the exponentially suppressed (thermal) equilibration in the vicinity of such a transition. In the absence of efficient update techniques, a common approach for improving equilibration in Monte Carlo simulations is broadening the sampled statistical ensemble beyond the bimodal distribution of the canonical ensemble. Here we show how a recently developed feedback algorithm can systematically optimize such broad-histogram ensembles and significantly speed up equilibration in comparison with other extended ensemble techniques such as flat-histogram, multicanonical and Wang–Landau sampling. We simulate, as a prototypical example of a strong first-order transition, the two-dimensional Potts model with up to Q = 250 different states in large systems. The optimized histogram develops a distinct multi-peak structure, thereby resolving entropic barriers and their associated phase transitions in the phase coexistence region—such as droplet nucleation and annihilation, and droplet–strip transitions for systems with periodic boundary conditions. We characterize the efficiency of the optimized histogram sampling by measuring round-trip times τ(N, Q) across the phase transition for samples comprised of N spins. While we find power-law scaling of τ versus N for small Q∼ 2 , we observe a crossover to exponential scaling for larger Q. These results demonstrate that despite the ensemble optimization, broad-histogram simulations cannot fully eliminate the supercritical slowing down at strongly first-order transitions

  6. Optimized broad-histogram simulations for strong first-order phase transitions: droplet transitions in the large-Q Potts model

    Science.gov (United States)

    Bauer, Bela; Gull, Emanuel; Trebst, Simon; Troyer, Matthias; Huse, David A.

    2010-01-01

    The numerical simulation of strongly first-order phase transitions has remained a notoriously difficult problem even for classical systems due to the exponentially suppressed (thermal) equilibration in the vicinity of such a transition. In the absence of efficient update techniques, a common approach for improving equilibration in Monte Carlo simulations is broadening the sampled statistical ensemble beyond the bimodal distribution of the canonical ensemble. Here we show how a recently developed feedback algorithm can systematically optimize such broad-histogram ensembles and significantly speed up equilibration in comparison with other extended ensemble techniques such as flat-histogram, multicanonical and Wang-Landau sampling. We simulate, as a prototypical example of a strong first-order transition, the two-dimensional Potts model with up to Q = 250 different states in large systems. The optimized histogram develops a distinct multi-peak structure, thereby resolving entropic barriers and their associated phase transitions in the phase coexistence region—such as droplet nucleation and annihilation, and droplet-strip transitions for systems with periodic boundary conditions. We characterize the efficiency of the optimized histogram sampling by measuring round-trip times τ(N, Q) across the phase transition for samples comprised of N spins. While we find power-law scaling of τ versus N for small Q \\lesssim 50 and N \\lesssim 40^2 , we observe a crossover to exponential scaling for larger Q. These results demonstrate that despite the ensemble optimization, broad-histogram simulations cannot fully eliminate the supercritical slowing down at strongly first-order transitions.

  7. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  8. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  9. Completely reproducible description of digital sound data with cellular automata

    International Nuclear Information System (INIS)

    Wada, Masato; Kuroiwa, Jousuke; Nara, Shigetoshi

    2002-01-01

    A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information

  10. A novel approach for characterizing broad-band radio spectral energy distributions

    Science.gov (United States)

    Harvey, V. M.; Franzen, T.; Morgan, J.; Seymour, N.

    2018-05-01

    We present a new broad-band radio frequency catalogue across 0.12 GHz ≤ ν ≤ 20 GHz created by combining data from the Murchison Widefield Array Commissioning Survey, the Australia Telescope 20 GHz survey, and the literature. Our catalogue consists of 1285 sources limited by S20 GHz > 40 mJy at 5σ, and contains flux density measurements (or estimates) and uncertainties at 0.074, 0.080, 0.119, 0.150, 0.180, 0.408, 0.843, 1.4, 4.8, 8.6, and 20 GHz. We fit a second-order polynomial in log-log space to the spectral energy distributions of all these sources in order to characterize their broad-band emission. For the 994 sources that are well described by a linear or quadratic model we present a new diagnostic plot arranging sources by the linear and curvature terms. We demonstrate the advantages of such a plot over the traditional radio colour-colour diagram. We also present astrophysical descriptions of the sources found in each segment of this new parameter space and discuss the utility of these plots in the upcoming era of large area, deep, broad-band radio surveys.

  11. Reproducibility of gene expression across generations of Affymetrix microarrays

    Directory of Open Access Journals (Sweden)

    Haslett Judith N

    2003-06-01

    Full Text Available Abstract Background The development of large-scale gene expression profiling technologies is rapidly changing the norms of biological investigation. But the rapid pace of change itself presents challenges. Commercial microarrays are regularly modified to incorporate new genes and improved target sequences. Although the ability to compare datasets across generations is crucial for any long-term research project, to date no means to allow such comparisons have been developed. In this study the reproducibility of gene expression levels across two generations of Affymetrix GeneChips® (HuGeneFL and HG-U95A was measured. Results Correlation coefficients were computed for gene expression values across chip generations based on different measures of similarity. Comparing the absolute calls assigned to the individual probe sets across the generations found them to be largely unchanged. Conclusion We show that experimental replicates are highly reproducible, but that reproducibility across generations depends on the degree of similarity of the probe sets and the expression level of the corresponding transcript.

  12. Reproducibility of the Portuguese version of the PEDro Scale

    Directory of Open Access Journals (Sweden)

    Silvia Regina Shiwa

    2011-10-01

    Full Text Available The objective of this study was to test the inter-rater reproducibility of the Portuguese version of the PEDro Scale. Seven physiotherapists rated the methodological quality of 50 reports of randomized controlled trials written in Portuguese indexed on the PEDro database. Each report was also rated using the English version of the PEDro Scale. Reproducibility was evaluated by comparing two separate ratings of reports written in Portuguese and comparing the Portuguese PEDro score with the English version of the scale. Kappa coefficients ranged from 0.53 to 1.00 for individual item and an intraclass correlation coefficient (ICC of 0.82 for the total PEDro score was observed. The standard error of the measurement of the scale was 0.58. The Portuguese version of the scale was comparable with the English version, with an ICC of 0.78. The inter-rater reproducibility of the Brazilian Portuguese PEDro Scale is adequate and similar to the original English version.

  13. A new strategy to deliver synthetic protein drugs: self-reproducible biologics using minicircles.

    Science.gov (United States)

    Yi, Hyoju; Kim, Youngkyun; Kim, Juryun; Jung, Hyerin; Rim, Yeri Alice; Jung, Seung Min; Park, Sung-Hwan; Ju, Ji Hyeon

    2014-08-05

    Biologics are the most successful drugs used in anticytokine therapy. However, they remain partially unsuccessful because of the elevated cost of their synthesis and purification. Development of novel biologics has also been hampered by the high cost. Biologics are made of protein components; thus, theoretically, they can be produced in vivo. Here we tried to invent a novel strategy to allow the production of synthetic drugs in vivo by the host itself. The recombinant minicircles encoding etanercept or tocilizumab, which are synthesized currently by pharmaceutical companies, were injected intravenously into animal models. Self-reproduced etanercept and tocilizumab were detected in the serum of mice. Moreover, arthritis subsided in mice that were injected with minicircle vectors carrying biologics. Self-reproducible biologics need neither factory facilities for drug production nor clinical processes, such as frequent drug injection. Although this novel strategy is in its very early conceptual stage, it seems to represent a potential alternative method for the delivery of biologics.

  14. Intra-and interobserver reproducibility of shear wave elastography for evaluation of the breast lesions

    International Nuclear Information System (INIS)

    Hong, Min Ji; Kim, Hak Hee

    2017-01-01

    To evaluate reproducibility of shear wave elastography (SWE) for breast lesions within and between observers and compare the reproducibility of SWE features. For intraobserver reproducibility, 225 masses with 208 patients were included; and two consecutive SWE images were acquired by each observer. For interobserver reproducibility, SWE images of the same mass were obtained by another observer before surgery in 40 patients. Intraclass correlation coefficients (ICC) were used to determine intra- and interobserver reproducibility. Intraobserver reliability for mean elasticity (Emean) and maximum elasticity (Emax) were excellent (ICC = 0.803, 0.799). ICC for SWE ratio and minimum elasticity (Emin) were fair to good (ICC = 0.703, 0.539). Emean showed excellent ICC regardless of histopathologic type and tumor size. Emax, SWE ratio and Emin represented excellent or fair to good reproducibility based on histopathologic type and tumor size. In interobserver study, ICC for Emean, Emax and SWE ratio were excellent. Emean, Emax and SWE ratio represented excellent ICC irrespective of histopathologic type. ICC for Emean was excellent regardless of tumor size. SWE ratio and Emax showed fair to good interobserver reproducibility based on tumor size. Emin represented poor interobserver reliability. Emean in SWE was highly reproducible within and between observers

  15. Intra-and interobserver reproducibility of shear wave elastography for evaluation of the breast lesions

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Min Ji [Dept. of Radiology, Gil Hospital, Gachon University of Medicine and Science, Incheon (Korea, Republic of); Kim, Hak Hee [Dept. of Radiology, and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul (Korea, Republic of)

    2017-03-15

    To evaluate reproducibility of shear wave elastography (SWE) for breast lesions within and between observers and compare the reproducibility of SWE features. For intraobserver reproducibility, 225 masses with 208 patients were included; and two consecutive SWE images were acquired by each observer. For interobserver reproducibility, SWE images of the same mass were obtained by another observer before surgery in 40 patients. Intraclass correlation coefficients (ICC) were used to determine intra- and interobserver reproducibility. Intraobserver reliability for mean elasticity (Emean) and maximum elasticity (Emax) were excellent (ICC = 0.803, 0.799). ICC for SWE ratio and minimum elasticity (Emin) were fair to good (ICC = 0.703, 0.539). Emean showed excellent ICC regardless of histopathologic type and tumor size. Emax, SWE ratio and Emin represented excellent or fair to good reproducibility based on histopathologic type and tumor size. In interobserver study, ICC for Emean, Emax and SWE ratio were excellent. Emean, Emax and SWE ratio represented excellent ICC irrespective of histopathologic type. ICC for Emean was excellent regardless of tumor size. SWE ratio and Emax showed fair to good interobserver reproducibility based on tumor size. Emin represented poor interobserver reliability. Emean in SWE was highly reproducible within and between observers.

  16. Reproducibility of abdominal fat assessment by ultrasound and computed tomography.

    Science.gov (United States)

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaeté; Benedeti, Augusto César Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge

    2017-01-01

    To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility.

  17. An empirical analysis of journal policy effectiveness for computational reproducibility.

    Science.gov (United States)

    Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun

    2018-03-13

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.

  18. Reproducing Kernel Method for Solving Nonlinear Differential-Difference Equations

    Directory of Open Access Journals (Sweden)

    Reza Mokhtari

    2012-01-01

    Full Text Available On the basis of reproducing kernel Hilbert spaces theory, an iterative algorithm for solving some nonlinear differential-difference equations (NDDEs is presented. The analytical solution is shown in a series form in a reproducing kernel space, and the approximate solution , is constructed by truncating the series to terms. The convergence of , to the analytical solution is also proved. Results obtained by the proposed method imply that it can be considered as a simple and accurate method for solving such differential-difference problems.

  19. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  20. Reproducibility of cervical range of motion in patients with neck pain

    NARCIS (Netherlands)

    Pool, JJM; van Mameren, H; Deville, WJLM; Assendelft, WJJ; de Vet, HCW; de Winter, AF; Koes, BW; Bouter, LM; Hoving, J.L.

    2005-01-01

    Background: Reproducibility measurements of the range of motion are an important prerequisite for the interpretation of study results. The aim of the study is to assess the intra-rater and interrater reproducibility of the measurement of active Range of Motion ( ROM) in patients with neck pain using

  1. Reproducibility of clinical research in critical care: a scoping review.

    Science.gov (United States)

    Niven, Daniel J; McCormick, T Jared; Straus, Sharon E; Hemmelgarn, Brenda R; Jeffs, Lianne; Barnes, Tavish R M; Stelfox, Henry T

    2018-02-21

    The ability to reproduce experiments is a defining principle of science. Reproducibility of clinical research has received relatively little scientific attention. However, it is important as it may inform clinical practice, research agendas, and the design of future studies. We used scoping review methods to examine reproducibility within a cohort of randomized trials examining clinical critical care research and published in the top general medical and critical care journals. To identify relevant clinical practices, we searched the New England Journal of Medicine, The Lancet, and JAMA for randomized trials published up to April 2016. To identify a comprehensive set of studies for these practices, included articles informed secondary searches within other high-impact medical and specialty journals. We included late-phase randomized controlled trials examining therapeutic clinical practices in adults admitted to general medical-surgical or specialty intensive care units (ICUs). Included articles were classified using a reproducibility framework. An original study was the first to evaluate a clinical practice. A reproduction attempt re-evaluated that practice in a new set of participants. Overall, 158 practices were examined in 275 included articles. A reproduction attempt was identified for 66 practices (42%, 95% CI 33-50%). Original studies reported larger effects than reproduction attempts (primary endpoint, risk difference 16.0%, 95% CI 11.6-20.5% vs. 8.4%, 95% CI 6.0-10.8%, P = 0.003). More than half of clinical practices with a reproduction attempt demonstrated effects that were inconsistent with the original study (56%, 95% CI 42-68%), among which a large number were reported to be efficacious in the original study and to lack efficacy in the reproduction attempt (34%, 95% CI 19-52%). Two practices reported to be efficacious in the original study were found to be harmful in the reproduction attempt. A minority of critical care practices with research published

  2. Reproducibility of Psychological Experiments as a Problem of Post-Nonclassical Science

    Directory of Open Access Journals (Sweden)

    Vachkov I.V.,

    2016-04-01

    Full Text Available A fundamental project on reproducibility carried out in the USA by Brian Nosek in 2015 (the Reproducibility Project revealed a serious methodological problem in psychology: the issue of replication of psycho- logical experiments. Reproducibility has been traditionally perceived as one of the basic principles of the scientific method. However, methodological analysis of the modern post-nonclassical stage in the development of science suggests that this might be a bit too uncompromising as applied to psychology. It seems that the very criteria of scientific research need to be reconsidered with regard to the specifics of post-nonclassical science, and, as the authors put it, as a result, reproducibility might lose its key status or even be excluded at all. The reviewed problem and the proposed ways of coping with it are of high importance to research and practice in psychology as they define the strategies for organizing, conducting and evaluating experimental research.

  3. Coupled RipCAS-DFLOW (CoRD) Software and Data Management System for Reproducible Floodplain Vegetation Succession Modeling

    Science.gov (United States)

    Turner, M. A.; Miller, S.; Gregory, A.; Cadol, D. D.; Stone, M. C.; Sheneman, L.

    2016-12-01

    We present the Coupled RipCAS-DFLOW (CoRD) modeling system created to encapsulate the workflow to analyze the effects of stream flooding on vegetation succession. CoRD provides an intuitive command-line and web interface to run DFLOW and RipCAS in succession over many years automatically, which is a challenge because, for our application, DFLOW must be run on a supercomputing cluster via the PBS job scheduler. RipCAS is a vegetation succession model, and DFLOW is a 2D open channel flow model. Data adaptors have been developed to seamlessly connect DFLOW output data to be RipCAS inputs, and vice-versa. CoRD provides automated statistical analysis and visualization, plus automatic syncing of input and output files and model run metadata to the hydrological data management system HydroShare using its excellent Python REST client. This combination of technologies and data management techniques allows the results to be shared with collaborators and eventually published. Perhaps most importantly, it allows results to be easily reproduced via either the command-line or web user interface. This system is a result of collaboration between software developers and hydrologists participating in the Western Consortium for Watershed Analysis, Visualization, and Exploration (WC-WAVE). Because of the computing-intensive nature of this particular workflow, including automating job submission/monitoring and data adaptors, software engineering expertise is required. However, the hydrologists provide the software developers with a purpose and ensure a useful, intuitive tool is developed. Our hydrologists contribute software, too: RipCAS was developed from scratch by hydrologists on the team as a specialized, open-source version of the Computer Aided Simulation Model for Instream Flow and Riparia (CASiMiR) vegetation model; our hydrologists running DFLOW provided numerous examples and help with the supercomputing system. This project is written in Python, a popular language in the

  4. Human broadly neutralizing antibodies to the envelope glycoprotein complex of hepatitis C virus

    DEFF Research Database (Denmark)

    Giang, Erick; Dorner, Marcus; Prentoe, Jannick C

    2012-01-01

    , and an effective vaccine should target conserved T- and B-cell epitopes of the virus. Conserved B-cell epitopes overlapping the CD81 receptor-binding site (CD81bs) on the E2 viral envelope glycoprotein have been reported previously and provide promising vaccine targets. In this study, we isolated 73 human m......Abs recognizing five distinct antigenic regions on the virus envelope glycoprotein complex E1E2 from an HCV-immune phage-display antibody library by using an exhaustive-panning strategy. Many of these mAbs were broadly neutralizing. In particular, the mAb AR4A, recognizing a discontinuous epitope outside the CD81......bs on the E1E2 complex, has an exceptionally broad neutralizing activity toward diverse HCV genotypes and protects against heterologous HCV challenge in a small animal model. The mAb panel will be useful for the design and development of vaccine candidates to elicit broadly neutralizing antibodies...

  5. Scapular dyskinesis in trapezius myalgia and intraexaminer reproducibility of clinical tests

    DEFF Research Database (Denmark)

    Juul-Kristensen, Birgit; Hilt, Kenneth; Enoch, Flemming

    2011-01-01

    dyskinesis, general health, and work ability, and 19 cases and 14 controls participated in the reproducibility study. Intraexaminer reproducibility was good to excellent for 6 of 10 clinical variables (Intraclass Correlation Coefficient [ICC] 0.76-0.91; kappa 0.84-1.00), and fair to good for four variables...

  6. Reproducibility of cervical range of motion in patients with neck pain

    NARCIS (Netherlands)

    Hoving, Jan Lucas; Pool, Jan J. M.; van Mameren, Henk; Devillé, Walter J. L. M.; Assendelft, Willem J. J.; de Vet, Henrica C. W.; de Winter, Andrea F.; Koes, Bart W.; Bouter, Lex M.

    2005-01-01

    BACKGROUND: Reproducibility measurements of the range of motion are an important prerequisite for the interpretation of study results. The aim of the study is to assess the intra-rater and inter-rater reproducibility of the measurement of active Range of Motion (ROM) in patients with neck pain using

  7. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data

    Directory of Open Access Journals (Sweden)

    Kuroda Mitzi I

    2010-07-01

    Full Text Available Abstract Background Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. Results We develop the Quantized correlation coefficient (QCC that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization, a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. Conclusions To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  8. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.

    Science.gov (United States)

    Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J

    2010-07-27

    Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  9. Reproducibility of the dynamics of facial expressions in unilateral facial palsy.

    Science.gov (United States)

    Alagha, M A; Ju, X; Morley, S; Ayoub, A

    2018-02-01

    The aim of this study was to assess the reproducibility of non-verbal facial expressions in unilateral facial paralysis using dynamic four-dimensional (4D) imaging. The Di4D system was used to record five facial expressions of 20 adult patients. The system captured 60 three-dimensional (3D) images per second; each facial expression took 3-4seconds which was recorded in real time. Thus a set of 180 3D facial images was generated for each expression. The procedure was repeated after 30min to assess the reproducibility of the expressions. A mathematical facial mesh consisting of thousands of quasi-point 'vertices' was conformed to the face in order to determine the morphological characteristics in a comprehensive manner. The vertices were tracked throughout the sequence of the 180 images. Five key 3D facial frames from each sequence of images were analyzed. Comparisons were made between the first and second capture of each facial expression to assess the reproducibility of facial movements. Corresponding images were aligned using partial Procrustes analysis, and the root mean square distance between them was calculated and analyzed statistically (paired Student t-test, PFacial expressions of lip purse, cheek puff, and raising of eyebrows were reproducible. Facial expressions of maximum smile and forceful eye closure were not reproducible. The limited coordination of various groups of facial muscles contributed to the lack of reproducibility of these facial expressions. 4D imaging is a useful clinical tool for the assessment of facial expressions. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  10. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  11. Reproducibility, Controllability, and Optimization of Lenr Experiments

    Science.gov (United States)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  12. Respiratory-Gated Helical Computed Tomography of Lung: Reproducibility of Small Volumes in an Ex Vivo Model

    International Nuclear Information System (INIS)

    Biederer, Juergen; Dinkel, Julien; Bolte, Hendrik; Welzel, Thomas; Hoffmann, Beata M.Sc.; Thierfelder, Carsten; Mende, Ulrich; Debus, Juergen; Heller, Martin; Kauczor, Hans-Ulrich

    2007-01-01

    Purpose: Motion-adapted radiotherapy with gated irradiation or tracking of tumor positions requires dedicated imaging techniques such as four-dimensional (4D) helical computed tomography (CT) for patient selection and treatment planning. The objective was to evaluate the reproducibility of spatial information for small objects on respiratory-gated 4D helical CT using computer-assisted volumetry of lung nodules in a ventilated ex vivo system. Methods and Materials: Five porcine lungs were inflated inside a chest phantom and prepared with 55 artificial nodules (mean diameter, 8.4 mm ± 1.8). The lungs were respirated by a flexible diaphragm and scanned with 40-row detector CT (collimation, 24 x 1.2 mm; pitch, 0.1; rotation time, 1 s; slice thickness, 1.5 mm; increment, 0.8 mm). The 4D-CT scans acquired during respiration (eight per minute) and reconstructed at 0-100% inspiration and equivalent static scans were scored for motion-related artifacts (0 or absent to 3 or relevant). The reproducibility of nodule volumetry (three readers) was assessed using the variation coefficient (VC). Results: The mean volumes from the static and dynamic inspiratory scans were equal (364.9 and 360.8 mm 3 , respectively, p = 0.24). The static and dynamic end-expiratory volumes were slightly greater (371.9 and 369.7 mm 3 , respectively, p = 0.019). The VC for volumetry (static) was 3.1%, with no significant difference between 20 apical and 20 caudal nodules (2.6% and 3.5%, p = 0.25). In dynamic scans, the VC was greater (3.9%, p = 0.004; apical and caudal, 2.6% and 4.9%; p = 0.004), with a significant difference between static and dynamic in the 20 caudal nodules (3.5% and 4.9%, p = 0.015). This was consistent with greater motion-related artifacts and image noise at the diaphragm (p <0.05). The VC for interobserver variability was 0.6%. Conclusion: Residual motion-related artifacts had only minimal influence on volumetry of small solid lesions. This indicates a high reproducibility of

  13. On-line quantile regression in the RKHS (Reproducing Kernel Hilbert Space) for operational probabilistic forecasting of wind power

    International Nuclear Information System (INIS)

    Gallego-Castillo, Cristobal; Bessa, Ricardo; Cavalcante, Laura; Lopez-Garcia, Oscar

    2016-01-01

    Wind power probabilistic forecast is being used as input in several decision-making problems, such as stochastic unit commitment, operating reserve setting and electricity market bidding. This work introduces a new on-line quantile regression model based on the Reproducing Kernel Hilbert Space (RKHS) framework. Its application to the field of wind power forecasting involves a discussion on the choice of the bias term of the quantile models, and the consideration of the operational framework in order to mimic real conditions. Benchmark against linear and splines quantile regression models was performed for a real case study during a 18 months period. Model parameter selection was based on k-fold crossvalidation. Results showed a noticeable improvement in terms of calibration, a key criterion for the wind power industry. Modest improvements in terms of Continuous Ranked Probability Score (CRPS) were also observed for prediction horizons between 6 and 20 h ahead. - Highlights: • New online quantile regression model based on the Reproducing Kernel Hilbert Space. • First application to operational probabilistic wind power forecasting. • Modest improvements of CRPS for prediction horizons between 6 and 20 h ahead. • Noticeable improvements in terms of Calibration due to online learning.

  14. Cervical vertebrae maturation method morphologic criteria: poor reproducibility.

    Science.gov (United States)

    Nestman, Trenton S; Marshall, Steven D; Qian, Fang; Holton, Nathan; Franciscus, Robert G; Southard, Thomas E

    2011-08-01

    The cervical vertebrae maturation (CVM) method has been advocated as a predictor of peak mandibular growth. A careful review of the literature showed potential methodologic errors that might influence the high reported reproducibility of the CVM method, and we recently established that the reproducibility of the CVM method was poor when these potential errors were eliminated. The purpose of this study was to further investigate the reproducibility of the individual vertebral patterns. In other words, the purpose was to determine which of the individual CVM vertebral patterns could be classified reliably and which could not. Ten practicing orthodontists, trained in the CVM method, evaluated the morphology of cervical vertebrae C2 through C4 from 30 cephalometric radiographs using questions based on the CVM method. The Fleiss kappa statistic was used to assess interobserver agreement when evaluating each cervical vertebrae morphology question for each subject. The Kendall coefficient of concordance was used to assess the level of interobserver agreement when determining a "derived CVM stage" for each subject. Interobserver agreement was high for assessment of the lower borders of C2, C3, and C4 that were either flat or curved in the CVM method, but interobserver agreement was low for assessment of the vertebral bodies of C3 and C4 when they were either trapezoidal, rectangular horizontal, square, or rectangular vertical; this led to the overall poor reproducibility of the CVM method. These findings were reflected in the Fleiss kappa statistic. Furthermore, nearly 30% of the time, individual morphologic criteria could not be combined to generate a final CVM stage because of incompatible responses to the 5 questions. Intraobserver agreement in this study was only 62%, on average, when the inconclusive stagings were excluded as disagreements. Intraobserver agreement was worse (44%) when the inconclusive stagings were included as disagreements. For the group of subjects

  15. X-ray Emitting GHz-Peaked Spectrum Galaxies: Testing a Dynamical-Radiative Model with Broad-Band Spectra

    International Nuclear Information System (INIS)

    Ostorero, L.; Moderski, R.; Stawarz, L.; Diaferio, A.; Kowalska, I.; Cheung, C.C.; Kataoka, J.; Begelman, M.C.; Wagner, S.J.

    2010-01-01

    In a dynamical-radiative model we recently developed to describe the physics of compact, GHz-Peaked-Spectrum (GPS) sources, the relativistic jets propagate across the inner, kpc-sized region of the host galaxy, while the electron population of the expanding lobes evolves and emits synchrotron and inverse-Compton (IC) radiation. Interstellar-medium gas clouds engulfed by the expanding lobes, and photoionized by the active nucleus, are responsible for the radio spectral turnover through free-free absorption (FFA) of the synchrotron photons. The model provides a description of the evolution of the GPS spectral energy distribution (SED) with the source expansion, predicting significant and complex high-energy emission, from the X-ray to the γ-ray frequency domain. Here, we test this model with the broad-band SEDs of a sample of eleven X-ray emitting GPS galaxies with Compact-Symmetric-Object (CSO) morphology, and show that: (i) the shape of the radio continuum at frequencies lower than the spectral turnover is indeed well accounted for by the FFA mechanism; (ii) the observed X-ray spectra can be interpreted as non-thermal radiation produced via IC scattering of the local radiation fields off the lobe particles, providing a viable alternative to the thermal, accretion-disk dominated scenario. We also show that the relation between the hydrogen column densities derived from the X-ray (N H ) and radio (N HI ) data of the sources is suggestive of a positive correlation, which, if confirmed by future observations, would provide further support to our scenario of high-energy emitting lobes.

  16. Modeling neutralization kinetics of HIV by broadly neutralizing monoclonal antibodies in genital secretions coating the cervicovaginal mucosa.

    Directory of Open Access Journals (Sweden)

    Scott A McKinley

    Full Text Available Eliciting broadly neutralizing antibodies (bnAb in cervicovaginal mucus (CVM represents a promising "first line of defense" strategy to reduce vaginal HIV transmission. However, it remains unclear what levels of bnAb must be present in CVM to effectively reduce infection. We approached this complex question by modeling the dynamic tally of bnAb coverage on HIV. This analysis introduces a critical, timescale-dependent competition: to protect, bnAb must accumulate at sufficient stoichiometry to neutralize HIV faster than virions penetrate CVM and reach target cells. We developed a model that incorporates concentrations and diffusivities of HIV and bnAb in semen and CVM, kinetic rates for binding (kon and unbinding (koff of select bnAb, and physiologically relevant thicknesses of CVM and semen layers. Comprehensive model simulations lead to robust conclusions about neutralization kinetics in CVM. First, due to the limited time virions in semen need to penetrate CVM, substantially greater bnAb concentrations than in vitro estimates must be present in CVM to neutralize HIV. Second, the model predicts that bnAb with more rapid kon, almost independent of koff, should offer greater neutralization potency in vivo. These findings suggest the fastest arriving virions at target cells present the greatest likelihood of infection. It also implies the marked improvements in in vitro neutralization potency of many recently discovered bnAb may not translate to comparable reduction in the bnAb dose needed to confer protection against initial vaginal infections. Our modeling framework offers a valuable tool to gaining quantitative insights into the dynamics of mucosal immunity against HIV and other infectious diseases.

  17. Modeling neutralization kinetics of HIV by broadly neutralizing monoclonal antibodies in genital secretions coating the cervicovaginal mucosa.

    Science.gov (United States)

    McKinley, Scott A; Chen, Alex; Shi, Feng; Wang, Simi; Mucha, Peter J; Forest, M Gregory; Lai, Samuel K

    2014-01-01

    Eliciting broadly neutralizing antibodies (bnAb) in cervicovaginal mucus (CVM) represents a promising "first line of defense" strategy to reduce vaginal HIV transmission. However, it remains unclear what levels of bnAb must be present in CVM to effectively reduce infection. We approached this complex question by modeling the dynamic tally of bnAb coverage on HIV. This analysis introduces a critical, timescale-dependent competition: to protect, bnAb must accumulate at sufficient stoichiometry to neutralize HIV faster than virions penetrate CVM and reach target cells. We developed a model that incorporates concentrations and diffusivities of HIV and bnAb in semen and CVM, kinetic rates for binding (kon) and unbinding (koff) of select bnAb, and physiologically relevant thicknesses of CVM and semen layers. Comprehensive model simulations lead to robust conclusions about neutralization kinetics in CVM. First, due to the limited time virions in semen need to penetrate CVM, substantially greater bnAb concentrations than in vitro estimates must be present in CVM to neutralize HIV. Second, the model predicts that bnAb with more rapid kon, almost independent of koff, should offer greater neutralization potency in vivo. These findings suggest the fastest arriving virions at target cells present the greatest likelihood of infection. It also implies the marked improvements in in vitro neutralization potency of many recently discovered bnAb may not translate to comparable reduction in the bnAb dose needed to confer protection against initial vaginal infections. Our modeling framework offers a valuable tool to gaining quantitative insights into the dynamics of mucosal immunity against HIV and other infectious diseases.

  18. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    Science.gov (United States)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  19. An assessment of geographical distribution of different plant functional types over North America simulated using the CLASS-CTEM modelling framework

    Science.gov (United States)

    Shrestha, Rudra K.; Arora, Vivek K.; Melton, Joe R.; Sushama, Laxmi

    2017-10-01

    The performance of the competition module of the CLASS-CTEM (Canadian Land Surface Scheme and Canadian Terrestrial Ecosystem Model) modelling framework is assessed at 1° spatial resolution over North America by comparing the simulated geographical distribution of its plant functional types (PFTs) with two observation-based estimates. The model successfully reproduces the broad geographical distribution of trees, grasses and bare ground although limitations remain. In particular, compared to the two observation-based estimates, the simulated fractional vegetation coverage is lower in the arid southwest North American region and higher in the Arctic region. The lower-than-observed simulated vegetation coverage in the southwest region is attributed to lack of representation of shrubs in the model and plausible errors in the observation-based data sets. The observation-based data indicate vegetation fractional coverage of more than 60 % in this arid region, despite only 200-300 mm of precipitation that the region receives annually, and observation-based leaf area index (LAI) values in the region are lower than one. The higher-than-observed vegetation fractional coverage in the Arctic is likely due to the lack of representation of moss and lichen PFTs and also likely because of inadequate representation of permafrost in the model as a result of which the C3 grass PFT performs overly well in the region. The model generally reproduces the broad spatial distribution and the total area covered by the two primary tree PFTs (needleleaf evergreen trees, NDL-EVG; and broadleaf cold deciduous trees, BDL-DCD-CLD) reasonably well. The simulated fractional coverage of tree PFTs increases after the 1960s in response to the CO2 fertilization effect and climate warming. Differences between observed and simulated PFT coverages highlight model limitations and suggest that the inclusion of shrubs, and moss and lichen PFTs, and an adequate representation of permafrost will help improve

  20. Analysis of mammalian gene function through broad-based phenotypic screens across a consortium of mouse clinics.

    Science.gov (United States)

    de Angelis, Martin Hrabě; Nicholson, George; Selloum, Mohammed; White, Jacqui; Morgan, Hugh; Ramirez-Solis, Ramiro; Sorg, Tania; Wells, Sara; Fuchs, Helmut; Fray, Martin; Adams, David J; Adams, Niels C; Adler, Thure; Aguilar-Pimentel, Antonio; Ali-Hadji, Dalila; Amann, Gregory; André, Philippe; Atkins, Sarah; Auburtin, Aurelie; Ayadi, Abdel; Becker, Julien; Becker, Lore; Bedu, Elodie; Bekeredjian, Raffi; Birling, Marie-Christine; Blake, Andrew; Bottomley, Joanna; Bowl, Mike; Brault, Véronique; Busch, Dirk H; Bussell, James N; Calzada-Wack, Julia; Cater, Heather; Champy, Marie-France; Charles, Philippe; Chevalier, Claire; Chiani, Francesco; Codner, Gemma F; Combe, Roy; Cox, Roger; Dalloneau, Emilie; Dierich, André; Di Fenza, Armida; Doe, Brendan; Duchon, Arnaud; Eickelberg, Oliver; Esapa, Chris T; El Fertak, Lahcen; Feigel, Tanja; Emelyanova, Irina; Estabel, Jeanne; Favor, Jack; Flenniken, Ann; Gambadoro, Alessia; Garrett, Lilian; Gates, Hilary; Gerdin, Anna-Karin; Gkoutos, George; Greenaway, Simon; Glasl, Lisa; Goetz, Patrice; Da Cruz, Isabelle Goncalves; Götz, Alexander; Graw, Jochen; Guimond, Alain; Hans, Wolfgang; Hicks, Geoff; Hölter, Sabine M; Höfler, Heinz; Hancock, John M; Hoehndorf, Robert; Hough, Tertius; Houghton, Richard; Hurt, Anja; Ivandic, Boris; Jacobs, Hughes; Jacquot, Sylvie; Jones, Nora; Karp, Natasha A; Katus, Hugo A; Kitchen, Sharon; Klein-Rodewald, Tanja; Klingenspor, Martin; Klopstock, Thomas; Lalanne, Valerie; Leblanc, Sophie; Lengger, Christoph; le Marchand, Elise; Ludwig, Tonia; Lux, Aline; McKerlie, Colin; Maier, Holger; Mandel, Jean-Louis; Marschall, Susan; Mark, Manuel; Melvin, David G; Meziane, Hamid; Micklich, Kateryna; Mittelhauser, Christophe; Monassier, Laurent; Moulaert, David; Muller, Stéphanie; Naton, Beatrix; Neff, Frauke; Nolan, Patrick M; Nutter, Lauryl Mj; Ollert, Markus; Pavlovic, Guillaume; Pellegata, Natalia S; Peter, Emilie; Petit-Demoulière, Benoit; Pickard, Amanda; Podrini, Christine; Potter, Paul; Pouilly, Laurent; Puk, Oliver; Richardson, David; Rousseau, Stephane; Quintanilla-Fend, Leticia; Quwailid, Mohamed M; Racz, Ildiko; Rathkolb, Birgit; Riet, Fabrice; Rossant, Janet; Roux, Michel; Rozman, Jan; Ryder, Ed; Salisbury, Jennifer; Santos, Luis; Schäble, Karl-Heinz; Schiller, Evelyn; Schrewe, Anja; Schulz, Holger; Steinkamp, Ralf; Simon, Michelle; Stewart, Michelle; Stöger, Claudia; Stöger, Tobias; Sun, Minxuan; Sunter, David; Teboul, Lydia; Tilly, Isabelle; Tocchini-Valentini, Glauco P; Tost, Monica; Treise, Irina; Vasseur, Laurent; Velot, Emilie; Vogt-Weisenhorn, Daniela; Wagner, Christelle; Walling, Alison; Weber, Bruno; Wendling, Olivia; Westerberg, Henrik; Willershäuser, Monja; Wolf, Eckhard; Wolter, Anne; Wood, Joe; Wurst, Wolfgang; Yildirim, Ali Önder; Zeh, Ramona; Zimmer, Andreas; Zimprich, Annemarie; Holmes, Chris; Steel, Karen P; Herault, Yann; Gailus-Durner, Valérie; Mallon, Ann-Marie; Brown, Steve Dm

    2015-09-01

    The function of the majority of genes in the mouse and human genomes remains unknown. The mouse embryonic stem cell knockout resource provides a basis for the characterization of relationships between genes and phenotypes. The EUMODIC consortium developed and validated robust methodologies for the broad-based phenotyping of knockouts through a pipeline comprising 20 disease-oriented platforms. We developed new statistical methods for pipeline design and data analysis aimed at detecting reproducible phenotypes with high power. We acquired phenotype data from 449 mutant alleles, representing 320 unique genes, of which half had no previous functional annotation. We captured data from over 27,000 mice, finding that 83% of the mutant lines are phenodeviant, with 65% demonstrating pleiotropy. Surprisingly, we found significant differences in phenotype annotation according to zygosity. New phenotypes were uncovered for many genes with previously unknown function, providing a powerful basis for hypothesis generation and further investigation in diverse systems.

  1. Retrospective Correction of Physiological Noise: Impact on Sensitivity, Specificity, and Reproducibility of Resting-State Functional Connectivity in a Reading Network Model.

    Science.gov (United States)

    Krishnamurthy, Venkatagiri; Krishnamurthy, Lisa C; Schwam, Dina M; Ealey, Ashley; Shin, Jaemin; Greenberg, Daphne; Morris, Robin D

    2018-03-01

    It is well accepted that physiological noise (PN) obscures the detection of neural fluctuations in resting-state functional connectivity (rsFC) magnetic resonance imaging. However, a clear consensus for an optimal PN correction (PNC) methodology and how it can impact the rsFC signal characteristics is still lacking. In this study, we probe the impact of three PNC methods: RETROICOR: (Glover et al., 2000 ), ANATICOR: (Jo et al., 2010 ), and RVTMBPM: (Bianciardi et al., 2009 ). Using a reading network model, we systematically explore the effects of PNC optimization on sensitivity, specificity, and reproducibility of rsFC signals. In terms of specificity, ANATICOR was found to be effective in removing local white matter (WM) fluctuations and also resulted in aggressive removal of expected cortical-to-subcortical functional connections. The ability of RETROICOR to remove PN was equivalent to removal of simulated random PN such that it artificially inflated the connection strength, thereby decreasing sensitivity. RVTMBPM maintained specificity and sensitivity by balanced removal of vasodilatory PN and local WM nuisance edges. Another aspect of this work was exploring the effects of PNC on identifying reading group differences. Most PNC methods accounted for between-subject PN variability resulting in reduced intersession reproducibility. This effect facilitated the detection of the most consistent group differences. RVTMBPM was most effective in detecting significant group differences due to its inherent sensitivity to removing spatially structured and temporally repeating PN arising from dense vasculature. Finally, results suggest that combining all three PNC resulted in "overcorrection" by removing signal along with noise.

  2. Microbial community development in a dynamic gut model is reproducible, colon region specific, and selective for Bacteroidetes and Clostridium cluster IX.

    Science.gov (United States)

    Van den Abbeele, Pieter; Grootaert, Charlotte; Marzorati, Massimo; Possemiers, Sam; Verstraete, Willy; Gérard, Philippe; Rabot, Sylvie; Bruneau, Aurélia; El Aidy, Sahar; Derrien, Muriel; Zoetendal, Erwin; Kleerebezem, Michiel; Smidt, Hauke; Van de Wiele, Tom

    2010-08-01

    Dynamic, multicompartment in vitro gastrointestinal simulators are often used to monitor gut microbial dynamics and activity. These reactors need to harbor a microbial community that is stable upon inoculation, colon region specific, and relevant to in vivo conditions. Together with the reproducibility of the colonization process, these criteria are often overlooked when the modulatory properties from different treatments are compared. We therefore investigated the microbial colonization process in two identical simulators of the human intestinal microbial ecosystem (SHIME), simultaneously inoculated with the same human fecal microbiota with a high-resolution phylogenetic microarray: the human intestinal tract chip (HITChip). Following inoculation of the in vitro colon compartments, microbial community composition reached steady state after 2 weeks, whereas 3 weeks were required to reach functional stability. This dynamic colonization process was reproducible in both SHIME units and resulted in highly diverse microbial communities which were colon region specific, with the proximal regions harboring saccharolytic microbes (e.g., Bacteroides spp. and Eubacterium spp.) and the distal regions harboring mucin-degrading microbes (e.g., Akkermansia spp.). Importantly, the shift from an in vivo to an in vitro environment resulted in an increased Bacteroidetes/Firmicutes ratio, whereas Clostridium cluster IX (propionate producers) was enriched compared to clusters IV and XIVa (butyrate producers). This was supported by proportionally higher in vitro propionate concentrations. In conclusion, high-resolution analysis of in vitro-cultured gut microbiota offers new insight on the microbial colonization process and indicates the importance of digestive parameters that may be crucial in the development of new in vitro models.

  3. Reproducibility of heart rate variability parameters measured in healthy subjects at rest and after a postural change maneuver

    Directory of Open Access Journals (Sweden)

    E.M. Dantas

    2010-10-01

    Full Text Available Heart rate variability (HRV provides important information about cardiac autonomic modulation. Since it is a noninvasive and inexpensive method, HRV has been used to evaluate several parameters of cardiovascular health. However, the internal reproducibility of this method has been challenged in some studies. Our aim was to determine the intra-individual reproducibility of HRV parameters in short-term recordings obtained in supine and orthostatic positions. Electrocardiographic (ECG recordings were obtained from 30 healthy subjects (20-49 years, 14 men using a digital apparatus (sampling ratio = 250 Hz. ECG was recorded for 10 min in the supine position and for 10 min in the orthostatic position. The procedure was repeated 2-3 h later. Time and frequency domain analyses were performed. Frequency domain included low (LF, 0.04-0.15 Hz and high frequency (HF, 0.15-0.4 Hz bands. Power spectral analysis was performed by the autoregressive method and model order was set at 16. Intra-subject agreement was assessed by linear regression analysis, test of difference in variances and limits of agreement. Most HRV measures (pNN50, RMSSD, LF, HF, and LF/HF ratio were reproducible independent of body position. Better correlation indexes (r > 0.6 were obtained in the orthostatic position. Bland-Altman plots revealed that most values were inside the agreement limits, indicating concordance between measures. Only SDNN and NNv in the supine position were not reproducible. Our results showed reproducibility of HRV parameters when recorded in the same individual with a short time between two exams. The increased sympathetic activity occurring in the orthostatic position probably facilitates reproducibility of the HRV indexes.

  4. Reliability and reproducibility of subaxial cervical injury description system: a standardized nomenclature schema.

    Science.gov (United States)

    Bono, Christopher M; Schoenfeld, Andrew; Gupta, Giri; Harrop, James S; Anderson, Paul; Patel, Alpesh A; Dimar, John; Aarabi, Bizhan; Dailey, Andrew; Vaccaro, Alexander R; Gahr, Ralf; Shaffrey, Christopher; Anderson, David G; Rampersaud, Raj

    2011-08-01

    Radiographic measurement study. To develop a standardized cervical injury nomenclature system to facilitate description, communication, and classification among health care providers. The reliability and reproducibility of this system was then examined. Description of subaxial cervical injuries is critical for treatment decision making and comparing scientific reports of outcomes. Despite a number of available classification systems, surgeons, and researchers continue to use descriptive nomenclature, such as "burst" and "teardrop" fractures, to describe injuries. However, there is considerable inconsistency with use of such terms in the literature. Eleven distinct injury types and associated definitions were established for the subaxial cervical spine and subsequently refined by members of the Spine Trauma Study Group. A series of 18 cases of patients with a broad spectrum of subaxial cervical spine injuries was prepared and distributed to surgeon raters. Each rater was provided with the full nomenclature document and asked to select primary and secondary injury types for each case. After receipt of the raters' first round of classifications, the cases were resorted and returned to the raters for a second round of review. Interrater and intrarater reliabilities were calculated as percent agreement and Cohen kappa (κ) values. Intrarater reliability was assessed by comparing a given rater's diagnosis from the first and second rounds. Nineteen surgeons completed the first and second rounds of the study. Overall, the system demonstrated 56.4% interrater agreement and 72.8% intrarater agreement. Overall, interrater κ values demonstrated moderate agreement while intrarater κ values showed substantial agreement. Analyzed by injury types, only four (burst fractures, lateral mass fractures, flexion teardrop fractures, and anterior distraction injuries) demonstrated greater than 50% interrater agreement. This study demonstrated that, even in ideal circumstances, there is

  5. Validation of the 3D Skin Comet assay using full thickness skin models: Transferability and reproducibility.

    Science.gov (United States)

    Reisinger, Kerstin; Blatz, Veronika; Brinkmann, Joep; Downs, Thomas R; Fischer, Anja; Henkler, Frank; Hoffmann, Sebastian; Krul, Cyrille; Liebsch, Manfred; Luch, Andreas; Pirow, Ralph; Reus, Astrid A; Schulz, Markus; Pfuhler, Stefan

    2018-03-01

    Recently revised OECD Testing Guidelines highlight the importance of considering the first site-of-contact when investigating the genotoxic hazard. Thus far, only in vivo approaches are available to address the dermal route of exposure. The 3D Skin Comet and Reconstructed Skin Micronucleus (RSMN) assays intend to close this gap in the in vitro genotoxicity toolbox by investigating DNA damage after topical application. This represents the most relevant route of exposure for a variety of compounds found in household products, cosmetics, and industrial chemicals. The comet assay methodology is able to detect both chromosomal damage and DNA lesions that may give rise to gene mutations, thereby complementing the RSMN which detects only chromosomal damage. Here, the comet assay was adapted to two reconstructed full thickness human skin models: the EpiDerm™- and Phenion ® Full-Thickness Skin Models. First, tissue-specific protocols for the isolation of single cells and the general comet assay were transferred to European and US-American laboratories. After establishment of the assay, the protocol was then further optimized with appropriate cytotoxicity measurements and the use of aphidicolin, a DNA repair inhibitor, to improve the assay's sensitivity. In the first phase of an ongoing validation study eight chemicals were tested in three laboratories each using the Phenion ® Full-Thickness Skin Model, informing several validation modules. Ultimately, the 3D Skin Comet assay demonstrated a high predictive capacity and good intra- and inter-laboratory reproducibility with four laboratories reaching a 100% predictivity and the fifth yielding 70%. The data are intended to demonstrate the use of the 3D Skin Comet assay as a new in vitro tool for following up on positive findings from the standard in vitro genotoxicity test battery for dermally applied chemicals, ultimately helping to drive the regulatory acceptance of the assay. To expand the database, the validation will

  6. Reproducibility of abdominal fat assessment by ultrasound and computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaete; Benedeti, Augusto Cesar Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge, E-mail: fernando@fatesa.edu.br [Faculdade de Tecnologia em Saude (FATESA), Ribeirao Preto, SP (Brazil); Universidade de Fortaleza (UNIFOR), Fortaleza, CE (Brazil). Departmento de Radiologia; Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Medicina. Departmento de Medicina Clinica; Universidade de Sao Paulo (FFCLRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Filosofia, Ciencias e Letras; Hospital Mae de Deus, Porto Alegre, RS (Brazil)

    2017-05-15

    Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility. (author)

  7. Empirical evaluation of cross-site reproducibility in radiomic features for characterizing prostate MRI

    Science.gov (United States)

    Chirra, Prathyush; Leo, Patrick; Yim, Michael; Bloch, B. Nicolas; Rastinehad, Ardeshir R.; Purysko, Andrei; Rosen, Mark; Madabhushi, Anant; Viswanath, Satish

    2018-02-01

    The recent advent of radiomics has enabled the development of prognostic and predictive tools which use routine imaging, but a key question that still remains is how reproducible these features may be across multiple sites and scanners. This is especially relevant in the context of MRI data, where signal intensity values lack tissue specific, quantitative meaning, as well as being dependent on acquisition parameters (magnetic field strength, image resolution, type of receiver coil). In this paper we present the first empirical study of the reproducibility of 5 different radiomic feature families in a multi-site setting; specifically, for characterizing prostate MRI appearance. Our cohort comprised 147 patient T2w MRI datasets from 4 different sites, all of which were first pre-processed to correct acquisition-related for artifacts such as bias field, differing voxel resolutions, as well as intensity drift (non-standardness). 406 3D voxel wise radiomic features were extracted and evaluated in a cross-site setting to determine how reproducible they were within a relatively homogeneous non-tumor tissue region; using 2 different measures of reproducibility: Multivariate Coefficient of Variation and Instability Score. Our results demonstrated that Haralick features were most reproducible between all 4 sites. By comparison, Laws features were among the least reproducible between sites, as well as performing highly variably across their entire parameter space. Similarly, the Gabor feature family demonstrated good cross-site reproducibility, but for certain parameter combinations alone. These trends indicate that despite extensive pre-processing, only a subset of radiomic features and associated parameters may be reproducible enough for use within radiomics-based machine learning classifier schemes.

  8. Validation of the BUGJEFF311.BOLIB, BUGENDF70.BOLIB and BUGLE-B7 broad-group libraries on the PCA-Replica (H2O/Fe) neutron shielding benchmark experiment

    OpenAIRE

    Pescarini Massimo; Orsi Roberto; Frisoni Manuela

    2016-01-01

    The PCA-Replica 12/13 (H2O/Fe) neutron shielding benchmark experiment was analysed using the TORT-3.2 3D SN code. PCA-Replica reproduces a PWR ex-core radial geometry with alternate layers of water and steel including a pressure vessel simulator. Three broad-group coupled neutron/photon working cross section libraries in FIDO-ANISN format with the same energy group structure (47 n + 20 γ) and based on different nuclear data were alternatively used: the ENEA BUGJEFF311.BOLIB (JEFF-3.1.1) and U...

  9. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.

    Science.gov (United States)

    Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P

    2018-02-23

    Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.

  10. Continuous wave power scaling in high power broad area quantum cascade lasers

    Science.gov (United States)

    Suttinger, M.; Leshin, J.; Go, R.; Figueiredo, P.; Shu, H.; Lyakh, A.

    2018-02-01

    Experimental and model results for high power broad area quantum cascade lasers are presented. Continuous wave power scaling from 1.62 W to 2.34 W has been experimentally demonstrated for 3.15 mm-long, high reflection-coated 5.6 μm quantum cascade lasers with 15 stage active region for active region width increased from 10 μm to 20 μm. A semi-empirical model for broad area devices operating in continuous wave mode is presented. The model uses measured pulsed transparency current, injection efficiency, waveguide losses, and differential gain as input parameters. It also takes into account active region self-heating and sub-linearity of pulsed power vs current laser characteristic. The model predicts that an 11% improvement in maximum CW power and increased wall plug efficiency can be achieved from 3.15 mm x 25 μm devices with 21 stages of the same design but half doping in the active region. For a 16-stage design with a reduced stage thickness of 300Å, pulsed roll-over current density of 6 kA/cm2 , and InGaAs waveguide layers; optical power increase of 41% is projected. Finally, the model projects that power level can be increased to 4.5 W from 3.15 mm × 31 μm devices with the baseline configuration with T0 increased from 140 K for the present design to 250 K.

  11. Serous tubal intraepithelial carcinoma: diagnostic reproducibility and its implications.

    Science.gov (United States)

    Carlson, Joseph W; Jarboe, Elke A; Kindelberger, David; Nucci, Marisa R; Hirsch, Michelle S; Crum, Christopher P

    2010-07-01

    Serous tubal intraepithelial carcinoma (STIC) is detected in between 5% and 7% of women undergoing risk-reduction salpingooophorectomy for mutations in the BRCA1 or 2 genes (BRCA+), and seems to play a role in the pathogenesis of many ovarian and "primary peritoneal" serous carcinomas. The recognition of STIC is germane to the management of BRCA+ women; however, the diagnostic reproducibility of STIC is unknown. Twenty-one cases were selected and classified as STIC or benign, using both hematoxylin and eosin and immunohistochemical stains for p53 and MIB-1. Digital images of 30 hematoxylin and eosin-stained STICs (n=14) or benign tubal epithelium (n=16) were photographed and randomized for blind digital review in a Powerpoint format by 6 experienced gynecologic pathologists and 6 pathology trainees. A generalized kappa statistic for multiple raters was calculated for all groups. For all reviewers, the kappa was 0.333, indicating poor reproducibility; kappa was 0.453 for the experienced gynecologic pathologists (fair-to-good reproducibility), and kappa=0.253 for the pathology residents (poor reproducibility). In the experienced group, 3 of 14 STICs were diagnosed by all 6 reviewers, and 9 of 14 by a majority of the reviewers. These results show that interobserver concordance in the recognition of STIC in high-quality digital images is at best fair-to-good for even experienced gynecologic pathologists, and a proportion cannot be consistently identified even among experienced observers. In view of these findings, a diagnosis of STIC should be corroborated by a second pathologist, if feasible.

  12. LHC Orbit Correction Reproducibility and Related Machine Protection

    CERN Document Server

    Baer, T; Schmidt, R; Wenninger, J

    2012-01-01

    The Large Hadron Collider (LHC) has an unprecedented nominal stored beam energy of up to 362 MJ per beam. In order to ensure an adequate machine protection by the collimation system, a high reproducibility of the beam position at collimators and special elements like the final focus quadrupoles is essential. This is realized by a combination of manual orbit corrections, feed forward and real time feedback. In order to protect the LHC against inconsistent orbit corrections, which could put the machine in a vulnerable state, a novel software-based interlock system for orbit corrector currents was developed. In this paper, the principle of the new interlock system is described and the reproducibility of the LHC orbit correction is discussed against the background of this system.

  13. Understanding reproducibility of human IVF traits to predict next IVF cycle outcome.

    Science.gov (United States)

    Wu, Bin; Shi, Juanzi; Zhao, Wanqiu; Lu, Suzhen; Silva, Marta; Gelety, Timothy J

    2014-10-01

    Evaluating the failed IVF cycle often provides useful prognostic information. Before undergoing another attempt, patients experiencing an unsuccessful IVF cycle frequently request information about the probability of future success. Here, we introduced the concept of reproducibility and formulae to predict the next IVF cycle outcome. The experimental design was based on the retrospective review of IVF cycle data from 2006 to 2013 in two different IVF centers and statistical analysis. The reproducibility coefficients (r) of IVF traits including number of oocytes retrieved, oocyte maturity, fertilization, embryo quality and pregnancy were estimated using the interclass correlation coefficient between the repeated IVF cycle measurements for the same patient by variance component analysis. The formulae were designed to predict next IVF cycle outcome. The number of oocytes retrieved from patients and their fertilization rate had the highest reproducibility coefficients (r = 0.81 ~ 0.84), which indicated a very close correlation between the first retrieval cycle and subsequent IVF cycles. Oocyte maturity and number of top quality embryos had middle level reproducibility (r = 0.38 ~ 0.76) and pregnancy rate had a relative lower reproducibility (r = 0.23 ~ 0.27). Based on these parameters, the next outcome for these IVF traits might be accurately predicted by the designed formulae. The introduction of the concept of reproducibility to our human IVF program allows us to predict future IVF cycle outcomes. The traits of oocyte numbers retrieved, oocyte maturity, fertilization, and top quality embryos had higher or middle reproducibility, which provides a basis for accurate prediction of future IVF outcomes. Based on this prediction, physicians may counsel their patients or change patient's stimulation plans, and laboratory embryologists may improve their IVF techniques accordingly.

  14. On the Reproducibility of Label-Free Quantitative Cross-Linking/Mass Spectrometry

    Science.gov (United States)

    Müller, Fränze; Fischer, Lutz; Chen, Zhuo Angel; Auchynnikava, Tania; Rappsilber, Juri

    2018-02-01

    Quantitative cross-linking/mass spectrometry (QCLMS) is an emerging approach to study conformational changes of proteins and multi-subunit complexes. Distinguishing protein conformations requires reproducibly identifying and quantifying cross-linked peptides. Here we analyzed the variation between multiple cross-linking reactions using bis[sulfosuccinimidyl] suberate (BS3)-cross-linked human serum albumin (HSA) and evaluated how reproducible cross-linked peptides can be identified and quantified by LC-MS analysis. To make QCLMS accessible to a broader research community, we developed a workflow that integrates the established software tools MaxQuant for spectra preprocessing, Xi for cross-linked peptide identification, and finally Skyline for quantification (MS1 filtering). Out of the 221 unique residue pairs identified in our sample, 124 were subsequently quantified across 10 analyses with coefficient of variation (CV) values of 14% (injection replica) and 32% (reaction replica). Thus our results demonstrate that the reproducibility of QCLMS is in line with the reproducibility of general quantitative proteomics and we establish a robust workflow for MS1-based quantitation of cross-linked peptides.

  15. A reproducible accelerated in vitro release testing method for PLGA microspheres.

    Science.gov (United States)

    Shen, Jie; Lee, Kyulim; Choi, Stephanie; Qu, Wen; Wang, Yan; Burgess, Diane J

    2016-02-10

    The objective of the present study was to develop a discriminatory and reproducible accelerated in vitro release method for long-acting PLGA microspheres with inner structure/porosity differences. Risperidone was chosen as a model drug. Qualitatively and quantitatively equivalent PLGA microspheres with different inner structure/porosity were obtained using different manufacturing processes. Physicochemical properties as well as degradation profiles of the prepared microspheres were investigated. Furthermore, in vitro release testing of the prepared risperidone microspheres was performed using the most common in vitro release methods (i.e., sample-and-separate and flow through) for this type of product. The obtained compositionally equivalent risperidone microspheres had similar drug loading but different inner structure/porosity. When microsphere particle size appeared similar, porous risperidone microspheres showed faster microsphere degradation and drug release compared with less porous microspheres. Both in vitro release methods investigated were able to differentiate risperidone microsphere formulations with differences in porosity under real-time (37 °C) and accelerated (45 °C) testing conditions. Notably, only the accelerated USP apparatus 4 method showed good reproducibility for highly porous risperidone microspheres. These results indicated that the accelerated USP apparatus 4 method is an appropriate fast quality control tool for long-acting PLGA microspheres (even with porous structures). Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Reproducibility of airway luminal size in asthma measured by HRCT.

    Science.gov (United States)

    Brown, Robert H; Henderson, Robert J; Sugar, Elizabeth A; Holbrook, Janet T; Wise, Robert A

    2017-10-01

    Brown RH, Henderson RJ, Sugar EA, Holbrook JT, Wise RA, on behalf of the American Lung Association Airways Clinical Research Centers. Reproducibility of airway luminal size in asthma measured by HRCT. J Appl Physiol 123: 876-883, 2017. First published July 13, 2017; doi:10.1152/japplphysiol.00307.2017.-High-resolution CT (HRCT) is a well-established imaging technology used to measure lung and airway morphology in vivo. However, there is a surprising lack of studies examining HRCT reproducibility. The CPAP Trial was a multicenter, randomized, three-parallel-arm, sham-controlled 12-wk clinical trial to assess the use of a nocturnal continuous positive airway pressure (CPAP) device on airway reactivity to methacholine. The lack of a treatment effect of CPAP on clinical or HRCT measures provided an opportunity for the current analysis. We assessed the reproducibility of HRCT imaging over 12 wk. Intraclass correlation coefficients (ICCs) were calculated for individual airway segments, individual lung lobes, both lungs, and air trapping. The ICC [95% confidence interval (CI)] for airway luminal size at total lung capacity ranged from 0.95 (0.91, 0.97) to 0.47 (0.27, 0.69). The ICC (95% CI) for airway luminal size at functional residual capacity ranged from 0.91 (0.85, 0.95) to 0.32 (0.11, 0.65). The ICC measurements for airway distensibility index and wall thickness were lower, ranging from poor (0.08) to moderate (0.63) agreement. The ICC for air trapping at functional residual capacity was 0.89 (0.81, 0.94) and varied only modestly by lobe from 0.76 (0.61, 0.87) to 0.95 (0.92, 0.97). In stable well-controlled asthmatic subjects, it is possible to reproducibly image unstimulated airway luminal areas over time, by region, and by size at total lung capacity throughout the lungs. Therefore, any changes in luminal size on repeat CT imaging are more likely due to changes in disease state and less likely due to normal variability. NEW & NOTEWORTHY There is a surprising lack

  17. The nebular spectra of the transitional Type Ia Supernovae 2007on and 2011iv: broad, multiple components indicate aspherical explosion cores

    Science.gov (United States)

    Mazzali, P. A.; Ashall, C.; Pian, E.; Stritzinger, M. D.; Gall, C.; Phillips, M. M.; Höflich, P.; Hsiao, E.

    2018-05-01

    The nebular-epoch spectrum of the rapidly declining, `transitional' Type Ia supernova (SN) 2007on showed double emission peaks, which have been interpreted as indicating that the SN was the result of the direct collision of two white dwarfs. The spectrum can be reproduced using two distinct emission components, one redshifted and one blueshifted. These components are similar in mass but have slightly different degrees of ionization. They recede from one another at a line-of-sight speed larger than the sum of the combined expansion velocities of their emitting cores, thereby acting as two independent nebulae. While this configuration appears to be consistent with the scenario of two white dwarfs colliding, it may also indicate an off-centre delayed detonation explosion of a near-Chandrasekhar-mass white dwarf. In either case, broad emission line widths and a rapidly evolving light curve can be expected for the bolometric luminosity of the SN. This is the case for both SNe 2007on and 2011iv, also a transitional SN Ia that exploded in the same elliptical galaxy, NGC 1404. Although SN 2011iv does not show double-peaked emission line profiles, the width of its emission lines is such that a two-component model yields somewhat better results than a single-component model. Most of the mass ejected is in one component, however, which suggests that SN 2011iv was the result of the off-centre ignition of a Chandrasekhar-mass white dwarf.

  18. Reproducing Electric Field Observations during Magnetic Storms by means of Rigorous 3-D Modelling and Distortion Matrix Co-estimation

    Science.gov (United States)

    Püthe, Christoph; Manoj, Chandrasekharan; Kuvshinov, Alexey

    2015-04-01

    Electric fields induced in the conducting Earth during magnetic storms drive currents in power transmission grids, telecommunication lines or buried pipelines. These geomagnetically induced currents (GIC) can cause severe service disruptions. The prediction of GIC is thus of great importance for public and industry. A key step in the prediction of the hazard to technological systems during magnetic storms is the calculation of the geoelectric field. To address this issue for mid-latitude regions, we developed a method that involves 3-D modelling of induction processes in a heterogeneous Earth and the construction of a model of the magnetospheric source. The latter is described by low-degree spherical harmonics; its temporal evolution is derived from observatory magnetic data. Time series of the electric field can be computed for every location on Earth's surface. The actual electric field however is known to be perturbed by galvanic effects, arising from very local near-surface heterogeneities or topography, which cannot be included in the conductivity model. Galvanic effects are commonly accounted for with a real-valued time-independent distortion matrix, which linearly relates measured and computed electric fields. Using data of various magnetic storms that occurred between 2000 and 2003, we estimated distortion matrices for observatory sites onshore and on the ocean bottom. Strong correlations between modellings and measurements validate our method. The distortion matrix estimates prove to be reliable, as they are accurately reproduced for different magnetic storms. We further show that 3-D modelling is crucial for a correct separation of galvanic and inductive effects and a precise prediction of electric field time series during magnetic storms. Since the required computational resources are negligible, our approach is suitable for a real-time prediction of GIC. For this purpose, a reliable forecast of the source field, e.g. based on data from satellites

  19. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques.

    Directory of Open Access Journals (Sweden)

    Emi Kamimura

    Full Text Available The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D images of teeth captured by a digital impression technique to a conventional impression technique in vivo.Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE. A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE. Stereolithography (STL data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D laboratory scanner (D810, 3shape. The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test.The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm than when using a conventional impression technique (0.023 ± 0.01 mm.The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator.

  20. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques

    Science.gov (United States)

    Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi

    2017-01-01

    Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642

  1. Validity and reproducibility of a Spanish dietary history.

    Directory of Open Access Journals (Sweden)

    Pilar Guallar-Castillón

    Full Text Available To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E, which collects information on numerous aspects of the Spanish diet.The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart.The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66, meat (r = 0.66, fish (r = 0.42, vegetables (r = 0.62 and fruits (r = 0.44. The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76, proteins (r= 0.58, lipids (r = 0.73, saturated fat (r = 0.73, monounsaturated fat (r = 0.59, polyunsaturated fat (r = 0.57, and carbohydrates (r = 0.66. The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients.The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients.

  2. Reproducibility of the serum lipid response to coffee oil in healthy volunteers

    Directory of Open Access Journals (Sweden)

    Katan Martijn B

    2003-10-01

    Full Text Available Abstract Background Humans and animals show a certain consistency in the response of their serum lipids to fat-modified diets. This may indicate a genetic basis underlying this response. Coffee oil might be used as a model substance to investigate which genes determine differences in the serum lipid response. Before carrying out such studies our objective was to investigate to what extent the effect of coffee oil on serum lipid concentrations is reproducible within subjects. Methods The serum lipid response of 32 healthy volunteers was measured twice in separate five-week periods in which coffee oil was administered (69 mg cafestol / day. Results Total cholesterol levels increased by 24% in period 1 (range:0;52% and 18% in period 2 (1;48%, LDL cholesterol by 29 % (-9;71% and 20% (-12;57%, triglycerides by 66% (16;175% and 58% (-13;202%, and HDL cholesterol did not change significantly: The range of the HDL response was -19;25% in period 1 and -20;33% in period 2. The correlation between the two responses was 0.20 (95%CI -0.16, 0.51 for total cholesterol, 0.16 (95%CI -0.20, 0.48 for LDL, 0.67 (95%CI 0.42, 0.83 for HDL, and 0.77 (95%CI 0.56, 0.88 for triglycerides. Conclusions The responses of total and LDL cholesterol to coffee oil were poorly reproducible within subjects. The responses of HDL and triglycerides, however, appeared to be highly reproducible. Therefore, investigating the genetic sources of the variation in the serum-lipid response to coffee oil is more promising for HDL and triglycerides.

  3. Reproducibility and comparative validity of a food frequency questionnaire for Australian adults.

    Science.gov (United States)

    Collins, Clare E; Boggess, May M; Watson, Jane F; Guest, Maya; Duncanson, Kerith; Pezdirc, Kristine; Rollo, Megan; Hutchesson, Melinda J; Burrows, Tracy L

    2014-10-01

    Food frequency questionnaires (FFQ) are used in epidemiological studies to investigate the relationship between diet and disease. There is a need for a valid and reliable adult FFQ with a contemporary food list in Australia. To evaluate the reproducibility and comparative validity of the Australian Eating Survey (AES) FFQ in adults compared to weighed food records (WFRs). Two rounds of AES and three-day WFRs were conducted in 97 adults (31 males, median age and BMI for males of 44.9 years, 26.2 kg/m(2), females 41.3 years, 24.0 kg/m(2). Reproducibility was assessed over six months using Wilcoxon signed-rank tests and comparative validity was assessed by intraclass correlation coefficients (ICC) estimated by fitting a mixed effects model for each nutrient to account for age, sex and BMI to allow estimation of between and within person variance. Reproducibility was found to be good for both WFR and FFQ since there were no significant differences between round 1 and 2 administrations. For comparative validity, FFQ ICCs were at least as large as those for WFR. The ICC of the WFR-FFQ difference for total energy intake was 0.6 (95% CI 0.43, 0.77) and the median ICC for all nutrients was 0.47, with all ICCs between 0.15 (%E from saturated fat) and 0.7 (g/day sugars). Compared to WFR the AES FFQ is suitable for reliably estimating the dietary intakes of Australian adults across a wide range of nutrients. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  4. Reproducible positioning in chest X-ray radiography

    International Nuclear Information System (INIS)

    1974-01-01

    A device is described that can be used to ensure reproducibility in the positioning of the patient during X-ray radiography of the thorax. Signals are taken from an electrocardiographic monitor and from a device recording the respiratory cycle. Radiography is performed only when two preselected signals coincide

  5. A catalyzing phantom for reproducible dynamic conversion of hyperpolarized [1-¹³C]-pyruvate.

    Science.gov (United States)

    Walker, Christopher M; Lee, Jaehyuk; Ramirez, Marc S; Schellingerhout, Dawid; Millward, Steven; Bankson, James A

    2013-01-01

    In vivo real time spectroscopic imaging of hyperpolarized ¹³C labeled metabolites shows substantial promise for the assessment of physiological processes that were previously inaccessible. However, reliable and reproducible methods of measurement are necessary to maximize the effectiveness of imaging biomarkers that may one day guide personalized care for diseases such as cancer. Animal models of human disease serve as poor reference standards due to the complexity, heterogeneity, and transient nature of advancing disease. In this study, we describe the reproducible conversion of hyperpolarized [1-¹³C]-pyruvate to [1-¹³C]-lactate using a novel synthetic enzyme phantom system. The rate of reaction can be controlled and tuned to mimic normal or pathologic conditions of varying degree. Variations observed in the use of this phantom compare favorably against within-group variations observed in recent animal studies. This novel phantom system provides crucial capabilities as a reference standard for the optimization, comparison, and certification of quantitative imaging strategies for hyperpolarized tracers.

  6. A catalyzing phantom for reproducible dynamic conversion of hyperpolarized [1-¹³C]-pyruvate.

    Directory of Open Access Journals (Sweden)

    Christopher M Walker

    Full Text Available In vivo real time spectroscopic imaging of hyperpolarized ¹³C labeled metabolites shows substantial promise for the assessment of physiological processes that were previously inaccessible. However, reliable and reproducible methods of measurement are necessary to maximize the effectiveness of imaging biomarkers that may one day guide personalized care for diseases such as cancer. Animal models of human disease serve as poor reference standards due to the complexity, heterogeneity, and transient nature of advancing disease. In this study, we describe the reproducible conversion of hyperpolarized [1-¹³C]-pyruvate to [1-¹³C]-lactate using a novel synthetic enzyme phantom system. The rate of reaction can be controlled and tuned to mimic normal or pathologic conditions of varying degree. Variations observed in the use of this phantom compare favorably against within-group variations observed in recent animal studies. This novel phantom system provides crucial capabilities as a reference standard for the optimization, comparison, and certification of quantitative imaging strategies for hyperpolarized tracers.

  7. Reproducibility of Manual Platelet Estimation Following Automated Low Platelet Counts

    Directory of Open Access Journals (Sweden)

    Zainab S Al-Hosni

    2016-11-01

    Full Text Available Objectives: Manual platelet estimation is one of the methods used when automated platelet estimates are very low. However, the reproducibility of manual platelet estimation has not been adequately studied. We sought to assess the reproducibility of manual platelet estimation following automated low platelet counts and to evaluate the impact of the level of experience of the person counting on the reproducibility of manual platelet estimates. Methods: In this cross-sectional study, peripheral blood films of patients with platelet counts less than 100 × 109/L were retrieved and given to four raters to perform manual platelet estimation independently using a predefined method (average of platelet counts in 10 fields using 100× objective multiplied by 20. Data were analyzed using intraclass correlation coefficient (ICC as a method of reproducibility assessment. Results: The ICC across the four raters was 0.840, indicating excellent agreement. The median difference of the two most experienced raters was 0 (range: -64 to 78. The level of platelet estimate by the least-experienced rater predicted the disagreement (p = 0.037. When assessing the difference between pairs of raters, there was no significant difference in the ICC (p = 0.420. Conclusions: The agreement between different raters using manual platelet estimation was excellent. Further confirmation is necessary, with a prospective study using a gold standard method of platelet counts.

  8. Reproducibility in the assessment of acute pancreatitis with computed tomography

    International Nuclear Information System (INIS)

    Freire Filho, Edison de Oliveira; Vieira, Renata La Rocca; Yamada, Andre Fukunishi; Shigueoka, David Carlos; Bekhor, Daniel; Freire, Maxime Figueiredo de Oliveira; Ajzen, Sergio; D'Ippolito, Giuseppe

    2007-01-01

    Objective: To evaluate the reproducibility of unenhanced and contrast-enhanced computed tomography in the assessment of patients with acute pancreatitis. Materials and methods: Fifty-one unenhanced and contrast-enhanced abdominal computed tomography studies of patients with acute pancreatitis were blindly reviewed by two radiologists (observers 1 and 2). The morphological index was separately calculated for unenhanced and contrast-enhanced computed tomography and the disease severity index was established. Intraobserver and interobserver reproducibility of computed tomography was measured by means of the kappa index (κ). Results: Interobserver agreement was κ 0.666, 0.705, 0.648, 0.547 and 0.631, respectively for unenhanced and contrast-enhanced morphological index, presence of pancreatic necrosis, pancreatic necrosis extension, and disease severity index. Intraobserver agreement (observers 1 and 2, respectively) was κ = 0.796 and 0.732 for unenhanced morphological index; κ 0.725 and 0.802 for contrast- enhanced morphological index; κ = 0.674 and 0.849 for presence of pancreatic necrosis; κ = 0.606 and 0.770 for pancreatic necrosis extension; and κ = 0.801 and 0.687 for disease severity index at computed tomography. Conclusion: Computed tomography for determination of morphological index and disease severity index in the staging of acute pancreatitis is a quite reproducible method. The absence of contrast- enhancement does not affect the computed tomography morphological index reproducibility. (author)

  9. Broad-range PCR: past, present, or future of bacteriology?

    Science.gov (United States)

    Renvoisé, A; Brossier, F; Sougakoff, W; Jarlier, V; Aubry, A

    2013-08-01

    PCR targeting the gene encoding 16S ribosomal RNA (commonly named broad-range PCR or 16S PCR) has been used for 20 years as a polyvalent tool to study prokaryotes. Broad-range PCR was first used as a taxonomic tool, then in clinical microbiology. We will describe the use of broad-range PCR in clinical microbiology. The first application was identification of bacterial strains obtained by culture but whose phenotypic or proteomic identification remained difficult or impossible. This changed bacterial taxonomy and allowed discovering many new species. The second application of broad-range PCR in clinical microbiology is the detection of bacterial DNA from clinical samples; we will review the clinical settings in which the technique proved useful (such as endocarditis) and those in which it did not (such as characterization of bacteria in ascites, in cirrhotic patients). This technique allowed identifying the etiological agents for several diseases, such as Whipple disease. This review is a synthesis of data concerning the applications, assets, and drawbacks of broad-range PCR in clinical microbiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  10. Inter-examiner reproducibility of tests for lumbar motor control

    Directory of Open Access Journals (Sweden)

    Elkjaer Arne

    2011-05-01

    Full Text Available Abstract Background Many studies show a relation between reduced lumbar motor control (LMC and low back pain (LBP. However, test circumstances vary and during test performance, subjects may change position. In other words, the reliability - i.e. reproducibility and validity - of tests for LMC should be based on quantitative data. This has not been considered before. The aim was to analyse the reproducibility of five different quantitative tests for LMC commonly used in daily clinical practice. Methods The five tests for LMC were: repositioning (RPS, sitting forward lean (SFL, sitting knee extension (SKE, and bent knee fall out (BKFO, all measured in cm, and leg lowering (LL, measured in mm Hg. A total of 40 subjects (14 males, 26 females 25 with and 15 without LBP, with a mean age of 46.5 years (SD 14.8, were examined independently and in random order by two examiners on the same day. LBP subjects were recruited from three physiotherapy clinics with a connection to the clinic's gym or back-school. Non-LBP subjects were recruited from the clinic's staff acquaintances, and from patients without LBP. Results The means and standard deviations for each of the tests were 0.36 (0.27 cm for RPS, 1.01 (0.62 cm for SFL, 0.40 (0.29 cm for SKE, 1.07 (0.52 cm for BKFO, and 32.9 (7.1 mm Hg for LL. All five tests for LMC had reproducibility with the following ICCs: 0.90 for RPS, 0.96 for SFL, 0.96 for SKE, 0.94 for BKFO, and 0.98 for LL. Bland and Altman plots showed that most of the differences between examiners A and B were less than 0.20 cm. Conclusion These five tests for LMC displayed excellent reproducibility. However, the diagnostic accuracy of these tests needs to be addressed in larger cohorts of subjects, establishing values for the normal population. Also cut-points between subjects with and without LBP must be determined, taking into account age, level of activity, degree of impairment and participation in sports. Whether reproducibility of these

  11. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    Science.gov (United States)

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Measurement of Trabecular Bone Parameters in Porcine Vertebral Bodies Using Multidetector CT: Evaluation of Reproducibility of 3-Dimensional CT Histomorphometry

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Hwan; Goo, Jin Mo [Dept. of Radiology, Seoul National University Hospital, Seoul National University College of Medicine, Seoul (Korea, Republic of); Moon Kyung Chul [Dept. of Pathology, Seoul National University Hospital, Seoul National University College of Medicine, Seoul (Korea, Republic of); An, Sang Bu [Dept. of radiology, National Cancer Center, Goyang (Korea, Republic of); Kim, Kwang Gi [Dept. of Biomedical Engineering, Division of Basic and Applied Sciences, National Cancer Center, Goyang (Korea, Republic of)

    2011-05-15

    To evaluate the reproducibility of 3-dimensional histomorphometry for the microarchitecture analysis of trabecular bone parameters using multidetector computed tomography (MDCT). Thirty-six specimens from porcine vertebral bodies were imaged five times with a 64- detector row MDCT system using the same scan protocols. Locations of the specimens were nearly identical through the scans. Three-dimensional structural parameters of trabecular bone were derived from the five data sets using image analyzing software. The features measured by the analysis programs were trabecular bone volume, trabecular bone volume/tissue volume, trabecular thickness, trabecular separation, trabecular number, trabecular bone pattern factor, structural model index. The structural trabecular parameters showed excellent reproducibility through repeated scanning. Intraclass correlation coefficients of all seven structural parameters were in the range of 0.998 to 1.000. Coefficients of variation of the six structural parameters, excluding structural model index, were not over 1.6%. The measurement of the trabecular structural parameters using multidetector CT and three-dimensional histomophometry analysis program was validated and showed excellent reproducibility. This method could be used as a noninvasive and easily available test in a clinical setting.

  13. The general theory of the Quasi-reproducible experiments: How to describe the measured data of complex systems?

    Science.gov (United States)

    Nigmatullin, Raoul R.; Maione, Guido; Lino, Paolo; Saponaro, Fabrizio; Zhang, Wei

    2017-01-01

    In this paper, we suggest a general theory that enables to describe experiments associated with reproducible or quasi-reproducible data reflecting the dynamical and self-similar properties of a wide class of complex systems. Under complex system we understand a system when the model based on microscopic principles and suppositions about the nature of the matter is absent. This microscopic model is usually determined as ;the best fit" model. The behavior of the complex system relatively to a control variable (time, frequency, wavelength, etc.) can be described in terms of the so-called intermediate model (IM). One can prove that the fitting parameters of the IM are associated with the amplitude-frequency response of the segment of the Prony series. The segment of the Prony series including the set of the decomposition coefficients and the set of the exponential functions (with k = 1,2,…,K) is limited by the final mode K. The exponential functions of this decomposition depend on time and are found by the original algorithm described in the paper. This approach serves as a logical continuation of the results obtained earlier in paper [Nigmatullin RR, W. Zhang and Striccoli D. General theory of experiment containing reproducible data: The reduction to an ideal experiment. Commun Nonlinear Sci Numer Simul, 27, (2015), pp 175-192] for reproducible experiments and includes the previous results as a partial case. In this paper, we consider a more complex case when the available data can create short samplings or exhibit some instability during the process of measurements. We give some justified evidences and conditions proving the validity of this theory for the description of a wide class of complex systems in terms of the reduced set of the fitting parameters belonging to the segment of the Prony series. The elimination of uncontrollable factors expressed in the form of the apparatus function is discussed. To illustrate how to apply the theory and take advantage of its

  14. Reproducibility of prompts in computer-aided detection (CAD) of breast cancer

    International Nuclear Information System (INIS)

    Taylor, C.G.; Champness, J.; Reddy, M.; Taylor, P.; Potts, H.W.W.; Given-Wilson, R.

    2003-01-01

    AIM: We evaluated the reproducibility of prompts using the R2 ImageChecker M2000 computer-aided detection (CAD) system. MATERIALS AND METHODS: Forty selected two-view mammograms of women with breast cancer were digitized and analysed using the ImageChecker on 10 separate occasions. The mammograms were chosen to provide both straightforward and subtle signs of malignancy. Data analysed included mammographic abnormality, pathology, and whether the cancer was prompted or given an emphasized prompt. RESULTS: Correct prompts were generated in 86 out of 100 occasions for screen-detected cancers. Reproducibility was less in the other categories of more subtle cancers: 21% for cancers previously missed by CAD, a group that contained more grade 1 and small (<10 mm) tumours. Prompts for calcifications were more reproducible than those for masses (76% versus 53%) and these cancers were more likely to have an emphasized prompt. CONCLUSIONS: Probably the most important cause of variability of prompts is shifts in film position between sequential digitizations. Consequently subtle lesions that are only just above the threshold for display may not be prompted on repeat scanning. However, users of CAD should be aware that even emphasized prompts are not consistently reproducible

  15. Reproducing {sup 137}Cs vertical migration in Spanish soils - Reproducing {sup 137}Cs and {sup 90}Sr vertical migration in Spanish mainland

    Energy Technology Data Exchange (ETDEWEB)

    Olondo, C.; Legarda, F.; Herranz, M.; Idoeta, R. [The University of the Basque Country - UPV/EHU, Nuclear Engineering and Fluid Mechanics Dept. Faculty of Engineering, Alda. Urquijo 48013, Bilbao (Spain)

    2014-07-01

    As a result of caesium's and strontium's activity migration study developed in Spanish mainland soils, there has been obtained convective - diffusive migration equation that will reproduce adequately the movement that an activity deposit would follow in this land. Taking into account the dependence on rain that apparent convection velocity shows, it has been defined a new migration parameter that depends only on soil's properties. By means of a least square method and fitting the migration equation to experimental activity profiles, the values showed by the migration parameters in the studied soils, characteristics of that area, have been obtained. After that, there have been obtained the mean values of these parameters for each defined group that, depending on soil's texture, have been observed in the study performed about the movement of both radionuclides in soils and to whom these soils belong. Using these mean values and obtained equation, it has been properly reproduce those vertical activity profiles that were experimentally determined. In order to validate these values, a new sampling programme is carrying out in the north of Spain and, with obtained new sampling points' information, is going to verify if, indeed, obtained mean values also reproduce these new sampling points' activity vertical profile. (authors)

  16. Novel burn device for rapid, reproducible burn wound generation.

    Science.gov (United States)

    Kim, J Y; Dunham, D M; Supp, D M; Sen, C K; Powell, H M

    2016-03-01

    Scarring following full thickness burns leads to significant reductions in range of motion and quality of life for burn patients. To effectively study scar development and the efficacy of anti-scarring treatments in a large animal model (female red Duroc pigs), reproducible, uniform, full-thickness, burn wounds are needed to reduce variability in observed results that occur with burn depth. Prior studies have proposed that initial temperature of the burner, contact time with skin, thermal capacity of burner material, and the amount of pressure applied to the skin need to be strictly controlled to ensure reproducibility. The purpose of this study was to develop a new burner that enables temperature and pressure to be digitally controlled and monitored in real-time throughout burn wound creation and compare it to a standard burn device. A custom burn device was manufactured with an electrically heated burn stylus and a temperature control feedback loop via an electronic microstat. Pressure monitoring was controlled by incorporation of a digital scale into the device, which measured downward force. The standard device was comprised of a heat resistant handle with a long rod connected to the burn stylus, which was heated using a hot plate. To quantify skin surface temperature and internal stylus temperature as a function of contact time, the burners were heated to the target temperature (200±5°C) and pressed into the skin for 40s to create the thermal injuries. Time to reach target temperature and elapsed time between burns were recorded. In addition, each unit was evaluated for reproducibility within and across three independent users by generating burn wounds at contact times spanning from 5 to 40s at a constant pressure and at pressures of 1 or 3lbs with a constant contact time of 40s. Biopsies were collected for histological analysis and burn depth quantification using digital image analysis (ImageJ). The custom burn device maintained both its internal

  17. EMISSION SIGNATURES FROM SUB-PARSEC BINARY SUPERMASSIVE BLACK HOLES. I. DIAGNOSTIC POWER OF BROAD EMISSION LINES

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Khai; Bogdanović, Tamara [Center for Relativistic Astrophysics, School of Physics, Georgia Institute of Technology, Atlanta GA 30332 (United States)

    2016-09-10

    Motivated by advances in observational searches for sub-parsec supermassive black hole binaries (SBHBs) made in the past few years, we develop a semi-analytic model to describe spectral emission-line signatures of these systems. The goal of this study is to aid the interpretation of spectroscopic searches for binaries and to help test one of the leading models of binary accretion flows in the literature: SBHB in a circumbinary disk. In this work, we present the methodology and a comparison of the preliminary model with the data. We model SBHB accretion flows as a set of three accretion disks: two mini-disks that are gravitationally bound to the individual black holes and a circumbinary disk. Given a physically motivated parameter space occupied by sub-parsec SBHBs, we calculate a synthetic database of nearly 15 million broad optical emission-line profiles and explore the dependence of the profile shapes on characteristic properties of SBHBs. We find that the modeled profiles show distinct statistical properties as a function of the semimajor axis, mass ratio, eccentricity of the binary, and the degree of alignment of the triple disk system. This suggests that the broad emission-line profiles from SBHB systems can in principle be used to infer the distribution of these parameters and as such merit further investigation. Calculated profiles are more morphologically heterogeneous than the broad emission lines in observed SBHB candidates and we discuss improved treatment of radiative transfer effects, which will allow a direct statistical comparison of the two groups.

  18. EMISSION SIGNATURES FROM SUB-PARSEC BINARY SUPERMASSIVE BLACK HOLES. I. DIAGNOSTIC POWER OF BROAD EMISSION LINES

    International Nuclear Information System (INIS)

    Nguyen, Khai; Bogdanović, Tamara

    2016-01-01

    Motivated by advances in observational searches for sub-parsec supermassive black hole binaries (SBHBs) made in the past few years, we develop a semi-analytic model to describe spectral emission-line signatures of these systems. The goal of this study is to aid the interpretation of spectroscopic searches for binaries and to help test one of the leading models of binary accretion flows in the literature: SBHB in a circumbinary disk. In this work, we present the methodology and a comparison of the preliminary model with the data. We model SBHB accretion flows as a set of three accretion disks: two mini-disks that are gravitationally bound to the individual black holes and a circumbinary disk. Given a physically motivated parameter space occupied by sub-parsec SBHBs, we calculate a synthetic database of nearly 15 million broad optical emission-line profiles and explore the dependence of the profile shapes on characteristic properties of SBHBs. We find that the modeled profiles show distinct statistical properties as a function of the semimajor axis, mass ratio, eccentricity of the binary, and the degree of alignment of the triple disk system. This suggests that the broad emission-line profiles from SBHB systems can in principle be used to infer the distribution of these parameters and as such merit further investigation. Calculated profiles are more morphologically heterogeneous than the broad emission lines in observed SBHB candidates and we discuss improved treatment of radiative transfer effects, which will allow a direct statistical comparison of the two groups.

  19. Reproducibility of a 3-dimensional gyroscope in measuring shoulder anteflexion and abduction

    Directory of Open Access Journals (Sweden)

    Penning Ludo I F

    2012-07-01

    Full Text Available Abstract Background Few studies have investigated the use of a 3-dimensional gyroscope for measuring the range of motion (ROM in the impaired shoulder. Reproducibility of digital inclinometer and visual estimation is poor. This study aims to investigate the reproducibility of a tri axial gyroscope in measurement of anteflexion, abduction and related rotations in the impaired shoulder. Methods Fifty-eight patients with either subacromial impingement (27 or osteoarthritis of the shoulder (31 participated. Active anteflexion, abduction and related rotations were measured with a tri axial gyroscope according to a test retest protocol. Severity of shoulder impairment and patient perceived pain were assessed by the Disability of Arm Shoulder and Hand score (DASH and the Visual Analogue Scale (VAS. VAS scores were recorded before and after testing. Results In two out of three hospitals patients with osteoarthritis (n = 31 were measured, in the third hospital patients with subacromial impingement (n = 27. There were significant differences among hospitals for the VAS and DASH scores measured before and after testing. The mean differences between the test and retest means for anteflexion were −6 degrees (affected side, 9 (contralateral side and for abduction 15 degrees (affected side and 10 degrees (contralateral side. Bland & Altman plots showed that the confidence intervals for the mean differences fall within −6 up to 15 degrees, individual test - retest differences could exceed these limits. A simulation according to ‘Generalizability Theory’ produces very good coefficients for anteflexion and related rotation as a comprehensive measure of reproducibility. Optimal reproducibility is achieved with 2 repetitions for anteflexion. Conclusions Measurements were influenced by patient perceived pain. Differences in VAS and DASH might be explained by different underlying pathology. These differences in shoulder pathology however did not alter

  20. Disclosure, apology, and offer programs: stakeholders' views of barriers to and strategies for broad implementation.

    Science.gov (United States)

    Bell, Sigall K; Smulowitz, Peter B; Woodward, Alan C; Mello, Michelle M; Duva, Anjali Mitter; Boothman, Richard C; Sands, Kenneth

    2012-12-01

    The Disclosure, Apology, and Offer (DA&O) model, a response to patient injuries caused by medical care, is an innovative approach receiving national attention for its early success as an alternative to the existing inherently adversarial, inefficient, and inequitable medical liability system. Examples of DA&O programs, however, are few. Through key informant interviews, we investigated the potential for more widespread implementation of this model by provider organizations and liability insurers, defining barriers to implementation and strategies for overcoming them. Our study focused on Massachusetts, but we also explored themes that are broadly generalizable to other states. We found strong support for the DA&O model among key stakeholders, who cited its benefits for both the liability system and patient safety. The respondents did not perceive any insurmountable barriers to broad implementation, and they identified strategies that could be pursued relatively quickly. Such solutions would permit a range of organizations to implement the model without legislative hurdles. Although more data are needed about the outcomes of DA&O programs, the model holds considerable promise for transforming the current approach to medical liability and patient safety. © 2012 Milbank Memorial Fund.

  1. Reproducibility of liver position using active breathing coordinator for liver cancer radiotherapy

    International Nuclear Information System (INIS)

    Eccles, Cynthia; Brock, Kristy K.; Bissonnette, Jean-Pierre; Hawkins, Maria; Dawson, Laura A.

    2006-01-01

    Purpose: To measure the intrabreath-hold liver motion and the intrafraction and interfraction reproducibility of liver position relative to vertebral bodies using an active breathing coordinator (ABC) in patients with unresectable liver cancer treated with hypofractionated stereotactic body radiation therapy (SBRT). Methods: Tolerability of ABC and organ motion during ABC was assessed using kV fluoroscopy in 34 patients. For patients treated with ABC, repeat breath-hold CT scans in the ABC breath-hold position were acquired at simulation to estimate the volumetric intrafraction reproducibility of the liver relative to the vertebral bodies. In addition, preceding each radiation therapy fraction, with the liver immobilized using ABC, repeat anteroposterior (AP) megavoltage verification images were obtained. Off-line alignments were completed to determine intrafraction reproducibility (from repeat images obtained before one treatment) and interfraction reproducibility (from comparisons of the final image for each fraction with the AP) of diaphragm position relative to vertebral bodies. For each image set, the vertebral bodies were aligned, and the resultant craniocaudal (CC) offset in diaphragm position was measured. Liver position during ABC was also evaluated from kV fluoroscopy acquired at the time of simulation, kV fluoroscopy at the time of treatment, and from MV beam's-eye view movie loops acquired during treatment. Results: Twenty-one of 34 patients were screened to be suitable for ABC. The average free breathing range of these patients was 13 mm (range, 5-1 mm). Fluoroscopy revealed that the average maximal diaphragm motion during ABC breath-hold was 1.4 mm (range, 0-3.4 mm). The MV treatment movie loops confirmed diaphragm stability during treatment. For a measure of intrafraction reproducibility, an analysis of 36 repeat ABC computed tomography (CT) scans in 14 patients was conducted. The average mean difference in the liver surface position was -0.9 mm, -0

  2. The reproducibility of random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    RAPD) profiles of Streptococcus thermophilus strains by using the polymerase chain reaction (PCR). Several factors can cause the amplification of false and non reproducible bands in the RAPD profiles. We tested three primers, OPI-02 MOD, ...

  3. Timbral aspects of reproduced sound in small rooms. II

    DEFF Research Database (Denmark)

    Bech, Søren

    1996-01-01

    A single loudspeaker with frequency-dependent directivity characteristics, positioned in a room of normal size with frequency-dependent absorption coefficients of the room surfaces, has been simulated using an electroacoustic setup. The model included the direct sound, seventeen individual...... reflections and the reverberant field. The threshold of detection, and just-noticeable differences for an increase in level were measured for individual reflections. The results have confirmed that the first-order floor reflection is likely to contribute individually to the timbre of reproduced noise. However......, for a speech signal none of the investigated reflections will contribute individually to the timbre. It is suggested that the threshold of detection is determined by the spectral changes in the dominant frequency range of 500 Hz to 2 kHz. For increases in the level of individual reflections, the most likely...

  4. A computational model for histone mark propagation reproduces the distribution of heterochromatin in different human cell types.

    Science.gov (United States)

    Schwämmle, Veit; Jensen, Ole Nørregaard

    2013-01-01

    Chromatin is a highly compact and dynamic nuclear structure that consists of DNA and associated proteins. The main organizational unit is the nucleosome, which consists of a histone octamer with DNA wrapped around it. Histone proteins are implicated in the regulation of eukaryote genes and they carry numerous reversible post-translational modifications that control DNA-protein interactions and the recruitment of chromatin binding proteins. Heterochromatin, the transcriptionally inactive part of the genome, is densely packed and contains histone H3 that is methylated at Lys 9 (H3K9me). The propagation of H3K9me in nucleosomes along the DNA in chromatin is antagonizing by methylation of H3 Lysine 4 (H3K4me) and acetylations of several lysines, which is related to euchromatin and active genes. We show that the related histone modifications form antagonized domains on a coarse scale. These histone marks are assumed to be initiated within distinct nucleation sites in the DNA and to propagate bi-directionally. We propose a simple computer model that simulates the distribution of heterochromatin in human chromosomes. The simulations are in agreement with previously reported experimental observations from two different human cell lines. We reproduced different types of barriers between heterochromatin and euchromatin providing a unified model for their function. The effect of changes in the nucleation site distribution and of propagation rates were studied. The former occurs mainly with the aim of (de-)activation of single genes or gene groups and the latter has the power of controlling the transcriptional programs of entire chromosomes. Generally, the regulatory program of gene transcription is controlled by the distribution of nucleation sites along the DNA string.

  5. 77 FR 665 - Endangered and Threatened Wildlife and Plants; Listing Two Distinct Population Segments of Broad...

    Science.gov (United States)

    2012-01-05

    ... threat to the broad-snouted caiman, what regional climate change models are available, and whether they are reliable and credible to use as a step-down model for assessing the effects of climate change on... waterways, including rivers near waterfalls such as Iguaz[uacute], and freshwater creeks with rocky bottoms...

  6. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    Science.gov (United States)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  7. Reproducibility of Computer-Aided Detection Marks in Digital Mammography

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Kim, Sun Mi; Im, Jung Gi; Cha, Joo Hee

    2007-01-01

    To evaluate the performance and reproducibility of a computeraided detection (CAD) system in mediolateral oblique (MLO) digital mammograms taken serially, without release of breast compression. A CAD system was applied preoperatively to the fulfilled digital mammograms of two MLO views taken without release of breast compression in 82 patients (age range: 33 83 years; mean age: 49 years) with previously diagnosed breast cancers. The total number of visible lesion components in 82 patients was 101: 66 masses and 35 microcalcifications. We analyzed the sensitivity and reproducibility of the CAD marks. The sensitivity of the CAD system for first MLO views was 71% (47/66) for masses and 80% (28/35) for microcalcifications. The sensitivity of the CAD system for second MLO views was 68% (45/66) for masses and 17% (6/35) for microcalcifications. In 84 ipsilateral serial MLO image sets (two patients had bilateral cancers), identical images, regardless of the existence of CAD marks, were obtained for 35% (29/84) and identical images with CAD marks were obtained for 29% (23/78). Identical images, regardless of the existence of CAD marks, for contralateral MLO images were 65% (52/80) and identical images with CAD marks were obtained for 28% (11/39). The reproducibility of CAD marks for the true positive masses in serial MLO views was 84% (42/50) and that for the true positive microcalcifications was 0% (0/34). The CAD system in digital mammograms showed a high sensitivity for detecting masses and microcalcifications. However, reproducibility of microcalcification marks was very low in MLO views taken serially without release of breast compression. Minute positional change and patient movement can alter the images and result in a significant effect on the algorithm utilized by the CAD for detecting microcalcifications

  8. Reproducibility of heart rate variability, blood pressure variability and baroreceptor sensitivity during rest and head-up tilt

    DEFF Research Database (Denmark)

    Højgaard, Michael V; Agner, Erik; Kanters, Jørgen K

    2005-01-01

    OBJECTIVE: Previous studies have indicated moderate-to-poor reproducibility of heart rate variability (HRV) but the reproducibility of blood pressure variability (BPV) and spectral measures of baroreceptor sensitivity (BRS) are not well established. METHODS: We measured normal-to-normal heart beat...... pressures were extracted for the assessment of day-to-day and short-term reproducibility. Power spectrum analysis (Fourier) and transfer function analysis was performed. Reproducibility was assessed using the coefficient of variation (CV). The reproducibility of the mean RR interval, mean systolic......, diastolic and mean blood pressure was good (CVspectral parameters of HRV (CV range 18-36%) and BPV (16-44%) and moderate reproducibility of BRS (14-20%). CONCLUSION: Spectral estimates of BRS had only moderate reproducibility although...

  9. Design of 2.5 GHz broad bandwidth microwave bandpass filter at operating frequency of 10 GHz using HFSS

    Science.gov (United States)

    Jasim, S. E.; Jusoh, M. A.; Mahmud, S. N. S.; Zamani, A. H.

    2018-04-01

    Development of low losses, small size and broad bandwidth microwave bandpass filter operating at higher frequencies is an active area of research. This paper presents a new route used to design and simulate microwave bandpass filter using finite element modelling and realized broad bandwidth, low losses, small dimension microwave bandpass filter operating at 10 GHz frequency using return loss method. The filter circuit has been carried out using Computer Aid Design (CAD), Ansoft HFSS software and designed with four parallel couple line model and small dimension (10 × 10 mm2) using LaAlO3 substrate. The response of the microwave filter circuit showed high return loss -50 dB at operating frequency at 10.4 GHz and broad bandwidth of 2.5 GHz from 9.5 to 12 GHz. The results indicate the filter design and simulation using HFSS is reliable and have the opportunity to transfer from lab potential experiments to the industry.

  10. Stable, precise, and reproducible patterning of bicoid and hunchback molecules in the early Drosophila embryo.

    Directory of Open Access Journals (Sweden)

    Yurie Okabe-Oho

    2009-08-01

    Full Text Available Precise patterning of morphogen molecules and their accurate reading out are of key importance in embryonic development. Recent experiments have visualized distributions of proteins in developing embryos and shown that the gradient of concentration of Bicoid morphogen in Drosophila embryos is established rapidly after fertilization and remains stable through syncytial mitoses. This stable Bicoid gradient is read out in a precise way to distribute Hunchback with small fluctuations in each embryo and in a reproducible way, with small embryo-to-embryo fluctuation. The mechanisms of such stable, precise, and reproducible patterning through noisy cellular processes, however, still remain mysterious. To address these issues, here we develop the one- and three-dimensional stochastic models of the early Drosophila embryo. The simulated results show that the fluctuation in expression of the hunchback gene is dominated by the random arrival of Bicoid at the hunchback enhancer. Slow diffusion of Hunchback protein, however, averages out this intense fluctuation, leading to the precise patterning of distribution of Hunchback without loss of sharpness of the boundary of its distribution. The coordinated rates of diffusion and transport of input Bicoid and output Hunchback play decisive roles in suppressing fluctuations arising from the dynamical structure change in embryos and those arising from the random diffusion of molecules, and give rise to the stable, precise, and reproducible patterning of Bicoid and Hunchback distributions.

  11. Reproducibility Test for Thermoluminescence Dosimeter (TLD) Using TLD Radpro

    International Nuclear Information System (INIS)

    Nur Khairunisa Zahidi; Ahmad Bazlie Abdul Kadir; Faizal Azrin Abdul Razalim

    2016-01-01

    Thermoluminescence dosimeters (TLD) as one type of dosimeter which are often used to substitute the film badge. Like a film badge, it is worn for a period of time and then must be processed to determine the dose received. This study was to test the reproducibility of TLD using Radpro reader. This study aimed to determine the dose obtained by TLD-100 chips when irradiated with Co-60 gamma source and to test the effectiveness of TLD Radpro reader as a machine to analyse the TLD. Ten chips of TLD -100 were irradiated using Eldorado machine with Co-60 source at a distance of 5 meters from the source with 2 mSv dose exposure. After the irradiation process, TLD-100 chips were read using the TLD Radpro reader. These steps will be repeated for nine times to obtain reproducibility coefficient, r i . The readings of dose obtained from experiment was almost equivalent to the actual dose. Results shows that the average value obtained for reproducibility coefficient, r i is 6.39 % which is less than 10 %. As conclusion, the dose obtained from experiment considered accurate because its value were almost equivalent to the actual dose and TLD Radpro was verified as a good reader to analyse the TLD. (author)

  12. Repeatability and Reproducibility of Fibre-Based Nanogenerator Synthesized by Electrospinning Machine

    International Nuclear Information System (INIS)

    Suyitno; Huda, Sholiehul; Arifin, Zainal; Hadi, Syamsul; Lambang, Raymundus Lullus

    2014-01-01

    Zinc oxide fibres-based nanogenerators synthesized easily by electrospinning machine are promising to harvest electricity from mechanical energy. However, the repeatability and reproducibility were two major factors needed to be investigated to minimize product failure and to determine the feasibility of mass production of nanogenerators. The green fibres of zinc oxide were produced by electrospinning machine of zinc acetate and polyvinyl alcohol solution at a flow rate of 4 μL/min followed by sintering at temperature 550°C with heating rate 240°C/h. Each 10 nanogenerators was tested by three trained operators with three times of repetition at compressive load 0.5 kg. The nanogenerators revealed the maximum output voltage ranging from 203 to 217 mV. The value of repeatability and reproducibility of nanogenerators was approximately 24.29% showing that nanogenerators were still acceptable to be mass-produced. The relatively low reproducibility was mainly due to the operators, so that the checklist needed to be made easier and simpler for all the variables affecting to the quality of the fibres. Reducing the value of the repeatability and reproducibility is interesting to study further by creating a rotating collector so that the thickness and orientation of fibres can be arranged better

  13. Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study

    Directory of Open Access Journals (Sweden)

    van de Water Tanja

    2017-01-01

    Full Text Available Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT, fifteen elite (25 ± 4 years and nine non-elite (24 ± 4 years Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6% and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%. Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p 0.05. Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p 0.05. In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players’ performance.

  14. Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study.

    Science.gov (United States)

    van de Water, Tanja; Huijgen, Barbara; Faber, Irene; Elferink-Gemser, Marije

    2017-01-01

    Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT), fifteen elite (25 ± 4 years) and nine non-elite (24 ± 4 years) Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6%) and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%). Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p 0.05). Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p badminton-specific reaction time, nor both components of inhibitory control (p > 0.05). In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players' performance.

  15. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  16. Can cancer researchers accurately judge whether preclinical reports will reproduce?

    Directory of Open Access Journals (Sweden)

    Daniel Benjamin

    2017-06-01

    Full Text Available There is vigorous debate about the reproducibility of research findings in cancer biology. Whether scientists can accurately assess which experiments will reproduce original findings is important to determining the pace at which science self-corrects. We collected forecasts from basic and preclinical cancer researchers on the first 6 replication studies conducted by the Reproducibility Project: Cancer Biology (RP:CB to assess the accuracy of expert judgments on specific replication outcomes. On average, researchers forecasted a 75% probability of replicating the statistical significance and a 50% probability of replicating the effect size, yet none of these studies successfully replicated on either criterion (for the 5 studies with results reported. Accuracy was related to expertise: experts with higher h-indices were more accurate, whereas experts with more topic-specific expertise were less accurate. Our findings suggest that experts, especially those with specialized knowledge, were overconfident about the RP:CB replicating individual experiments within published reports; researcher optimism likely reflects a combination of overestimating the validity of original studies and underestimating the difficulties of repeating their methodologies.

  17. Intercenter reproducibility of binary typing for Staphylococcus aureus

    NARCIS (Netherlands)

    van Leeuwen, Willem B.; Snoeijers, Sandor; van der Werken-Libregts, Christel; Tuip, Anita; van der Zee, Anneke; Egberink, Diane; de Proost, Monique; Bik, Elisabeth; Lunter, Bjorn; Kluytmans, Jan; Gits, Etty; van Duyn, Inge; Heck, Max; van der Zwaluw, Kim; Wannet, Wim; Noordhoek, Gerda T.; Mulder, Sije; Renders, Nicole; Boers, Miranda; Zaat, Sebastiaan; van der Riet, Daniëlle; Kooistra, Mirjam; Talens, Adriaan; Dijkshoorn, Lenie; van der Reyden, Tanny; Veenendaal, Dick; Bakker, Nancy; Cookson, Barry; Lynch, Alisson; Witte, Wolfgang; Cuny, Christa; Blanc, Dominique; Vernez, Isabelle; Hryniewicz, Waleria; Fiett, Janusz; Struelens, Marc; Deplano, Ariane; Landegent, Jim; Verbrugh, Henri A.; van Belkum, Alex

    2002-01-01

    The reproducibility of the binary typing (BT) protocol developed for epidemiological typing of Staphylococcus aureus was analyzed in a biphasic multicenter study. In a Dutch multicenter pilot study, 10 genetically unique isolates of methicillin-resistant S. aureus (MRSA) were characterized by the BT

  18. Intra-observer reproducibility and diagnostic performance of breast shear-wave elastography in Asian women.

    Science.gov (United States)

    Park, Hye Young; Han, Kyung Hwa; Yoon, Jung Hyun; Moon, Hee Jung; Kim, Min Jung; Kim, Eun-Kyung

    2014-06-01

    Our aim was to evaluate intra-observer reproducibility of shear-wave elastography (SWE) in Asian women. Sixty-four breast masses (24 malignant, 40 benign) were examined with SWE in 53 consecutive Asian women (mean age, 44.9 y old). Two SWE images were obtained for each of the lesions. The intra-observer reproducibility was assessed by intra-class correlation coefficients (ICC). We also evaluated various clinicoradiologic factors that can influence reproducibility in SWE. The ICC of intra-observer reproducibility was 0.789. In clinicoradiologic factor evaluation, masses surrounded by mixed fatty and glandular tissue (ICC: 0.619) showed lower intra-observer reproducibility compared with lesions that were surrounded by glandular tissue alone (ICC: 0.937; p breast SWE was excellent in Asian women. However, it may decrease when breast tissue is in a heterogeneous background. Therefore, SWE should be performed carefully in these cases. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  19. Tools for Reproducibility and Extensibility in Scientific Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others.    There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...

  20. Transition questions in clinical practice - validity and reproducibility

    DEFF Research Database (Denmark)

    Lauridsen, Henrik Hein

    2008-01-01

    Transition questions in CLINICAL practice - validity and reproducibility Lauridsen HH1, Manniche C3, Grunnet-Nilsson N1, Hartvigsen J1,2 1   Clinical Locomotion Science, Institute of Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark. e-mail: hlauridsen......@health.sdu.dk 2   Nordic Institute of Chiropractic and Clinical Biomechanics, Part of Clinical Locomotion Science, Odense, Denmark 3   Backcenter Funen, Part of Clinical Locomotion Science, Ringe, Denmark   Abstract  Understanding a change score is indispensable for interpretation of results from clinical studies...... are reproducible in patients with low back pain and/or leg pain. Despite critique of several biases, our results have reinforced the construct validity of TQ’s as an outcome measure since only 1 hypothesis was rejected. On the basis of our findings we have outlined a proposal for a standardised use of transition...

  1. Statistical measure of ensemble non reproducibility and correction to Bell's inequality

    International Nuclear Information System (INIS)

    Khrennikov, A.

    2000-01-01

    In this work it has been analysed the proof of Bell's inequality and demonstrate that this inequality is related to one particular model of probability theory, namely Kolmogorov measure-theoretical axiomatic, 1933. It was found a (numerical) statistical correction to Bell's inequality. Such an additional term ε φ on the right-hand side of Bell's inequality can be considered as a probability invariant of a quantum state φ. This is a measure of non reproducibility of hidden variables in different runs of experiments. Experiments to verify Bell's inequality can be considered as just experiments to estimate the constant ε φ . It seems that Bell's inequality could not be used as a crucial reason to deny local realism

  2. How well can DFT reproduce key interactions in Ziegler-Natta systems?

    KAUST Repository

    Correa, Andrea

    2013-08-08

    The performance of density functional theory in reproducing some of the main interactions occurring in MgCl2-supported Ziegler-Natta catalytic systems is assessed. Eight model systems, representatives of key interactions occurring in Ziegler-Natta catalysts, are selected. Fifteen density functionals are tested in combination with two different basis sets, namely, TZVP and cc-pVTZ. As a general result, we found that the best performances are achieved by the PBEh1PBE hybrid generalized gradient approximation (GGA) functional, but also the cheaper PBEh GGA functional gives rather good results. The failure of the popular B3LYP and BP86 functionals is noticeable. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Evaluation of CMIP5 Ability to Reproduce 20th Century Regional Trends in Surface Air Temperature and Precipitation over CONUS

    Science.gov (United States)

    Lee, J.; Waliser, D. E.; Lee, H.; Loikith, P. C.; Kunkel, K.

    2017-12-01

    Monitoring temporal changes in key climate variables, such as surface air temperature and precipitation, is an integral part of the ongoing efforts of the United States National Climate Assessment (NCA). Climate models participating in CMIP5 provide future trends for four different emissions scenarios. In order to have confidence in the future projections of surface air temperature and precipitation, it is crucial to evaluate the ability of CMIP5 models to reproduce observed trends for three different time periods (1895-1939, 1940-1979, and 1980-2005). Towards this goal, trends in surface air temperature and precipitation obtained from the NOAA nClimGrid 5 km gridded station observation-based product are compared during all three time periods to the 206 CMIP5 historical simulations from 48 unique GCMs and their multi-model ensemble (MME) for NCA-defined climate regions during summer (JJA) and winter (DJF). This evaluation quantitatively examines the biases of simulated trends of the spatially averaged temperature and precipitation in the NCA climate regions. The CMIP5 MME reproduces historical surface air temperature trends for JJA for all time period and all regions, except the Northern Great Plains from 1895-1939 and Southeast during 1980-2005. Likewise, for DJF, the MME reproduces historical surface air temperature trends across all time periods over all regions except the Southeast from 1895-1939 and the Midwest during 1940-1979. The Regional Climate Model Evaluation System (RCMES), an analysis tool which supports the NCA by providing access to data and tools for regional climate model validation, facilitates the comparisons between the models and observation. The RCMES Toolkit is designed to assist in the analysis of climate variables and the procedure of the evaluation of climate projection models to support the decision-making processes. This tool is used in conjunction with the above analysis and results will be presented to demonstrate its capability to

  4. Preparation of a reproducible long-acting formulation of risperidone-loaded PLGA microspheres using microfluidic method.

    Science.gov (United States)

    Jafarifar, Elham; Hajialyani, Marziyeh; Akbari, Mona; Rahimi, Masoud; Shokoohinia, Yalda; Fattahi, Ali

    2017-09-01

    The aim of the present study is to prepare risperidone-loaded poly lactic-co-glycolic acid (PLGA) microspheres within microfluidic system and to achieve a formulation with uniform size and monotonic and reproducible release profile. In comparison to batch method, T-junction and serpentine chips were utilized and optimizing study was carried out at different processing parameters (e.g. PLGA and surfactant concentration and flow rates ratio of outer to inner phase). The computational fluid dynamic (CFD) modeling was performed, and loading and release study were carried out. CFD simulation indicates that increasing the flow rate of aqueous phase cause to decrease the droplet size, while the change in size of microspheres did not follow a specific pattern in the experimental results. The most uniform microspheres and narrowest standard deviation (66.79 μm ± 3.32) were achieved using T-junction chip, 1% polyvinylalcohol, 1% PLGA and flow rates ratio of 20. The microfluidic-assisted microspheres were more uniform with narrower size distribution. The release of risperidone from microspheres produced by the microfluidic method was more reproducible and closer to zero-order kinetic model. The release profile of formulation with 2:1 drug-to-polymer ratio was the most favorable release, in which 41.85% release could be achieved during 24 days.

  5. Trouble with diffusion: Reassessing hillslope erosion laws with a particle-based model

    Science.gov (United States)

    Tucker, Gregory E.; Bradley, D. Nathan

    2010-03-01

    Many geomorphic systems involve a broad distribution of grain motion length scales, ranging from a few particle diameters to the length of an entire hillslope or stream. Studies of analogous physical systems have revealed that such broad motion distributions can have a significant impact on macroscale dynamics and can violate the assumptions behind standard, local gradient flux laws. Here, a simple particle-based model of sediment transport on a hillslope is used to study the relationship between grain motion statistics and macroscopic landform evolution. Surface grains are dislodged by random disturbance events with probabilities and distances that depend on local microtopography. Despite its simplicity, the particle model reproduces a surprisingly broad range of slope forms, including asymmetric degrading scarps and cinder cone profiles. At low slope angles the dynamics are diffusion like, with a short-range, thin-tailed hop length distribution, a parabolic, convex upward equilibrium slope form, and a linear relationship between transport rate and gradient. As slope angle steepens, the characteristic grain motion length scale begins to approach the length of the slope, leading to planar equilibrium forms that show a strongly nonlinear correlation between transport rate and gradient. These high-probability, long-distance motions violate the locality assumption embedded in many common gradient-based geomorphic transport laws. The example of a degrading scarp illustrates the potential for grain motion dynamics to vary in space and time as topography evolves. This characteristic renders models based on independent, stationary statistics inapplicable. An accompanying analytical framework based on treating grain motion as a survival process is briefly outlined.

  6. Chimeric Hemagglutinin Constructs Induce Broad Protection against Influenza B Virus Challenge in the Mouse Model

    OpenAIRE

    Ermler, Megan E.; Kirkpatrick, Ericka; Sun, Weina; Hai, Rong; Amanat, Fatima; Chromikova, Veronika; Palese, Peter; Krammer, Florian

    2017-01-01

    Seasonal influenza virus epidemics represent a significant public health burden. Approximately 25% of all influenza virus infections are caused by type B viruses, and these infections can be severe, especially in children. Current influenza virus vaccines are an effective prophylaxis against infection but are impacted by rapid antigenic drift, which can lead to mismatches between vaccine strains and circulating strains. Here, we describe a broadly protective vaccine candidate based on chimeri...

  7. A Broad 22 Micron Emission Feature in the Carina Nebula H ii Region.

    Science.gov (United States)

    Chan; Onaka

    2000-04-10

    We report the detection of a broad 22 µm emission feature in the Carina Nebula H ii region by the Infrared Space Observatory (ISO) short-wavelength spectrometer. The feature shape is similar to that of the 22 µm emission feature of newly synthesized dust observed in the Cassiopeia A supernova remnant. This finding suggests that both of the features are arising from the same carrier and that supernovae are probably the dominant production sources of this new interstellar grain. A similar broad emission dust feature is also found in the spectra of two starburst galaxies from the ISO archival data. This new dust grain could be an abundant component of interstellar grains and can be used to trace the supernova rate or star formation rate in external galaxies. The existence of the broad 22 µm emission feature complicates the dust model for starburst galaxies and must be taken into account correctly in the derivation of dust color temperature. Mg protosilicate has been suggested as the carrier of the 22 µm emission dust feature observed in Cassiopeia A. The present results provide useful information in studies on the chemical composition and emission mechanism of the carrier.

  8. Failed Radiatively Accelerated Dusty Outflow Model of the Broad Line Region in Active Galactic Nuclei. I. Analytical Solution

    Energy Technology Data Exchange (ETDEWEB)

    Czerny, B.; Panda, S.; Wildy, C.; Sniegowska, M. [Center for Theoretical Physics, Polish Academy of Sciences, Al. Lotników 32/46, 02-668 Warsaw (Poland); Li, Yan-Rong; Wang, J.-M. [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Hryniewicz, K.; Sredzinska, J. [Copernicus Astronomical Center, Polish Academy of Sciences, Bartycka 18, 00-716 Warsaw (Poland); Karas, V., E-mail: bcz@cft.edu.pl [Astronomical Institute, Academy of Sciences, Bocni II 1401, CZ-141 00 Prague (Czech Republic)

    2017-09-10

    The physical origin of the broad line region in active galactic nuclei is still unclear despite many years of observational studies. The reason is that the region is unresolved, and the reverberation mapping results imply a complex velocity field. We adopt a theory-motivated approach to identify the principal mechanism responsible for this complex phenomenon. We consider the possibility that the role of dust is essential. We assume that the local radiation pressure acting on the dust in the accretion disk atmosphere launches the outflow of material, but higher above the disk the irradiation from the central parts causes dust evaporation and a subsequent fallback. This failed radiatively accelerated dusty outflow is expected to represent the material forming low ionization lines. In this paper we formulate simple analytical equations to describe the cloud motion, including the evaporation phase. The model is fully described just by the basic parameters of black hole mass, accretion rate, black hole spin, and viewing angle. We study how the spectral line generic profiles correspond to this dynamic. We show that the virial factor calculated from our model strongly depends on the black hole mass in the case of enhanced dust opacity, and thus it then correlates with the line width. This could explain why the virial factor measured in galaxies with pseudobulges differs from that obtained from objects with classical bulges, although the trend predicted by the current version of the model is opposite to the observed trend.

  9. Reproducibility of isotope ratio measurements

    International Nuclear Information System (INIS)

    Elmore, D.

    1981-01-01

    The use of an accelerator as part of a mass spectrometer has improved the sensitivity for measuring low levels of long-lived radionuclides by several orders of magnitude. However, the complexity of a large tandem accelerator and beam transport system has made it difficult to match the precision of low energy mass spectrometry. Although uncertainties for accelerator measured isotope ratios as low as 1% have been obtained under favorable conditions, most errors quoted in the literature for natural samples are in the 5 to 20% range. These errors are dominated by statistics and generally the reproducibility is unknown since the samples are only measured once

  10. Reproducibility of the results in ultrasonic testing

    International Nuclear Information System (INIS)

    Chalaye, M.; Launay, J.P.; Thomas, A.

    1980-12-01

    This memorandum reports on the conclusions of the tests carried out in order to evaluate the reproducibility of ultrasonic tests made on welded joints. FRAMATOME have started a study to assess the dispersion of results afforded by the test line and to characterize its behaviour. The tests covered sensors and ultrasonic generators said to be identical to each other (same commercial batch) [fr

  11. Psychometric Evaluation of the Brachial Assessment Tool Part 1: Reproducibility.

    Science.gov (United States)

    Hill, Bridget; Williams, Gavin; Olver, John; Ferris, Scott; Bialocerkowski, Andrea

    2018-04-01

    To evaluate reproducibility (reliability and agreement) of the Brachial Assessment Tool (BrAT), a new patient-reported outcome measure for adults with traumatic brachial plexus injury (BPI). Prospective repeated-measure design. Outpatient clinics. Adults with confirmed traumatic BPI (N=43; age range, 19-82y). People with BPI completed the 31-item 4-response BrAT twice, 2 weeks apart. Results for the 3 subscales and summed score were compared at time 1 and time 2 to determine reliability, including systematic differences using paired t tests, test retest using intraclass correlation coefficient model 1,1 (ICC 1,1 ), and internal consistency using Cronbach α. Agreement parameters included standard error of measurement, minimal detectable change, and limits of agreement. BrAT. Test-retest reliability was excellent (ICC 1,1 =.90-.97). Internal consistency was high (Cronbach α=.90-.98). Measurement error was relatively low (standard error of measurement range, 3.1-8.8). A change of >4 for subscale 1, >6 for subscale 2, >4 for subscale 3, and >10 for the summed score is indicative of change over and above measurement error. Limits of agreement ranged from ±4.4 (subscale 3) to 11.61 (summed score). These findings support the use of the BrAT as a reproducible patient-reported outcome measure for adults with traumatic BPI with evidence of appropriate reliability and agreement for both individual and group comparisons. Further psychometric testing is required to establish the construct validity and responsiveness of the BrAT. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  12. Reproducibility of 3D kinematics and surface electromyography measurements of mastication.

    Science.gov (United States)

    Remijn, Lianne; Groen, Brenda E; Speyer, Renée; van Limbeek, Jacques; Nijhuis-van der Sanden, Maria W G

    2016-03-01

    The aim of this study was to determine the measurement reproducibility for a procedure evaluating the mastication process and to estimate the smallest detectable differences of 3D kinematic and surface electromyography (sEMG) variables. Kinematics of mandible movements and sEMG activity of the masticatory muscles were obtained over two sessions with four conditions: two food textures (biscuit and bread) of two sizes (small and large). Twelve healthy adults (mean age 29.1 years) completed the study. The second to the fifth chewing cycle of 5 bites were used for analyses. The reproducibility per outcome variable was calculated with an intraclass correlation coefficient (ICC) and a Bland-Altman analysis was applied to determine the standard error of measurement relative error of measurement and smallest detectable differences of all variables. ICCs ranged from 0.71 to 0.98 for all outcome variables. The outcome variables consisted of four bite and fourteen chewing cycle variables. The relative standard error of measurement of the bite variables was up to 17.3% for 'time-to-swallow', 'time-to-transport' and 'number of chewing cycles', but ranged from 31.5% to 57.0% for 'change of chewing side'. The relative standard error of measurement ranged from 4.1% to 24.7% for chewing cycle variables and was smaller for kinematic variables than sEMG variables. In general, measurements obtained with 3D kinematics and sEMG are reproducible techniques to assess the mastication process. The duration of the chewing cycle and frequency of chewing were the best reproducible measurements. Change of chewing side could not be reproduced. The published measurement error and smallest detectable differences will aid the interpretation of the results of future clinical studies using the same study variables. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Broad-band hard X-ray reflectors

    DEFF Research Database (Denmark)

    Joensen, K.D.; Gorenstein, P.; Hoghoj, P.

    1997-01-01

    Interest in optics for hard X-ray broad-band application is growing. In this paper, we compare the hard X-ray (20-100 keV) reflectivity obtained with an energy-dispersive reflectometer, of a standard commercial gold thin-film with that of a 600 bilayer W/Si X-ray supermirror. The reflectivity...... of the multilayer is found to agree extraordinarily well with theory (assuming an interface roughness of 4.5 Angstrom), while the agreement for the gold film is less, The overall performance of the supermirror is superior to that of gold, extending the band of reflection at least a factor of 2.8 beyond...... that of the gold, Various other design options are discussed, and we conclude that continued interest in the X-ray supermirror for broad-band hard X-ray applications is warranted....

  14. Analyzing extreme sea levels for broad-scale impact and adaptation studies

    Science.gov (United States)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Dangendorf, S.; Hinkel, J.; Slangen, A.

    2017-12-01

    Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels (ESL), because increasing damage due to extreme events is one of the major consequences of sea-level rise (SLR) and climate change. Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future SLR; different scenarios were developed with process-based or semi-empirical models and used for coastal impact studies at various temporal and spatial scales to guide coastal management and adaptation efforts. Uncertainties in future SLR are typically accounted for by analyzing the impacts associated with a range of scenarios and model ensembles. ESL distributions are then displaced vertically according to the SLR scenarios under the inherent assumption that we have perfect knowledge on the statistics of extremes. However, there is still a limited understanding of present-day ESL which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of storm surge water levels, and (2) statistical models used for determining present-day ESL exceedance probabilities. There is no universally accepted approach to obtain such values for broad-scale flood risk assessments and while substantial research has explored SLR uncertainties, we quantify, for the first time globally, key uncertainties in ESL estimates. We find that contemporary ESL uncertainties exceed those from SLR projections and, assuming that we meet the Paris agreement, the projected SLR itself by the end of the century. Our results highlight the necessity to further improve our understanding of uncertainties in ESL estimates through (1) continued improvement of numerical and statistical models to simulate and analyze coastal water levels and (2) exploit the rich observational database and continue data archeology to obtain longer time series and remove model bias

  15. Lattice Boltzmann simulations of the permeability and capillary adsorption of cement model microstructures

    Energy Technology Data Exchange (ETDEWEB)

    Zalzale, M. [Laboratory of Construction Materials, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); McDonald, P.J., E-mail: p.mcdonald@surrey.ac.uk [Department of Physics, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom)

    2012-12-15

    The lattice Boltzmann method is used to investigate the permeability of microstructures of cement pastes generated using the numerical models CEMHYD3D (Bentz, 1997) and {mu}IC (Bishnoi and Scrivener, 2009). Results are reported as a function of paste water-to-cement ratio and degree of hydration. The permeability decreases with increasing hydration and decreasing water-to-cement ratio in agreement with experiment. However the permeability is larger than the experimental data recorded using beam bending methods (Vichit-Vadakan and Scherer, 2002). Notwithstanding, the lattice Boltzmann results compare favourably with alternate numerical methods of permeability calculation for cement model microstructures. In addition, we show early results for the liquid/vapour capillary adsorption and desorption isotherms in the same model {mu}IC structures. The broad features of the experimental capillary porosity isotherm are reproduced, although further work is required to adequately parameterise the model.

  16. Echo Particle Image Velocimetry for Estimation of Carotid Artery Wall Shear Stress: Repeatability, Reproducibility and Comparison with Phase-Contrast Magnetic Resonance Imaging.

    Science.gov (United States)

    Gurung, Arati; Gates, Phillip E; Mazzaro, Luciano; Fulford, Jonathan; Zhang, Fuxing; Barker, Alex J; Hertzberg, Jean; Aizawa, Kunihiko; Strain, William D; Elyas, Salim; Shore, Angela C; Shandas, Robin

    2017-08-01

    Measurement of hemodynamic wall shear stress (WSS) is important in investigating the role of WSS in the initiation and progression of atherosclerosis. Echo particle image velocimetry (echo PIV) is a novel ultrasound-based technique for measuring WSS in vivo that has previously been validated in vitro using the standard optical PIV technique. We evaluated the repeatability and reproducibility of echo PIV for measuring WSS in the human common carotid artery. We measured WSS in 28 healthy participants (18 males and 10 females, mean age: 56 ± 12 y). Echo PIV was highly repeatable, with an intra-observer variability of 1.0 ± 0.1 dyn/cm 2 for peak systolic (maximum), 0.9 dyn/cm 2 for mean and 0.5 dyn/cm 2 for end-diastolic (minimum) WSS measurements. Likewise, echo PIV was reproducible, with a low inter-observer variability (max: 2.0 ± 0.2 dyn/cm 2 , mean: 1.3 ± 0.1 dyn/cm 2 , end-diastolic: 0.7 dyn/cm 2 ) and more variable inter-scan (test-retest) variability (max: 7.1 ± 2.3 dyn/cm 2 , mean: 2.9 ± 0.4 dyn/cm 2 , min: 1.5 ± 0.1 dyn/cm 2 ). We compared echo PIV with the reference method, phase-contrast magnetic resonance imaging (PC-MRI); echo PIV-based WSS measurements agreed qualitatively with PC-MRI measurements (r = 0.89, p PIV vs. PC-MRI): WSS at peak systole: 21 ± 7.0 dyn/cm 2 vs. 15 ± 5.0 dyn/cm 2 ; time-averaged WSS: 8.9 ± 3.0 dyn/cm 2 vs. 7.1 ± 3.0 dyn/cm 2 (p  0.05). For the first time, we report that echo PIV can measure WSS with good repeatability and reproducibility in adult humans with a broad age range. Echo PIV is feasible in humans and offers an easy-to-use, ultrasound-based, quantitative technique for measuring WSS in vivo in humans with good repeatability and reproducibility. Copyright © 2017. Published by Elsevier Inc.

  17. The Harm Done to Reproducibility by the Culture of Null Hypothesis Significance Testing.

    Science.gov (United States)

    Lash, Timothy L

    2017-09-15

    In the last few years, stakeholders in the scientific community have raised alarms about a perceived lack of reproducibility of scientific results. In reaction, guidelines for journals have been promulgated and grant applicants have been asked to address the rigor and reproducibility of their proposed projects. Neither solution addresses a primary culprit, which is the culture of null hypothesis significance testing that dominates statistical analysis and inference. In an innovative research enterprise, selection of results for further evaluation based on null hypothesis significance testing is doomed to yield a low proportion of reproducible results and a high proportion of effects that are initially overestimated. In addition, the culture of null hypothesis significance testing discourages quantitative adjustments to account for systematic errors and quantitative incorporation of prior information. These strategies would otherwise improve reproducibility and have not been previously proposed in the widely cited literature on this topic. Without discarding the culture of null hypothesis significance testing and implementing these alternative methods for statistical analysis and inference, all other strategies for improving reproducibility will yield marginal gains at best. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Test Automation Process Improvement A case study of BroadSoft

    OpenAIRE

    Gummadi, Jalendar

    2016-01-01

    This master thesis research is about improvement of test automation process at BroadSoft Finland as a case study. Test automation project recently started at BroadSoft but the project is not properly integrated in to existing process. Project is about converting manual test cases to automation test cases. The aim of this thesis is about studying existing BroadSoft test process and studying different test automation frameworks. In this thesis different test automation process are studied ...

  19. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    Science.gov (United States)

    Attada, Raju; Kumar, Prashant; Dasari, Hari Prasad

    2018-04-01

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF-LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena in

  20. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    KAUST Repository

    Attada, Raju

    2018-04-17

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF–LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena

  1. Assessment of Land Surface Models in a High-Resolution Atmospheric Model during Indian Summer Monsoon

    KAUST Repository

    Attada, Raju; Kumar, Prashant; Dasari, Hari Prasad

    2018-01-01

    Assessment of the land surface models (LSMs) on monsoon studies over the Indian summer monsoon (ISM) region is essential. In this study, we evaluate the skill of LSMs at 10 km spatial resolution in simulating the 2010 monsoon season. The thermal diffusion scheme (TDS), rapid update cycle (RUC), and Noah and Noah with multi-parameterization (Noah-MP) LSMs are chosen based on nature of complexity, that is, from simple slab model to multi-parameterization options coupled with the Weather Research and Forecasting (WRF) model. Model results are compared with the available in situ observations and reanalysis fields. The sensitivity of monsoon elements, surface characteristics, and vertical structures to different LSMs is discussed. Our results reveal that the monsoon features are reproduced by WRF model with all LSMs, but with some regional discrepancies. The model simulations with selected LSMs are able to reproduce the broad rainfall patterns, orography-induced rainfall over the Himalayan region, and dry zone over the southern tip of India. The unrealistic precipitation pattern over the equatorial western Indian Ocean is simulated by WRF–LSM-based experiments. The spatial and temporal distributions of top 2-m soil characteristics (soil temperature and soil moisture) are well represented in RUC and Noah-MP LSM-based experiments during the ISM. Results show that the WRF simulations with RUC, Noah, and Noah-MP LSM-based experiments significantly improved the skill of 2-m temperature and moisture compared to TDS (chosen as a base) LSM-based experiments. Furthermore, the simulations with Noah, RUC, and Noah-MP LSMs exhibit minimum error in thermodynamics fields. In case of surface wind speed, TDS LSM performed better compared to other LSM experiments. A significant improvement is noticeable in simulating rainfall by WRF model with Noah, RUC, and Noah-MP LSMs over TDS LSM. Thus, this study emphasis the importance of choosing/improving LSMs for simulating the ISM phenomena

  2. Modeling the Downstream Processing of Monoclonal Antibodies Reveals Cost Advantages for Continuous Methods for a Broad Range of Manufacturing Scales.

    Science.gov (United States)

    Hummel, Jonathan; Pagkaliwangan, Mark; Gjoka, Xhorxhi; Davidovits, Terence; Stock, Rick; Ransohoff, Thomas; Gantier, Rene; Schofield, Mark

    2018-01-17

    The biopharmaceutical industry is evolving in response to changing market conditions, including increasing competition and growing pressures to reduce costs. Single-use (SU) technologies and continuous bioprocessing have attracted attention as potential facilitators of cost-optimized manufacturing for monoclonal antibodies. While disposable bioprocessing has been adopted at many scales of manufacturing, continuous bioprocessing has yet to reach the same level of implementation. In this study, the cost of goods of Pall Life Science's integrated, continuous bioprocessing (ICB) platform is modeled, along with that of purification processes in stainless-steel and SU batch formats. All three models include costs associated with downstream processing only. Evaluation of the models across a broad range of clinical and commercial scenarios reveal that the cost savings gained by switching from stainless-steel to SU batch processing are often amplified by continuous operation. The continuous platform exhibits the lowest cost of goods across 78% of all scenarios modeled here, with the SU batch process having the lowest costs in the rest of the cases. The relative savings demonstrated by the continuous process are greatest at the highest feed titers and volumes. These findings indicate that existing and imminent continuous technologies and equipment can become key enablers for more cost effective manufacturing of biopharmaceuticals. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Fabrication of reproducible, integration-compatible hybrid molecular/si electronics.

    Science.gov (United States)

    Yu, Xi; Lovrinčić, Robert; Kraynis, Olga; Man, Gabriel; Ely, Tal; Zohar, Arava; Toledano, Tal; Cahen, David; Vilan, Ayelet

    2014-12-29

    Reproducible molecular junctions can be integrated within standard CMOS technology. Metal-molecule-semiconductor junctions are fabricated by direct Si-C binding of hexadecane or methyl-styrene onto oxide-free H-Si(111) surfaces, with the lateral size of the junctions defined by an etched SiO2 well and with evaporated Pb as the top contact. The current density, J, is highly reproducible with a standard deviation in log(J) of 0.2 over a junction diameter change from 3 to 100 μm. Reproducibility over such a large range indicates that transport is truly across the molecules and does not result from artifacts like edge effects or defects in the molecular monolayer. Device fabrication is tested for two n-Si doping levels. With highly doped Si, transport is dominated by tunneling and reveals sharp conductance onsets at room temperature. Using the temperature dependence of current across medium-doped n-Si, the molecular tunneling barrier can be separated from the Si-Schottky one, which is a 0.47 eV, in agreement with the molecular-modified surface dipole and quite different from the bare Si-H junction. This indicates that Pb evaporation does not cause significant chemical changes to the molecules. The ability to manufacture reliable devices constitutes important progress toward possible future hybrid Si-based molecular electronics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Convergence of macrostates under reproducible processes

    International Nuclear Information System (INIS)

    Rau, Jochen

    2010-01-01

    I show that whenever a system undergoes a reproducible macroscopic process the mutual distinguishability of macrostates, as measured by their relative entropy, diminishes. This extends the second law which regards only ordinary entropies, and hence only the distinguishability between macrostates and one specific reference state (equidistribution). The new result holds regardless of whether the process is linear or nonlinear. Its proof hinges on the monotonicity of quantum relative entropy under arbitrary coarse grainings, even those that cannot be represented by trace-preserving completely positive maps.

  5. Antiviral Therapy by HIV-1 Broadly Neutralizing and Inhibitory Antibodies

    Directory of Open Access Journals (Sweden)

    Zhiqing Zhang

    2016-11-01

    Full Text Available Human immunodeficiency virus type 1 (HIV-1 infection causes acquired immune deficiency syndrome (AIDS, a global epidemic for more than three decades. HIV-1 replication is primarily controlled through antiretroviral therapy (ART but this treatment does not cure HIV-1 infection. Furthermore, there is increasing viral resistance to ART, and side effects associated with long-term therapy. Consequently, there is a need of alternative candidates for HIV-1 prevention and therapy. Recent advances have discovered multiple broadly neutralizing antibodies against HIV-1. In this review, we describe the key epitopes on the HIV-1 Env protein and the reciprocal broadly neutralizing antibodies, and discuss the ongoing clinical trials of broadly neutralizing and inhibitory antibody therapy as well as antibody combinations, bispecific antibodies, and methods that improve therapeutic efficacy by combining broadly neutralizing antibodies (bNAbs with latency reversing agents. Compared with ART, HIV-1 therapeutics that incorporate these broadly neutralizing and inhibitory antibodies offer the advantage of decreasing virus load and clearing infected cells, which is a promising prospect in HIV-1 prevention and treatment.

  6. Asialoerythropoietin is a nonerythropoietic cytokine with broad neuroprotective activity in vivo

    DEFF Research Database (Denmark)

    Erbayraktar, Serhat; Grasso, Giovanni; Sfacteria, Alessandra

    2003-01-01

    Erythropoietin (EPO) is a tissue-protective cytokine preventing vascular spasm, apoptosis, and inflammatory responses. Although best known for its role in hematopoietic lineages, EPO also affects other tissues, including those of the nervous system. Enthusiasm for recombinant human erythropoietin...... importantly, asialoEPO exhibits a broad spectrum of neuroprotective activities, as demonstrated in models of cerebral ischemia, spinal cord compression, and sciatic nerve crush. These data suggest that nonerythropoietic variants of rhEPO can cross the blood-brain barrier and provide neuroprotection....

  7. Using the Properties of Broad Absorption Line Quasars to Illuminate Quasar Structure

    Science.gov (United States)

    Yong, Suk Yee; King, Anthea L.; Webster, Rachel L.; Bate, Nicholas F.; O'Dowd, Matthew J.; Labrie, Kathleen

    2018-06-01

    A key to understanding quasar unification paradigms is the emission properties of broad absorption line quasars (BALQs). The fact that only a small fraction of quasar spectra exhibit deep absorption troughs blueward of the broad permitted emission lines provides a crucial clue to the structure of quasar emitting regions. To learn whether it is possible to discriminate between the BALQ and non-BALQ populations given the observed spectral properties of a quasar, we employ two approaches: one based on statistical methods and the other supervised machine learning classification, applied to quasar samples from the Sloan Digital Sky Survey. The features explored include continuum and emission line properties, in particular the absolute magnitude, redshift, spectral index, line width, asymmetry, strength, and relative velocity offsets of high-ionisation C IV λ1549 and low-ionisation Mg II λ2798 lines. We consider a complete population of quasars, and assume that the statistical distributions of properties represent all angles where the quasar is viewed without obscuration. The distributions of the BALQ and non-BALQ sample properties show few significant differences. None of the observed continuum and emission line features are capable of differentiating between the two samples. Most published narrow disk-wind models are inconsistent with these observations, and an alternative disk-wind model is proposed. The key feature of the proposed model is a disk-wind filling a wide opening angle with multiple radial streams of dense clumps.

  8. Aveiro method in reproducing kernel Hilbert spaces under complete dictionary

    Science.gov (United States)

    Mai, Weixiong; Qian, Tao

    2017-12-01

    Aveiro Method is a sparse representation method in reproducing kernel Hilbert spaces (RKHS) that gives orthogonal projections in linear combinations of reproducing kernels over uniqueness sets. It, however, suffers from determination of uniqueness sets in the underlying RKHS. In fact, in general spaces, uniqueness sets are not easy to be identified, let alone the convergence speed aspect with Aveiro Method. To avoid those difficulties we propose an anew Aveiro Method based on a dictionary and the matching pursuit idea. What we do, in fact, are more: The new Aveiro method will be in relation to the recently proposed, the so called Pre-Orthogonal Greedy Algorithm (P-OGA) involving completion of a given dictionary. The new method is called Aveiro Method Under Complete Dictionary (AMUCD). The complete dictionary consists of all directional derivatives of the underlying reproducing kernels. We show that, under the boundary vanishing condition, bring available for the classical Hardy and Paley-Wiener spaces, the complete dictionary enables an efficient expansion of any given element in the Hilbert space. The proposed method reveals new and advanced aspects in both the Aveiro Method and the greedy algorithm.

  9. Reproducibility and validity of the Shanghai Women's Health Study physical activity questionnaire.

    Science.gov (United States)

    Matthews, Charles E; Shu, Xiao-Ou; Yang, Gong; Jin, Fan; Ainsworth, Barbara E; Liu, Dake; Gao, Yu-Tang; Zheng, Wei

    2003-12-01

    In this investigation, the authors evaluated the reproducibility and validity of the Shanghai Women's Health Study (SWHS) physical activity questionnaire (PAQ), which was administered in a cohort study of approximately 75,000 Chinese women aged 40-70 years. Reproducibility (2-year test-retest) was evaluated using kappa statistics and intraclass correlation coefficients (ICCs). Validity was evaluated by comparing Spearman correlations (r) for the SWHS PAQ with two criterion measures administered over a period of 12 months: four 7-day physical activity logs and up to 28 7-day PAQs. Women were recruited from the SWHS cohort (n = 200). Results indicated that the reproducibility of adolescent and adult exercise participation (kappa = 0.85 and kappa = 0.64, respectively) and years of adolescent exercise and adult exercise energy expenditure (ICC = 0.83 and ICC = 0.70, respectively) was reasonable. Reproducibility values for adult lifestyle activities were lower (ICC = 0.14-0.54). Significant correlations between the PAQ and criterion measures of adult exercise were observed for the first PAQ administration (physical activity log, r = 0.50; 7-day PAQ, r = 0.62) and the second PAQ administration (physical activity log, r = 0.74; 7-day PAQ, r = 0.80). Significant correlations between PAQ lifestyle activities and the 7-day PAQ were also noted (r = 0.33-0.88). These data indicate that the SWHS PAQ is a reproducible and valid measure of exercise behaviors and that it demonstrates utility in stratifying women by levels of important lifestyle activities (e.g., housework, walking, cycling).

  10. Broad- versus Narrow-Spectrum Oral Antibiotic Transition and Outcomes in Health Care-associated Pneumonia.

    Science.gov (United States)

    Buckel, Whitney R; Stenehjem, Edward; Sorensen, Jeff; Dean, Nathan; Webb, Brandon

    2017-02-01

    Guidelines recommend a switch from intravenous to oral antibiotics once patients who are hospitalized with pneumonia achieve clinical stability. However, little evidence guides the selection of an oral antibiotic for patients with health care-associated pneumonia, especially where no microbiological diagnosis is made. To compare outcomes between patients who were transitioned to broad- versus narrow-spectrum oral antibiotics after initially receiving broad-spectrum intravenous antibiotic coverage. We performed a secondary analysis of an existing database of adults with community-onset pneumonia admitted to seven Utah hospitals. We identified 220 inpatients with microbiology-negative health care-associated pneumonia from 2010 to 2012. After excluding inpatient deaths and treatment failures, 173 patients remained in which broad-spectrum intravenous antibiotics were transitioned to an oral regimen. We classified oral regimens as broad-spectrum (fluoroquinolone) versus narrow-spectrum (usually a β-lactam). We compared demographic and clinical characteristics between groups. Using a multivariable regression model, we adjusted outcomes by severity (electronically calculated CURB-65), comorbidity (Charlson Index), time to clinical stability, and length of intravenous therapy. Age, severity, comorbidity, length of intravenous therapy, and clinical response were similar between the two groups. Observed 30-day readmission (11.9 vs. 21.4%; P = 0.26) and 30-day all-cause mortality (2.3 vs. 5.3%; P = 0.68) were also similar between the narrow and broad oral antibiotic groups. In multivariable analysis, we found no statistically significant differences for adjusted odds of 30-day readmission (adjusted odds ratio, 0.56; 95% confidence interval, 0.06-5.2; P = 0.61) or 30-day all-cause mortality (adjusted odds ratio, 0.55; 95% confidence interval, 0.19-1.6; P = 0.26) between narrow and broad oral antibiotic groups. On the basis of analysis of a limited number of patients

  11. Formation of broad Balmer wings in symbiotic stars

    International Nuclear Information System (INIS)

    Chang, Seok-Jun; Heo, Jeong-Eun; Hong, Chae-Lin; Lee, Hee-Won

    2016-01-01

    Symbiotic stars are binary systems composed of a hot white dwarf and a mass losing giant. In addition to many prominent emission lines symbiotic stars exhibit Raman scattered O VI features at 6825 and 7088 Å. Another notable feature present in the spectra of many symbiotics is the broad wings around Balmer lines. Astrophysical mechanisms that can produce broad wings include Thomson scattering by free electrons and Raman scattering of Ly,β and higher series by neutral hydrogen. In this poster presentation we produce broad wings around Hα and H,β adopting a Monte Carlo techinique in order to make a quantitative comparison of these two mechanisms. Thomson wings are characterized by the exponential cutoff given by the termal width whereas the Raman wings are dependent on the column density and continuum shape in the far UV region. A brief discussion is provided. (paper)

  12. Credibility, Replicability, and Reproducibility in Simulation for Biomedicine and Clinical Applications in Neuroscience

    Science.gov (United States)

    Mulugeta, Lealem; Drach, Andrew; Erdemir, Ahmet; Hunt, C. A.; Horner, Marc; Ku, Joy P.; Myers Jr., Jerry G.; Vadigepalli, Rajanikanth; Lytton, William W.

    2018-01-01

    Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips), from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins) and scenario-based simulations rather than on numerical simulations. PMID:29713272

  13. Credibility, Replicability, and Reproducibility in Simulation for Biomedicine and Clinical Applications in Neuroscience

    Directory of Open Access Journals (Sweden)

    Lealem Mulugeta

    2018-04-01

    Full Text Available Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips, from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins and scenario-based simulations rather than on numerical simulations.

  14. Parameter optimization for reproducible cardiac 1 H-MR spectroscopy at 3 Tesla.

    Science.gov (United States)

    de Heer, Paul; Bizino, Maurice B; Lamb, Hildo J; Webb, Andrew G

    2016-11-01

    To optimize data acquisition parameters in cardiac proton MR spectroscopy, and to evaluate the intra- and intersession variability in myocardial triglyceride content. Data acquisition parameters at 3 Tesla (T) were optimized and reproducibility measured using, in total, 49 healthy subjects. The signal-to-noise-ratio (SNR) and the variance in metabolite amplitude between averages were measured for: (i) global versus local power optimization; (ii) static magnetic field (B 0 ) shimming performed during free-breathing or within breathholds; (iii) post R-wave peak measurement times between 50 and 900 ms; (iv) without respiratory compensation, with breathholds and with navigator triggering; and (v) frequency selective excitation, Chemical Shift Selective (CHESS) and Multiply Optimized Insensitive Suppression Train (MOIST) water suppression techniques. Using the optimized parameters intra- and intersession myocardial triglyceride content reproducibility was measured. Two cardiac proton spectra were acquired with the same parameters and compared (intrasession reproducibility) after which the subject was removed from the scanner and placed back in the scanner and a third spectrum was acquired which was compared with the first measurement (intersession reproducibility). Local power optimization increased SNR on average by 22% compared with global power optimization (P = 0.0002). The average linewidth was not significantly different for pencil beam B 0 shimming using free-breathing or breathholds (19.1 Hz versus 17.5 Hz; P = 0.15). The highest signal stability occurred at a cardiac trigger delay around 240 ms. The mean amplitude variation was significantly lower for breathholds versus free-breathing (P = 0.03) and for navigator triggering versus free-breathing (P = 0.03) as well as for navigator triggering versus breathhold (P = 0.02). The mean residual water signal using CHESS (1.1%, P = 0.01) or MOIST (0.7%, P = 0.01) water suppression was significantly lower than using

  15. 33 CFR 117.921 - Broad River.

    Science.gov (United States)

    2010-07-01

    ... OPERATION REGULATIONS Specific Requirements South Carolina § 117.921 Broad River. (a) The draw of the S170 bridge, mile 14.0 near Beaufort, shall open on signal if at least 24 hours notice is given. (b) The draw...

  16. THE SIZE, STRUCTURE, AND IONIZATION OF THE BROAD-LINE REGION IN NGC 3227

    International Nuclear Information System (INIS)

    Devereux, Nick

    2013-01-01

    Hubble Space Telescope spectroscopy of the Seyfert 1.5 galaxy, NGC 3227, confirms previous reports that the broad Hα emission line flux is time variable, decreasing by a modest ∼11% between 1999 and 2000 in response to a corresponding ∼37% decrease in the underlying continuum. Modeling the gas distribution responsible for the broad Hα, Hβ, and Hγ emission lines favors a spherically symmetric inflow as opposed to a thin disk. Adopting a central black hole mass of 7.6 × 10 6 M ☉ , determined from prior reverberation mapping, leads to the following dimensions for the size of the region emitting the broad Hα line: an outer radius ∼90 lt-days and an inner radius ∼3 lt-days. Thus, the previously determined reverberation size for the broad-line region (BLR) consistently coincides with the inner radius of a much larger volume of ionized gas. However, the perceived size of the BLR is an illusion, a consequence of the fact that the emitting region is ionization bounded at the outer radius and diminished by Doppler broadening at the inner radius. The actual dimensions of the inflow remain to be determined. Nevertheless, the steady-state mass inflow rate is estimated to be ∼10 –2 M ☉ yr –1 which is sufficient to explain the X-ray luminosity of the active galactic nucleus (AGN) in terms of radiatively inefficient accretion. Collectively, the results challenge many preconceived notions concerning the nature of BLRs in AGNs.

  17. An assessment of geographical distribution of different plant functional types over North America simulated using the CLASS–CTEM modelling framework

    Directory of Open Access Journals (Sweden)

    R. K. Shrestha

    2017-10-01

    Full Text Available The performance of the competition module of the CLASS–CTEM (Canadian Land Surface Scheme and Canadian Terrestrial Ecosystem Model modelling framework is assessed at 1° spatial resolution over North America by comparing the simulated geographical distribution of its plant functional types (PFTs with two observation-based estimates. The model successfully reproduces the broad geographical distribution of trees, grasses and bare ground although limitations remain. In particular, compared to the two observation-based estimates, the simulated fractional vegetation coverage is lower in the arid southwest North American region and higher in the Arctic region. The lower-than-observed simulated vegetation coverage in the southwest region is attributed to lack of representation of shrubs in the model and plausible errors in the observation-based data sets. The observation-based data indicate vegetation fractional coverage of more than 60 % in this arid region, despite only 200–300 mm of precipitation that the region receives annually, and observation-based leaf area index (LAI values in the region are lower than one. The higher-than-observed vegetation fractional coverage in the Arctic is likely due to the lack of representation of moss and lichen PFTs and also likely because of inadequate representation of permafrost in the model as a result of which the C3 grass PFT performs overly well in the region. The model generally reproduces the broad spatial distribution and the total area covered by the two primary tree PFTs (needleleaf evergreen trees, NDL-EVG; and broadleaf cold deciduous trees, BDL-DCD-CLD reasonably well. The simulated fractional coverage of tree PFTs increases after the 1960s in response to the CO2 fertilization effect and climate warming. Differences between observed and simulated PFT coverages highlight model limitations and suggest that the inclusion of shrubs, and moss and lichen PFTs, and an adequate representation of

  18. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M; Woo, B; Kim, J [Seoul National University, Seoul (Korea, Republic of); Jamshidi, N; Kuo, M [UCLA School of Medicine, Los Angeles, CA (United States)

    2015-06-15

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI.

  19. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    International Nuclear Information System (INIS)

    Lee, M; Woo, B; Kim, J; Jamshidi, N; Kuo, M

    2015-01-01

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI

  20. Intravoxel incoherent motion diffusion imaging of the liver: Optimal b-value subsampling and impact on parameter precision and reproducibility

    International Nuclear Information System (INIS)

    Dyvorne, Hadrien; Jajamovich, Guido; Kakite, Suguru; Kuehn, Bernd; Taouli, Bachir

    2014-01-01

    Highlights: • We assess the precision and reproducibility of liver IVIM diffusion parameters. • Liver IVIM DWI can be performed with 4 b-values with good parameter precision. • Liver IVIM DWI can be performed with 4 b-values with good parameter reproducibility. - Abstract: Purpose: To increase diffusion sampling efficiency in intravoxel incoherent motion (IVIM) diffusion-weighted imaging (DWI) of the liver by reducing the number of diffusion weightings (b-values). Materials and methods: In this IRB approved HIPAA compliant prospective study, 53 subjects (M/F 38/15, mean age 52 ± 13 y) underwent IVIM DWI at 1.5 T using 16 b-values (0–800 s/mm 2 ), with 14 subjects having repeat exams to assess IVIM parameter reproducibility. A biexponential diffusion model was used to quantify IVIM hepatic parameters (PF: perfusion fraction, D: true diffusion and D*: pseudo diffusion). All possible subsets of the 16 b-values were probed, with number of b values ranging from 4 to 15, and corresponding parameters were quantified for each subset. For each b-value subset, global parameter estimation error was computed against the parameters obtained with all 16 b-values and the subsets providing the lowest error were selected. Interscan estimation error was also evaluated between repeat exams to assess reproducibility of the IVIM technique in the liver. The optimal b-values distribution was selected such that the number of b-values was minimal while keeping parameter estimation error below interscan reproducibility error. Results: As the number of b-values decreased, the estimation error increased for all parameters, reflecting decreased precision of IVIM metrics. Using an optimal set of 4 b-values (0, 15, 150 and 800 s/mm 2 ), the errors were 6.5, 22.8 and 66.1% for D, PF and D* respectively. These values lie within the range of test–retest reproducibility for the corresponding parameters, with errors of 12.0, 32.3 and 193.8% for D, PF and D* respectively. Conclusion: A set

  1. Quark/gluon jet discrimination: a reproducible analysis using R

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The power to discriminate between light-quark jets and gluon jets would have a huge impact on many searches for new physics at CERN and beyond. This talk will present a walk-through of the development of a prototype machine learning classifier for differentiating between quark and gluon jets at experiments like those at the Large Hadron Collider at CERN. A new fast feature selection method that combines information theory and graph analytics will be outlined. This method has found new variables that promise significant improvements in discrimination power. The prototype jet tagger is simple, interpretable, parsimonious, and computationally extremely cheap, and therefore might be suitable for use in trigger systems for real-time data processing. Nested stratified k-fold cross validation was used to generate robust estimates of model performance. The data analysis was performed entirely in the R statistical programming language, and is fully reproducible. The entire analysis workflow is data-driven, automated a...

  2. DETERMINING QUASAR BLACK HOLE MASS FUNCTIONS FROM THEIR BROAD EMISSION LINES: APPLICATION TO THE BRIGHT QUASAR SURVEY

    International Nuclear Information System (INIS)

    Kelly, Brandon C.; Fan Xiaohui; Vestergaard, Marianne

    2009-01-01

    We describe a Bayesian approach to estimating quasar black hole mass functions (BHMF) using the broad emission lines to estimate black hole mass. We show how using the broad-line mass estimates in combination with statistical techniques developed for luminosity function estimation (e.g., the 1/V a correction) leads to statistically biased results. We derive the likelihood function for the BHMF based on the broad-line mass estimates, and derive the posterior distribution for the BHMF, given the observed data. We develop our statistical approach for a flexible model where the BHMF is modeled as a mixture of Gaussian functions. Statistical inference is performed using Markov chain Monte Carlo (MCMC) methods, and we describe a Metropolis-Hastings algorithm to perform the MCMC. The MCMC simulates random draws from the probability distribution of the BHMF parameters, given the data, and we use a simulated data set to show how these random draws may be used to estimate the probability distribution for the BHMF. In addition, we show how the MCMC output may be used to estimate the probability distribution of any quantities derived from the BHMF, such as the peak in the space density of quasars. Our method has the advantage that it is able to constrain the BHMF even beyond the survey detection limits at the adopted confidence level, accounts for measurement errors and the intrinsic uncertainty in broad-line mass estimates, and provides a natural way of estimating the probability distribution of any quantities derived from the BHMF. We conclude by using our method to estimate the local active BHMF using the z BH ∼> 10 8 M sun . Our analysis implies that at a given M BH , z < 0.5 broad-line quasars have a typical Eddington ratio of ∼0.4 and a dispersion in Eddington ratio of ∼<0.5 dex.

  3. Mode-Locking in Broad-Area Semiconductor Lasers Enhanced by Picosecond-Pulse Injection

    OpenAIRE

    Kaiser, J; Fischer, I; Elsasser, W; Gehrig, E; Hess, O

    2004-01-01

    We present combined experimental and theoretical investigations of the picosecond emission dynamics of broad-area semiconductor lasers (BALs). We enhance the weak longitudinal self-mode-locking that is inherent to BALs by injecting a single optical 50-ps pulse, which triggers the output of a distinct regular train of 13-ps pulses. Modeling based on multimode Maxwell-Bloch equations illustrates how the dynamic interaction of the injected pulse with the internal laser field efficiently couples ...

  4. Evidence for a Broad Autism Phenotype

    NARCIS (Netherlands)

    K. de Groot (Kristel); J.W. van Strien (Jan)

    2017-01-01

    textabstractThe broad autism phenotype implies the existence of a continuum ranging from individuals displaying almost no autistic traits to severely impaired diagnosed individuals. Recent studies have linked this variation in autistic traits to several domains of functioning. However, studies

  5. Reproducibility of corneal, macular and retinal nerve fiber layer ...

    African Journals Online (AJOL)

    side the limits of a consulting room.5. Reproducibility of ... examination, intraocular pressure and corneal thickness ... All OCT measurements were taken between 2 and 5 pm ..... CAS-OCT, Slit-lamp OCT, RTVue-100) have shown ICC.

  6. Reproducibility of studies on text mining for citation screening in systematic reviews: Evaluation and checklist.

    Science.gov (United States)

    Olorisade, Babatunde Kazeem; Brereton, Pearl; Andras, Peter

    2017-09-01

    Independent validation of published scientific results through study replication is a pre-condition for accepting the validity of such results. In computation research, full replication is often unrealistic for independent results validation, therefore, study reproduction has been justified as the minimum acceptable standard to evaluate the validity of scientific claims. The application of text mining techniques to citation screening in the context of systematic literature reviews is a relatively young and growing computational field with high relevance for software engineering, medical research and other fields. However, there is little work so far on reproduction studies in the field. In this paper, we investigate the reproducibility of studies in this area based on information contained in published articles and we propose reporting guidelines that could improve reproducibility. The study was approached in two ways. Initially we attempted to reproduce results from six studies, which were based on the same raw dataset. Then, based on this experience, we identified steps considered essential to successful reproduction of text mining experiments and characterized them to measure how reproducible is a study given the information provided on these steps. 33 articles were systematically assessed for reproducibility using this approach. Our work revealed that it is currently difficult if not impossible to independently reproduce the results published in any of the studies investigated. The lack of information about the datasets used limits reproducibility of about 80% of the studies assessed. Also, information about the machine learning algorithms is inadequate in about 27% of the papers. On the plus side, the third party software tools used are mostly free and available. The reproducibility potential of most of the studies can be significantly improved if more attention is paid to information provided on the datasets used, how they were partitioned and utilized, and

  7. Reproducibility of myelin content-based human habenula segmentation at 3 Tesla.

    Science.gov (United States)

    Kim, Joo-Won; Naidich, Thomas P; Joseph, Joshmi; Nair, Divya; Glasser, Matthew F; O'halloran, Rafael; Doucet, Gaelle E; Lee, Won Hee; Krinsky, Hannah; Paulino, Alejandro; Glahn, David C; Anticevic, Alan; Frangou, Sophia; Xu, Junqian

    2018-03-26

    In vivo morphological study of the human habenula, a pair of small epithalamic nuclei adjacent to the dorsomedial thalamus, has recently gained significant interest for its role in reward and aversion processing. However, segmenting the habenula from in vivo magnetic resonance imaging (MRI) is challenging due to the habenula's small size and low anatomical contrast. Although manual and semi-automated habenula segmentation methods have been reported, the test-retest reproducibility of the segmented habenula volume and the consistency of the boundaries of habenula segmentation have not been investigated. In this study, we evaluated the intra- and inter-site reproducibility of in vivo human habenula segmentation from 3T MRI (0.7-0.8 mm isotropic resolution) using our previously proposed semi-automated myelin contrast-based method and its fully-automated version, as well as a previously published manual geometry-based method. The habenula segmentation using our semi-automated method showed consistent boundary definition (high Dice coefficient, low mean distance, and moderate Hausdorff distance) and reproducible volume measurement (low coefficient of variation). Furthermore, the habenula boundary in our semi-automated segmentation from 3T MRI agreed well with that in the manual segmentation from 7T MRI (0.5 mm isotropic resolution) of the same subjects. Overall, our proposed semi-automated habenula segmentation showed reliable and reproducible habenula localization, while its fully-automated version offers an efficient way for large sample analysis. © 2018 Wiley Periodicals, Inc.

  8. Modelling impacts of performance on the probability of reproducing, and thereby on productive lifespan, allow prediction of lifetime efficiency in dairy cows.

    Science.gov (United States)

    Phuong, H N; Blavy, P; Martin, O; Schmidely, P; Friggens, N C

    2016-01-01

    Reproductive success is a key component of lifetime efficiency - which is the ratio of energy in milk (MJ) to energy intake (MJ) over the lifespan, of cows. At the animal level, breeding and feeding management can substantially impact milk yield, body condition and energy balance of cows, which are known as major contributors to reproductive failure in dairy cattle. This study extended an existing lifetime performance model to incorporate the impacts that performance changes due to changing breeding and feeding strategies have on the probability of reproducing and thereby on the productive lifespan, and thus allow the prediction of a cow's lifetime efficiency. The model is dynamic and stochastic, with an individual cow being the unit modelled and one day being the unit of time. To evaluate the model, data from a French study including Holstein and Normande cows fed high-concentrate diets and data from a Scottish study including Holstein cows selected for high and average genetic merit for fat plus protein that were fed high- v. low-concentrate diets were used. Generally, the model consistently simulated productive and reproductive performance of various genotypes of cows across feeding systems. In the French data, the model adequately simulated the reproductive performance of Holsteins but significantly under-predicted that of Normande cows. In the Scottish data, conception to first service was comparably simulated, whereas interval traits were slightly under-predicted. Selection for greater milk production impaired the reproductive performance and lifespan but not lifetime efficiency. The definition of lifetime efficiency used in this model did not include associated costs or herd-level effects. Further works should include such economic indicators to allow more accurate simulation of lifetime profitability in different production scenarios.

  9. ZIF-8 Membranes with Improved Reproducibility Fabricated from Sputter-Coated ZnO/Alumina Supports

    KAUST Repository

    Yu, Jian; Pan, Yichang; Wang, Chongqing; Lai, Zhiping

    2015-01-01

    for reproducible fabrication of high-quality membranes. In this study, high-quality ZIF-8 membranes were prepared through hydrothermal synthesis under the partial self-conversion of sputter-coated ZnO layer on porous α-alumina supports. The reproducibility

  10. REPRODUCING THE CORRELATIONS OF TYPE C LOW-FREQUENCY QUASI-PERIODIC OSCILLATION PARAMETERS IN XTE J1550–564 WITH A SPIRAL STRUCTURE

    Energy Technology Data Exchange (ETDEWEB)

    Varniere, Peggy [APC, AstroParticule et Cosmologie, Universite Paris Diderot, CNRS/IN2P3, CEA/Irfu, Observatoire de Paris, Sorbonne Paris Cité, 10, rue Alice Domon et Lonie Duquet, F-75205 Paris Cedex 13 (France); Vincent, Frederic H., E-mail: varniere@apc.univ-paris7.fr [Observatoire de Paris/LESIA, 5, place Jules Janssen, F-92195 Meudon Cedex (France)

    2017-01-10

    While it has been observed that the parameters intrinsic to the type C low-frequency quasi-periodic oscillations are related in a nonlinear manner among themselves, there has been, up to now, no model to explain or reproduce how the frequency, the FWHM, and the rms amplitude of the type C low-frequency quasi-periodic oscillations behave with respect to one another. Here we are using a simple toy model representing the emission from a standard disk and a spiral such as that caused by the accretion–ejection instability to reproduce the overall observed behavior and shed some light on its origin. This allows us to prove the ability of such a spiral structure to be at the origin of flux modulation over more than an order of magnitude in frequency.

  11. Automated analysis of phantom images for the evaluation of long-term reproducibility in digital mammography

    International Nuclear Information System (INIS)

    Gennaro, G; Ferro, F; Contento, G; Fornasin, F; Di Maggio, C

    2007-01-01

    The performance of an automatic software package was evaluated with phantom images acquired by a full-field digital mammography unit. After the validation, the software was used, together with a Leeds TORMAS test object, to model the image acquisition process. Process modelling results were used to evaluate the sensitivity of the method in detecting changes of exposure parameters from routine image quality measurements in digital mammography, which is the ultimate purpose of long-term reproducibility tests. Image quality indices measured by the software included the mean pixel value and standard deviation of circular details and surrounding background, contrast-to-noise ratio and relative contrast; detail counts were also collected. The validation procedure demonstrated that the software localizes the phantom details correctly and the difference between automatic and manual measurements was within few grey levels. Quantitative analysis showed sufficient sensitivity to relate fluctuations in exposure parameters (kV p or mAs) to variations in image quality indices. In comparison, detail counts were found less sensitive in detecting image quality changes, even when limitations due to observer subjectivity were overcome by automatic analysis. In conclusion, long-term reproducibility tests provided by the Leeds TORMAS phantom with quantitative analysis of multiple IQ indices have been demonstrated to be effective in predicting causes of deviation from standard operating conditions and can be used to monitor stability in full-field digital mammography

  12. The reproducibility of single photon absorptiometry in a clinical setting

    International Nuclear Information System (INIS)

    Valkema, R.; Blokland, J.A.K.; Pauwels, E.K.J.; Papapoulos, S.E.; Bijvoet, O.L.M.

    1989-01-01

    The reproducibility of single photon absorptiometry (SPA) results for detection of changes in bone mineral content (BMC) was evaluated in a clinical setting. During a period of 18 months with 4 different sources, the calibration scans of an aluminium standard had a variation of less than 1% unless the activity of the 125 I source was low. The calibration procedure was performed weekly and this was sufficient to correct for drift of the system. The short term reproducibility in patients was assessed with 119 duplicate measurements made in direct succession. The best reproducibility (CV=1.35%) was found for fat corrected BMC results expressed in g/cm, obtained at the site proximal to the 8 mm space between the radius and ulna. Analysis of all SPA scans made during 1 year (487 scans) showed a failure of the automatic procedure to detect the space of 8 mm between the forearm bones in 19 scans (3.9%). A space adjacent to the ulnar styloid was taken as the site for the first scan in these examinations. This problem may be recognized and corrected relatively easy. A significant correlation was found between BMC at the lower arm and BMC of the lumbar spine assessed with dual photon absorptiometry. However, the error of estimation of proximal BMC (SEE=20%) and distal BMC (SEE=19.4%) made these measurements of little value to predict BMC at the lumbar spine in individuals. The short term reproducibility in patients combined with long term stability of the equipment in our clinical setting showed that SPA is a reliable technique to assess changes in bone mass at the lower arm of 4% between 2 measurements with a confidence level of 95%. (orig.)

  13. Reproducibility of Mammography Units, Film Processing and Quality Imaging

    International Nuclear Information System (INIS)

    Gaona, Enrique

    2003-01-01

    The purpose of this study was to carry out an exploratory survey of the problems of quality control in mammography and processors units as a diagnosis of the current situation of mammography facilities. Measurements of reproducibility, optical density, optical difference and gamma index are included. Breast cancer is the most frequently diagnosed cancer and is the second leading cause of cancer death among women in the Mexican Republic. Mammography is a radiographic examination specially designed for detecting breast pathology. We found that the problems of reproducibility of AEC are smaller than the problems of processors units because almost all processors fall outside of the acceptable variation limits and they can affect the mammography quality image and the dose to breast. Only four mammography units agree with the minimum score established by ACR and FDA for the phantom image

  14. A database for reproducible manipulation research: CapriDB – Capture, Print, Innovate

    Directory of Open Access Journals (Sweden)

    Florian T. Pokorny

    2017-04-01

    Full Text Available We present a novel approach and database which combines the inexpensive generation of 3D object models via monocular or RGB-D camera images with 3D printing and a state of the art object tracking algorithm. Unlike recent efforts towards the creation of 3D object databases for robotics, our approach does not require expensive and controlled 3D scanning setups and aims to enable anyone with a camera to scan, print and track complex objects for manipulation research. The proposed approach results in detailed textured mesh models whose 3D printed replicas provide close approximations of the originals. A key motivation for utilizing 3D printed objects is the ability to precisely control and vary object properties such as the size, material properties and mass distribution in the 3D printing process to obtain reproducible conditions for robotic manipulation research. We present CapriDB – an extensible database resulting from this approach containing initially 40 textured and 3D printable mesh models together with tracking features to facilitate the adoption of the proposed approach.

  15. Production process reproducibility and product quality consistency of transient gene expression in HEK293 cells with anti-PD1 antibody as the model protein.

    Science.gov (United States)

    Ding, Kai; Han, Lei; Zong, Huifang; Chen, Junsheng; Zhang, Baohong; Zhu, Jianwei

    2017-03-01

    Demonstration of reproducibility and consistency of process and product quality is one of the most crucial issues in using transient gene expression (TGE) technology for biopharmaceutical development. In this study, we challenged the production consistency of TGE by expressing nine batches of recombinant IgG antibody in human embryonic kidney 293 cells to evaluate reproducibility including viable cell density, viability, apoptotic status, and antibody yield in cell culture supernatant. Product quality including isoelectric point, binding affinity, secondary structure, and thermal stability was assessed as well. In addition, major glycan forms of antibody from different batches of production were compared to demonstrate glycosylation consistency. Glycan compositions of the antibody harvested at different time periods were also measured to illustrate N-glycan distribution over the culture time. From the results, it has been demonstrated that different TGE batches are reproducible from lot to lot in overall cell growth, product yield, and product qualities including isoelectric point, binding affinity, secondary structure, and thermal stability. Furthermore, major N-glycan compositions are consistent among different TGE batches and conserved during cell culture time.

  16. Reproducibility of thoracic kyphosis measurements in patients with adolescent idiopathic scoliosis.

    Science.gov (United States)

    Ohrt-Nissen, Søren; Cheung, Jason Pui Yin; Hallager, Dennis Winge; Gehrchen, Martin; Kwan, Kenny; Dahl, Benny; Cheung, Kenneth M C; Samartzis, Dino

    2017-01-01

    Current surgical treatment for adolescent idiopathic scoliosis (AIS) involves correction in both the coronal and sagittal plane, and thorough assessment of these parameters is essential for evaluation of surgical results. However, various definitions of thoracic kyphosis (TK) have been proposed, and the intra- and inter-rater reproducibility of these measures has not been determined. As such, the purpose of the current study was to determine the intra- and inter-rater reproducibility of several TK measurements used in the assessment of AIS. Twenty patients (90% females) surgically treated for AIS with alternate-level pedicle screw fixation were included in the study. Three raters independently evaluated pre- and postoperative standing lateral plain radiographs. For each radiograph, several definitions of TK were measured as well as L1-S1 and nonfixed lumbar lordosis. All variables were measured twice 14 days apart, and a mixed effects model was used to determine the repeatability coefficient (RC), which is a measure of the agreement between repeated measurements. Also, the intra- and inter-rater intra-class correlation coefficient (ICC) was determined as a measure of reliability. Preoperative median Cobb angle was 58° (range 41°-86°), and median surgical curve correction was 68% (range 49-87%). Overall intra-rater RC was highest for T2-T12 and nonfixed TK (11°) and lowest for T4-T12 and T5-T12 (8°). Inter-rater RC was highest for T1-T12, T1-nonfixed, and nonfixed TK (13°) and lowest for T5-T12 (9°). Agreement varied substantially between pre- and postoperative radiographs. Inter-rater ICC was highest for T4-T12 (0.92; 95% CI 0.88-0.95) and T5-T12 (0.92; 95% CI 0.88-0.95) and lowest for T1-nonfixed (0.80; 95% CI 0.72-0.88). Considerable variation for all TK measurements was noted. Intra- and inter-rater reproducibility was best for T4-T12 and T5-T12. Future studies should consider adopting a relevant minimum difference as a limit for true change in TK.

  17. Reproducibility of CT bone dosimetry: Operator versus automated ROI definition

    International Nuclear Information System (INIS)

    Louis, O.; Luypaert, R.; Osteaux, M.; Kalender, W.

    1988-01-01

    Intrasubject reproducibility with repeated determination of vertebral mineral density from a given set of CT images was investigated. The region of interest (ROI) in 10 patient scans was selected by four independent operators either manually or with an automated procedure separating cortical and spongeous bone, the operators being requested to interact in ROI selection. The mean intrasubject variation was found to be much lower with the automated process (0.3 to 0.6%) than with the conventional method (2.5 to 5.2%). In a second study, 10 patients were examined twice to determine the reproducibility of CT slice selection by the operator. The errors were of the same order of magnitude as in ROI selection. (orig.)

  18. Reproducible analyses of microbial food for advanced life support systems

    Science.gov (United States)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  19. SN 2009bb: A PECULIAR BROAD-LINED TYPE Ic SUPERNOVA ,

    International Nuclear Information System (INIS)

    Pignata, Giuliano; Stritzinger, Maximilian; Phillips, M. M.; Morrell, Nidia; Boldt, Luis; Campillay, Abdo; Contreras, Carlos; Gonzalez, Sergio; Krzeminski, Wojtek; Roth, Miguel; Salgado, Francisco; Soderberg, Alicia; Mazzali, Paolo; Anderson, J. P.; Folatelli, Gaston; Foerster, Francisco; Hamuy, Mario; Maza, Jose; Levesque, Emily M.; Rest, Armin

    2011-01-01

    Ultraviolet, optical, and near-infrared photometry and optical spectroscopy of the broad-lined Type Ic supernova (SN) 2009bb are presented, following the flux evolution from -10 to +285 days past B-band maximum. Thanks to the very early discovery, it is possible to place tight constraints on the SN explosion epoch. The expansion velocities measured from near maximum spectra are found to be only slightly smaller than those measured from spectra of the prototype broad-lined SN 1998bw associated with GRB 980425. Fitting an analytical model to the pseudobolometric light curve of SN 2009bb suggests that 4.1 ± 1.9 M sun of material was ejected with 0.22 ± 0.06 M sun of it being 56 Ni. The resulting kinetic energy is 1.8 ± 0.7 x 10 52 erg. This, together with an absolute peak magnitude of M B = -18.36 ± 0.44, places SN 2009bb on the energetic and luminous end of the broad-lined Type Ic (SN Ic) sequence. Detection of helium in the early time optical spectra accompanied with strong radio emission and high metallicity of its environment makes SN 2009bb a peculiar object. Similar to the case for gamma-ray bursts (GRBs), we find that the bulk explosion parameters of SN 2009bb cannot account for the copious energy coupled to relativistic ejecta, and conclude that another energy reservoir (a central engine) is required to power the radio emission. Nevertheless, the analysis of the SN 2009bb nebular spectrum suggests that the failed GRB detection is not imputable to a large angle between the line-of-sight and the GRB beamed radiation. Therefore, if a GRB was produced during the SN 2009bb explosion, it was below the threshold of the current generation of γ-ray instruments.

  20. The Reproducibility of Nuclear Morphometric Measurements in Invasive Breast Carcinoma

    Directory of Open Access Journals (Sweden)

    Pauliina Kronqvist

    1997-01-01

    Full Text Available The intraobserver and interobserver reproducibility of computerized nuclear morphometry was determined in repeated measurements of 212 samples of invasive breast cancer. The influence of biological variation and the selection of the measurement area was also tested. Morphometrically determined mean nuclear profile area (Pearson’s r 0.89, grading efficiency (GE 0.95 and standard deviation (SD of nuclear profile area (Pearson’s r 0.84, GE 0.89 showed high reproducibility. In this respect, nuclear morphometry equals with other established methods of quantitative pathology and exceeds the results of subjective grading of nuclear atypia in invasive breast cancer. A training period of eight days was sufficient to produce clear improvement in consistency of nuclear morphometry results. By estimating the sources of variation it could be shown that the variation associated with the measurement procedure itself is small. Instead, sample associated variation is responsible for the majority of variation in the measurements (82.9% in mean nuclear profile area and 65.9% in SD of nuclear profile area. This study points out that when standardized methods are applied computerized morphometry is a reproducible and reliable method of assessing nuclear atypia in invasive breast cancer. For further improvement special emphasize should be put on sampling rules of selecting the microscope fields and measurement areas.

  1. Response of a comprehensive climate model to a broad range of external forcings: relevance for deep ocean ventilation and the development of late Cenozoic ice ages

    Science.gov (United States)

    Galbraith, Eric; de Lavergne, Casimir

    2018-03-01

    Over the past few million years, the Earth descended from the relatively warm and stable climate of the Pliocene into the increasingly dramatic ice age cycles of the Pleistocene. The influences of orbital forcing and atmospheric CO2 on land-based ice sheets have long been considered as the key drivers of the ice ages, but less attention has been paid to their direct influences on the circulation of the deep ocean. Here we provide a broad view on the influences of CO2, orbital forcing and ice sheet size according to a comprehensive Earth system model, by integrating the model to equilibrium under 40 different combinations of the three external forcings. We find that the volume contribution of Antarctic (AABW) vs. North Atlantic (NADW) waters to the deep ocean varies widely among the simulations, and can be predicted from the difference between the surface densities at AABW and NADW deep water formation sites. Minima of both the AABW-NADW density difference and the AABW volume occur near interglacial CO2 (270-400 ppm). At low CO2, abundant formation and northward export of sea ice in the Southern Ocean contributes to very salty and dense Antarctic waters that dominate the global deep ocean. Furthermore, when the Earth is cold, low obliquity (i.e. a reduced tilt of Earth's rotational axis) enhances the Antarctic water volume by expanding sea ice further. At high CO2, AABW dominance is favoured due to relatively warm subpolar North Atlantic waters, with more dependence on precession. Meanwhile, a large Laurentide ice sheet steers atmospheric circulation as to strengthen the Atlantic Meridional Overturning Circulation, but cools the Southern Ocean remotely, enhancing Antarctic sea ice export and leading to very salty and expanded AABW. Together, these results suggest that a `sweet spot' of low CO2, low obliquity and relatively small ice sheets would have poised the AMOC for interruption, promoting Dansgaard-Oeschger-type abrupt change. The deep ocean temperature and

  2. Reproducibility of the Pleth Variability Index in premature infants

    NARCIS (Netherlands)

    Den Boogert, W.J. (Wilhelmina J.); H.A. van Elteren (Hugo); T.G. Goos (Tom); I.K.M. Reiss (Irwin); R.C.J. de Jonge (Rogier); V.J. van den Berg (Victor J.)

    2017-01-01

    textabstractThe aim was to assess the reproducibility of the Pleth Variability Index (PVI), developed for non-invasive monitoring of peripheral perfusion, in preterm neonates below 32 weeks of gestational age. Three PVI measurements were consecutively performed in stable, comfortable preterm

  3. Hippocampal volume change measurement: quantitative assessment of the reproducibility of expert manual outlining and the automated methods FreeSurfer and FIRST.

    Science.gov (United States)

    Mulder, Emma R; de Jong, Remko A; Knol, Dirk L; van Schijndel, Ronald A; Cover, Keith S; Visser, Pieter J; Barkhof, Frederik; Vrenken, Hugo

    2014-05-15

    To measure hippocampal volume change in Alzheimer's disease (AD) or mild cognitive impairment (MCI), expert manual delineation is often used because of its supposed accuracy. It has been suggested that expert outlining yields poorer reproducibility as compared to automated methods, but this has not been investigated. To determine the reproducibilities of expert manual outlining and two common automated methods for measuring hippocampal atrophy rates in healthy aging, MCI and AD. From the Alzheimer's Disease Neuroimaging Initiative (ADNI), 80 subjects were selected: 20 patients with AD, 40 patients with mild cognitive impairment (MCI) and 20 healthy controls (HCs). Left and right hippocampal volume change between baseline and month-12 visit was assessed by using expert manual delineation, and by the automated software packages FreeSurfer (longitudinal processing stream) and FIRST. To assess reproducibility of the measured hippocampal volume change, both back-to-back (BTB) MPRAGE scans available for each visit were analyzed. Hippocampal volume change was expressed in μL, and as a percentage of baseline volume. Reproducibility of the 1-year hippocampal volume change was estimated from the BTB measurements by using linear mixed model to calculate the limits of agreement (LoA) of each method, reflecting its measurement uncertainty. Using the delta method, approximate p-values were calculated for the pairwise comparisons between methods. Statistical analyses were performed both with inclusion and exclusion of visibly incorrect segmentations. Visibly incorrect automated segmentation in either one or both scans of a longitudinal scan pair occurred in 7.5% of the hippocampi for FreeSurfer and in 6.9% of the hippocampi for FIRST. After excluding these failed cases, reproducibility analysis for 1-year percentage volume change yielded LoA of ±7.2% for FreeSurfer, ±9.7% for expert manual delineation, and ±10.0% for FIRST. Methods ranked the same for reproducibility of 1

  4. Investigating the structural origin of trpzip2 temperature dependent unfolding fluorescence line shape based on a Markov state model simulation.

    Science.gov (United States)

    Song, Jian; Gao, Fang; Cui, Raymond Z; Shuang, Feng; Liang, Wanzhen; Huang, Xuhui; Zhuang, Wei

    2012-10-25

    Vibrationally resolved fluorescence spectra of the β-hairpin trpzip2 peptide at two temperatures as well as during a T-jump unfolding process are simulated on the basis of a combination of Markov state models and quantum chemistry schemes. The broad asymmetric spectral line shape feature is reproduced by considering the exciton-phonon couplings. The temperature dependent red shift observed in the experiment has been attributed to the state population changes of specific chromophores. Through further theoretical study, it is found that both the environment's electric field and the chromophores' geometry distortions are responsible for tryptophan fluorescence shift.

  5. Knockout and fragmentation reactions using a broad range of tin isotopes

    Science.gov (United States)

    Rodríguez-Sánchez, J. L.; Benlliure, J.; Bertulani, C. A.; Vargas, J.; Ayyad, Y.; Alvarez-Pol, H.; Atkinson, J.; Aumann, T.; Beceiro-Novo, S.; Boretzky, K.; Caamaño, M.; Casarejos, E.; Cortina-Gil, D.; Díaz-Cortes, J.; Fernández, P. Díaz; Estrade, A.; Geissel, H.; Kelić-Heil, A.; Litvinov, Yu. A.; Mostazo, M.; Paradela, C.; Pérez-Loureiro, D.; Pietri, S.; Prochazka, A.; Takechi, M.; Weick, H.; Winfield, J. S.

    2017-09-01

    Production cross sections of residual nuclei obtained by knockout and fragmentation reactions of different tin isotopes accelerated at 1 A GeV have been measured with the fragment separator (FRS) at GSI, Darmstadt. The new measurements are used to investigate the neutron-excess dependence of the neutron- and proton-knockout cross sections. These cross sections are compared to Glauber model calculations coupled to a nuclear de-excitation code in order to investigate the role of the remnant excitations. This bench marking shows an overestimation of the cross sections for the removal of deeply bound nucleons. A phenomenological increase in the excitation energy induced in the remnants produced in these cases allows us to reproduce the measured cross sections.

  6. Influence of radiation dose and iterative reconstruction algorithms for measurement accuracy and reproducibility of pulmonary nodule volumetry: A phantom study.

    Science.gov (United States)

    Kim, Hyungjin; Park, Chang Min; Song, Yong Sub; Lee, Sang Min; Goo, Jin Mo

    2014-05-01

    To evaluate the influence of radiation dose settings and reconstruction algorithms on the measurement accuracy and reproducibility of semi-automated pulmonary nodule volumetry. CT scans were performed on a chest phantom containing various nodules (10 and 12mm; +100, -630 and -800HU) at 120kVp with tube current-time settings of 10, 20, 50, and 100mAs. Each CT was reconstructed using filtered back projection (FBP), iDose(4) and iterative model reconstruction (IMR). Semi-automated volumetry was performed by two radiologists using commercial volumetry software for nodules at each CT dataset. Noise, contrast-to-noise ratio and signal-to-noise ratio of CT images were also obtained. The absolute percentage measurement errors and differences were then calculated for volume and mass. The influence of radiation dose and reconstruction algorithm on measurement accuracy, reproducibility and objective image quality metrics was analyzed using generalized estimating equations. Measurement accuracy and reproducibility of nodule volume and mass were not significantly associated with CT radiation dose settings or reconstruction algorithms (p>0.05). Objective image quality metrics of CT images were superior in IMR than in FBP or iDose(4) at all radiation dose settings (pvolumetry can be applied to low- or ultralow-dose chest CT with usage of a novel iterative reconstruction algorithm without losing measurement accuracy and reproducibility. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Reproducibility of contrast-enhanced transrectal ultrasound of the prostate

    NARCIS (Netherlands)

    Sedelaar, J. P.; Goossen, T. E.; Wijkstra, H.; de la Rosette, J. J.

    2001-01-01

    Transrectal three-dimensional (3-D) contrast-enhanced power Doppler ultrasound (US) is a novel technique for studying possible prostate malignancy. Before studies can be performed to investigate the clinical validity of the technique, reproducibility of the contrast US studies must be proven.

  8. A Novel Approach to Calculation of Reproducing Kernel on Infinite Interval and Applications to Boundary Value Problems

    Directory of Open Access Journals (Sweden)

    Jing Niu

    2013-01-01

    reproducing kernel on infinite interval is obtained concisely in polynomial form for the first time. Furthermore, as a particular effective application of this method, we give an explicit representation formula for calculation of reproducing kernel in reproducing kernel space with boundary value conditions.

  9. Inter-Scan Reproducibility of Carotid Plaque Volume Measurements by 3-D Ultrasound

    DEFF Research Database (Denmark)

    Sandholt, Benjamin V; Collet-Billon, Antoine; Entrekin, Robert

    2018-01-01

    (PPV) measure centered on MPT. Total plaque volume (TPV), PPV from a 10-mm segment and MPT were measured using dedicated semi-automated software on 38 plaques from 26 patients. Inter-scan reproducibility was assessed using the t-test, Bland-Altman plots and Pearson's correlation coefficient....... There was a mean difference of 0.01 mm in MPT (limits of agreement: -0.45 to 0.42 mm, Pearson's correlation coefficient: 0.96). Both volume measurements exhibited high reproducibility, with PPV being superior (limits of agreement: -35.3 mm3to 33.5 mm3, Pearson's correlation coefficient: 0.96) to TPV (limits...... of agreement: -88.2 to 61.5 mm3, Pearson's correlation coefficient: 0.91). The good reproducibility revealed by the present results encourages future studies on establishing plaque quantification as part of cardiovascular risk assessment and for follow-up of disease progression over time....

  10. Evaluation of the Repeatability and the Reproducibility of AL-Scan Measurements Obtained by Residents

    Directory of Open Access Journals (Sweden)

    Mehmet Kola

    2014-01-01

    Full Text Available Purpose. To assess the repeatability and reproducibility of ocular biometry and intraocular lens (IOL power measurements obtained by ophthalmology residents using an AL-Scan device, a novel optical biometer. Methods. Two ophthalmology residents were instructed regarding the AL-Scan device. Both performed ocular biometry and IOL power measurements using AL-Scan, three times on each of 128 eyes, independently of one another. Corneal keratometry readings, horizontal iris width, central corneal thickness, anterior chamber depth, pupil size, and axial length values measured by both residents were recorded together with IOL power values calculated on the basis of four different IOL calculation formulas (SRK/T, Holladay, and HofferQ. Repeatability and reproducibility of the measurements obtained were analyzed using the intraclass correlation coefficient (ICC. Results. Repeatability (ICC, 0.872-0.999 for resident 1 versus 0.905-0.999 for resident 2 and reproducibility (ICC, 0.916-0.999 were high for all biometric measurements. Repeatability (ICC, 0.981-0.983 for resident 1 versus 0.995-0.996 for resident 2 and reproducibility were also high for all IOL power measurements (ICC, 0.996 for all. Conclusions. The AL-Scan device exhibits good repeatability and reproducibility in all biometric measurements and IOL power calculations, independent of the operator concerned.

  11. In utero diffusion tensor imaging of the fetal brain: A reproducibility study.

    Science.gov (United States)

    Jakab, András; Tuura, Ruth; Kellenberger, Christian; Scheer, Ianina

    2017-01-01

    Our purpose was to evaluate the within-subject reproducibility of in utero diffusion tensor imaging (DTI) metrics and the visibility of major white matter structures. Images for 30 fetuses (20-33. postmenstrual weeks, normal neurodevelopment: 6 cases, cerebral pathology: 24 cases) were acquired on 1.5 T or 3.0 T MRI. DTI with 15 diffusion-weighting directions was repeated three times for each case, TR/TE: 2200/63 ms, voxel size: 1 ∗ 1 mm, slice thickness: 3-5 mm, b-factor: 700 s/mm 2 . Reproducibility was evaluated from structure detectability, variability of DTI measures using the coefficient of variation (CV), image correlation and structural similarity across repeated scans for six selected structures. The effect of age, scanner type, presence of pathology was determined using Wilcoxon rank sum test. White matter structures were detectable in the following percentage of fetuses in at least two of the three repeated scans: corpus callosum genu 76%, splenium 64%, internal capsule, posterior limb 60%, brainstem fibers 40% and temporooccipital association pathways 60%. The mean CV of DTI metrics ranged between 3% and 14.6% and we measured higher reproducibility in fetuses with normal brain development. Head motion was negatively correlated with reproducibility, this effect was partially ameliorated by motion-correction algorithm using image registration. Structures on 3.0 T had higher variability both with- and without motion correction. Fetal DTI is reproducible for projection and commissural bundles during mid-gestation, however, in 16-30% of the cases, data were corrupted by artifacts, resulting in impaired detection of white matter structures. To achieve robust results for the quantitative analysis of diffusivity and anisotropy values, fetal-specific image processing is recommended and repeated DTI is needed to ensure the detectability of fiber pathways.

  12. Data-based Non-Markovian Model Inference

    Science.gov (United States)

    Ghil, Michael

    2015-04-01

    This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close

  13. Reproducibility analysis of the stability and treatment of vertebral metastatic lesions

    Directory of Open Access Journals (Sweden)

    Raphael de Rezende Pratali

    2014-09-01

    Full Text Available OBJECTIVES: To investigate the reproducibility among spine surgeons in defining the treatment of vertebral metastatic lesions, taking into account the mechanical stability of injuries. METHODS: Twenty cases of isolated vertebral metastatic lesions were presented to ten experts. Their opinion was then asked about the stability of the lesion, as well as their treatment option. RESULTS: The interobserver Kappa coefficient obtained both for stability analysis as to the decision of the treatment was poor (0.334 and 0.248, respectively. CONCLUSIONS: Poor interobserver reproducibility was observed in deciding the treatment of vertebral metastatic lesions when considering the stability of the lesions.

  14. Spatial modeling of agricultural land use change at global scale

    Science.gov (United States)

    Meiyappan, P.; Dalton, M.; O'Neill, B. C.; Jain, A. K.

    2014-11-01

    Long-term modeling of agricultural land use is central in global scale assessments of climate change, food security, biodiversity, and climate adaptation and mitigation policies. We present a global-scale dynamic land use allocation model and show that it can reproduce the broad spatial features of the past 100 years of evolution of cropland and pastureland patterns. The modeling approach integrates economic theory, observed land use history, and data on both socioeconomic and biophysical determinants of land use change, and estimates relationships using long-term historical data, thereby making it suitable for long-term projections. The underlying economic motivation is maximization of expected profits by hypothesized landowners within each grid cell. The model predicts fractional land use for cropland and pastureland within each grid cell based on socioeconomic and biophysical driving factors that change with time. The model explicitly incorporates the following key features: (1) land use competition, (2) spatial heterogeneity in the nature of driving factors across geographic regions, (3) spatial heterogeneity in the relative importance of driving factors and previous land use patterns in determining land use allocation, and (4) spatial and temporal autocorrelation in land use patterns. We show that land use allocation approaches based solely on previous land use history (but disregarding the impact of driving factors), or those accounting for both land use history and driving factors by mechanistically fitting models for the spatial processes of land use change do not reproduce well long-term historical land use patterns. With an example application to the terrestrial carbon cycle, we show that such inaccuracies in land use allocation can translate into significant implications for global environmental assessments. The modeling approach and its evaluation provide an example that can be useful to the land use, Integrated Assessment, and the Earth system modeling

  15. Self-consistent Bulge/Disk/Halo Galaxy Dynamical Modeling Using Integral Field Kinematics

    Science.gov (United States)

    Taranu, D. S.; Obreschkow, D.; Dubinski, J. J.; Fogarty, L. M. R.; van de Sande, J.; Catinella, B.; Cortese, L.; Moffett, A.; Robotham, A. S. G.; Allen, J. T.; Bland-Hawthorn, J.; Bryant, J. J.; Colless, M.; Croom, S. M.; D'Eugenio, F.; Davies, R. L.; Drinkwater, M. J.; Driver, S. P.; Goodwin, M.; Konstantopoulos, I. S.; Lawrence, J. S.; López-Sánchez, Á. R.; Lorente, N. P. F.; Medling, A. M.; Mould, J. R.; Owers, M. S.; Power, C.; Richards, S. N.; Tonini, C.

    2017-11-01

    We introduce a method for modeling disk galaxies designed to take full advantage of data from integral field spectroscopy (IFS). The method fits equilibrium models to simultaneously reproduce the surface brightness, rotation, and velocity dispersion profiles of a galaxy. The models are fully self-consistent 6D distribution functions for a galaxy with a Sérsic profile stellar bulge, exponential disk, and parametric dark-matter halo, generated by an updated version of GalactICS. By creating realistic flux-weighted maps of the kinematic moments (flux, mean velocity, and dispersion), we simultaneously fit photometric and spectroscopic data using both maximum-likelihood and Bayesian (MCMC) techniques. We apply the method to a GAMA spiral galaxy (G79635) with kinematics from the SAMI Galaxy Survey and deep g- and r-band photometry from the VST-KiDS survey, comparing parameter constraints with those from traditional 2D bulge-disk decomposition. Our method returns broadly consistent results for shared parameters while constraining the mass-to-light ratios of stellar components and reproducing the H I-inferred circular velocity well beyond the limits of the SAMI data. Although the method is tailored for fitting integral field kinematic data, it can use other dynamical constraints like central fiber dispersions and H I circular velocities, and is well-suited for modeling galaxies with a combination of deep imaging and H I and/or optical spectra (resolved or otherwise). Our implementation (MagRite) is computationally efficient and can generate well-resolved models and kinematic maps in under a minute on modern processors.

  16. Validity and reproducibility of a food frequency questionnaire for adults of São Paulo, Brazil.

    Science.gov (United States)

    Selem, Soraya Sant'Ana de Castro; Carvalho, Aline Martins de; Verly-Junior, Eliseu; Carlos, Jackeline Venâncio; Teixeira, Juliana Araujo; Marchioni, Dirce Maria Lobo; Fisberg, Regina Mara

    2014-12-01

    To assess the validity and reproducibility of a food frequency questionnaire developed for estimating the food consumption of adults in São Paulo, Brazil, based population study. A sample of individuals aged above 20 years, of both genders, living in São Paulo, was used for the validation study (n = 77) and reproducibility study (n = 74) of the food frequency questionnaire. To verify the validity and reproducibility of energy and 19 nutrients were applied two food frequency questionnaires (60 items) and three 24-hour dietary recalls (24HR - reference method). The validity was verified by Spearman correlation coefficient (crude and de-attenuated) and weighted Kappa, and reproducibility by intraclass correlation coefficients and weighted kappa. In analyzes of validity de-attenuated correlation coefficients ranged from 0.21 (carbohydrate) to 0.74 (energy), and weighted kappa exceeded 0.40 for 30% of the nutrients. Polyunsaturated fat and folate did not show significant correlation and weighted kappa. In reproducibility correlation coefficients ranged from 0.36 (polyunsaturated fat) to 0.69 (calcium), and weighted kappa exceeded 0.40 for 80% of the nutrients. The food frequency questionnaire analyzed has good validity and reproducibility for estimating the food consumption of adults in São Paulo compared to the reference method, so it is an appropriate instrument to be used in epidemiological studies on similar populations. Estimates of polyunsaturated fat and folate should be interpreted with caution.

  17. Repeatability and reproducibility of decisions by latent fingerprint examiners.

    Directory of Open Access Journals (Sweden)

    Bradford T Ulery

    Full Text Available The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as "difficult" than for "easy" or "moderate" comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (n = 4; 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization. Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases.

  18. Running an open experiment: transparency and reproducibility in soil and ecosystem science

    Science.gov (United States)

    Bond-Lamberty, Ben; Peyton Smith, A.; Bailey, Vanessa

    2016-08-01

    Researchers in soil and ecosystem science, and almost every other field, are being pushed—by funders, journals, governments, and their peers—to increase transparency and reproducibility of their work. A key part of this effort is a move towards open data as a way to fight post-publication data loss, improve data and code quality, enable powerful meta- and cross-disciplinary analyses, and increase trust in, and the efficiency of, publicly-funded research. Many scientists however lack experience in, and may be unsure of the benefits of, making their data and fully-reproducible analyses publicly available. Here we describe a recent ‘open experiment’, in which we documented every aspect of a soil incubation online, making all raw data, scripts, diagnostics, final analyses, and manuscripts available in real time. We found that using tools such as version control, issue tracking, and open-source statistical software improved data integrity, accelerated our team’s communication and productivity, and ensured transparency. There are many avenues to improve scientific reproducibility and data availability, of which is this only one example, and it is not an approach suited for every experiment or situation. Nonetheless, we encourage the communities in our respective fields to consider its advantages, and to lead rather than follow with respect to scientific reproducibility, transparency, and data availability.

  19. Reproducing the organic matter model of anthropogenic dark earth of Amazonia and testing the ecotoxicity of functionalized charcoal compounds

    Directory of Open Access Journals (Sweden)

    Carolina Rodrigues Linhares

    2012-05-01

    Full Text Available The objective of this work was to obtain organic compounds similar to the ones found in the organic matter of anthropogenic dark earth of Amazonia (ADE using a chemical functionalization procedure on activated charcoal, as well as to determine their ecotoxicity. Based on the study of the organic matter from ADE, an organic model was proposed and an attempt to reproduce it was described. Activated charcoal was oxidized with the use of sodium hypochlorite at different concentrations. Nuclear magnetic resonance was performed to verify if the spectra of the obtained products were similar to the ones of humic acids from ADE. The similarity between spectra indicated that the obtained products were polycondensed aromatic structures with carboxyl groups: a soil amendment that can contribute to soil fertility and to its sustainable use. An ecotoxicological test with Daphnia similis was performed on the more soluble fraction (fulvic acids of the produced soil amendment. Aryl chloride was formed during the synthesis of the organic compounds from activated charcoal functionalization and partially removed through a purification process. However, it is probable that some aryl chloride remained in the final product, since the ecotoxicological test indicated that the chemical functionalized soil amendment is moderately toxic.

  20. SU-E-T-676: Reproducibility and Consistency of Two SunNuclear 3D Scanning Tanks

    International Nuclear Information System (INIS)

    Hessler, J; DiCostanzo, D; Grzetic, S; Ayan, A; Gupta, N; Woollard, J

    2015-01-01

    Purpose: To determine if two Sun Nuclear 3D Scanning (SNC 3DS) tanks collect reproducible and consistent data and test the precision of the SNC Dosimetry auto-setup. Methods: Percent depth doses (PDDs) and profiles were collected on two SNC 3DS tanks with a Varian TrueBeam linear accelerator. SNC Dosimetry auto-setup application was used with CC13 ionization chambers. After auto-setup, collimator light field was checked manually against the position of the chamber. Comparing measured data for repeated measurements with tank 1 allowed evaluation of SNC-3DS auto-setup and tank reproducibility. Comparing measured data between tanks 1 and 2 allowed evaluation of consistency between tanks. Results: Preliminary results showed reproducibility of depth of maximum dose (Dmax) of 0.38mm for a 10cmx10cm field and 0.67mm for 30cmx30cm on a single tank. PDD values at 5cm, 10cm, and 20cm depths were reproducible within 0.26%. Consistency of Dmax between tanks was 0.17mm for a 10cmx10cm field and 0.44mm for 30cmx30cm. PDD values at 5cm, 10cm, and 20cmwere consistent within 0.06%. Profiles showed reproducibility in field width within 0.4mm for a 10cmx10cm field and 0.7mm for a 30cmx30cm field. Profiles showed consistency in field width within 0.2mm for 10cmx10cm and 30cmx30cm field sizes. Penumbra width was reproducible and consistent to under 0.5mm except for 30cmx30cm field size at 30cm depth where the reproducibility was 2.2mm and the consistency was 2.6mm. Conclusion: In conclusion, the SNC 3DS tank shows good reproducibility in measured data. Since the tank to tank variation in measured data is within the uncertainty of repeated single tank measurements the tanks also perform consistently

  1. High Reproducibility of ELISPOT Counts from Nine Different Laboratories

    DEFF Research Database (Denmark)

    Sundararaman, Srividya; Karulin, Alexey Y; Ansari, Tameem

    2015-01-01

    The primary goal of immune monitoring with ELISPOT is to measure the number of T cells, specific for any antigen, accurately and reproducibly between different laboratories. In ELISPOT assays, antigen-specific T cells secrete cytokines, forming spots of different sizes on a membrane with variable...

  2. Reproducibility of the Pleth Variability Index in premature infants

    NARCIS (Netherlands)

    Den Boogert, Wilhelmina J.; Van Elteren, Hugo A.; Goos, T.G.; Reiss, Irwin K.M.; De Jonge, Rogier C.J.; van Den Berg, Victor J.

    2017-01-01

    The aim was to assess the reproducibility of the Pleth Variability Index (PVI), developed for non-invasive monitoring of peripheral perfusion, in preterm neonates below 32 weeks of gestational age. Three PVI measurements were consecutively performed in stable, comfortable preterm neonates in the

  3. Broad spectrum antibiotic enrofloxacin modulates contact sensitivity through gut microbiota in a murine model.

    Science.gov (United States)

    Strzępa, Anna; Majewska-Szczepanik, Monika; Lobo, Francis M; Wen, Li; Szczepanik, Marian

    2017-07-01

    Medical advances in the field of infection therapy have led to an increasing use of antibiotics, which, apart from eliminating pathogens, also partially eliminate naturally existing commensal bacteria. It has become increasingly clear that less exposure to microbiota early in life may contribute to the observed rise in "immune-mediated" diseases, including autoimmunity and allergy. We sought to test whether the change of gut microbiota with the broad spectrum antibiotic enrofloxacin will modulate contact sensitivity (CS) in mice. Natural gut microbiota were modified by oral treatment with enrofloxacin prior to sensitization with trinitrophenyl chloride followed by CS testing. Finally, adoptive cell transfers were performed to characterize the regulatory cells that are induced by microbiota modification. Oral treatment with enrofloxacin suppresses CS and production of anti-trinitrophenyl chloride IgG1 antibodies. Adoptive transfer experiments show that antibiotic administration favors induction of regulatory cells that suppress CS. Flow cytometry and adoptive transfer of purified cells show that antibiotic-induced suppression of CS is mediated by TCR αβ + CD4 + CD25 + FoxP3 + Treg, CD19 + B220 + CD5 + IL-10 + , IL-10 + Tr1, and IL-10 + TCR γδ + cells. Treatment with the antibiotic induces dysbiosis characterized by increased proportion of Clostridium coccoides (cluster XIVa), C coccoides-Eubacterium rectale (cluster XIVab), Bacteroidetes, and Bifidobacterium spp, but decreased segmented filamentous bacteria. Transfer of antibiotic-modified gut microbiota inhibits CS, but this response can be restored through oral transfer of control gut bacteria to antibiotic-treated animals. Oral treatment with a broad spectrum antibiotic modifies gut microbiota composition and promotes anti-inflammatory response, suggesting that manipulation of gut microbiota can be a powerful tool to modulate the course of CS. Copyright © 2017 American Academy of Allergy, Asthma & Immunology

  4. A bistable model of cell polarity.

    Directory of Open Access Journals (Sweden)

    Matteo Semplice

    Full Text Available Ultrasensitivity, as described by Goldbeter and Koshland, has been considered for a long time as a way to realize bistable switches in biological systems. It is not as well recognized that when ultrasensitivity and reinforcing feedback loops are present in a spatially distributed system such as the cell plasmamembrane, they may induce bistability and spatial separation of the system into distinct signaling phases. Here we suggest that bistability of ultrasensitive signaling pathways in a diffusive environment provides a basic mechanism to realize cell membrane polarity. Cell membrane polarization is a fundamental process implicated in several basic biological phenomena, such as differentiation, proliferation, migration and morphogenesis of unicellular and multicellular organisms. We describe a simple, solvable model of cell membrane polarization based on the coupling of membrane diffusion with bistable enzymatic dynamics. The model can reproduce a broad range of symmetry-breaking events, such as those observed in eukaryotic directional sensing, the apico-basal polarization of epithelium cells, the polarization of budding and mating yeast, and the formation of Ras nanoclusters in several cell types.

  5. Reproducibility in Natural Language Processing: A Case Study of Two R Libraries for Mining PubMed/MEDLINE

    Science.gov (United States)

    Cohen, K. Bretonnel; Xia, Jingbo; Roeder, Christophe; Hunter, Lawrence E.

    2018-01-01

    There is currently a crisis in science related to highly publicized failures to reproduce large numbers of published studies. The current work proposes, by way of case studies, a methodology for moving the study of reproducibility in computational work to a full stage beyond that of earlier work. Specifically, it presents a case study in attempting to reproduce the reports of two R libraries for doing text mining of the PubMed/MEDLINE repository of scientific publications. The main findings are that a rational paradigm for reproduction of natural language processing papers can be established; the advertised functionality was difficult, but not impossible, to reproduce; and reproducibility studies can produce additional insights into the functioning of the published system. Additionally, the work on reproducibility lead to the production of novel user-centered documentation that has been accessed 260 times since its publication—an average of once a day per library. PMID:29568821

  6. Education and Broad Concepts of Agency

    Science.gov (United States)

    Winch, Christopher

    2014-01-01

    Drawing on recent debates about the relationship between propositional and practical knowledge, this article is concerned with broad concepts of agency. Specifically, it is concerned with agency that involves the forming and putting into effect of intentions over relatively extended periods, particularly in work contexts (called, for want of a…

  7. 33 CFR 110.27 - Lynn Harbor in Broad Sound, Mass.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Lynn Harbor in Broad Sound, Mass. 110.27 Section 110.27 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.27 Lynn Harbor in Broad Sound, Mass. North of...

  8. A study on reproducibility of three-dimensional measurement for an evaluation of craniofacial morphology

    International Nuclear Information System (INIS)

    Nagai, Yoshihiro; Nishiyama, Hideyoshi; Nihara, Jun; Tanaka, Ray; Yamaki, Masaki; Hayashi, Takafumi; Saito, Isao

    2013-01-01

    Materials including facial and oral pictures, frontal and lateral cephalograms, dental casts and CT are essential for orthodontic diagnosis with orthognathic surgery. Although a three-dimensional analysis has been prevalent in diagnosing patients with dentofacial deformity, little information is available as to the definition and reproducibility of the measurement points when conducting a three-dimensional analysis using CT. This study was therefore designed to evaluate reproducibility of three-dimensional landmarks defined on the multiplaner reconstruction (MPR) images. Seven presurgical CT data obtained from seven orthognathic patients (4 females and 3 males) were selected. Two orthodontists independently repeated the identification of 44 landmarks defined twice on the MPR image with the reference plane of the Frankfurt horizontal plane (FH plane) using DICOM viewer Exavision Lite (Ziosoft, Tokyo). The significance of intra-examiner and inter-examiner errors was assessed using ANOVA, and reproducibility of landmarks was evaluated by the standard deviation (SD) value of measurement error. While no significant differences were found in intra-examiner measurement values, a significant difference was identified in inter-examiner measurement values at 39 coordinates among 132 coordinates; 10, 15, and 14 coordinates were found in X-, Y- and Z-coordinates, respectively. Reproducibility of ramus posterior point (Ar), Gonion (Go) and greater palatine foramen were particularly poor. However, reproducibility of landmarks adopted was considered enough for the analysis of maxillofacial morphology since the SDs of those landmarks were small as compared to voxel size. In case the FH plane is set as the reference plane, much more reproducible measurement landmarks may be selected without an influence of changes in head posture. (author)

  9. Reproducibility and discriminability of brain patterns of semantic categories enhanced by congruent audiovisual stimuli.

    Directory of Open Access Journals (Sweden)

    Yuanqing Li

    Full Text Available One of the central questions in cognitive neuroscience is the precise neural representation, or brain pattern, associated with a semantic category. In this study, we explored the influence of audiovisual stimuli on the brain patterns of concepts or semantic categories through a functional magnetic resonance imaging (fMRI experiment. We used a pattern search method to extract brain patterns corresponding to two semantic categories: "old people" and "young people." These brain patterns were elicited by semantically congruent audiovisual, semantically incongruent audiovisual, unimodal visual, and unimodal auditory stimuli belonging to the two semantic categories. We calculated the reproducibility index, which measures the similarity of the patterns within the same category. We also decoded the semantic categories from these brain patterns. The decoding accuracy reflects the discriminability of the brain patterns between two categories. The results showed that both the reproducibility index of brain patterns and the decoding accuracy were significantly higher for semantically congruent audiovisual stimuli than for unimodal visual and unimodal auditory stimuli, while the semantically incongruent stimuli did not elicit brain patterns with significantly higher reproducibility index or decoding accuracy. Thus, the semantically congruent audiovisual stimuli enhanced the within-class reproducibility of brain patterns and the between-class discriminability of brain patterns, and facilitate neural representations of semantic categories or concepts. Furthermore, we analyzed the brain activity in superior temporal sulcus and middle temporal gyrus (STS/MTG. The strength of the fMRI signal and the reproducibility index were enhanced by the semantically congruent audiovisual stimuli. Our results support the use of the reproducibility index as a potential tool to supplement the fMRI signal amplitude for evaluating multimodal integration.

  10. Rate turnover in mechano-catalytic coupling: A model and its microscopic origin

    Energy Technology Data Exchange (ETDEWEB)

    Roy, Mahua; Grazioli, Gianmarc; Andricioaei, Ioan, E-mail: andricio@uci.edu [Department of Chemistry, University of California, Irvine, California 92697 (United States)

    2015-07-28

    A novel aspect in the area of mechano-chemistry concerns the effect of external forces on enzyme activity, i.e., the existence of mechano-catalytic coupling. Recent experiments on enzyme-catalyzed disulphide bond reduction in proteins under the effect of a force applied on the termini of the protein substrate reveal an unexpected biphasic force dependence for the bond cleavage rate. Here, using atomistic molecular dynamics simulations combined with Smoluchowski theory, we propose a model for this behavior. For a broad range of forces and systems, the model reproduces the experimentally observed rates by solving a reaction-diffusion equation for a “protein coordinate” diffusing in a force-dependent effective potential. The atomistic simulations are used to compute, from first principles, the parameters of the model via a quasiharmonic analysis. Additionally, the simulations are also used to provide details about the microscopic degrees of freedom that are important for the underlying mechano-catalysis.

  11. Rate turnover in mechano-catalytic coupling: A model and its microscopic origin

    International Nuclear Information System (INIS)

    Roy, Mahua; Grazioli, Gianmarc; Andricioaei, Ioan

    2015-01-01

    A novel aspect in the area of mechano-chemistry concerns the effect of external forces on enzyme activity, i.e., the existence of mechano-catalytic coupling. Recent experiments on enzyme-catalyzed disulphide bond reduction in proteins under the effect of a force applied on the termini of the protein substrate reveal an unexpected biphasic force dependence for the bond cleavage rate. Here, using atomistic molecular dynamics simulations combined with Smoluchowski theory, we propose a model for this behavior. For a broad range of forces and systems, the model reproduces the experimentally observed rates by solving a reaction-diffusion equation for a “protein coordinate” diffusing in a force-dependent effective potential. The atomistic simulations are used to compute, from first principles, the parameters of the model via a quasiharmonic analysis. Additionally, the simulations are also used to provide details about the microscopic degrees of freedom that are important for the underlying mechano-catalysis

  12. SeqBox: RNAseq/ChIPseq reproducible analysis on a consumer game computer.

    Science.gov (United States)

    Beccuti, Marco; Cordero, Francesca; Arigoni, Maddalena; Panero, Riccardo; Amparore, Elvio G; Donatelli, Susanna; Calogero, Raffaele A

    2018-03-01

    Short reads sequencing technology has been used for more than a decade now. However, the analysis of RNAseq and ChIPseq data is still computational demanding and the simple access to raw data does not guarantee results reproducibility between laboratories. To address these two aspects, we developed SeqBox, a cheap, efficient and reproducible RNAseq/ChIPseq hardware/software solution based on NUC6I7KYK mini-PC (an Intel consumer game computer with a fast processor and a high performance SSD disk), and Docker container platform. In SeqBox the analysis of RNAseq and ChIPseq data is supported by a friendly GUI. This allows access to fast and reproducible analysis also to scientists with/without scripting experience. Docker container images, docker4seq package and the GUI are available at http://www.bioinformatica.unito.it/reproducibile.bioinformatics.html. beccuti@di.unito.it. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  13. Evaluating the reproducibility of environmental radioactivity monitoring data through replicate sample analysis

    International Nuclear Information System (INIS)

    Lindeken, C.L.; White, J.H.; Silver, W.J.

    1978-01-01

    At the Lawrence Livermore Laboratory, about 10% of the sampling effort in the environmental monitoring program represents replicate sample collection. Replication of field samples was initiated as part of the quality assurance program for environmental monitoring to determine the reproducibility of environmental measurements. In the laboratory these replicates are processed along with routine samples. As all components of variance are included in analysis of such field samples, comparison of the analytical data from replicate analyses provides a basis for estimating the overall reproducibility of the measurements. The replication study indicates that the reproducibility of environmental radioactivity monitoring data is subject to considerably more variability than is indicated by the accompanying counting errors. The data are also compared with analyses of duplicate aliquots from a well mixed sample or with duplicate aliquots of samples with known radionuclide content. These comparisons show that most of the variability is associated with the collection and preparation of the sample rather than with the analytical procedures

  14. The broad line region of AGN: Kinematics and physics

    Directory of Open Access Journals (Sweden)

    Popović L.Č.

    2006-01-01

    Full Text Available In this paper a discussion of kinematics and physics of the Broad Line Region (BLR is given. The possible physical conditions in the BLR and problems in determination of the physical parameters (electron temperature and density are considered. Moreover, one analyses the geometry of the BLR and the probability that (at least a fraction of the radiation in the Broad Emission Lines (BELs originates from a relativistic accretion disk.

  15. A how to guide to reproducible research

    OpenAIRE

    Whitaker, Kirstie

    2018-01-01

    This talk will discuss the perceived and actual barriers experienced by researchers attempting to do reproducible research, and give practical guidance on how they can be overcome. It will include suggestions on how to make your code and data available and usable for others (including a strong suggestion to document both clearly so you don't have to reply to lots of email questions from future users). Specifically it will include a brief guide to version control, collaboration and disseminati...

  16. Population Structure in the Model Grass Brachypodium distachyon Is Highly Correlated with Flowering Differences across Broad Geographic Areas

    Directory of Open Access Journals (Sweden)

    Ludmila Tyler

    2016-07-01

    Full Text Available The small, annual grass (L. Beauv., a close relative of wheat ( L. and barley ( L., is a powerful model system for cereals and bioenergy grasses. Genome-wide association studies (GWAS of natural variation can elucidate the genetic basis of complex traits but have been so far limited in by the lack of large numbers of well-characterized and sufficiently diverse accessions. Here, we report on genotyping-by-sequencing (GBS of 84 , seven , and three accessions with diverse geographic origins including Albania, Armenia, Georgia, Italy, Spain, and Turkey. Over 90,000 high-quality single-nucleotide polymorphisms (SNPs distributed across the Bd21 reference genome were identified. Our results confirm the hybrid nature of the genome, which appears as a mosaic of -like and -like sequences. Analysis of more than 50,000 SNPs for the accessions revealed three distinct, genetically defined populations. Surprisingly, these genomic profiles are associated with differences in flowering time rather than with broad geographic origin. High levels of differentiation in loci associated with floral development support the differences in flowering phenology between populations. Genome-wide association studies combining genotypic and phenotypic data also suggest the presence of one or more photoperiodism, circadian clock, and vernalization genes in loci associated with flowering time variation within populations. Our characterization elucidates genes underlying population differences, expands the germplasm resources available for , and illustrates the feasibility and limitations of GWAS in this model grass.

  17. Enhancing reproducibility in scientific computing: Metrics and registry for Singularity containers

    Science.gov (United States)

    Prybol, Cameron J.; Kurtzer, Gregory M.

    2017-01-01

    Here we present Singularity Hub, a framework to build and deploy Singularity containers for mobility of compute, and the singularity-python software with novel metrics for assessing reproducibility of such containers. Singularity containers make it possible for scientists and developers to package reproducible software, and Singularity Hub adds automation to this workflow by building, capturing metadata for, visualizing, and serving containers programmatically. Our novel metrics, based on custom filters of content hashes of container contents, allow for comparison of an entire container, including operating system, custom software, and metadata. First we will review Singularity Hub’s primary use cases and how the infrastructure has been designed to support modern, common workflows. Next, we conduct three analyses to demonstrate build consistency, reproducibility metric and performance and interpretability, and potential for discovery. This is the first effort to demonstrate a rigorous assessment of measurable similarity between containers and operating systems. We provide these capabilities within Singularity Hub, as well as the source software singularity-python that provides the underlying functionality. Singularity Hub is available at https://singularity-hub.org, and we are excited to provide it as an openly available platform for building, and deploying scientific containers. PMID:29186161

  18. Reproducibility of Ultrasound and Magnetic Resonance Imaging Measurements of Tendon Size

    International Nuclear Information System (INIS)

    Brushoej, C.; Henriksen, B.M.; Albrecht-Beste, E.; Hoelmich, P.; Larsen, K.; Bachmann Nielsen, M.

    2006-01-01

    Purpose: To investigate the intra- and inter-tester reproducibility of measurements of the Achilles tendon, tibialis anterior tendon, and the tibialis posterior tendon in football players using ultrasound (US) and magnetic resonance imaging (MRI). Material and Methods: Eleven asymptomatic football players were examined. Using a standardized US scanning protocol, the tendons were examined by two observers with US for thickness, width, and cross-sectional area. One observer conducted the procedure twice. The subjects also underwent an MRI examination, and the assessment of tendon size was conducted twice by two observers. Results: The best reproducibility judged by coefficient of variation (CV) and 95% confidence interval was determined for the Achilles tendon on both US and MRI. The variability of US on measurements on the tibialis anterior and tibialis posterior tendons was less than that when using MRI. In 12 out of 18 measurements, there were systematic differences between observers as judged by one-sided F-test. Conclusion: The reproducibility of the three tendons was limited. Precaution should be taken when looking for minor quantitative changes, i.e., training-induced hypertrophy, and when doing so, the Achilles tendon should be used

  19. Enhancing reproducibility in scientific computing: Metrics and registry for Singularity containers.

    Directory of Open Access Journals (Sweden)

    Vanessa V Sochat

    Full Text Available Here we present Singularity Hub, a framework to build and deploy Singularity containers for mobility of compute, and the singularity-python software with novel metrics for assessing reproducibility of such containers. Singularity containers make it possible for scientists and developers to package reproducible software, and Singularity Hub adds automation to this workflow by building, capturing metadata for, visualizing, and serving containers programmatically. Our novel metrics, based on custom filters of content hashes of container contents, allow for comparison of an entire container, including operating system, custom software, and metadata. First we will review Singularity Hub's primary use cases and how the infrastructure has been designed to support modern, common workflows. Next, we conduct three analyses to demonstrate build consistency, reproducibility metric and performance and interpretability, and potential for discovery. This is the first effort to demonstrate a rigorous assessment of measurable similarity between containers and operating systems. We provide these capabilities within Singularity Hub, as well as the source software singularity-python that provides the underlying functionality. Singularity Hub is available at https://singularity-hub.org, and we are excited to provide it as an openly available platform for building, and deploying scientific containers.

  20. Shareholder, stakeholder-owner or broad stakeholder maximization

    DEFF Research Database (Denmark)

    Mygind, Niels

    2004-01-01

    With reference to the discussion about shareholder versus stakeholder maximization it is argued that the normal type of maximization is in fact stakeholder-owner maxi-mization. This means maximization of the sum of the value of the shares and stake-holder benefits belonging to the dominating...... including the shareholders of a company. Although it may be the ultimate goal for Corporate Social Responsibility to achieve this kind of maximization, broad stakeholder maximization is quite difficult to give a precise definition. There is no one-dimensional measure to add different stakeholder benefits...... not traded on the mar-ket, and therefore there is no possibility for practical application. Broad stakeholder maximization instead in practical applications becomes satisfying certain stakeholder demands, so that the practical application will be stakeholder-owner maximization un-der constraints defined...