WorldWideScience

Sample records for model reproduces broad

  1. The diverse broad-band light-curves of Swift GRBs reproduced with the cannonball model

    CERN Document Server

    Dado, Shlomo; De Rújula, A

    2009-01-01

    Two radiation mechanisms, inverse Compton scattering (ICS) and synchrotron radiation (SR), suffice within the cannonball (CB) model of long gamma ray bursts (LGRBs) and X-ray flashes (XRFs) to provide a very simple and accurate description of their observed prompt emission and afterglows. Simple as they are, the two mechanisms and the burst environment generate the rich structure of the light curves at all frequencies and times. This is demonstrated for 33 selected Swift LGRBs and XRFs, which are well sampled from early time until late time and well represent the entire diversity of the broad band light curves of Swift LGRBs and XRFs. Their prompt gamma-ray and X-ray emission is dominated by ICS of glory light. During their fast decline phase, ICS is taken over by SR which dominates their broad band afterglow. The pulse shape and spectral evolution of the gamma-ray peaks and the early-time X-ray flares, and even the delayed optical `humps' in XRFs, are correctly predicted. The canonical and non-canonical X-ra...

  2. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  3. Reproducing the hierarchy of disorder for Morpho-inspired, broad-angle color reflection

    DEFF Research Database (Denmark)

    Song, Bokwang; Johansen, Villads Egede; Sigmund, Ole

    2017-01-01

    on the positional disorder among the identical, multilayered ridges as the critical factor for producing angular independent color. Realizing such positional disorder of identical nanostructures is difficult, which in turn has limited experimental verification of different physical mechanisms that have been...... proposed. In this paper, we suggest an alternative model of inter-structural disorder that can achieve the same broad-angle color reflection, and is applicable to wafer-scale fabrication using conventional thin film technologies. Fabrication of a thin film that produces pure, stable blue across a viewing...... angle of more than 120 ° is demonstrated, together with a robust, conformal color coating....

  4. A reproducible canine model of esophageal varices.

    Science.gov (United States)

    Jensen, D M; Machicado, G A; Tapia, J I; Kauffman, G; Franco, P; Beilin, D

    1983-03-01

    One of the most promising nonoperative techniques for control of variceal hemorrhage is sclerosis via the fiberoptic endoscope. Many questions remain, however, about sclerosing agents, guidelines for effective use, and limitations of endoscopic techniques. A reproducible large animal model of esophageal varices would facilitate the critical evaluation of techniques for variceal hemostasis or sclerosis. Our purpose was to develop a large animal model of esophageal varices. Studies in pigs and dogs are described which led to the development of a reproducible canine model of esophageal varices. For the final model, mongrel dogs had laparotomy, side-to-side portacaval shunt, inferior vena cava ligation, placement of an ameroid constrictor around the portal vein, and liver biopsy. The mean (+/- SE) pre- and postshunt portal pressure increased significantly from 12 +/- 0.4 to 23 +/- 1 cm saline. Weekly endoscopies were performed to grade the varix size. Two-thirds of animals developed medium or large sized esophageal varices after the first operation. Three to six weeks later, a second laparotomy with complete ligation of the portal vein and liver biopsy were performed in animals with varices (one-third of the animals). All dogs developed esophageal varices and abdominal wall collateral veins of variable size 3-6 wk after the first operation. After the second operation, the varices became larger. Shunting of blood through esophageal varices via splenic and gastric veins was demonstrated by angiography. Sequential liver biopsies were normal. There was no morbidity or mortality. Ascites, encephalopathy, or spontaneous variceal bleeding did not occur. We have documented the lack of size change and the persistence of medium to large esophageal varices and abdominal collateral veins in all animals followed for more than 6 mo. Variceal bleeding could be induced by venipuncture for testing endoscopic hemostatic and sclerosis methods. We suggest other potential uses of this

  5. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  6. Towards reproducible descriptions of neuronal network models.

    Directory of Open Access Journals (Sweden)

    Eilen Nordlie

    2009-08-01

    Full Text Available Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.

  7. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  8. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  9. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    Science.gov (United States)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  10. Can global chemistry-climate models reproduce air quality extremes?

    Science.gov (United States)

    Schnell, J.; Prather, M. J.; Holmes, C. D.

    2013-12-01

    We identify and characterize extreme ozone pollution episodes over the USA and EU through a novel analysis of ten years (2000-2010) of surface ozone measurements. An optimal interpolation scheme is developed to create grid-cell averaged values of surface ozone that can be compared with gridded model simulations. In addition, it also allows a comparison of two non-coincident observational networks in the EU. The scheme incorporates techniques borrowed from inverse distance weighting and Kriging. It uses all representative observational site data while still recognizing the heterogeneity of surface ozone. Individual, grid-cell level events are identified as an exceedance of historical percentile (10 worst days in a year, 97.3 percentile). A clustering algorithm is then used to construct the ozone episodes from the individual events. We then test the skill of the high-resolution (100 km) two-year (2005-2006) hindcast from the UCI global chemistry transport model in reproducing the events/episodes identified in the observations using the same identification criteria. Although the UCI CTM has substantial biases in surface ozone, we find that it has considerable skill in reproducing both individual grid-cell level extreme events and their connectedness in space and time with an overall skill of 24% (32%) for the US (EU). The grid-cell level extreme ozone events in both the observations and UCI CTM are found to occur mostly (~75%) in coherent, multi-day, connected episodes covering areas greater than 1000 x 1000 square km. In addition the UCI CTM has greater skill in reproducing these larger episodes. We conclude that even at relatively coarse resolution, global chemistry-climate models can be used to project major synoptic pollution episodes driven by large-scale climate and chemistry changes even with their known biases.

  11. From alginate impressions to digital virtual models: accuracy and reproducibility.

    Science.gov (United States)

    Dalstra, Michel; Melsen, Birte

    2009-03-01

    To compare the accuracy and reproducibility of measurements performed on digital virtual models with those taken on plaster casts from models poured immediately after the impression was taken, the 'gold standard', and from plaster models poured following a 3-5 day shipping procedure of the alginate impression. Direct comparison of two measuring techniques. The study was conducted at the Department of Orthodontics, School of Dentistry, University of Aarhus, Denmark in 2006/2007. Twelve randomly selected orthodontic graduate students with informed consent. Three sets of alginate impressions were taken from the participants within 1 hour. Plaster models were poured immediately from two of the sets, while the third set was kept in transit in the mail for 3-5 days. Upon return a plaster model was poured as well. Finally digital models were made from the plaster models. A number of measurements were performed on the plaster casts with a digital calliper and on the corresponding digital models using the virtual measuring tool of the accompanying software. Afterwards these measurements were compared statistically. No statistical differences were found between the three sets of plaster models. The intra- and inter-observer variability are smaller for the measurements performed on the digital models. Sending alginate impressions by mail does not affect the quality and accuracy of plaster casts poured from them afterwards. Virtual measurements performed on digital models display less variability than the corresponding measurements performed with a calliper on the actual models.

  12. A reproducible oral microcosm biofilm model for testing dental materials.

    Science.gov (United States)

    Rudney, J D; Chen, R; Lenton, P; Li, J; Li, Y; Jones, R S; Reilly, C; Fok, A S; Aparicio, C

    2012-12-01

    Most studies of biofilm effects on dental materials use single-species biofilms, or consortia. Microcosm biofilms grown directly from saliva or plaque are much more diverse, but difficult to characterize. We used the Human Oral Microbial Identification Microarray (HOMIM) to validate a reproducible oral microcosm model. Saliva and dental plaque were collected from adults and children. Hydroxyapatite and dental composite discs were inoculated with either saliva or plaque, and microcosm biofilms were grown in a CDC biofilm reactor. In later experiments, the reactor was pulsed with sucrose. DNA from inoculums and microcosms was analysed by HOMIM for 272 species. Microcosms included about 60% of species from the original inoculum. Biofilms grown on hydroxyapatite and composites were extremely similar. Sucrose pulsing decreased diversity and pH, but increased the abundance of Streptococcus and Veillonella. Biofilms from the same donor, grown at different times, clustered together. This model produced reproducible microcosm biofilms that were representative of the oral microbiota. Sucrose induced changes associated with dental caries. This is the first use of HOMIM to validate an oral microcosm model that can be used to study the effects of complex biofilms on dental materials. © 2012 The Society for Applied Microbiology.

  13. Optimizing technology investments: a broad mission model approach

    Science.gov (United States)

    Shishko, R.

    2003-01-01

    A long-standing problem in NASA is how to allocate scarce technology development resources across advanced technologies in order to best support a large set of future potential missions. Within NASA, two orthogonal paradigms have received attention in recent years: the real-options approach and the broad mission model approach. This paper focuses on the latter.

  14. Can a coupled meteorology–chemistry model reproduce the ...

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  15. The substorm cycle as reproduced by global MHD models

    Science.gov (United States)

    Gordeev, E.; Sergeev, V.; Tsyganenko, N.; Kuznetsova, M.; Rastäetter, L.; Raeder, J.; Tóth, G.; Lyon, J.; Merkin, V.; Wiltberger, M.

    2017-01-01

    Recently, Gordeev et al. (2015) suggested a method to test global MHD models against statistical empirical data. They showed that four community-available global MHD models supported by the Community Coordinated Modeling Center (CCMC) produce a reasonable agreement with reality for those key parameters (the magnetospheric size, magnetic field, and pressure) that are directly related to the large-scale equilibria in the outer magnetosphere. Based on the same set of simulation runs, here we investigate how the models reproduce the global loading-unloading cycle. We found that in terms of global magnetic flux transport, three examined CCMC models display systematically different response to idealized 2 h north then 2 h south interplanetary magnetic field (IMF) Bz variation. The LFM model shows a depressed return convection and high loading rate during the growth phase as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded/unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. Two other models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. We also demonstrate potential technical problem in the publicly available simulations which is related to postprocessing interpolation and could affect the accuracy of magnetic field tracing and of other related procedures.

  16. A reproducible brain tumour model established from human glioblastoma biopsies

    Directory of Open Access Journals (Sweden)

    Li Xingang

    2009-12-01

    Full Text Available Abstract Background Establishing clinically relevant animal models of glioblastoma multiforme (GBM remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. Methods In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. Results The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. Conclusions In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression.

  17. Development of a Consistent and Reproducible Porcine Scald Burn Model

    Science.gov (United States)

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  18. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    Science.gov (United States)

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    osteosarcoma model was shown to be feasible: the take rate was high, surgical mortality was negligible and the procedure was simple to perform and easily reproduced. It may be a useful tool in the investigation of antiangiogenic and anticancer therapeutics. Ultrasound was found to be a highly accurate tool for tumor diagnosis, localization and measurement and may be recommended for monitoring tumor growth in this model.

  19. Modeling of optical wireless scattering communication channels over broad spectra.

    Science.gov (United States)

    Liu, Weihao; Zou, Difan; Xu, Zhengyuan

    2015-03-01

    The air molecules and suspended aerosols help to build non-line-of-sight (NLOS) optical scattering communication links using carriers from near infrared to visible light and ultraviolet bands. This paper proposes channel models over such broad spectra. Wavelength dependent Rayleigh and Mie scattering and absorption coefficients of particles are analytically obtained first. They are applied to the ray tracing based Monte Carlo method, which models the photon scattering angle from the scatterer and propagation distance between two consecutive scatterers. Communication link path loss is studied under different operation conditions, including visibility, particle density, wavelength, and communication range. It is observed that optimum communication performances exist across the wavelength under specific atmospheric conditions. Infrared, visible light and ultraviolet bands show their respective features as conditions vary.

  20. How Modeling Standards, Software, and Initiatives Support Reproducibility in Systems Biology and Systems Medicine.

    Science.gov (United States)

    Waltemath, Dagmar; Wolkenhauer, Olaf

    2016-10-01

    Only reproducible results are of significance to science. The lack of suitable standards and appropriate support of standards in software tools has led to numerous publications with irreproducible results. Our objectives are to identify the key challenges of reproducible research and to highlight existing solutions. In this paper, we summarize problems concerning reproducibility in systems biology and systems medicine. We focus on initiatives, standards, and software tools that aim to improve the reproducibility of simulation studies. The long-term success of systems biology and systems medicine depends on trustworthy models and simulations. This requires openness to ensure reusability and transparency to enable reproducibility of results in these fields.

  1. Self-Consistent Dynamical Model of the Broad Line Region

    International Nuclear Information System (INIS)

    Czerny, Bozena; Li, Yan-Rong; Sredzinska, Justyna; Hryniewicz, Krzysztof; Panda, Swayam; Wildy, Conor; Karas, Vladimir

    2017-01-01

    We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency) and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.

  2. Self-Consistent Dynamical Model of the Broad Line Region

    Energy Technology Data Exchange (ETDEWEB)

    Czerny, Bozena [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Li, Yan-Rong [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, Beijing (China); Sredzinska, Justyna; Hryniewicz, Krzysztof [Copernicus Astronomical Center, Polish Academy of Sciences, Warsaw (Poland); Panda, Swayam [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Copernicus Astronomical Center, Polish Academy of Sciences, Warsaw (Poland); Wildy, Conor [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Karas, Vladimir, E-mail: bcz@cft.edu.pl [Astronomical Institute, Czech Academy of Sciences, Prague (Czech Republic)

    2017-06-22

    We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency) and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.

  3. Hydrological Modeling Reproducibility Through Data Management and Adaptors for Model Interoperability

    Science.gov (United States)

    Turner, M. A.

    2015-12-01

    Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of

  4. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    Science.gov (United States)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  5. Using a 1-D model to reproduce diurnal SST signals

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob L.

    2014-01-01

    of measurement. A generally preferred approach to bridge the gap between in situ and remotely obtained measurements is through modelling of the upper ocean temperature. This ESA supported study focuses on the implementation of the 1 dimensional General Ocean Turbulence Model (GOTM), in order to resolve...... an additional parametrisation for the total outgoing long-wave radiation and a 9-band parametrisation for the light extinction. New parametrisations for the stability functions, associated with vertical mixing, have been included. GOTM is tested using experimental data from the Woods Hole Oceanographic...

  6. Reproducible Infection Model for Clostridium perfringens in Broiler Chickens

    DEFF Research Database (Denmark)

    Pedersen, Karl; Friis-Holm, Lotte Bjerrum; Heuer, Ole Eske

    2008-01-01

    Experiments were carried out to establish an infection and disease model for Clostridium perfringens in broiler chickens. Previous experiments had failed to induce disease and only a transient colonization with challenge strains had been obtained. In the present study, two series of experiments...

  7. COMBINE archive and OMEX format : One file to share all information to reproduce a modeling project

    NARCIS (Netherlands)

    Bergmann, Frank T.; Olivier, Brett G.; Soiland-Reyes, Stian

    2014-01-01

    Background: With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models,

  8. Particle acceleration model for the broad-band baseline spectrum of the Crab nebula

    Science.gov (United States)

    Fraschetti, F.; Pohl, M.

    2017-11-01

    We develop a simple one-zone model of the steady-state Crab nebula spectrum encompassing both the radio/soft X-ray and the GeV/multi-TeV observations. By solving the transport equation for GeV-TeV electrons injected at the wind termination shock as a log-parabola momentum distribution and evolved via energy losses, we determine analytically the resulting differential energy spectrum of photons. We find an impressive agreement with the observed spectrum of synchrotron emission, and the synchrotron self-Compton component reproduces the previously unexplained broad 200-GeV peak that matches the Fermi/Large Area Telescope (LAT) data beyond 1 GeV with the Major Atmospheric Gamma Imaging Cherenkov (MAGIC) data. We determine the parameters of the single log-parabola electron injection distribution, in contrast with multiple broken power-law electron spectra proposed in the literature. The resulting photon differential spectrum provides a natural interpretation of the deviation from power law customarily fitted with empirical multiple broken power laws. Our model can be applied to the radio-to-multi-TeV spectrum of a variety of astrophysical outflows, including pulsar wind nebulae and supernova remnants, as well as to interplanetary shocks.

  9. Voxel-level reproducibility assessment of modality independent elastography in a pre-clinical murine model

    Science.gov (United States)

    Flint, Katelyn M.; Weis, Jared A.; Yankeelov, Thomas E.; Miga, Michael I.

    2015-03-01

    Changes in tissue mechanical properties, measured non-invasively by elastography methods, have been shown to be an important diagnostic tool, particularly for cancer. Tissue elasticity information, tracked over the course of therapy, may be an important prognostic indicator of tumor response to treatment. While many elastography techniques exist, this work reports on the use of a novel form of elastography that uses image texture to reconstruct elastic property distributions in tissue (i.e., a modality independent elastography (MIE) method) within the context of a pre-clinical breast cancer system.1,2 The elasticity results have previously shown good correlation with independent mechanical testing.1 Furthermore, MIE has been successfully utilized to localize and characterize lesions in both phantom experiments and simulation experiments with clinical data.2,3 However, the reproducibility of this method has not been characterized in previous work. The goal of this study is to evaluate voxel-level reproducibility of MIE in a pre-clinical model of breast cancer. Bland-Altman analysis of co-registered repeat MIE scans in this preliminary study showed a reproducibility index of 24.7% (scaled to a percent of maximum stiffness) at the voxel level. As opposed to many reports in the magnetic resonance elastography (MRE) literature that speak to reproducibility measures of the bulk organ, these results establish MIE reproducibility at the voxel level; i.e., the reproducibility of locally-defined mechanical property measurements throughout the tumor volume.

  10. Investigation of dimensional variation in parts manufactured by fused deposition modeling using Gauge Repeatability and Reproducibility

    Science.gov (United States)

    Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar

    2018-02-01

    In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.

  11. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  12. Reproducibility of the coil positioning in Nb$_3$Sn magnet models through magnetic measurements

    CERN Document Server

    Borgnolutti, F; Ferracin, P; Kashikhin, V V; Sabbi, G; Velev, G; Todesco, E; Zlobin, A V

    2009-01-01

    The random part of the integral field harmonics in a series of superconducting magnets has been used in the past to identify the reproducibility of the coil positioning. Using a magnetic model and a MonteCarlo approach, coil blocks are randomly moved and the amplitude that best fits the magnetic measurements is interpreted as the reproducibility of the coil position. Previous values for r.m.s. coil displacements for Nb-Ti magnets range from 0.05 to 0.01 mm. In this paper, we use this approach to estimate the reproducibility in the coil position for Nb3Sn short models that have been built in the framework of the FNAL core program (HFDA dipoles) and of the LARP program (TQ quadrupoles). Our analysis shows that the Nb3Sn models manufactured in the past years correspond to r.m.s. coil displacements of at least 5 times what is found for the series production of a mature Nb-Ti technology. On the other hand, the variability of the field harmonics along the magnet axis shows that Nb3Sn magnets have already reached va...

  13. Cellular automaton model in the fundamental diagram approach reproducing the synchronized outflow of wide moving jams

    International Nuclear Information System (INIS)

    Tian, Jun-fang; Yuan, Zhen-zhou; Jia, Bin; Fan, Hong-qiang; Wang, Tao

    2012-01-01

    Velocity effect and critical velocity are incorporated into the average space gap cellular automaton model [J.F. Tian, et al., Phys. A 391 (2012) 3129], which was able to reproduce many spatiotemporal dynamics reported by the three-phase theory except the synchronized outflow of wide moving jams. The physics of traffic breakdown has been explained. Various congested patterns induced by the on-ramp are reproduced. It is shown that the occurrence of synchronized outflow, free outflow of wide moving jams is closely related with drivers time delay in acceleration at the downstream jam front and the critical velocity, respectively. -- Highlights: ► Velocity effect is added into average space gap cellular automaton model. ► The physics of traffic breakdown has been explained. ► The probabilistic nature of traffic breakdown is simulated. ► Various congested patterns induced by the on-ramp are reproduced. ► The occurrence of synchronized outflow of jams depends on drivers time delay.

  14. NRFixer: Sentiment Based Model for Predicting the Fixability of Non-Reproducible Bugs

    Directory of Open Access Journals (Sweden)

    Anjali Goyal

    2017-08-01

    Full Text Available Software maintenance is an essential step in software development life cycle. Nowadays, software companies spend approximately 45\\% of total cost in maintenance activities. Large software projects maintain bug repositories to collect, organize and resolve bug reports. Sometimes it is difficult to reproduce the reported bug with the information present in a bug report and thus this bug is marked with resolution non-reproducible (NR. When NR bugs are reconsidered, a few of them might get fixed (NR-to-fix leaving the others with the same resolution (NR. To analyse the behaviour of developers towards NR-to-fix and NR bugs, the sentiment analysis of NR bug report textual contents has been conducted. The sentiment analysis of bug reports shows that NR bugs' sentiments incline towards more negativity than reproducible bugs. Also, there is a noticeable opinion drift found in the sentiments of NR-to-fix bug reports. Observations driven from this analysis were an inspiration to develop a model that can judge the fixability of NR bugs. Thus a framework, {NRFixer,} which predicts the probability of NR bug fixation, is proposed. {NRFixer} was evaluated with two dimensions. The first dimension considers meta-fields of bug reports (model-1 and the other dimension additionally incorporates the sentiments (model-2 of developers for prediction. Both models were compared using various machine learning classifiers (Zero-R, naive Bayes, J48, random tree and random forest. The bug reports of Firefox and Eclipse projects were used to test {NRFixer}. In Firefox and Eclipse projects, J48 and Naive Bayes classifiers achieve the best prediction accuracy, respectively. It was observed that the inclusion of sentiments in the prediction model shows a rise in the prediction accuracy ranging from 2 to 5\\% for various classifiers.

  15. Reproducing Sea-Ice Deformation Distributions With Viscous-Plastic Sea-Ice Models

    Science.gov (United States)

    Bouchat, A.; Tremblay, B.

    2016-02-01

    High resolution sea-ice dynamic models offer the potential to discriminate between sea-ice rheologies based on their ability to reproduce the satellite-derived deformation fields. Recent studies have shown that sea-ice viscous-plastic (VP) models do not reproduce the observed statistical properties of the strain rate distributions of the RADARSAT Geophysical Processor System (RGPS) deformation fields [1][2]. We use the elliptical VP rheology and we compute the probability density functions (PDFs) for simulated strain rate invariants (divergence and maximum shear stress) and compare against the deformations obtained with the 3-day gridded products from RGPS. We find that the large shear deformations are well reproduced by the elliptical VP model and the deformations do not follow a Gaussian distribution as reported in Girard et al. [1][2]. On the other hand, we do find an overestimation of the shear in the range of mid-magnitude deformations in all of our VP simulations tested with different spatial resolutions and numerical parameters. Runs with no internal stress (free-drift) or with constant viscosity coefficients (Newtonian fluid) also show this overestimation. We trace back this discrepancy to the elliptical yield curve aspect ratio (e = 2) having too little shear strength, hence not resisting enough the inherent shear in the wind forcing associated with synoptic weather systems. Experiments where we simply increase the shear resistance of the ice by modifying the ellipse ratio confirm the need for a rheology with an increased shear strength. [1] Girard et al. (2009), Evaluation of high-resolution sea ice models [...], Journal of Geophysical Research, 114[2] Girard et al. (2011), A new modeling framework for sea-ice mechanics [...], Annals of Glaciology, 57, 123-132

  16. Evaluation of fecal mRNA reproducibility via a marginal transformed mixture modeling approach

    Directory of Open Access Journals (Sweden)

    Davidson Laurie A

    2010-01-01

    Full Text Available Abstract Background Developing and evaluating new technology that enables researchers to recover gene-expression levels of colonic cells from fecal samples could be key to a non-invasive screening tool for early detection of colon cancer. The current study, to the best of our knowledge, is the first to investigate and report the reproducibility of fecal microarray data. Using the intraclass correlation coefficient (ICC as a measure of reproducibility and the preliminary analysis of fecal and mucosal data, we assessed the reliability of mixture density estimation and the reproducibility of fecal microarray data. Using Monte Carlo-based methods, we explored whether ICC values should be modeled as a beta-mixture or transformed first and fitted with a normal-mixture. We used outcomes from bootstrapped goodness-of-fit tests to determine which approach is less sensitive toward potential violation of distributional assumptions. Results The graphical examination of both the distributions of ICC and probit-transformed ICC (PT-ICC clearly shows that there are two components in the distributions. For ICC measurements, which are between 0 and 1, the practice in literature has been to assume that the data points are from a beta-mixture distribution. Nevertheless, in our study we show that the use of a normal-mixture modeling approach on PT-ICC could provide superior performance. Conclusions When modeling ICC values of gene expression levels, using mixture of normals in the probit-transformed (PT scale is less sensitive toward model mis-specification than using mixture of betas. We show that a biased conclusion could be made if we follow the traditional approach and model the two sets of ICC values using the mixture of betas directly. The problematic estimation arises from the sensitivity of beta-mixtures toward model mis-specification, particularly when there are observations in the neighborhood of the the boundary points, 0 or 1. Since beta-mixture modeling

  17. Reproducing the optical properties of fine desert dust aerosols using ensembles of simple model particles

    International Nuclear Information System (INIS)

    Kahnert, Michael

    2004-01-01

    Single scattering optical properties are calculated for a proxy of fine dust aerosols at a wavelength of 0.55 μm. Spherical and spheroidal model particles are employed to fit the aerosol optical properties and to retrieve information about the physical parameters characterising the aerosols. It is found that spherical particles are capable of reproducing the scalar optical properties and the forward peak of the phase function of the dust aerosols. The effective size parameter of the aerosol ensemble is retrieved with high accuracy by using spherical model particles. Significant improvements are achieved by using spheroidal model particles. The aerosol phase function and the other diagonal elements of the Stokes scattering matrix can be fitted with high accuracy, whereas the off-diagonal elements are poorly reproduced. More elongated prolate and more flattened oblate spheroids contribute disproportionately strongly to the optimised shape distribution of the model particles and appear to be particularly useful for achieving a good fit of the scattering matrix. However, the clear discrepancies between the shape distribution of the aerosols and the shape distribution of the spheroidal model particles suggest that the possibilities of extracting shape information from optical observations are rather limited

  18. A novel highly reproducible and lethal nonhuman primate model for orthopox virus infection.

    Directory of Open Access Journals (Sweden)

    Marit Kramski

    Full Text Available The intentional re-introduction of Variola virus (VARV, the agent of smallpox, into the human population is of great concern due its bio-terroristic potential. Moreover, zoonotic infections with Cowpox (CPXV and Monkeypox virus (MPXV cause severe diseases in humans. Smallpox vaccines presently available can have severe adverse effects that are no longer acceptable. The efficacy and safety of new vaccines and antiviral drugs for use in humans can only be demonstrated in animal models. The existing nonhuman primate models, using VARV and MPXV, need very high viral doses that have to be applied intravenously or intratracheally to induce a lethal infection in macaques. To overcome these drawbacks, the infectivity and pathogenicity of a particular CPXV was evaluated in the common marmoset (Callithrix jacchus.A CPXV named calpox virus was isolated from a lethal orthopox virus (OPV outbreak in New World monkeys. We demonstrated that marmosets infected with calpox virus, not only via the intravenous but also the intranasal route, reproducibly develop symptoms resembling smallpox in humans. Infected animals died within 1-3 days after onset of symptoms, even when very low infectious viral doses of 5x10(2 pfu were applied intranasally. Infectious virus was demonstrated in blood, saliva and all organs analyzed.We present the first characterization of a new OPV infection model inducing a disease in common marmosets comparable to smallpox in humans. Intranasal virus inoculation mimicking the natural route of smallpox infection led to reproducible infection. In vivo titration resulted in an MID(50 (minimal monkey infectious dose 50% of 8.3x10(2 pfu of calpox virus which is approximately 10,000-fold lower than MPXV and VARV doses applied in the macaque models. Therefore, the calpox virus/marmoset model is a suitable nonhuman primate model for the validation of vaccines and antiviral drugs. Furthermore, this model can help study mechanisms of OPV pathogenesis.

  19. Reproducing tailing in breakthrough curves: Are statistical models equally representative and predictive?

    Science.gov (United States)

    Pedretti, Daniele; Bianchi, Marco

    2018-03-01

    Breakthrough curves (BTCs) observed during tracer tests in highly heterogeneous aquifers display strong tailing. Power laws are popular models for both the empirical fitting of these curves, and the prediction of transport using upscaling models based on best-fitted estimated parameters (e.g. the power law slope or exponent). The predictive capacity of power law based upscaling models can be however questioned due to the difficulties to link model parameters with the aquifers' physical properties. This work analyzes two aspects that can limit the use of power laws as effective predictive tools: (a) the implication of statistical subsampling, which often renders power laws undistinguishable from other heavily tailed distributions, such as the logarithmic (LOG); (b) the difficulties to reconcile fitting parameters obtained from models with different formulations, such as the presence of a late-time cutoff in the power law model. Two rigorous and systematic stochastic analyses, one based on benchmark distributions and the other on BTCs obtained from transport simulations, are considered. It is found that a power law model without cutoff (PL) results in best-fitted exponents (αPL) falling in the range of typical experimental values reported in the literature (1.5 constant αCO ≈ 1. In the PLCO model, the cutoff rate (λ) is the parameter that fully reproduces the persistence of the tailing and is shown to be inversely correlated to the LOG scale parameter (i.e. with the skewness of the distribution). The theoretical results are consistent with the fitting analysis of a tracer test performed during the MADE-5 experiment. It is shown that a simple mechanistic upscaling model based on the PLCO formulation is able to predict the ensemble of BTCs from the stochastic transport simulations without the need of any fitted parameters. The model embeds the constant αCO = 1 and relies on a stratified description of the transport mechanisms to estimate λ. The PL fails to

  20. [Reproducibility and repeatability of the determination of occlusal plane on digital dental models].

    Science.gov (United States)

    Qin, Yi-fei; Xu, Tian-min

    2015-06-18

    To assess the repeatability(intraobserver comparison)and reproducibility(interobserver comparison)of two different methods for establishing the occlusal plane on digital dental models. With Angle's classification as a stratification factor,48 cases were randomly extracted from 806 ones which had integrated clinical data and had their orthodontic treatment from July 2004 to August 2008 in Department of Orthodontics ,Peking University School and Hospital of Stomatology.Post-treatment plaster casts of 48 cases were scanned by Roland LPX-1200 3D laser scanner to generate geometry data as research subjects.In a locally developed software package,one observer repeated 5 times at intervals of at least one week to localize prescriptive landmarks on each digital model to establish a group of functional occlusal planes and a group of anatomic occlusal planes, while 6 observers established two other groups of functional and anatomic occlusal planes independently.Standard deviations of dihedral angles of each group on each model were calculated and compared between the related groups.The models with the five largest standard deviations of each group were studied to explore possible factors that might influence the identification of the landmarks on the digital models. Significant difference of intraobserver variability was not detected between the functional occlusal plane and the anatomic occlusal plane (P>0.1), while that of interobserver variability was detected (Pocclusal plane was 0.2° smaller than that of the anatomic occlusal plane.The functional occlusal plane's variability of intraobserver and interobsever did not differ significantly (P>0.1), while the anatomic occlusal plane's variability of the intraobserver was significantly smaller than that of the interobserver (Pocclusal planes are suitable as a conference plane with equal repeatability. When several observers measure a large number of digital models,the functional occlusal plane is more reproducible than the

  1. Reproducibility, reliability and validity of measurements obtained from Cecile3 digital models

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Watanabe-Kanno

    2009-09-01

    Full Text Available The aim of this study was to determine the reproducibility, reliability and validity of measurements in digital models compared to plaster models. Fifteen pairs of plaster models were obtained from orthodontic patients with permanent dentition before treatment. These were digitized to be evaluated with the program Cécile3 v2.554.2 beta. Two examiners measured three times the mesiodistal width of all the teeth present, intercanine, interpremolar and intermolar distances, overjet and overbite. The plaster models were measured using a digital vernier. The t-Student test for paired samples and interclass correlation coefficient (ICC were used for statistical analysis. The ICC of the digital models were 0.84 ± 0.15 (intra-examiner and 0.80 ± 0.19 (inter-examiner. The average mean difference of the digital models was 0.23 ± 0.14 and 0.24 ± 0.11 for each examiner, respectively. When the two types of measurements were compared, the values obtained from the digital models were lower than those obtained from the plaster models (p < 0.05, although the differences were considered clinically insignificant (differences < 0.1 mm. The Cécile digital models are a clinically acceptable alternative for use in Orthodontics.

  2. Reproducing the nonlinear dynamic behavior of a structured beam with a generalized continuum model

    Science.gov (United States)

    Vila, J.; Fernández-Sáez, J.; Zaera, R.

    2018-04-01

    In this paper we study the coupled axial-transverse nonlinear vibrations of a kind of one dimensional structured solids by application of the so called Inertia Gradient Nonlinear continuum model. To show the accuracy of this axiomatic model, previously proposed by the authors, its predictions are compared with numeric results from a previously defined finite discrete chain of lumped masses and springs, for several number of particles. A continualization of the discrete model equations based on Taylor series allowed us to set equivalent values of the mechanical properties in both discrete and axiomatic continuum models. Contrary to the classical continuum model, the inertia gradient nonlinear continuum model used herein is able to capture scale effects, which arise for modes in which the wavelength is comparable to the characteristic distance of the structured solid. The main conclusion of the work is that the proposed generalized continuum model captures the scale effects in both linear and nonlinear regimes, reproducing the behavior of the 1D nonlinear discrete model adequately.

  3. Assessment of the reliability of reproducing two-dimensional resistivity models using an image processing technique.

    Science.gov (United States)

    Ishola, Kehinde S; Nawawi, Mohd Nm; Abdullah, Khiruddin; Sabri, Ali Idriss Aboubakar; Adiat, Kola Abdulnafiu

    2014-01-01

    This study attempts to combine the results of geophysical images obtained from three commonly used electrode configurations using an image processing technique in order to assess their capabilities to reproduce two-dimensional (2-D) resistivity models. All the inverse resistivity models were processed using the PCI Geomatica software package commonly used for remote sensing data sets. Preprocessing of the 2-D inverse models was carried out to facilitate further processing and statistical analyses. Four Raster layers were created, three of these layers were used for the input images and the fourth layer was used as the output of the combined images. The data sets were merged using basic statistical approach. Interpreted results show that all images resolved and reconstructed the essential features of the models. An assessment of the accuracy of the images for the four geologic models was performed using four criteria: the mean absolute error and mean percentage absolute error, resistivity values of the reconstructed blocks and their displacements from the true models. Generally, the blocks of the images of maximum approach give the least estimated errors. Also, the displacement of the reconstructed blocks from the true blocks is the least and the reconstructed resistivities of the blocks are closer to the true blocks than any other combined used. Thus, it is corroborated that when inverse resistivity models are combined, most reliable and detailed information about the geologic models is obtained than using individual data sets.

  4. The substorm loading-unloading cycle as reproduced by community-available global MHD magnetospheric models

    Science.gov (United States)

    Gordeev, Evgeny; Sergeev, Victor; Tsyganenko, Nikolay; Kuznetsova, Maria; Rastaetter, Lutz; Raeder, Joachim; Toth, Gabor; Lyon, John; Merkin, Vyacheslav; Wiltberger, Michael

    2017-04-01

    In this study we investigate how well the three community-available global MHD models, supported by the Community Coordinated Modeling Center (CCMC NASA), reproduce the global magnetospheric dynamics, including the loading-unloading substorm cycle. We found that in terms of global magnetic flux transport CCMC models display systematically different response to idealized 2-hour north then 2-hour south IMF Bz variation. The LFM model shows a depressed return convection in the tail plasma sheet and high rate of magnetic flux loading into the lobes during the growth phase, as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded/unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. BATSRUS and Open GGCM models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. Our study shows that different CCMC models under the same solar wind conditions (north to south IMF variation) produce essentially different solutions in terms of global magnetospheric convection.

  5. Mouse Models of Diet-Induced Nonalcoholic Steatohepatitis Reproduce the Heterogeneity of the Human Disease

    Science.gov (United States)

    Machado, Mariana Verdelho; Michelotti, Gregory Alexander; Xie, Guanhua; de Almeida, Thiago Pereira; Boursier, Jerome; Bohnic, Brittany; Guy, Cynthia D.; Diehl, Anna Mae

    2015-01-01

    Background and aims Non-alcoholic steatohepatitis (NASH), the potentially progressive form of nonalcoholic fatty liver disease (NAFLD), is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD) diet and Western diet. Methods Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose) for 16 weeks. Liver pathology and metabolic profile were compared. Results The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation) was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation. Conclusion Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH. PMID:26017539

  6. Mouse models of diet-induced nonalcoholic steatohepatitis reproduce the heterogeneity of the human disease.

    Directory of Open Access Journals (Sweden)

    Mariana Verdelho Machado

    Full Text Available Non-alcoholic steatohepatitis (NASH, the potentially progressive form of nonalcoholic fatty liver disease (NAFLD, is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD diet and Western diet.Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose for 16 weeks. Liver pathology and metabolic profile were compared.The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation.Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH.

  7. Circuit modeling of the electrical impedance: II. Normal subjects and system reproducibility

    International Nuclear Information System (INIS)

    Shiffman, C A; Rutkove, S B

    2013-01-01

    Part I of this series showed that the five-element circuit model accurately mimics impedances measured using multi-frequency electrical impedance myography (MFEIM), focusing on changes brought on by disease. This paper addresses two requirements which must be met if the method is to qualify for clinical use. First, the extracted parameters must be reproducible over long time periods such as those involved in the treatment of muscular disease, and second, differences amongst normal subjects should be attributable to known differences in the properties of healthy muscle. It applies the method to five muscle groups in 62 healthy subjects, closely following the procedure used earlier for the diseased subjects. Test–retest comparisons show that parameters are reproducible at levels from 6 to 16% (depending on the parameter) over time spans of up to 267 days, levels far below the changes occurring in serious disease. Also, variations with age, gender and muscle location are found to be consistent with established expectations for healthy muscle tissue. We conclude that the combination of MFEIM measurements and five-element circuit analysis genuinely reflects properties of muscle and is reliable enough to recommend its use in following neuromuscular disease. (paper)

  8. Reproducibility analysis of measurements with a mechanical semiautomatic eye model for evaluation of intraocular lenses

    Science.gov (United States)

    Rank, Elisabet; Traxler, Lukas; Bayer, Natascha; Reutterer, Bernd; Lux, Kirsten; Drauschke, Andreas

    2014-03-01

    Mechanical eye models are used to validate ex vivo the optical quality of intraocular lenses (IOLs). The quality measurement and test instructions for IOLs are defined in the ISO 11979-2. However, it was mentioned in literature that these test instructions could lead to inaccurate measurements in case of some modern IOL designs. Reproducibility of alignment and measurement processes are presented, performed with a semiautomatic mechanical ex vivo eye model based on optical properties published by Liou and Brennan in the scale 1:1. The cornea, the iris aperture and the IOL itself are separately changeable within the eye model. The adjustment of the IOL can be manipulated by automatic decentration and tilt of the IOL in reference to the optical axis of the whole system, which is defined by the connection line of the central point of the artificial cornea and the iris aperture. With the presented measurement setup two quality criteria are measurable: the modulation transfer function (MTF) and the Strehl ratio. First the reproducibility of the alignment process for definition of initial conditions of the lateral position and tilt in reference to the optical axis of the system is investigated. Furthermore, different IOL holders are tested related to the stable holding of the IOL. The measurement is performed by a before-after comparison of the lens position using a typical decentration and tilt tolerance analysis path. Modulation transfer function MTF and Strehl ratio S before and after this tolerance analysis are compared and requirements for lens holder construction are deduced from the presented results.

  9. Assessment of a climate model to reproduce rainfall variability and extremes over Southern Africa

    Science.gov (United States)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2010-01-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with

  10. Why are models unable to reproduce multi-decadal trends in lower tropospheric baseline ozone levels?

    Science.gov (United States)

    Hu, L.; Liu, J.; Mickley, L. J.; Strahan, S. E.; Steenrod, S.

    2017-12-01

    Assessments of tropospheric ozone radiative forcing rely on accurate model simulations. Parrish et al (2014) found that three chemistry-climate models (CCMs) overestimate present-day O3 mixing ratios and capture only 50% of the observed O3 increase over the last five decades at 12 baseline sites in the northern mid-latitudes, indicating large uncertainties in our understanding of the ozone trends and their implications for radiative forcing. Here we present comparisons of outputs from two chemical transport models (CTMs) - GEOS-Chem and the Global Modeling Initiative model - with O3 observations from the same sites and from the global ozonesonde network. Both CTMs are driven by reanalysis meteorological data (MERRA or MERRA2) and thus are expected to be different in atmospheric transport processes relative to those freely running CCMs. We test whether recent model developments leading to more active ozone chemistry affect the computed ozone sensitivity to perturbations in emissions. Preliminary results suggest these CTMs can reproduce present-day ozone levels but fail to capture the multi-decadal trend since 1980. Both models yield widespread overpredictions of free tropospheric ozone in the 1980s. Sensitivity studies in GEOS-Chem suggest that the model estimate of natural background ozone is too high. We discuss factors that contribute to the variability and trends of tropospheric ozone over the last 30 years, with a focus on intermodel differences in spatial resolution and in the representation of stratospheric chemistry, stratosphere-troposphere exchange, halogen chemistry, and biogenic VOC emissions and chemistry. We also discuss uncertainty in the historical emission inventories used in models, and how these affect the simulated ozone trends.

  11. Constraining the geometry and kinematics of the quasar broad emission line region using gravitational microlensing. I. Models and simulations

    Science.gov (United States)

    Braibant, L.; Hutsemékers, D.; Sluse, D.; Goosmann, R.

    2017-11-01

    Recent studies have shown that line profile distortions are commonly observed in gravitationally lensed quasar spectra. Often attributed to microlensing differential magnification, line profile distortions can provide information on the geometry and kinematics of the broad emission line region (BLR) in quasars. We investigate the effect of gravitational microlensing on quasar broad emission line profiles and their underlying continuum, combining the emission from simple representative BLR models with generic microlensing magnification maps. Specifically, we considered Keplerian disk, polar, and equatorial wind BLR models of various sizes. The effect of microlensing has been quantified with four observables: μBLR, the total magnification of the broad emission line; μcont, the magnification of the underlying continuum; as well as red/blue, RBI and wings/core, WCI, indices that characterize the line profile distortions. The simulations showed that distortions of line profiles, such as those recently observed in lensed quasars, can indeed be reproduced and attributed to the differential effect of microlensing on spatially separated regions of the BLR. While the magnification of the emission line μBLR sets an upper limit on the BLR size and, similarly, the magnification of the continuum μcont sets an upper limit on the size of the continuum source, the line profile distortions mainly depend on the BLR geometry and kinematics. We thus built (WCI,RBI) diagrams that can serve as diagnostic diagrams to discriminate between the various BLR models on the basis of quantitative measurements. It appears that a strong microlensing effect puts important constraints on the size of the BLR and on its distance to the high-magnification caustic. In that case, BLR models with different geometries and kinematics are more prone to produce distinctive line profile distortions for a limited number of caustic configurations, which facilitates their discrimination. When the microlensing

  12. A stable and reproducible human blood-brain barrier model derived from hematopoietic stem cells.

    Directory of Open Access Journals (Sweden)

    Romeo Cecchelli

    Full Text Available The human blood brain barrier (BBB is a selective barrier formed by human brain endothelial cells (hBECs, which is important to ensure adequate neuronal function and protect the central nervous system (CNS from disease. The development of human in vitro BBB models is thus of utmost importance for drug discovery programs related to CNS diseases. Here, we describe a method to generate a human BBB model using cord blood-derived hematopoietic stem cells. The cells were initially differentiated into ECs followed by the induction of BBB properties by co-culture with pericytes. The brain-like endothelial cells (BLECs express tight junctions and transporters typically observed in brain endothelium and maintain expression of most in vivo BBB properties for at least 20 days. The model is very reproducible since it can be generated from stem cells isolated from different donors and in different laboratories, and could be used to predict CNS distribution of compounds in human. Finally, we provide evidence that Wnt/β-catenin signaling pathway mediates in part the BBB inductive properties of pericytes.

  13. Stochastic model of financial markets reproducing scaling and memory in volatility return intervals

    Science.gov (United States)

    Gontis, V.; Havlin, S.; Kononovicius, A.; Podobnik, B.; Stanley, H. E.

    2016-11-01

    We investigate the volatility return intervals in the NYSE and FOREX markets. We explain previous empirical findings using a model based on the interacting agent hypothesis instead of the widely-used efficient market hypothesis. We derive macroscopic equations based on the microscopic herding interactions of agents and find that they are able to reproduce various stylized facts of different markets and different assets with the same set of model parameters. We show that the power-law properties and the scaling of return intervals and other financial variables have a similar origin and could be a result of a general class of non-linear stochastic differential equations derived from a master equation of an agent system that is coupled by herding interactions. Specifically, we find that this approach enables us to recover the volatility return interval statistics as well as volatility probability and spectral densities for the NYSE and FOREX markets, for different assets, and for different time-scales. We find also that the historical S&P500 monthly series exhibits the same volatility return interval properties recovered by our proposed model. Our statistical results suggest that human herding is so strong that it persists even when other evolving fluctuations perturbate the financial system.

  14. How well do CMIP5 Climate Models Reproduce the Hydrologic Cycle of the Colorado River Basin?

    Science.gov (United States)

    Gautam, J.; Mascaro, G.

    2017-12-01

    The Colorado River, which is the primary source of water for nearly 40 million people in the arid Southwestern states of the United States, has been experiencing an extended drought since 2000, which has led to a significant reduction in water supply. As the water demands increase, one of the major challenges for water management in the region has been the quantification of uncertainties associated with streamflow predictions in the Colorado River Basin (CRB) under potential changes of future climate. Hence, testing the reliability of model predictions in the CRB is critical in addressing this challenge. In this study, we evaluated the performances of 17 General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase Five (CMIP5) and 4 Regional Climate Models (RCMs) in reproducing the statistical properties of the hydrologic cycle in the CRB. We evaluated the water balance components at four nested sub-basins along with the inter-annual and intra-annual changes of precipitation (P), evaporation (E), runoff (R) and temperature (T) from 1979 to 2005. Most of the models captured the net water balance fairly well in the most-upstream basin but simulated a weak hydrological cycle in the evaporation channel at the downstream locations. The simulated monthly variability of P had different patterns, with correlation coefficients ranging from -0.6 to 0.8 depending on the sub-basin and the models from same parent institution clustering together. Apart from the most-upstream sub-basin where the models were mainly characterized by a negative seasonal bias in SON (of up to -50%), most of them had a positive bias in all seasons (of up to +260%) in the other three sub-basins. The models, however, captured the monthly variability of T well at all sites with small inter-model variabilities and a relatively similar range of bias (-7 °C to +5 °C) across all seasons. Mann-Kendall test was applied to the annual P and T time-series where majority of the models

  15. Synaptic augmentation in a cortical circuit model reproduces serial dependence in visual working memory.

    Directory of Open Access Journals (Sweden)

    Daniel P Bliss

    Full Text Available Recent work has established that visual working memory is subject to serial dependence: current information in memory blends with that from the recent past as a function of their similarity. This tuned temporal smoothing likely promotes the stability of memory in the face of noise and occlusion. Serial dependence accumulates over several seconds in memory and deteriorates with increased separation between trials. While this phenomenon has been extensively characterized in behavior, its neural mechanism is unknown. In the present study, we investigate the circuit-level origins of serial dependence in a biophysical model of cortex. We explore two distinct kinds of mechanisms: stable persistent activity during the memory delay period and dynamic "activity-silent" synaptic plasticity. We find that networks endowed with both strong reverberation to support persistent activity and dynamic synapses can closely reproduce behavioral serial dependence. Specifically, elevated activity drives synaptic augmentation, which biases activity on the subsequent trial, giving rise to a spatiotemporally tuned shift in the population response. Our hybrid neural model is a theoretical advance beyond abstract mathematical characterizations, offers testable hypotheses for physiological research, and demonstrates the power of biological insights to provide a quantitative explanation of human behavior.

  16. Fast bootstrapping and permutation testing for assessing reproducibility and interpretability of multivariate fMRI decoding models.

    Directory of Open Access Journals (Sweden)

    Bryan R Conroy

    Full Text Available Multivariate decoding models are increasingly being applied to functional magnetic imaging (fMRI data to interpret the distributed neural activity in the human brain. These models are typically formulated to optimize an objective function that maximizes decoding accuracy. For decoding models trained on full-brain data, this can result in multiple models that yield the same classification accuracy, though some may be more reproducible than others--i.e. small changes to the training set may result in very different voxels being selected. This issue of reproducibility can be partially controlled by regularizing the decoding model. Regularization, along with the cross-validation used to estimate decoding accuracy, typically requires retraining many (often on the order of thousands of related decoding models. In this paper we describe an approach that uses a combination of bootstrapping and permutation testing to construct both a measure of cross-validated prediction accuracy and model reproducibility of the learned brain maps. This requires re-training our classification method on many re-sampled versions of the fMRI data. Given the size of fMRI datasets, this is normally a time-consuming process. Our approach leverages an algorithm called fast simultaneous training of generalized linear models (FaSTGLZ to create a family of classifiers in the space of accuracy vs. reproducibility. The convex hull of this family of classifiers can be used to identify a subset of Pareto optimal classifiers, with a single-optimal classifier selectable based on the relative cost of accuracy vs. reproducibility. We demonstrate our approach using full-brain analysis of elastic-net classifiers trained to discriminate stimulus type in an auditory and visual oddball event-related fMRI design. Our approach and results argue for a computational approach to fMRI decoding models in which the value of the interpretation of the decoding model ultimately depends upon optimizing a

  17. Integrating social model principles into broad-based treatment: results of a program evaluation.

    Science.gov (United States)

    Polcin, Douglas L; Prindle, Suzi D; Bostrom, Alan

    2002-11-01

    Although traditional social model recovery programs appear to be decreasing, some aspects of social model recovery continue to exert a strong influence in broad-based, integrated programs. This article describes a four-week program that integrates licensed therapists, certified counselors, psychiatric consultation, and social model recovery principles into a broad-based treatment approach. The Social Model Philosophy Scale revealed a low overall rating on social model philosophy. However, social model principles that were heavily stressed included practicing 12-step recovery, the importance of getting a 12-step sponsor, staff-client interactions outside a formal office, employing staff who are in recovery, and emphasizing a goal of abstinence. Three- and six-month follow-up revealed significant improvement in alcohol and drug use, heavy alcohol use, satisfaction with family relationships, 12-step involvement, illegal behaviors, arrests, unsafe sex, self-esteem, use of medical resources, and health status. Findings provide a rationale for larger, multi-site studies that assess the effectiveness of social model characteristics using multivariate techniques.

  18. Rainfall variability and extremes over southern Africa: Assessment of a climate model to reproduce daily extremes

    Science.gov (United States)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2009-04-01

    It is increasingly accepted that that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. The ability of a climate model to simulate current climate provides some indication of how much confidence can be applied to its future predictions. In this paper, simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. This concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of rainfall variability over southern Africa. Secondly, the ability of the model to reproduce daily rainfall extremes will

  19. A comprehensive model of catastrophic optical-damage in broad-area laser diodes

    Science.gov (United States)

    Chin, A. K.; Bertaska, R. K.; Jaspan, M. A.; Flusberg, A. M.; Swartz, S. D.; Knapczyk, M. T.; Petr, R.; Smilanski, I.; Jacob, J. H.

    2009-02-01

    The present model of formation and propagation of catastrophic optical-damage (COD), a random failure-mode in laser diodes, was formulated in 1974 and has remained substantially unchanged. We extend the model of COD phenomena, based on analytical studies involving EBIC (electron-beam induced current), STEM (scanning transmission-electron microscopy) and sophisticated optical-measurements. We have determined that a ring-cavity mode, whose presence has not been previously reported, significantly contributes to COD initiation and propagation in broad-area laser-diodes.

  20. Cross-species analysis of gene expression in non-model mammals: reproducibility of hybridization on high density oligonucleotide microarrays

    Directory of Open Access Journals (Sweden)

    Pita-Thomas Wolfgang

    2007-04-01

    Full Text Available Abstract Background Gene expression profiles of non-model mammals may provide valuable data for biomedical and evolutionary studies. However, due to lack of sequence information of other species, DNA microarrays are currently restricted to humans and a few model species. This limitation may be overcome by using arrays developed for a given species to analyse gene expression in a related one, an approach known as "cross-species analysis". In spite of its potential usefulness, the accuracy and reproducibility of the gene expression measures obtained in this way are still open to doubt. The present study examines whether or not hybridization values from cross-species analyses are as reproducible as those from same-species analyses when using Affymetrix oligonucleotide microarrays. Results The reproducibility of the probe data obtained hybridizing deer, Old-World primates, and human RNA samples to Affymetrix human GeneChip® U133 Plus 2.0 was compared. The results show that cross-species hybridization affected neither the distribution of the hybridization reproducibility among different categories, nor the reproducibility values of the individual probes. Our analyses also show that a 0.5% of the probes analysed in the U133 plus 2.0 GeneChip are significantly associated to un-reproducible hybridizations. Such probes-called in the text un-reproducible probe sequences- do not increase in number in cross-species analyses. Conclusion Our study demonstrates that cross-species analyses do not significantly affect hybridization reproducibility of GeneChips, at least within the range of the mammal species analysed here. The differences in reproducibility between same-species and cross-species analyses observed in previous studies were probably caused by the analytical methods used to calculate the gene expression measures. Together with previous observations on the accuracy of GeneChips for cross-species analysis, our analyses demonstrate that cross

  1. Model for a reproducible curriculum infrastructure to provide international nurse anesthesia continuing education.

    Science.gov (United States)

    Collins, Shawn Bryant

    2011-12-01

    There are no set standards for nurse anesthesia education in developing countries, yet one of the keys to the standards in global professional practice is competency assurance for individuals. Nurse anesthetists in developing countries have difficulty obtaining educational materials. These difficulties include, but are not limited to, financial constraints, lack of anesthesia textbooks, and distance from educational sites. There is increasing evidence that the application of knowledge in developing countries is failing. One reason is that many anesthetists in developing countries are trained for considerably less than acceptable time periods and are often supervised by poorly trained practitioners, who then pass on less-than-desirable practice skills, thus exacerbating difficulties. Sustainability of development can come only through anesthetists who are both well trained and able to pass on their training to others. The international nurse anesthesia continuing education project was developed in response to the difficulty that nurse anesthetists in developing countries face in accessing continuing education. The purpose of this project was to develop a nonprofit, volunteer-based model for providing nurse anesthesia continuing education that can be reproduced and used in any developing country.

  2. Power scaling and experimentally fitted model for broad area quantum cascade lasers in continuous wave operation

    Science.gov (United States)

    Suttinger, Matthew; Go, Rowel; Figueiredo, Pedro; Todi, Ankesh; Shu, Hong; Leshin, Jason; Lyakh, Arkadiy

    2018-01-01

    Experimental and model results for 15-stage broad area quantum cascade lasers (QCLs) are presented. Continuous wave (CW) power scaling from 1.62 to 2.34 W has been experimentally demonstrated for 3.15-mm long, high reflection-coated QCLs for an active region width increased from 10 to 20 μm. A semiempirical model for broad area devices operating in CW mode is presented. The model uses measured pulsed transparency current, injection efficiency, waveguide losses, and differential gain as input parameters. It also takes into account active region self-heating and sublinearity of pulsed power versus current laser characteristic. The model predicts that an 11% improvement in maximum CW power and increased wall-plug efficiency can be achieved from 3.15 mm×25 μm devices with 21 stages of the same design, but half doping in the active region. For a 16-stage design with a reduced stage thickness of 300 Å, pulsed rollover current density of 6 kA/cm2, and InGaAs waveguide layers, an optical power increase of 41% is projected. Finally, the model projects that power level can be increased to ˜4.5 W from 3.15 mm×31 μm devices with the baseline configuration with T0 increased from 140 K for the present design to 250 K.

  3. Composite model to reproduce the mechanical behaviour of methane hydrate bearing soils

    Science.gov (United States)

    De la Fuente, Maria

    2016-04-01

    Methane hydrate bearing sediments (MHBS) are naturally-occurring materials containing different components in the pores that may suffer phase changes under relative small temperature and pressure variations for conditions typically prevailing a few hundreds of meters below sea level. Their modelling needs to account for heat and mass balance equations of the different components, and several strategies already exist to combine them (e.g., Rutqvist & Moridis, 2009; Sánchez et al. 2014). These equations have to be completed by restrictions and constitutive laws reproducing the phenomenology of heat and fluid flows, phase change conditions and mechanical response. While the formulation of the non-mechanical laws generally includes explicitly the mass fraction of methane in each phase, which allows for a natural update of parameters during phase changes, mechanical laws are, in most cases, stated for the whole solid skeleton (Uchida et al., 2012; Soga et al. 2006). In this paper, a mechanical model is proposed to cope with the response of MHBS. It is based on a composite approach that allows defining the thermo-hydro-mechanical response of mineral skeleton and solid hydrates independently. The global stress-strain-temperature response of the solid phase (grains + hydrate) is then obtained by combining both responses according to energy principle following the work by Pinyol et al. (2007). In this way, dissociation of MH can be assessed on the basis of the stress state and temperature prevailing locally within the hydrate component. Besides, its structuring effect is naturally accounted for by the model according to patterns of MH inclusions within soil pores. This paper describes the fundamental hypothesis behind the model and its formulation. Its performance is assessed by comparison with laboratory data presented in the literature. An analysis of MHBS response to several stress-temperature paths representing potential field cases is finally presented. References

  4. Can Computational Sediment Transport Models Reproduce the Observed Variability of Channel Networks in Modern Deltas?

    Science.gov (United States)

    Nesvold, E.; Mukerji, T.

    2017-12-01

    River deltas display complex channel networks that can be characterized through the framework of graph theory, as shown by Tejedor et al. (2015). Deltaic patterns may also be useful in a Bayesian approach to uncertainty quantification of the subsurface, but this requires a prior distribution of the networks of ancient deltas. By considering subaerial deltas, one can at least obtain a snapshot in time of the channel network spectrum across deltas. In this study, the directed graph structure is semi-automatically extracted from satellite imagery using techniques from statistical processing and machine learning. Once the network is labeled with vertices and edges, spatial trends and width and sinuosity distributions can also be found easily. Since imagery is inherently 2D, computational sediment transport models can serve as a link between 2D network structure and 3D depositional elements; the numerous empirical rules and parameters built into such models makes it necessary to validate the output with field data. For this purpose we have used a set of 110 modern deltas, with average water discharge ranging from 10 - 200,000 m3/s, as a benchmark for natural variability. Both graph theoretic and more general distributions are established. A key question is whether it is possible to reproduce this deltaic network spectrum with computational models. Delft3D was used to solve the shallow water equations coupled with sediment transport. The experimental setup was relatively simple; incoming channelized flow onto a tilted plane, with varying wave and tidal energy, sediment types and grain size distributions, river discharge and a few other input parameters. Each realization was run until a delta had fully developed: between 50 and 500 years (with a morphology acceleration factor). It is shown that input parameters should not be sampled independently from the natural ranges, since this may result in deltaic output that falls well outside the natural spectrum. Since we are

  5. Conceptual model suitability for reproducing preferential flow paths in waste rock piles

    Science.gov (United States)

    Broda, S.; Blessent, D.; Aubertin, M.

    2012-12-01

    Waste rocks are typically deposited on mining sites forming waste rock piles (WRP). Acid mine drainage (AMD) or contaminated neutral drainage (CND) with metal leaching from the sulphidic minerals adversely impact soil and water composition on and beyond the mining sites. The deposition method and the highly heterogeneous hydrogeological and geochemical properties of waste rock have a major impact on water and oxygen movement and pore water pressure distribution in the WRP, controlling AMD/CND production. However, the prediction and interpretation of water distribution in WRP is a challenging problem and many attempted numerical investigations of short and long term forecasts were found unreliable. Various forms of unsaturated localized preferential flow processes have been identified, for instance flow in macropores and fractures, heterogeneity-driven and gravity-driven unstable flow, with local hydraulic conductivities reaching several dozen meters per day. Such phenomena have been entirely neglected in numerical WRP modelling and are unattainable with the classical equivalent porous media conceptual approach typically used in this field. An additional complicating circumstance is the unknown location of macropores and fractures a priori. In this study, modeling techniques originally designed for massive fractured rock aquifers are applied. The properties of the waste rock material, found at the Tio mine at Havre Saint-Pierre, Québec (Canada), used in this modelling study were retrieved from laboratory permeability and water retention tests. These column tests were reproduced with the numerical 3D fully-integrated surface/subsurface flow model HydroGeoSphere, where material heterogeneity is represented by means of i) the dual continuum approach, ii) discrete fractures, and iii) a stochastic facies distribution framework using TPROGS. Comparisons with measured pore water pressures, tracer concentrations and exiting water volumes allowed defining limits and

  6. A novel, comprehensive, and reproducible porcine model for determining the timing of bruises in forensic pathology

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2016-01-01

    in order to identify gross and histological parameters that may be useful in determining the age of a bruise. Methods The mechanical device was able to apply a single reproducible stroke with a plastic tube that was equivalent to being struck by a man. In each of 10 anesthetized pigs, four strokes...

  7. Endovascular Broad-Neck Aneurysm Creation in a Porcine Model Using a Vascular Plug

    Energy Technology Data Exchange (ETDEWEB)

    Muehlenbruch, Georg, E-mail: gmuehlenbruch@ukaachen.de; Nikoubashman, Omid; Steffen, Bjoern; Dadak, Mete [RWTH Aachen University, Department of Diagnostic and Interventional Neuroradiology, University Hospital (Germany); Palmowski, Moritz [RWTH Aachen University, Department of Nuclear Medicine, University Hospital (Germany); Wiesmann, Martin [RWTH Aachen University, Department of Diagnostic and Interventional Neuroradiology, University Hospital (Germany)

    2013-02-15

    Ruptured cerebral arterial aneurysms require prompt treatment by either surgical clipping or endovascular coiling. Training for these sophisticated endovascular procedures is essential and ideally performed in animals before their use in humans. Simulators and established animal models have shown drawbacks with respect to degree of reality, size of the animal model and aneurysm, or time and effort needed for aneurysm creation. We therefore aimed to establish a realistic and readily available aneurysm model. Five anticoagulated domestic pigs underwent endovascular intervention through right femoral access. A total of 12 broad-neck aneurysms were created in the carotid, subclavian, and renal arteries using the Amplatzer vascular plug. With dedicated vessel selection, cubic, tubular, and side-branch aneurysms could be created. Three of the 12 implanted occluders, two of them implanted over a side branch of the main vessel, did not induce complete vessel occlusion. However, all aneurysms remained free of intraluminal thrombus formation and were available for embolization training during a surveillance period of 6 h. Two aneurysms underwent successful exemplary treatment: one was stent-assisted, and one was performed with conventional endovascular coil embolization. The new porcine aneurysm model proved to be a straightforward approach that offers a wide range of training and scientific applications that might help further improve endovascular coil embolization therapy in patients with cerebral aneurysms.

  8. Endovascular Broad-Neck Aneurysm Creation in a Porcine Model Using a Vascular Plug

    International Nuclear Information System (INIS)

    Mühlenbruch, Georg; Nikoubashman, Omid; Steffen, Björn; Dadak, Mete; Palmowski, Moritz; Wiesmann, Martin

    2013-01-01

    Ruptured cerebral arterial aneurysms require prompt treatment by either surgical clipping or endovascular coiling. Training for these sophisticated endovascular procedures is essential and ideally performed in animals before their use in humans. Simulators and established animal models have shown drawbacks with respect to degree of reality, size of the animal model and aneurysm, or time and effort needed for aneurysm creation. We therefore aimed to establish a realistic and readily available aneurysm model. Five anticoagulated domestic pigs underwent endovascular intervention through right femoral access. A total of 12 broad-neck aneurysms were created in the carotid, subclavian, and renal arteries using the Amplatzer vascular plug. With dedicated vessel selection, cubic, tubular, and side-branch aneurysms could be created. Three of the 12 implanted occluders, two of them implanted over a side branch of the main vessel, did not induce complete vessel occlusion. However, all aneurysms remained free of intraluminal thrombus formation and were available for embolization training during a surveillance period of 6 h. Two aneurysms underwent successful exemplary treatment: one was stent-assisted, and one was performed with conventional endovascular coil embolization. The new porcine aneurysm model proved to be a straightforward approach that offers a wide range of training and scientific applications that might help further improve endovascular coil embolization therapy in patients with cerebral aneurysms.

  9. Testing Accretion Disk Wind Models of Broad Absorption Line Quasars with SDSS Spectra

    Science.gov (United States)

    Lindgren, Sean; Gabel, Jack

    2017-06-01

    We present an investigation of a large sample of broad absorption line (BAL) quasars (QSO) from the Sloan Digital Sky Survey (SDSS) Data Release 5 (DR5). Properties of the BALs, such as absorption equivalent width, outflow velocities, and depth of BAL, are obtained from analysis by Gibson et al. We perform correlation analysis on these data to test the predictions made by the radiation driven, accretion disk streamline model of Murray and Chiang. We find the CIV BAL maximum velocity and the continuum luminosity are correlated, consistent with radiation driven models. The mean minimum velocity of CIV is lower in low ionization BALs (LoBALs), than highly ionized BALs (HiBALS), suggesting an orientation effect consistent with the Murray and Chiang model. Finally, we find that HiBALs greatly outnumber LoBALs in the general BAL population, supporting prediction of the Murray and Chiang model that HiBALs have a greater global covering factor than LoBALs.

  10. Pharmacokinetic Modelling to Predict FVIII:C Response to Desmopressin and Its Reproducibility in Nonsevere Haemophilia A Patients.

    Science.gov (United States)

    Schütte, Lisette M; van Hest, Reinier M; Stoof, Sara C M; Leebeek, Frank W G; Cnossen, Marjon H; Kruip, Marieke J H A; Mathôt, Ron A A

    2018-04-01

     Nonsevere haemophilia A (HA) patients can be treated with desmopressin. Response of factor VIII activity (FVIII:C) differs between patients and is difficult to predict.  Our aims were to describe FVIII:C response after desmopressin and its reproducibility by population pharmacokinetic (PK) modelling.  Retrospective data of 128 nonsevere HA patients (age 7-75 years) receiving an intravenous or intranasal dose of desmopressin were used. PK modelling of FVIII:C was performed by nonlinear mixed effect modelling. Reproducibility of FVIII:C response was defined as less than 25% difference in peak FVIII:C between administrations.  A total of 623 FVIII:C measurements from 142 desmopressin administrations were available; 14 patients had received two administrations at different occasions. The FVIII:C time profile was best described by a two-compartment model with first-order absorption and elimination. Interindividual variability of the estimated baseline FVIII:C, central volume of distribution and clearance were 37, 43 and 50%, respectively. The most recently measured FVIII:C (FVIII-recent) was significantly associated with FVIII:C response to desmopressin ( p  C increase of 0.47 IU/mL (median, interquartile range: 0.32-0.65 IU/mL, n  = 142). C response was reproducible in 6 out of 14 patients receiving two desmopressin administrations.  FVIII:C response to desmopressin in nonsevere HA patients was adequately described by a population PK model. Large variability in FVIII:C response was observed, which could only partially be explained by FVIII-recent. C response was not reproducible in a small subset of patients. Therefore, monitoring FVIII:C around surgeries or bleeding might be considered. Research is needed to study this further. Schattauer Stuttgart.

  11. Humanized Immunoglobulin Mice: Models for HIV Vaccine Testing and Studying the Broadly Neutralizing Antibody Problem.

    Science.gov (United States)

    Verkoczy, Laurent

    2017-01-01

    A vaccine that can effectively prevent HIV-1 transmission remains paramount to ending the HIV pandemic, but to do so, will likely need to induce broadly neutralizing antibody (bnAb) responses. A major technical hurdle toward achieving this goal has been a shortage of animal models with the ability to systematically pinpoint roadblocks to bnAb induction and to rank vaccine strategies based on their ability to stimulate bnAb development. Over the past 6 years, immunoglobulin (Ig) knock-in (KI) technology has been leveraged to express bnAbs in mice, an approach that has enabled elucidation of various B-cell tolerance mechanisms limiting bnAb production and evaluation of strategies to circumvent such processes. From these studies, in conjunction with the wealth of information recently obtained regarding the evolutionary pathways and paratopes/epitopes of multiple bnAbs, it has become clear that the very features of bnAbs desired for their function will be problematic to elicit by traditional vaccine paradigms, necessitating more iterative testing of new vaccine concepts. To meet this need, novel bnAb KI models have now been engineered to express either inferred prerearranged V(D)J exons (or unrearranged germline V, D, or J segments that can be assembled into functional rearranged V(D)J exons) encoding predecessors of mature bnAbs. One encouraging approach that has materialized from studies using such newer models is sequential administration of immunogens designed to bind progressively more mature bnAb predecessors. In this review, insights into the regulation and induction of bnAbs based on the use of KI models will be discussed, as will new Ig KI approaches for higher-throughput production and/or altering expression of bnAbs in vivo, so as to further enable vaccine-guided bnAb induction studies. © 2017 Elsevier Inc. All rights reserved.

  12. Evapotranspiration modelled from stands of three broad-leaved tropical trees in Costa Rica

    Science.gov (United States)

    Bigelow, Seth

    2001-10-01

    To examine the impact of tree species on the water cycle in a wet tropical region, annual evapotranspiration (ET) was estimated in Costa Rican plantations of three native, broad-leaved tree species that contrasted strongly in leaf size, leaf area and phenology. Evapotranspiration was estimated using the Penman-Monteith equation for transpiration from the dry canopy, the equilibrium equation for evaporation from the understory and a modified Rutter model of interception for evaporation of water from the canopy when wetted by rainfall. Canopy conductance was estimated from stomatal conductance, leaf area and leaf boundary-layer conductance; canopy storage capacity and filling rate were estimated from throughfall measurements. Micrometeorological instruments were mounted on a scaffolding tower.Mean stomatal conductance, which ranged from 0·1 to 0·7 mol m-2 s-1, was similar to boundary-layer conductance, 0·1 to 0·5 mol m-2 s-1, indicating decoupling of stomata from atmospheric conditions. Mean canopy conductance varied from 0·6 to 0·7 mol m-2 s-1 in the 1994 wet season then dropped to 0·3-0·4 mol m-2 s-1 in stands of the two deciduous species, Cordia and Cedrela, as a result of reduced leaf area during the dry season. Despite increased understory evaporation, dry-season ET from these stands was only 78-81% of ET in stands of the evergreen species, Hyeronima. Maximum canopy water depth varied from 0·2 to 2·2 mm, causing modelled interception to vary from 6% to 25% of annual ET. Higher dry-season transpiration rates along with high rates of evaporation of intercepted rainfall in all seasons led to 14% higher annual ET in Hyeronima stands (1509 mm) than in stands of the species with lowest ET,

  13. A novel porcine model of ataxia telangiectasia reproduces neurological features and motor deficits of human disease.

    Science.gov (United States)

    Beraldi, Rosanna; Chan, Chun-Hung; Rogers, Christopher S; Kovács, Attila D; Meyerholz, David K; Trantzas, Constantin; Lambertz, Allyn M; Darbro, Benjamin W; Weber, Krystal L; White, Katherine A M; Rheeden, Richard V; Kruer, Michael C; Dacken, Brian A; Wang, Xiao-Jun; Davis, Bryan T; Rohret, Judy A; Struzynski, Jason T; Rohret, Frank A; Weimer, Jill M; Pearce, David A

    2015-11-15

    Ataxia telangiectasia (AT) is a progressive multisystem disorder caused by mutations in the AT-mutated (ATM) gene. AT is a neurodegenerative disease primarily characterized by cerebellar degeneration in children leading to motor impairment. The disease progresses with other clinical manifestations including oculocutaneous telangiectasia, immune disorders, increased susceptibly to cancer and respiratory infections. Although genetic investigations and physiological models have established the linkage of ATM with AT onset, the mechanisms linking ATM to neurodegeneration remain undetermined, hindering therapeutic development. Several murine models of AT have been successfully generated showing some of the clinical manifestations of the disease, however they do not fully recapitulate the hallmark neurological phenotype, thus highlighting the need for a more suitable animal model. We engineered a novel porcine model of AT to better phenocopy the disease and bridge the gap between human and current animal models. The initial characterization of AT pigs revealed early cerebellar lesions including loss of Purkinje cells (PCs) and altered cytoarchitecture suggesting a developmental etiology for AT and could advocate for early therapies for AT patients. In addition, similar to patients, AT pigs show growth retardation and develop motor deficit phenotypes. By using the porcine system to model human AT, we established the first animal model showing PC loss and motor features of the human disease. The novel AT pig provides new opportunities to unmask functions and roles of ATM in AT disease and in physiological conditions. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. A novel porcine model of ataxia telangiectasia reproduces neurological features and motor deficits of human disease

    Science.gov (United States)

    Beraldi, Rosanna; Chan, Chun-Hung; Rogers, Christopher S.; Kovács, Attila D.; Meyerholz, David K.; Trantzas, Constantin; Lambertz, Allyn M.; Darbro, Benjamin W.; Weber, Krystal L.; White, Katherine A.M.; Rheeden, Richard V.; Kruer, Michael C.; Dacken, Brian A.; Wang, Xiao-Jun; Davis, Bryan T.; Rohret, Judy A.; Struzynski, Jason T.; Rohret, Frank A.; Weimer, Jill M.; Pearce, David A.

    2015-01-01

    Ataxia telangiectasia (AT) is a progressive multisystem disorder caused by mutations in the AT-mutated (ATM) gene. AT is a neurodegenerative disease primarily characterized by cerebellar degeneration in children leading to motor impairment. The disease progresses with other clinical manifestations including oculocutaneous telangiectasia, immune disorders, increased susceptibly to cancer and respiratory infections. Although genetic investigations and physiological models have established the linkage of ATM with AT onset, the mechanisms linking ATM to neurodegeneration remain undetermined, hindering therapeutic development. Several murine models of AT have been successfully generated showing some of the clinical manifestations of the disease, however they do not fully recapitulate the hallmark neurological phenotype, thus highlighting the need for a more suitable animal model. We engineered a novel porcine model of AT to better phenocopy the disease and bridge the gap between human and current animal models. The initial characterization of AT pigs revealed early cerebellar lesions including loss of Purkinje cells (PCs) and altered cytoarchitecture suggesting a developmental etiology for AT and could advocate for early therapies for AT patients. In addition, similar to patients, AT pigs show growth retardation and develop motor deficit phenotypes. By using the porcine system to model human AT, we established the first animal model showing PC loss and motor features of the human disease. The novel AT pig provides new opportunities to unmask functions and roles of ATM in AT disease and in physiological conditions. PMID:26374845

  15. Validation of EURO-CORDEX regional climate models in reproducing the variability of precipitation extremes in Romania

    Science.gov (United States)

    Dumitrescu, Alexandru; Busuioc, Aristita

    2016-04-01

    EURO-CORDEX is the European branch of the international CORDEX initiative that aims to provide improved regional climate change projections for Europe. The main objective of this paper is to document the performance of the individual models in reproducing the variability of precipitation extremes in Romania. Here three EURO-CORDEX regional climate models (RCMs) ensemble (scenario RCP4.5) are analysed and inter-compared: DMI-HIRHAM5, KNMI-RACMO2.2 and MPI-REMO. Compared to previous studies, when the RCM validation regarding the Romanian climate has mainly been made on mean state and at station scale, a more quantitative approach of precipitation extremes is proposed. In this respect, to have a more reliable comparison with observation, a high resolution daily precipitation gridded data set was used as observational reference (CLIMHYDEX project). The comparison between the RCM outputs and observed grid point values has been made by calculating three extremes precipitation indices, recommended by the Expert Team on Climate Change Detection Indices (ETCCDI), for the 1976-2005 period: R10MM, annual count of days when precipitation ≥10mm; RX5DAY, annual maximum 5-day precipitation and R95P%, precipitation fraction of annual total precipitation due to daily precipitation > 95th percentile. The RCMs capability to reproduce the mean state for these variables, as well as the main modes of their spatial variability (given by the first three EOF patterns), are analysed. The investigation confirms the ability of RCMs to simulate the main features of the precipitation extreme variability over Romania, but some deficiencies in reproducing of their regional characteristics were found (for example, overestimation of the mea state, especially over the extra Carpathian regions). This work has been realised within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian

  16. [NDVI difference rate recognition model of deciduous broad-leaved forest based on HJ-CCD remote sensing data].

    Science.gov (United States)

    Wang, Yan; Tian, Qing-Jiu; Huang, Yan; Wei, Hong-Wei

    2013-04-01

    The present paper takes Chuzhou in Anhui Province as the research area, and deciduous broad-leaved forest as the research object. Then it constructs the recognition model about deciduous broad-leaved forest was constructed using NDVI difference rate between leaf expansion and flowering and fruit-bearing, and the model was applied to HJ-CCD remote sensing image on April 1, 2012 and May 4, 2012. At last, the spatial distribution map of deciduous broad-leaved forest was extracted effectively, and the results of extraction were verified and evaluated. The result shows the validity of NDVI difference rate extraction method proposed in this paper and also verifies the applicability of using HJ-CCD data for vegetation classification and recognition.

  17. Using a 1-D model to reproduce the diurnal variability of SST

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob L.; Donlon, Craig J.

    2017-01-01

    preferred approach to bridge the gap between in situ and remotely sensed measurements and obtain diurnal warming estimates at large spatial scales is modeling of the upper ocean temperature. This study uses the one-dimensional General Ocean Turbulence Model (GOTM) to resolve diurnal signals identified from...... forcing fields and is able to resolve daily SST variability seen both from satellite and in situ measurements. As such, and due to its low computational cost, it is proposed as a candidate model for diurnal variability estimates....

  18. World soil property estimates for broad-scale modelling (WISE30sec)

    NARCIS (Netherlands)

    Batjes, N.H.

    2015-01-01

    This study presents soil property estimates for the world for application at a broad scale. The GIS dataset was compiled using traditional mapping approaches. It is comprised of a soil -geographical and a soil attribute component. The former was derived from a GIS overlay of the Harmonised World

  19. Chimeric Hemagglutinin Constructs Induce Broad Protection against Influenza B Virus Challenge in the Mouse Model.

    Science.gov (United States)

    Ermler, Megan E; Kirkpatrick, Ericka; Sun, Weina; Hai, Rong; Amanat, Fatima; Chromikova, Veronika; Palese, Peter; Krammer, Florian

    2017-06-15

    Seasonal influenza virus epidemics represent a significant public health burden. Approximately 25% of all influenza virus infections are caused by type B viruses, and these infections can be severe, especially in children. Current influenza virus vaccines are an effective prophylaxis against infection but are impacted by rapid antigenic drift, which can lead to mismatches between vaccine strains and circulating strains. Here, we describe a broadly protective vaccine candidate based on chimeric hemagglutinins, consisting of globular head domains from exotic influenza A viruses and stalk domains from influenza B viruses. Sequential vaccination with these constructs in mice leads to the induction of broadly reactive antibodies that bind to the conserved stalk domain of influenza B virus hemagglutinin. Vaccinated mice are protected from lethal challenge with diverse influenza B viruses. Results from serum transfer experiments and antibody-dependent cell-mediated cytotoxicity (ADCC) assays indicate that this protection is antibody mediated and based on Fc effector functions. The present data suggest that chimeric hemagglutinin-based vaccination is a viable strategy to broadly protect against influenza B virus infection. IMPORTANCE While current influenza virus vaccines are effective, they are affected by mismatches between vaccine strains and circulating strains. Furthermore, the antiviral drug oseltamivir is less effective for treating influenza B virus infections than for treating influenza A virus infections. A vaccine that induces broad and long-lasting protection against influenza B viruses is therefore urgently needed. Copyright © 2017 American Society for Microbiology.

  20. Energy and nutrient deposition and excretion in the reproducing sow: model development and evaluation

    DEFF Research Database (Denmark)

    Hansen, A V; Strathe, A B; Theil, Peter Kappel

    2014-01-01

    was related to predictions of body fat and protein loss from the lactation model. Nitrogen intake, urine N, fecal N, and milk N were predicted with RMSPE as percentage of observed mean of 9.7, 17.9, 10.0, and 7.7%, respectively. The model provided a framework, but more refinements and improvements in accuracy......Air and nutrient emissions from swine operations raise environmental concerns. During the reproduction phase, sows consume and excrete large quantities of nutrients. The objective of this study was to develop a mathematical model to describe energy and nutrient partitioning and predict manure...... excretion and composition and methane emissions on a daily basis. The model was structured to contain gestation and lactation modules, which can be run separately or sequentially, with outputs from the gestation module used as inputs to the lactation module. In the gestating module, energy and protein...

  1. Do on/off time series models reproduce emerging stock market comovements?

    OpenAIRE

    Mohamed el hédi Arouri; Fredj Jawadi

    2011-01-01

    Using nonlinear modeling tools, this study investigates the comovements between the Mexican and the world stock markets over the last three decades. While the previous works only highlight some evidence of comovements, our paper aims to specify the different time-varying links and mechanisms characterizing the Mexican stock market through the comparison of two nonlinear error correction models (NECMs). Our findings point out strong evidence of time-varying and nonlinear mean-reversion and lin...

  2. The Computable Catchment: An executable document for model-data software sharing, reproducibility and interactive visualization

    Science.gov (United States)

    Gil, Y.; Duffy, C.

    2015-12-01

    This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.

  3. "High-precision, reconstructed 3D model" of skull scanned by conebeam CT: Reproducibility verified using CAD/CAM data.

    Science.gov (United States)

    Katsumura, Seiko; Sato, Keita; Ikawa, Tomoko; Yamamura, Keiko; Ando, Eriko; Shigeta, Yuko; Ogawa, Takumi

    2016-01-01

    Computed tomography (CT) scanning has recently been introduced into forensic medicine and dentistry. However, the presence of metal restorations in the dentition can adversely affect the quality of three-dimensional reconstruction from CT scans. In this study, we aimed to evaluate the reproducibility of a "high-precision, reconstructed 3D model" obtained from a conebeam CT scan of dentition, a method that might be particularly helpful in forensic medicine. We took conebeam CT and helical CT images of three dry skulls marked with 47 measuring points; reconstructed three-dimensional images; and measured the distances between the points in the 3D images with a computer-aided design/computer-aided manufacturing (CAD/CAM) marker. We found that in comparison with the helical CT, conebeam CT is capable of reproducing measurements closer to those obtained from the actual samples. In conclusion, our study indicated that the image-reproduction from a conebeam CT scan was more accurate than that from a helical CT scan. Furthermore, the "high-precision reconstructed 3D model" facilitates reliable visualization of full-sized oral and maxillofacial regions in both helical and conebeam CT scans. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. A novel porcine model of ataxia telangiectasia reproduces neurological features and motor deficits of human disease

    OpenAIRE

    Beraldi, Rosanna; Chan, Chun-Hung; Rogers, Christopher S.; Kovács, Attila D.; Meyerholz, David K.; Trantzas, Constantin; Lambertz, Allyn M.; Darbro, Benjamin W.; Weber, Krystal L.; White, Katherine A.M.; Rheeden, Richard V.; Kruer, Michael C.; Dacken, Brian A.; Wang, Xiao-Jun; Davis, Bryan T.

    2015-01-01

    Ataxia telangiectasia (AT) is a progressive multisystem disorder caused by mutations in the AT-mutated (ATM) gene. AT is a neurodegenerative disease primarily characterized by cerebellar degeneration in children leading to motor impairment. The disease progresses with other clinical manifestations including oculocutaneous telangiectasia, immune disorders, increased susceptibly to cancer and respiratory infections. Although genetic investigations and physiological models have established the l...

  5. Establishing a Reproducible Hypertrophic Scar following Thermal Injury: A Porcine Model

    Directory of Open Access Journals (Sweden)

    Scott J. Rapp, MD

    2015-02-01

    Conclusions: Deep partial-thickness thermal injury to the back of domestic swine produces an immature hypertrophic scar by 10 weeks following burn with thickness appearing to coincide with the location along the dorsal axis. With minimal pig to pig variation, we describe our technique to provide a testable immature scar model.

  6. Reproducibility of a novel model of murine asthma-like pulmonary inflammation.

    Science.gov (United States)

    McKinley, L; Kim, J; Bolgos, G L; Siddiqui, J; Remick, D G

    2004-05-01

    Sensitization to cockroach allergens (CRA) has been implicated as a major cause of asthma, especially among inner-city populations. Endotoxin from Gram-negative bacteria has also been investigated for its role in attenuating or exacerbating the asthmatic response. We have created a novel model utilizing house dust extract (HDE) containing high levels of both CRA and endotoxin to induce pulmonary inflammation (PI) and airway hyperresponsiveness (AHR). A potential drawback of this model is that the HDE is in limited supply and preparation of new HDE will not contain the exact components of the HDE used to define our model system. The present study involved testing HDEs collected from various homes for their ability to cause PI and AHR. Dust collected from five homes was extracted in phosphate buffered saline overnight. The levels of CRA and endotoxin in the supernatants varied from 7.1 to 49.5 mg/ml of CRA and 1.7-6 micro g/ml of endotoxin in the HDEs. Following immunization and two pulmonary exposures to HDE all five HDEs induced AHR, PI and plasma IgE levels substantially higher than normal mice. This study shows that HDE containing high levels of cockroach allergens and endotoxin collected from different sources can induce an asthma-like response in our murine model.

  7. A computational model incorporating neural stem cell dynamics reproduces glioma incidence across the lifespan in the human population.

    Directory of Open Access Journals (Sweden)

    Roman Bauer

    Full Text Available Glioma is the most common form of primary brain tumor. Demographically, the risk of occurrence increases until old age. Here we present a novel computational model to reproduce the probability of glioma incidence across the lifespan. Previous mathematical models explaining glioma incidence are framed in a rather abstract way, and do not directly relate to empirical findings. To decrease this gap between theory and experimental observations, we incorporate recent data on cellular and molecular factors underlying gliomagenesis. Since evidence implicates the adult neural stem cell as the likely cell-of-origin of glioma, we have incorporated empirically-determined estimates of neural stem cell number, cell division rate, mutation rate and oncogenic potential into our model. We demonstrate that our model yields results which match actual demographic data in the human population. In particular, this model accounts for the observed peak incidence of glioma at approximately 80 years of age, without the need to assert differential susceptibility throughout the population. Overall, our model supports the hypothesis that glioma is caused by randomly-occurring oncogenic mutations within the neural stem cell population. Based on this model, we assess the influence of the (experimentally indicated decrease in the number of neural stem cells and increase of cell division rate during aging. Our model provides multiple testable predictions, and suggests that different temporal sequences of oncogenic mutations can lead to tumorigenesis. Finally, we conclude that four or five oncogenic mutations are sufficient for the formation of glioma.

  8. [Renaissance of training in general surgery in Cambodia: a unique experience or reproducible model].

    Science.gov (United States)

    Dumurgier, C; Baulieux, J

    2005-01-01

    Is the new surgical training program at the University of Phom-Penh, Cambodia a unique experience or can it serve as a model for developing countries? This report describes the encouraging first results of this didactic and hands-on surgical program. Based on their findings the authors recommend not only continuing the program in Phom-Penh but also proposing slightly modified versions to new medical universities not currently offering specialization in surgery.

  9. Evaluation of Nitinol staples for the Lapidus arthrodesis in a reproducible biomechanical model

    Directory of Open Access Journals (Sweden)

    Nicholas Alexander Russell

    2015-12-01

    Full Text Available While the Lapidus procedure is a widely accepted technique for treatment of hallux valgus, the optimal fixation method to maintain joint stability remains controversial. The purpose of this study was to evaluate the biomechanical properties of new Shape Memory Alloy staples arranged in different configurations in a repeatable 1st Tarsometatarsal arthrodesis model. Ten sawbones models of the whole foot (n=5 per group were reconstructed using a single dorsal staple or two staples in a delta configuration. Each construct was mechanically tested in dorsal four-point bending, medial four-point bending, dorsal three-point bending and plantar cantilever bending with the staples activated at 37°C. The peak load, stiffness and plantar gapping were determined for each test. Pressure sensors were used to measure the contact force and area of the joint footprint in each group. There was a significant (p < 0.05 increase in peak load in the two staple constructs compared to the single staple constructs for all testing modalities. Stiffness also increased significantly in all tests except dorsal four-point bending. Pressure sensor readings showed a significantly higher contact force at time zero and contact area following loading in the two staple constructs (p < 0.05. Both groups completely recovered any plantar gapping following unloading and restored their initial contact footprint. The biomechanical integrity and repeatability of the models was demonstrated with no construct failures due to hardware or model breakdown. Shape memory alloy staples provide fixation with the ability to dynamically apply and maintain compression across a simulated arthrodesis following a range of loading conditions.

  10. Can lagrangian models reproduce the migration time of European eel obtained from otolith analysis?

    Science.gov (United States)

    Rodríguez-Díaz, L.; Gómez-Gesteira, M.

    2017-12-01

    European eel can be found at the Bay of Biscay after a long migration across the Atlantic. The duration of migration, which takes place at larval stage, is of primary importance to understand eel ecology and, hence, its survival. This duration is still a controversial matter since it can range from 7 months to > 4 years depending on the method to estimate duration. The minimum migration duration estimated from our lagrangian model is similar to the duration obtained from the microstructure of eel otoliths, which is typically on the order of 7-9 months. The lagrangian model showed to be sensitive to different conditions like spatial and time resolution, release depth, release area and initial distribution. In general, migration showed to be faster when decreasing the depth and increasing the resolution of the model. In average, the fastest migration was obtained when only advective horizontal movement was considered. However, faster migration was even obtained in some cases when locally oriented random migration was taken into account.

  11. Chimeric Hemagglutinin Constructs Induce Broad Protection against Influenza B Virus Challenge in the Mouse Model

    OpenAIRE

    Ermler, Megan E.; Kirkpatrick, Ericka; Sun, Weina; Hai, Rong; Amanat, Fatima; Chromikova, Veronika; Palese, Peter; Krammer, Florian

    2017-01-01

    Seasonal influenza virus epidemics represent a significant public health burden. Approximately 25% of all influenza virus infections are caused by type B viruses, and these infections can be severe, especially in children. Current influenza virus vaccines are an effective prophylaxis against infection but are impacted by rapid antigenic drift, which can lead to mismatches between vaccine strains and circulating strains. Here, we describe a broadly protective vaccine candidate based on chimeri...

  12. Acute multi-sgRNA knockdown of KEOPS complex genes reproduces the microcephaly phenotype of the stable knockout zebrafish model.

    Directory of Open Access Journals (Sweden)

    Tilman Jobst-Schwan

    Full Text Available Until recently, morpholino oligonucleotides have been widely employed in zebrafish as an acute and efficient loss-of-function assay. However, off-target effects and reproducibility issues when compared to stable knockout lines have compromised their further use. Here we employed an acute CRISPR/Cas approach using multiple single guide RNAs targeting simultaneously different positions in two exemplar genes (osgep or tprkb to increase the likelihood of generating mutations on both alleles in the injected F0 generation and to achieve a similar effect as morpholinos but with the reproducibility of stable lines. This multi single guide RNA approach resulted in median likelihoods for at least one mutation on each allele of >99% and sgRNA specific insertion/deletion profiles as revealed by deep-sequencing. Immunoblot showed a significant reduction for Osgep and Tprkb proteins. For both genes, the acute multi-sgRNA knockout recapitulated the microcephaly phenotype and reduction in survival that we observed previously in stable knockout lines, though milder in the acute multi-sgRNA knockout. Finally, we quantify the degree of mutagenesis by deep sequencing, and provide a mathematical model to quantitate the chance for a biallelic loss-of-function mutation. Our findings can be generalized to acute and stable CRISPR/Cas targeting for any zebrafish gene of interest.

  13. Realizing the Living Paper using the ProvONE Model for Reproducible Research

    Science.gov (United States)

    Jones, M. B.; Jones, C. S.; Ludäscher, B.; Missier, P.; Walker, L.; Slaughter, P.; Schildhauer, M.; Cuevas-Vicenttín, V.

    2015-12-01

    Science has advanced through traditional publications that codify research results as a permenant part of the scientific record. But because publications are static and atomic, researchers can only cite and reference a whole work when building on prior work of colleagues. The open source software model has demonstrated a new approach in which strong version control in an open environment can nurture an open ecosystem of software. Developers now commonly fork and extend software giving proper credit, with less repetition, and with confidence in the relationship to original software. Through initiatives like 'Beyond the PDF', an analogous model has been imagined for open science, in which software, data, analyses, and derived products become first class objects within a publishing ecosystem that has evolved to be finer-grained and is realized through a web of linked open data. We have prototyped a Living Paper concept by developing the ProvONE provenance model for scientific workflows, with prototype deployments in DataONE. ProvONE promotes transparency and openness by describing the authenticity, origin, structure, and processing history of research artifacts and by detailing the steps in computational workflows that produce derived products. To realize the Living Paper, we decompose scientific papers into their constituent products and publish these as compound objects in the DataONE federation of archival repositories. Each individual finding and sub-product of a reseach project (such as a derived data table, a workflow or script, a figure, an image, or a finding) can be independently stored, versioned, and cited. ProvONE provenance traces link these fine-grained products within and across versions of a paper, and across related papers that extend an original analysis. This allows for open scientific publishing in which researchers extend and modify findings, creating a dynamic, evolving web of results that collectively represent the scientific enterprise. The

  14. A discrete particle model reproducing collective dynamics of a bee swarm.

    Science.gov (United States)

    Bernardi, Sara; Colombi, Annachiara; Scianna, Marco

    2018-02-01

    In this article, we present a microscopic discrete mathematical model describing collective dynamics of a bee swarm. More specifically, each bee is set to move according to individual strategies and social interactions, the former involving the desire to reach a target destination, the latter accounting for repulsive/attractive stimuli and for alignment processes. The insects tend in fact to remain sufficiently close to the rest of the population, while avoiding collisions, and they are able to track and synchronize their movement to the flight of a given set of neighbors within their visual field. The resulting collective behavior of the bee cloud therefore emerges from non-local short/long-range interactions. Differently from similar approaches present in the literature, we here test different alignment mechanisms (i.e., based either on an Euclidean or on a topological neighborhood metric), which have an impact also on the other social components characterizing insect behavior. A series of numerical realizations then shows the phenomenology of the swarm (in terms of pattern configuration, collective productive movement, and flight synchronization) in different regions of the space of free model parameters (i.e., strength of attractive/repulsive forces, extension of the interaction regions). In this respect, constraints in the possible variations of such coefficients are here given both by reasonable empirical observations and by analytical results on some stability characteristics of the defined pairwise interaction kernels, which have to assure a realistic crystalline configuration of the swarm. An analysis of the effect of unconscious random fluctuations of bee dynamics is also provided. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Sprague-Dawley rats are a sustainable and reproducible animal model for induction and study of oral submucous fibrosis

    Directory of Open Access Journals (Sweden)

    Shilpa Maria

    2015-01-01

    Full Text Available Background: Oral submucous fibrosis (OSF is a chronic debilitating disease predominantly affecting the oral cavity and oropharynx. Characteristic histological traits of OSF include epithelial atrophy, inflammation, and a generalized submucosal fibrosis. Several studies and epidemiological surveys provide substantial evidence that areca nut is the main etiological factor for OSF. Hesitance of patients to undergo biopsy procedure together with clinicians becoming increasingly reluctant to take biopsies in cases of OSF has prompted researchers to develop animal models to study the disease process. Materials and Methods: The present study evaluates the efficacy, sustainability, and reproducibility of using Sprague-Dawley (SD rats as a possible model in the induction and progression of OSF. Buccal mucosa of SD rats was injected with areca nut and pan masala solutions on alternate days over a period of 48 weeks. The control group was treated with saline. The influence of areca nut and pan masala on the oral epithelium and connective tissue was evaluated by light microscopy. Results: Oral submucous fibrosis-like lesions were seen in both the areca nut and pan masala treated groups. The histological changes observed included: Atrophic epithelium, partial or complete loss of rete ridges, juxta-epithelial hyalinization, inflammation and accumulation of dense bundles of collagen fibers subepithelially. Conclusions: Histopathological changes in SD rats following treatment with areca nut and pan masala solutions bears a close semblance to that seen in humans with OSF. The SD rats seem to be a cheap and efficient, sustainable and reproducible model for the induction and development of OSF.

  16. Comparative analysis of 5 lung cancer natural history and screening models that reproduce outcomes of the NLST and PLCO trials.

    Science.gov (United States)

    Meza, Rafael; ten Haaf, Kevin; Kong, Chung Yin; Erdogan, Ayca; Black, William C; Tammemagi, Martin C; Choi, Sung Eun; Jeon, Jihyoun; Han, Summer S; Munshi, Vidit; van Rosmalen, Joost; Pinsky, Paul; McMahon, Pamela M; de Koning, Harry J; Feuer, Eric J; Hazelton, William D; Plevritis, Sylvia K

    2014-06-01

    The National Lung Screening Trial (NLST) demonstrated that low-dose computed tomography screening is an effective way of reducing lung cancer (LC) mortality. However, optimal screening strategies have not been determined to date and it is uncertain whether lighter smokers than those examined in the NLST may also benefit from screening. To address these questions, it is necessary to first develop LC natural history models that can reproduce NLST outcomes and simulate screening programs at the population level. Five independent LC screening models were developed using common inputs and calibration targets derived from the NLST and the Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO). Imputation of missing information regarding smoking, histology, and stage of disease for a small percentage of individuals and diagnosed LCs in both trials was performed. Models were calibrated to LC incidence, mortality, or both outcomes simultaneously. Initially, all models were calibrated to the NLST and validated against PLCO. Models were found to validate well against individuals in PLCO who would have been eligible for the NLST. However, all models required further calibration to PLCO to adequately capture LC outcomes in PLCO never-smokers and light smokers. Final versions of all models produced incidence and mortality outcomes in the presence and absence of screening that were consistent with both trials. The authors developed 5 distinct LC screening simulation models based on the evidence in the NLST and PLCO. The results of their analyses demonstrated that the NLST and PLCO have produced consistent results. The resulting models can be important tools to generate additional evidence to determine the effectiveness of lung cancer screening strategies using low-dose computed tomography. © 2014 American Cancer Society.

  17. Broad spectrum antibiotic enrofloxacin modulates contact sensitivity through gut microbiota in a murine model.

    Science.gov (United States)

    Strzępa, Anna; Majewska-Szczepanik, Monika; Lobo, Francis M; Wen, Li; Szczepanik, Marian

    2017-07-01

    Medical advances in the field of infection therapy have led to an increasing use of antibiotics, which, apart from eliminating pathogens, also partially eliminate naturally existing commensal bacteria. It has become increasingly clear that less exposure to microbiota early in life may contribute to the observed rise in "immune-mediated" diseases, including autoimmunity and allergy. We sought to test whether the change of gut microbiota with the broad spectrum antibiotic enrofloxacin will modulate contact sensitivity (CS) in mice. Natural gut microbiota were modified by oral treatment with enrofloxacin prior to sensitization with trinitrophenyl chloride followed by CS testing. Finally, adoptive cell transfers were performed to characterize the regulatory cells that are induced by microbiota modification. Oral treatment with enrofloxacin suppresses CS and production of anti-trinitrophenyl chloride IgG1 antibodies. Adoptive transfer experiments show that antibiotic administration favors induction of regulatory cells that suppress CS. Flow cytometry and adoptive transfer of purified cells show that antibiotic-induced suppression of CS is mediated by TCR αβ + CD4 + CD25 + FoxP3 + Treg, CD19 + B220 + CD5 + IL-10 + , IL-10 + Tr1, and IL-10 + TCR γδ + cells. Treatment with the antibiotic induces dysbiosis characterized by increased proportion of Clostridium coccoides (cluster XIVa), C coccoides-Eubacterium rectale (cluster XIVab), Bacteroidetes, and Bifidobacterium spp, but decreased segmented filamentous bacteria. Transfer of antibiotic-modified gut microbiota inhibits CS, but this response can be restored through oral transfer of control gut bacteria to antibiotic-treated animals. Oral treatment with a broad spectrum antibiotic modifies gut microbiota composition and promotes anti-inflammatory response, suggesting that manipulation of gut microbiota can be a powerful tool to modulate the course of CS. Copyright © 2017 American Academy of Allergy, Asthma & Immunology

  18. Can the CMIP5 models reproduce interannual to interdecadal southern African summer rainfall variability and their teleconnections?

    Science.gov (United States)

    Dieppois, Bastien; Pohl, Benjamin; Crétat, Julien; Keenlyside, Noel; New, Mark

    2017-04-01

    This study examines for the first time the ability of 28 global climate models from the Coupled Model Intercomparison Project 5 (CMIP5) to reproduce southern African summer rainfall variability and their teleconnections with large-scale modes of climate variability across the dominant timescales. In observations, summer southern African rainfall exhibits three significant timescales of variability over the twentieth century: interdecadal (15-28 years), quasi-decadal (8-13 years), and interannual (2-8 years). Most of CMIP5 simulations underestimate southern African summer rainfall variability at these three timescales, and this bias is proportionally stronger from high- to low-frequency. The inter-model spread is as important as the spread between the ensemble members of a given model, which suggests a strong influence of internal climate variability, and/or large model uncertainties. The underestimated amplitude of rainfall variability for each timescale are linked to unrealistic spatial distributions of these fluctuations over the subcontinent in most CMIP5 models. This is, at least partially, due to a poor representation of the tropical/subtropical teleconnections, which are known to favour wet conditions over southern African rainfall in the observations. Most CMIP5 realisations (85%) fail at simulating sea-surface temperature (SST) anomalies related to a negative Pacific Decadal Oscillation during wetter conditions at the interdecadal timescale. At the quasi-decadal timescale, only one-third of simulations display a negative Interdecadal Pacific Oscillation during wetter conditions, but these SST anomalies are anomalously shifted westward and poleward when compared to observed anomalies. Similar biases in simulating La Niña SST anomalies are identified in more than 50% of CMIP5 simulations at the interannual timescale. These biases in Pacific SST anomalies result in important shifts in the Walker circulation. This impacts southern Africa rainfall variability

  19. A methodology for model-based greenhouse design: Part 1, a greenhouse climate model for a broad range of designs and climates

    NARCIS (Netherlands)

    Vanthoor, B.H.E.; Stanghellini, C.; Henten, van E.J.; Visser, de P.H.B.

    2011-01-01

    With the aim of developing a model-based method to design greenhouses for a broad range of climatic and economic conditions, a greenhouse climate model has been developed and validated. This model describes the effects of the outdoor climate and greenhouse design on the indoor greenhouse climate.

  20. Can CFMIP2 models reproduce the leading modes of cloud vertical structure in the CALIPSO-GOCCP observations?

    Science.gov (United States)

    Wang, Fang; Yang, Song

    2018-02-01

    Using principal component (PC) analysis, three leading modes of cloud vertical structure (CVS) are revealed by the GCM-Oriented CALIPSO Cloud Product (GOCCP), i.e. tropical high, subtropical anticyclonic and extratropical cyclonic cloud modes (THCM, SACM and ECCM, respectively). THCM mainly reflect the contrast between tropical high clouds and clouds in middle/high latitudes. SACM is closely associated with middle-high clouds in tropical convective cores, few-cloud regimes in subtropical anticyclonic clouds and stratocumulus over subtropical eastern oceans. ECCM mainly corresponds to clouds along extratropical cyclonic regions. Models of phase 2 of Cloud Feedback Model Intercomparison Project (CFMIP2) well reproduce the THCM, but SACM and ECCM are generally poorly simulated compared to GOCCP. Standardized PCs corresponding to CVS modes are generally captured, whereas original PCs (OPCs) are consistently underestimated (overestimated) for THCM (SACM and ECCM) by CFMIP2 models. The effects of CVS modes on relative cloud radiative forcing (RSCRF/RLCRF) (RSCRF being calculated at the surface while RLCRF at the top of atmosphere) are studied in terms of principal component regression method. Results show that CFMIP2 models tend to overestimate (underestimated or simulate the opposite sign) RSCRF/RLCRF radiative effects (REs) of ECCM (THCM and SACM) in unit global mean OPC compared to observations. These RE biases may be attributed to two factors, one of which is underestimation (overestimation) of low/middle clouds (high clouds) (also known as stronger (weaker) REs in unit low/middle (high) clouds) in simulated global mean cloud profiles, the other is eigenvector biases in CVS modes (especially for SACM and ECCM). It is suggested that much more attention should be paid on improvement of CVS, especially cloud parameterization associated with particular physical processes (e.g. downwelling regimes with the Hadley circulation, extratropical storm tracks and others), which

  1. Reproducing Electric Field Observations during Magnetic Storms by means of Rigorous 3-D Modelling and Distortion Matrix Co-estimation

    Science.gov (United States)

    Püthe, Christoph; Manoj, Chandrasekharan; Kuvshinov, Alexey

    2015-04-01

    Electric fields induced in the conducting Earth during magnetic storms drive currents in power transmission grids, telecommunication lines or buried pipelines. These geomagnetically induced currents (GIC) can cause severe service disruptions. The prediction of GIC is thus of great importance for public and industry. A key step in the prediction of the hazard to technological systems during magnetic storms is the calculation of the geoelectric field. To address this issue for mid-latitude regions, we developed a method that involves 3-D modelling of induction processes in a heterogeneous Earth and the construction of a model of the magnetospheric source. The latter is described by low-degree spherical harmonics; its temporal evolution is derived from observatory magnetic data. Time series of the electric field can be computed for every location on Earth's surface. The actual electric field however is known to be perturbed by galvanic effects, arising from very local near-surface heterogeneities or topography, which cannot be included in the conductivity model. Galvanic effects are commonly accounted for with a real-valued time-independent distortion matrix, which linearly relates measured and computed electric fields. Using data of various magnetic storms that occurred between 2000 and 2003, we estimated distortion matrices for observatory sites onshore and on the ocean bottom. Strong correlations between modellings and measurements validate our method. The distortion matrix estimates prove to be reliable, as they are accurately reproduced for different magnetic storms. We further show that 3-D modelling is crucial for a correct separation of galvanic and inductive effects and a precise prediction of electric field time series during magnetic storms. Since the required computational resources are negligible, our approach is suitable for a real-time prediction of GIC. For this purpose, a reliable forecast of the source field, e.g. based on data from satellites

  2. QSAR model reproducibility and applicability: a case study of rate constants of hydroxyl radical reaction models applied to polybrominated diphenyl ethers and (benzo-)triazoles.

    Science.gov (United States)

    Roy, Partha Pratim; Kovarich, Simona; Gramatica, Paola

    2011-08-01

    The crucial importance of the three central OECD principles for quantitative structure-activity relationship (QSAR) model validation is highlighted in a case study of tropospheric degradation of volatile organic compounds (VOCs) by OH, applied to two CADASTER chemical classes (PBDEs and (benzo-)triazoles). The application of any QSAR model to chemicals without experimental data largely depends on model reproducibility by the user. The reproducibility of an unambiguous algorithm (OECD Principle 2) is guaranteed by redeveloping MLR models based on both updated version of DRAGON software for molecular descriptors calculation and some freely available online descriptors. The Genetic Algorithm has confirmed its ability to always select the most informative descriptors independently on the input pool of variables. The ability of the GA-selected descriptors to model chemicals not used in model development is verified by three different splittings (random by response, K-ANN and K-means clustering), thus ensuring the external predictivity of the new models, independently of the training/prediction set composition (OECD Principle 5). The relevance of checking the structural applicability domain becomes very evident on comparing the predictions for CADASTER chemicals, using the new models proposed herein, with those obtained by EPI Suite. Copyright © 2011 Wiley Periodicals, Inc.

  3. Eccentric Contraction-Induced Muscle Injury: Reproducible, Quantitative, Physiological Models to Impair Skeletal Muscle’s Capacity to Generate Force

    Science.gov (United States)

    Call, Jarrod A.; Lowe, Dawn A.

    2018-01-01

    In order to investigate the molecular and cellular mechanisms of muscle regeneration an experimental injury model is required. Advantages of eccentric contraction-induced injury are that it is a controllable, reproducible, and physiologically relevant model to cause muscle injury, with injury being defined as a loss of force generating capacity. While eccentric contractions can be incorporated into conscious animal study designs such as downhill treadmill running, electrophysiological approaches to elicit eccentric contractions and examine muscle contractility, for example before and after the injurious eccentric contractions, allows researchers to circumvent common issues in determining muscle function in a conscious animal (e.g., unwillingness to participate). Herein, we describe in vitro and in vivo methods that are reliable, repeatable, and truly maximal because the muscle contractions are evoked in a controlled, quantifiable manner independent of subject motivation. Both methods can be used to initiate eccentric contraction-induced injury and are suitable for monitoring functional muscle regeneration hours to days to weeks post-injury. PMID:27492161

  4. Eccentric Contraction-Induced Muscle Injury: Reproducible, Quantitative, Physiological Models to Impair Skeletal Muscle's Capacity to Generate Force.

    Science.gov (United States)

    Call, Jarrod A; Lowe, Dawn A

    2016-01-01

    In order to investigate the molecular and cellular mechanisms of muscle regeneration an experimental injury model is required. Advantages of eccentric contraction-induced injury are that it is a controllable, reproducible, and physiologically relevant model to cause muscle injury, with injury being defined as a loss of force generating capacity. While eccentric contractions can be incorporated into conscious animal study designs such as downhill treadmill running, electrophysiological approaches to elicit eccentric contractions and examine muscle contractility, for example before and after the injurious eccentric contractions, allows researchers to circumvent common issues in determining muscle function in a conscious animal (e.g., unwillingness to participate). Herein, we describe in vitro and in vivo methods that are reliable, repeatable, and truly maximal because the muscle contractions are evoked in a controlled, quantifiable manner independent of subject motivation. Both methods can be used to initiate eccentric contraction-induced injury and are suitable for monitoring functional muscle regeneration hours to days to weeks post-injury.

  5. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    Directory of Open Access Journals (Sweden)

    Sonal eGoswami

    2012-06-01

    Full Text Available Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e. that require the hippocampus or not the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze. Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma, PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  6. Reproducing the organic matter model of anthropogenic dark earth of Amazonia and testing the ecotoxicity of functionalized charcoal compounds

    Directory of Open Access Journals (Sweden)

    Carolina Rodrigues Linhares

    2012-05-01

    Full Text Available The objective of this work was to obtain organic compounds similar to the ones found in the organic matter of anthropogenic dark earth of Amazonia (ADE using a chemical functionalization procedure on activated charcoal, as well as to determine their ecotoxicity. Based on the study of the organic matter from ADE, an organic model was proposed and an attempt to reproduce it was described. Activated charcoal was oxidized with the use of sodium hypochlorite at different concentrations. Nuclear magnetic resonance was performed to verify if the spectra of the obtained products were similar to the ones of humic acids from ADE. The similarity between spectra indicated that the obtained products were polycondensed aromatic structures with carboxyl groups: a soil amendment that can contribute to soil fertility and to its sustainable use. An ecotoxicological test with Daphnia similis was performed on the more soluble fraction (fulvic acids of the produced soil amendment. Aryl chloride was formed during the synthesis of the organic compounds from activated charcoal functionalization and partially removed through a purification process. However, it is probable that some aryl chloride remained in the final product, since the ecotoxicological test indicated that the chemical functionalized soil amendment is moderately toxic.

  7. Assessing the status and trend of bat populations across broad geographic regions with dynamic distribution models

    Science.gov (United States)

    Rodhouse, Thomas J.; Ormsbee, Patricia C.; Irvine, Kathryn M.; Vierling, Lee A.; Szewczak, Joseph M.; Vierling, Kerri T.

    2012-01-01

    Bats face unprecedented threats from habitat loss, climate change, disease, and wind power development, and populations of many species are in decline. A better ability to quantify bat population status and trend is urgently needed in order to develop effective conservation strategies. We used a Bayesian autoregressive approach to develop dynamic distribution models for Myotis lucifugus, the little brown bat, across a large portion of northwestern USA, using a four-year detection history matrix obtained from a regional monitoring program. This widespread and abundant species has experienced precipitous local population declines in northeastern USA resulting from the novel disease white-nose syndrome, and is facing likely range-wide declines. Our models were temporally dynamic and accounted for imperfect detection. Drawing on species–energy theory, we included measures of net primary productivity (NPP) and forest cover in models, predicting that M. lucifugus occurrence probabilities would covary positively along those gradients.

  8. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  9. Reproducibility and accuracy of linear measurements on dental models derived from cone-beam computed tomography compared with digital dental casts

    NARCIS (Netherlands)

    Waard, O. de; Rangel, F.A.; Fudalej, P.S.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Breuning, K.H.

    2014-01-01

    INTRODUCTION: The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models

  10. Models for Broad Area Event Identification and Yield Estimation: Multiple Coda Types

    Science.gov (United States)

    2011-09-01

    microearthquakes accompanying hydraulic fracturing in granitic rock, Bull. Seism . Soc. Am., 81, 553-575, 1991. Fisk, M. and S. R. Taylor, (2007...146882, pp. 13. Yang, X., T. Lay, X.-B. Xie, and M. S. Thorne (2007). Geometric spreading of Pn and Sn in a spherical Earth model, Bull. Seism . Soc

  11. Coupled RipCAS-DFLOW (CoRD) Software and Data Management System for Reproducible Floodplain Vegetation Succession Modeling

    Science.gov (United States)

    Turner, M. A.; Miller, S.; Gregory, A.; Cadol, D. D.; Stone, M. C.; Sheneman, L.

    2016-12-01

    We present the Coupled RipCAS-DFLOW (CoRD) modeling system created to encapsulate the workflow to analyze the effects of stream flooding on vegetation succession. CoRD provides an intuitive command-line and web interface to run DFLOW and RipCAS in succession over many years automatically, which is a challenge because, for our application, DFLOW must be run on a supercomputing cluster via the PBS job scheduler. RipCAS is a vegetation succession model, and DFLOW is a 2D open channel flow model. Data adaptors have been developed to seamlessly connect DFLOW output data to be RipCAS inputs, and vice-versa. CoRD provides automated statistical analysis and visualization, plus automatic syncing of input and output files and model run metadata to the hydrological data management system HydroShare using its excellent Python REST client. This combination of technologies and data management techniques allows the results to be shared with collaborators and eventually published. Perhaps most importantly, it allows results to be easily reproduced via either the command-line or web user interface. This system is a result of collaboration between software developers and hydrologists participating in the Western Consortium for Watershed Analysis, Visualization, and Exploration (WC-WAVE). Because of the computing-intensive nature of this particular workflow, including automating job submission/monitoring and data adaptors, software engineering expertise is required. However, the hydrologists provide the software developers with a purpose and ensure a useful, intuitive tool is developed. Our hydrologists contribute software, too: RipCAS was developed from scratch by hydrologists on the team as a specialized, open-source version of the Computer Aided Simulation Model for Instream Flow and Riparia (CASiMiR) vegetation model; our hydrologists running DFLOW provided numerous examples and help with the supercomputing system. This project is written in Python, a popular language in the

  12. Current models broadly neglect specific needs of biodiversity conservation in protected areas under climate change

    Directory of Open Access Journals (Sweden)

    Moloney Kirk A

    2011-05-01

    Full Text Available Abstract Background Protected areas are the most common and important instrument for the conservation of biological diversity and are called for under the United Nations' Convention on Biological Diversity. Growing human population densities, intensified land-use, invasive species and increasing habitat fragmentation threaten ecosystems worldwide and protected areas are often the only refuge for endangered species. Climate change is posing an additional threat that may also impact ecosystems currently under protection. Therefore, it is of crucial importance to include the potential impact of climate change when designing future nature conservation strategies and implementing protected area management. This approach would go beyond reactive crisis management and, by necessity, would include anticipatory risk assessments. One avenue for doing so is being provided by simulation models that take advantage of the increase in computing capacity and performance that has occurred over the last two decades. Here we review the literature to determine the state-of-the-art in modeling terrestrial protected areas under climate change, with the aim of evaluating and detecting trends and gaps in the current approaches being employed, as well as to provide a useful overview and guidelines for future research. Results Most studies apply statistical, bioclimatic envelope models and focus primarily on plant species as compared to other taxa. Very few studies utilize a mechanistic, process-based approach and none examine biotic interactions like predation and competition. Important factors like land-use, habitat fragmentation, invasion and dispersal are rarely incorporated, restricting the informative value of the resulting predictions considerably. Conclusion The general impression that emerges is that biodiversity conservation in protected areas could benefit from the application of modern modeling approaches to a greater extent than is currently reflected in the

  13. Beam-based model of broad-band impedance of the Diamond Light Source

    Science.gov (United States)

    Smaluk, Victor; Martin, Ian; Fielder, Richard; Bartolini, Riccardo

    2015-06-01

    In an electron storage ring, the interaction between a single-bunch beam and a vacuum chamber impedance affects the beam parameters, which can be measured rather precisely. So we can develop beam-based numerical models of longitudinal and transverse impedances. At the Diamond Light Source (DLS) to get the model parameters, a set of measured data has been used including current-dependent shift of betatron tunes and synchronous phase, chromatic damping rates, and bunch lengthening. A matlab code for multiparticle tracking has been developed. The tracking results and analytical estimations are quite consistent with the measured data. Since Diamond has the shortest natural bunch length among all light sources in standard operation, the studies of collective effects with short bunches are relevant to many facilities including next generation of light sources.

  14. Beam-based model of broad-band impedance of the Diamond Light Source

    Directory of Open Access Journals (Sweden)

    Victor Smaluk

    2015-06-01

    Full Text Available In an electron storage ring, the interaction between a single-bunch beam and a vacuum chamber impedance affects the beam parameters, which can be measured rather precisely. So we can develop beam-based numerical models of longitudinal and transverse impedances. At the Diamond Light Source (DLS to get the model parameters, a set of measured data has been used including current-dependent shift of betatron tunes and synchronous phase, chromatic damping rates, and bunch lengthening. A matlab code for multiparticle tracking has been developed. The tracking results and analytical estimations are quite consistent with the measured data. Since Diamond has the shortest natural bunch length among all light sources in standard operation, the studies of collective effects with short bunches are relevant to many facilities including next generation of light sources.

  15. Ecosystem Services Provided by Agricultural Land as Modeled by Broad Scale Geospatial Analysis

    Science.gov (United States)

    Kokkinidis, Ioannis

    Agricultural ecosystems provide multiple services including food and fiber provision, nutrient cycling, soil retention and water regulation. Objectives of the study were to identify and quantify a selection of ecosystem services provided by agricultural land, using existing geospatial tools and preferably free and open source data, such as the Virginia Land Use Evaluation System (VALUES), the North Carolina Realistic Yield Expectations (RYE) database, and the land cover datasets NLCD and CDL. Furthermore I sought to model tradeoffs between provisioning and other services. First I assessed the accuracy of agricultural land in NLCD and CDL over a four county area in eastern Virginia using cadastral parcels. I uncovered issues concerning the definition of agricultural land. The area and location of agriculture saw little change in the 19 years studied. Furthermore all datasets have significant errors of omission (11.3 to 95.1%) and commission (0 to 71.3%). Location of agriculture was used with spatial crop yield databases I created and combined with models I adapted to calculate baseline values for plant biomass, nutrient composition and requirements, land suitability for and potential production of biofuels and the economic impact of agriculture for the four counties. The study area was then broadened to cover 97 counties in eastern Virginia and North Carolina, investigating the potential for increased regional grain production through intensification and extensification of agriculture. Predicted yield from geospatial crop models was compared with produced yield from the NASS Survey of Agriculture. Area of most crops in CDL was similar to that in the Survey of Agriculture, but a yield gap is present for most years, partially due to weather, thus indicating potential for yield increase through intensification. Using simple criteria I quantified the potential to extend agriculture in high yield land in other uses and modeled the changes in erosion and runoff should

  16. Multi-epitope Models Explain How Pre-existing Antibodies Affect the Generation of Broadly Protective Responses to Influenza.

    Directory of Open Access Journals (Sweden)

    Veronika I Zarnitsyna

    2016-06-01

    Full Text Available The development of next-generation influenza vaccines that elicit strain-transcendent immunity against both seasonal and pandemic viruses is a key public health goal. Targeting the evolutionarily conserved epitopes on the stem of influenza's major surface molecule, hemagglutinin, is an appealing prospect, and novel vaccine formulations show promising results in animal model systems. However, studies in humans indicate that natural infection and vaccination result in limited boosting of antibodies to the stem of HA, and the level of stem-specific antibody elicited is insufficient to provide broad strain-transcendent immunity. Here, we use mathematical models of the humoral immune response to explore how pre-existing immunity affects the ability of vaccines to boost antibodies to the head and stem of HA in humans, and, in particular, how it leads to the apparent lack of boosting of broadly cross-reactive antibodies to the stem epitopes. We consider hypotheses where binding of antibody to an epitope: (i results in more rapid clearance of the antigen; (ii leads to the formation of antigen-antibody complexes which inhibit B cell activation through Fcγ receptor-mediated mechanism; and (iii masks the epitope and prevents the stimulation and proliferation of specific B cells. We find that only epitope masking but not the former two mechanisms to be key in recapitulating patterns in data. We discuss the ramifications of our findings for the development of vaccines against both seasonal and pandemic influenza.

  17. Broad line regions in Seyfert-1 galaxies

    International Nuclear Information System (INIS)

    Groningen, E. van.

    1984-01-01

    To reproduce observed emission profiles of Seyfert galaxies, rotation in an accretion disk has been proposed. In this thesis, the profiles emitted by such an accretion disk are investigated. Detailed comparison with the observed profiles yields that a considerable fraction can be fitted with a power-law function, as predicted by the model. The author analyzes a series of high quality spectra of Seyfert galaxies, obtained with the 2.5m telescope at Las Campanas. He presents detailed analyses of two objects: Mkn335 and Akn120. In both cases, strong evidence is presented for the presence of two separate broad line zones. These zones are identified with an accretion disk and an outflowing wind. The disk contains gas with very high densities and emits predominantly the lower ionization lines. He reports on the discovery of very broad wings beneath the strong forbidden line 5007. (Auth.)

  18. Field Validation of Habitat Suitability Models for Vulnerable Marine Ecosystems in the South Pacific Ocean: Implications for the use of Broad-scale Models in Fisheries Management

    Science.gov (United States)

    Anderson, O. F.; Guinotte, J. M.; Clark, M. R.; Rowden, A. A.; Mormede, S.; Davies, A. J.; Bowden, D.

    2016-02-01

    Spatial management of vulnerable marine ecosystems requires accurate knowledge of their distribution. Predictive habitat suitability modelling, using species presence data and a suite of environmental predictor variables, has emerged as a useful tool for inferring distributions outside of known areas. However, validation of model predictions is typically performed with non-independent data. In this study, we describe the results of habitat suitability models constructed for four deep-sea reef-forming coral species across a large region of the South Pacific Ocean using MaxEnt and Boosted Regression Tree modelling approaches. In order to validate model predictions we conducted a photographic survey on a set of seamounts in an un-sampled area east of New Zealand. The likelihood of habitat suitable for reef forming corals on these seamounts was predicted to be variable, but very high in some regions, particularly where levels of aragonite saturation, dissolved oxygen, and particulate organic carbon were optimal. However, the observed frequency of coral occurrence in analyses of survey photographic data was much lower than expected, and patterns of observed versus predicted coral distribution were not highly correlated. The poor performance of these broad-scale models is attributed to lack of recorded species absences to inform the models, low precision of global bathymetry models, and lack of data on the geomorphology and substrate of the seamounts at scales appropriate to the modelled taxa. This demonstrates the need to use caution when interpreting and applying broad-scale, presence-only model results for fisheries management and conservation planning in data poor areas of the deep sea. Future improvements in the predictive performance of broad-scale models will rely on the continued advancement in modelling of environmental predictor variables, refinements in modelling approaches to deal with missing or biased inputs, and incorporation of true absence data.

  19. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  20. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  1. From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics.

    Directory of Open Access Journals (Sweden)

    Alejandra González-Beltrán

    Full Text Available Reproducing the results from a scientific paper can be challenging due to the absence of data and the computational tools required for their analysis. In addition, details relating to the procedures used to obtain the published results can be difficult to discern due to the use of natural language when reporting how experiments have been performed. The Investigation/Study/Assay (ISA, Nanopublications (NP, and Research Objects (RO models are conceptual data modelling frameworks that can structure such information from scientific papers. Computational workflow platforms can also be used to reproduce analyses of data in a principled manner. We assessed the extent by which ISA, NP, and RO models, together with the Galaxy workflow system, can capture the experimental processes and reproduce the findings of a previously published paper reporting on the development of SOAPdenovo2, a de novo genome assembler.Executable workflows were developed using Galaxy, which reproduced results that were consistent with the published findings. A structured representation of the information in the SOAPdenovo2 paper was produced by combining the use of ISA, NP, and RO models. By structuring the information in the published paper using these data and scientific workflow modelling frameworks, it was possible to explicitly declare elements of experimental design, variables, and findings. The models served as guides in the curation of scientific information and this led to the identification of inconsistencies in the original published paper, thereby allowing its authors to publish corrections in the form of an errata.SOAPdenovo2 scripts, data, and results are available through the GigaScience Database: http://dx.doi.org/10.5524/100044; the workflows are available from GigaGalaxy: http://galaxy.cbiit.cuhk.edu.hk; and the representations using the ISA, NP, and RO models are available through the SOAPdenovo2 case study website http://isa-tools.github.io/soapdenovo2

  2. Failed Radiatively Accelerated Dusty Outflow Model of the Broad Line Region in Active Galactic Nuclei. I. Analytical Solution

    Energy Technology Data Exchange (ETDEWEB)

    Czerny, B.; Panda, S.; Wildy, C.; Sniegowska, M. [Center for Theoretical Physics, Polish Academy of Sciences, Al. Lotników 32/46, 02-668 Warsaw (Poland); Li, Yan-Rong; Wang, J.-M. [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Road, Beijing 100049 (China); Hryniewicz, K.; Sredzinska, J. [Copernicus Astronomical Center, Polish Academy of Sciences, Bartycka 18, 00-716 Warsaw (Poland); Karas, V., E-mail: bcz@cft.edu.pl [Astronomical Institute, Academy of Sciences, Bocni II 1401, CZ-141 00 Prague (Czech Republic)

    2017-09-10

    The physical origin of the broad line region in active galactic nuclei is still unclear despite many years of observational studies. The reason is that the region is unresolved, and the reverberation mapping results imply a complex velocity field. We adopt a theory-motivated approach to identify the principal mechanism responsible for this complex phenomenon. We consider the possibility that the role of dust is essential. We assume that the local radiation pressure acting on the dust in the accretion disk atmosphere launches the outflow of material, but higher above the disk the irradiation from the central parts causes dust evaporation and a subsequent fallback. This failed radiatively accelerated dusty outflow is expected to represent the material forming low ionization lines. In this paper we formulate simple analytical equations to describe the cloud motion, including the evaporation phase. The model is fully described just by the basic parameters of black hole mass, accretion rate, black hole spin, and viewing angle. We study how the spectral line generic profiles correspond to this dynamic. We show that the virial factor calculated from our model strongly depends on the black hole mass in the case of enhanced dust opacity, and thus it then correlates with the line width. This could explain why the virial factor measured in galaxies with pseudobulges differs from that obtained from objects with classical bulges, although the trend predicted by the current version of the model is opposite to the observed trend.

  3. 3D-modeling of the spine using EOS imaging system: Inter-reader reproducibility and reliability.

    Science.gov (United States)

    Rehm, Johannes; Germann, Thomas; Akbar, Michael; Pepke, Wojciech; Kauczor, Hans-Ulrich; Weber, Marc-André; Spira, Daniel

    2017-01-01

    To retrospectively assess the interreader reproducibility and reliability of EOS 3D full spine reconstructions in patients with adolescent idiopathic scoliosis (AIS). 73 patients with mean age of 17 years and a moderate AIS (median Cobb Angle 18.2°) obtained low-dose standing biplanar radiographs with EOS. Two independent readers performed "full spine" 3D reconstructions of the spine with the "full-spine" method adjusting the bone contour of every thoracic and lumbar vertebra (Th1-L5). Interreader reproducibility was assessed regarding rotation of every single vertebra in the coronal (i.e. frontal), sagittal (i.e. lateral), and axial plane, T1/T12 kyphosis, T4/T12 kyphosis, L1/L5 lordosis, L1/S1 lordosis and pelvic parameters. Radiation exposure, scan-time and 3D reconstruction time were recorded. Interclass correlation (ICC) ranged between 0.83 and 0.98 for frontal vertebral rotation, between 0.94 and 0.99 for lateral vertebral rotation and between 0.51 and 0.88 for axial vertebral rotation. ICC was 0.92 for T1/T12 kyphosis, 0.95 for T4/T12 kyphosis, 0.90 for L1/L5 lordosis, 0.85 for L1/S1 lordosis, 0.97 for pelvic incidence, 0.96 for sacral slope, 0.98 for sagittal pelvic tilt and 0.94 for lateral pelvic tilt. The mean time for reconstruction was 14.9 minutes (reader 1: 14.6 minutes, reader 2: 15.2 minutes, p3D angle measurement of vertebral rotation proved to be reliable and was performed in an acceptable reconstruction time. Interreader reproducibility of axial rotation was limited to some degree in the upper and middle thoracic spine due the obtuse angulation of the pedicles and the processi spinosi in the frontal view somewhat complicating their delineation.

  4. Population Structure in the Model Grass Brachypodium distachyon Is Highly Correlated with Flowering Differences across Broad Geographic Areas

    Directory of Open Access Journals (Sweden)

    Ludmila Tyler

    2016-07-01

    Full Text Available The small, annual grass (L. Beauv., a close relative of wheat ( L. and barley ( L., is a powerful model system for cereals and bioenergy grasses. Genome-wide association studies (GWAS of natural variation can elucidate the genetic basis of complex traits but have been so far limited in by the lack of large numbers of well-characterized and sufficiently diverse accessions. Here, we report on genotyping-by-sequencing (GBS of 84 , seven , and three accessions with diverse geographic origins including Albania, Armenia, Georgia, Italy, Spain, and Turkey. Over 90,000 high-quality single-nucleotide polymorphisms (SNPs distributed across the Bd21 reference genome were identified. Our results confirm the hybrid nature of the genome, which appears as a mosaic of -like and -like sequences. Analysis of more than 50,000 SNPs for the accessions revealed three distinct, genetically defined populations. Surprisingly, these genomic profiles are associated with differences in flowering time rather than with broad geographic origin. High levels of differentiation in loci associated with floral development support the differences in flowering phenology between populations. Genome-wide association studies combining genotypic and phenotypic data also suggest the presence of one or more photoperiodism, circadian clock, and vernalization genes in loci associated with flowering time variation within populations. Our characterization elucidates genes underlying population differences, expands the germplasm resources available for , and illustrates the feasibility and limitations of GWAS in this model grass.

  5. Dynamic contrast-enhanced computed tomography in metastatic nasopharyngeal carcinoma: reproducibility analysis and observer variability of the distributed parameter model.

    Science.gov (United States)

    Ng, Quan-Sing; Thng, Choon Hua; Lim, Wan Teck; Hartono, Septian; Thian, Yee Liang; Lee, Puor Sherng; Tan, Daniel Shao-Weng; Tan, Eng Huat; Koh, Tong San

    2012-01-01

    To determine the reproducibility and observer variability of distributed parameter analysis of dynamic contrast-enhanced computed tomography (DCE-CT) data in metastatic nasopharyngeal carcinoma, and to compare 2 approaches of region-of-interest (ROI) analyses. Following ethical approval and informed consent, 17 patients with nasopharyngeal carcinoma underwent paired DCE-CT examinations on a 64-detector scanner, measuring tumor blood flow (F, mL/100 mL/min), permeability surface area product (PS, mL/100 mL/min), fractional intravascular blood volume (v1, mL/100 mL), and fractional extracellular-extravascular volume (v2, mL/100 mL). Tumor parameters were derived by fitting (i) the ROI-averaged concentration-time curve, and (ii) the median value of parameters from voxel-level concentration-time curves. Measurement reproducibility and inter- and intraobserver variability were estimated using Bland-Altman statistics. Mean F, PS, v1, and v2 are 44.9, 20.4, 7.1, and 34.1 for ROI analysis, and 49.0, 18.7, 6.7, and 34.0 for voxel analysis, respectively. Within-subject coefficients of variation are 38.8%, 49.5%, 54.2%, and 35.9% for ROI analysis, and 15.0%, 35.1%, 33.0%, and 21.0% for voxel analysis, respectively. Repeatability coefficients are 48.2, 28.0, 10.7, and 33.9 for ROI analysis, and 20.3, 18.2, 6.1 and 19.8 for voxel analysis, respectively. Intra- and interobserver correlation coefficient ranged from 0.94 to 0.97 and 0.90 to 0.95 for voxel analysis, and 0.73 to 0.87 and 0.72 to 0.94 for ROI analysis, respectively. Measurements of F and v2 appear more reproducible than PS and v1. Voxel-level analysis improves both reproducibility and observer variability compared with ROI-averaged analysis and may retain information about tumor spatial heterogeneity.

  6. Measurement of cerebral blood flow by intravenous xenon-133 technique and a mobile system. Reproducibility using the Obrist model compared to total curve analysis

    DEFF Research Database (Denmark)

    Schroeder, T; Holstein, P; Lassen, N A

    1986-01-01

    was considerably more reproducible than CBF level. Using a single detector instead of five regional values averaged as the hemispheric flow increased standard deviation of CBF level by 10-20%, while the variation in asymmetry was doubled. In optimal measuring conditions the two models revealed no significant...... differences, but in low flow situations the artifact model yielded significantly more stable results. The present apparatus, equipped with 3-5 detectors covering each hemisphere, offers the opportunity of performing serial CBF measurements in situations not otherwise feasible.......The recent development of a mobile 10 detector unit, using i.v. Xenon-133 technique, has made it possible to perform repeated bedside measurements of cerebral blood flow (CBF). Test-retest studies were carried out in 38 atherosclerotic subjects, in order to evaluate the reproducibility of CBF level...

  7. Qualification of a Plant Disease Simulation Model: Performance of the LATEBLIGHT Model Across a Broad Range of Environments.

    Science.gov (United States)

    Andrade-Piedra, Jorge L; Forbes, Gregory A; Shtienberg, Dani; Grünwald, Niklaus J; Chacón, María G; Taipe, Marco V; Hijmans, Robert J; Fry, William E

    2005-12-01

    ABSTRACT The concept of model qualification, i.e., discovering the domain over which a validated model may be properly used, was illustrated with LATEBLIGHT, a mathematical model that simulates the effect of weather, host growth and resistance, and fungicide use on asexual development and growth of Phytophthora infestans on potato foliage. Late blight epidemics from Ecuador, Mexico, Israel, and the United States involving 13 potato cultivars (32 epidemics in total) were compared with model predictions using graphical and statistical tests. Fungicides were not applied in any of the epidemics. For the simulations, a host resistance level was assigned to each cultivar based on general categories reported by local investigators. For eight cultivars, the model predictions fit the observed data. For four cultivars, the model predictions overestimated disease, likely due to inaccurate estimates of host resistance. Model predictions were inconsistent for one cultivar and for one location. It was concluded that the domain of applicability of LATEBLIGHT can be extended from the range of conditions in Peru for which it has been previously validated to those observed in this study. A sensitivity analysis showed that, within the range of values observed empirically, LATEBLIGHT is more sensitive to changes in variables related to initial inoculum and to weather than to changes in variables relating to host resistance.

  8. Modeling vegetation heights from high resolution stereo aerial photography: an application for broad-scale rangeland monitoring

    Science.gov (United States)

    Gillan, Jeffrey K.; Karl, Jason W.; Duniway, Michael; Elaksher, Ahmed

    2014-01-01

    Vertical vegetation structure in rangeland ecosystems can be a valuable indicator for assessing rangeland health and monitoring riparian areas, post-fire recovery, available forage for livestock, and wildlife habitat. Federal land management agencies are directed to monitor and manage rangelands at landscapes scales, but traditional field methods for measuring vegetation heights are often too costly and time consuming to apply at these broad scales. Most emerging remote sensing techniques capable of measuring surface and vegetation height (e.g., LiDAR or synthetic aperture radar) are often too expensive, and require specialized sensors. An alternative remote sensing approach that is potentially more practical for managers is to measure vegetation heights from digital stereo aerial photographs. As aerial photography is already commonly used for rangeland monitoring, acquiring it in stereo enables three-dimensional modeling and estimation of vegetation height. The purpose of this study was to test the feasibility and accuracy of estimating shrub heights from high-resolution (HR, 3-cm ground sampling distance) digital stereo-pair aerial images. Overlapping HR imagery was taken in March 2009 near Lake Mead, Nevada and 5-cm resolution digital surface models (DSMs) were created by photogrammetric methods (aerial triangulation, digital image matching) for twenty-six test plots. We compared the heights of individual shrubs and plot averages derived from the DSMs to field measurements. We found strong positive correlations between field and image measurements for several metrics. Individual shrub heights tended to be underestimated in the imagery, however, accuracy was higher for dense, compact shrubs compared with shrubs with thin branches. Plot averages of shrub height from DSMs were also strongly correlated to field measurements but consistently underestimated. Grasses and forbs were generally too small to be detected with the resolution of the DSMs. Estimates of

  9. Modeling vegetation heights from high resolution stereo aerial photography: an application for broad-scale rangeland monitoring.

    Science.gov (United States)

    Gillan, Jeffrey K; Karl, Jason W; Duniway, Michael; Elaksher, Ahmed

    2014-11-01

    Vertical vegetation structure in rangeland ecosystems can be a valuable indicator for assessing rangeland health and monitoring riparian areas, post-fire recovery, available forage for livestock, and wildlife habitat. Federal land management agencies are directed to monitor and manage rangelands at landscapes scales, but traditional field methods for measuring vegetation heights are often too costly and time consuming to apply at these broad scales. Most emerging remote sensing techniques capable of measuring surface and vegetation height (e.g., LiDAR or synthetic aperture radar) are often too expensive, and require specialized sensors. An alternative remote sensing approach that is potentially more practical for managers is to measure vegetation heights from digital stereo aerial photographs. As aerial photography is already commonly used for rangeland monitoring, acquiring it in stereo enables three-dimensional modeling and estimation of vegetation height. The purpose of this study was to test the feasibility and accuracy of estimating shrub heights from high-resolution (HR, 3-cm ground sampling distance) digital stereo-pair aerial images. Overlapping HR imagery was taken in March 2009 near Lake Mead, Nevada and 5-cm resolution digital surface models (DSMs) were created by photogrammetric methods (aerial triangulation, digital image matching) for twenty-six test plots. We compared the heights of individual shrubs and plot averages derived from the DSMs to field measurements. We found strong positive correlations between field and image measurements for several metrics. Individual shrub heights tended to be underestimated in the imagery, however, accuracy was higher for dense, compact shrubs compared with shrubs with thin branches. Plot averages of shrub height from DSMs were also strongly correlated to field measurements but consistently underestimated. Grasses and forbs were generally too small to be detected with the resolution of the DSMs. Estimates of

  10. Reproducibility study of [{sup 18}F]FPP(RGD){sub 2} uptake in murine models of human tumor xenografts

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Edwin; Liu, Shuangdong; Chin, Frederick; Cheng, Zhen [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Gowrishankar, Gayatri; Yaghoubi, Shahriar [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Wedgeworth, James Patrick [Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Berndorff, Dietmar; Gekeler, Volker [Bayer Schering Pharma AG, Global Drug Discovery, Berlin (Germany); Gambhir, Sanjiv S. [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Canary Center at Stanford for Cancer Early Detection, Nuclear Medicine, Departments of Radiology and Bioengineering, Molecular Imaging Program at Stanford, Stanford, CA (United States)

    2011-04-15

    An {sup 18}F-labeled PEGylated arginine-glycine-aspartic acid (RGD) dimer [{sup 18}F]FPP(RGD){sub 2} has been used to image tumor {alpha}{sub v}{beta}{sub 3} integrin levels in preclinical and clinical studies. Serial positron emission tomography (PET) studies may be useful for monitoring antiangiogenic therapy response or for drug screening; however, the reproducibility of serial scans has not been determined for this PET probe. The purpose of this study was to determine the reproducibility of the integrin {alpha}{sub v}{beta}{sub 3}-targeted PET probe, [{sup 18}F ]FPP(RGD){sub 2} using small animal PET. Human HCT116 colon cancer xenografts were implanted into nude mice (n = 12) in the breast and scapular region and grown to mean diameters of 5-15 mm for approximately 2.5 weeks. A 3-min acquisition was performed on a small animal PET scanner approximately 1 h after administration of [{sup 18}F]FPP(RGD){sub 2} (1.9-3.8 MBq, 50-100 {mu}Ci) via the tail vein. A second small animal PET scan was performed approximately 6 h later after reinjection of the probe to assess for reproducibility. Images were analyzed by drawing an ellipsoidal region of interest (ROI) around the tumor xenograft activity. Percentage injected dose per gram (%ID/g) values were calculated from the mean or maximum activity in the ROIs. Coefficients of variation and differences in %ID/g values between studies from the same day were calculated to determine the reproducibility. The coefficient of variation (mean {+-}SD) for %ID{sub mean}/g and %ID{sub max}/g values between [{sup 18}F]FPP(RGD){sub 2} small animal PET scans performed 6 h apart on the same day were 11.1 {+-} 7.6% and 10.4 {+-} 9.3%, respectively. The corresponding differences in %ID{sub mean}/g and %ID{sub max}/g values between scans were -0.025 {+-} 0.067 and -0.039 {+-} 0.426. Immunofluorescence studies revealed a direct relationship between extent of {alpha}{sub {nu}}{beta}{sub 3} integrin expression in tumors and tumor vasculature

  11. Improving Students' Understanding of Molecular Structure through Broad-Based Use of Computer Models in the Undergraduate Organic Chemistry Lecture

    Science.gov (United States)

    Springer, Michael T.

    2014-01-01

    Several articles suggest how to incorporate computer models into the organic chemistry laboratory, but relatively few papers discuss how to incorporate these models broadly into the organic chemistry lecture. Previous research has suggested that "manipulating" physical or computer models enhances student understanding; this study…

  12. [Reproducing and evaluating a rabbit model of multiple organ dysfunction syndrome after cardiopulmonary resuscitation resulted from asphyxia].

    Science.gov (United States)

    Zhang, Dong; Li, Nan; Chen, Ying; Wang, Yu-shan

    2013-02-01

    To evaluate the reproduction of a model of post resuscitation multiple organ dysfunction syndrome (PR-MODS) after cardiac arrest (CA) in rabbit, in order to provide new methods for post-CA treatment. Thirty-five rabbits were randomly divided into three groups, the sham group (n=5), the 7-minute asphyxia group (n=15), and the 8-minute asphyxia group (n=15). The asphyxia CA model was reproduced with tracheal occlusion. After cardiopulmonary resuscitation (CPR), the ratio of recovery of spontaneous circulation (ROSC), the mortality at different time points and the incidence of systemic inflammatory response syndrome (SIRS) were observed in two asphyxia groups. Creatine kinase isoenzyme (CK-MB), alanine aminotransferase (ALT), creatinine (Cr), glucose (Glu) and arterial partial pressure of oxygen (PaO2) levels in blood were measured in the two asphyxia groups before CPR and 12, 24 and 48 hours after ROSC. The survived rabbits were euthanized at 48 hours after ROSC, and heart, brain, lung, kidney, liver, and intestine were harvested for pathological examination using light microscope. PR-MODS after CA was defined based on the function of main organs and their pathological changes. (1) The incidence of ROSC was 100.0% in 7-minute asphyxia group and 86.7% in 8-minute asphyxia group respectively (P>0.05). The 6-hour mortality in 8-minute asphyxia group was significantly higher than that in 7-minute asphyxia group (46.7% vs. 6.7%, P0.05). (2) There was a variety of organ dysfunctions in survived rabbits after ROSC, including chemosis, respiratory distress, hypotension, abdominal distension, weakened or disappearance of bowel peristalsis and oliguria. (3) There was no SIRS or associated changes in major organ function in the sham group. SIRS was observed at 12 - 24 hours after ROSC in the two asphyxia groups. CK-MB was increased significantly at 12 hours after ROSC compared with that before asphyxia (7-minute asphyxia group: 786.88±211.84 U/L vs. 468.20±149.45 U/L, 8

  13. THE LICK AGN MONITORING PROJECT 2011: DYNAMICAL MODELING OF THE BROAD-LINE REGION IN Mrk 50

    Energy Technology Data Exchange (ETDEWEB)

    Pancoast, Anna; Brewer, Brendon J.; Treu, Tommaso; Bennert, Vardha N.; Sand, David J. [Department of Physics, University of California, Santa Barbara, CA 93106 (United States); Barth, Aaron J.; Cooper, Michael C. [Department of Physics and Astronomy, 4129 Frederick Reines Hall, University of California, Irvine, CA 92697-4575 (United States); Canalizo, Gabriela [Department of Physics and Astronomy, University of California, Riverside, CA 92521 (United States); Filippenko, Alexei V.; Li, Weidong; Cenko, S. Bradley; Clubb, Kelsey I. [Department of Astronomy, University of California, Berkeley, CA 94720-3411 (United States); Gates, Elinor L. [Lick Observatory, P.O. Box 85, Mount Hamilton, CA 95140 (United States); Greene, Jenny E. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Malkan, Matthew A. [Department of Physics and Astronomy, University of California, Los Angeles, CA 90095-1547 (United States); Stern, Daniel; Assef, Roberto J. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Boulevard, Pasadena, CA 91109 (United States); Woo, Jong-Hak [Astronomy Program, Department of Physics and Astronomy, Seoul National University, Seoul 151-742 (Korea, Republic of); Bae, Hyun-Jin [Department of Astronomy and Center for Galaxy Evolution Research, Yonsei University, Seoul 120-749 (Korea, Republic of); Buehler, Tabitha, E-mail: pancoast@physics.ucsb.edu [Department of Physics and Astronomy, N283 ESC, Brigham Young University, Provo, UT 84602-4360 (United States); and others

    2012-07-20

    We present dynamical modeling of the broad-line region (BLR) in the Seyfert 1 galaxy Mrk 50 using reverberation mapping data taken as part of the Lick AGN Monitoring Project (LAMP) 2011. We model the reverberation mapping data directly, constraining the geometry and kinematics of the BLR, as well as deriving a black hole mass estimate that does not depend on a normalizing factor or virial coefficient. We find that the geometry of the BLR in Mrk 50 is a nearly face-on thick disk, with a mean radius of 9.6{sup +1.2}{sub -0.9} light days, a width of the BLR of 6.9{sup +1.2}{sub -1.1} light days, and a disk opening angle of 25 {+-} 10 deg above the plane. We also constrain the inclination angle to be 9{sup +7}{sub -5} deg, close to face-on. Finally, the black hole mass of Mrk 50 is inferred to be log{sub 10}(M{sub BH}/M{sub Sun }) = 7.57{sup +0.44}{sub -0.27}. By comparison to the virial black hole mass estimate from traditional reverberation mapping analysis, we find the normalizing constant (virial coefficient) to be log{sub 10} f = 0.78{sup +0.44}{sub -0.27}, consistent with the commonly adopted mean value of 0.74 based on aligning the M{sub BH}-{sigma}* relation for active galactic nuclei and quiescent galaxies. While our dynamical model includes the possibility of a net inflow or outflow in the BLR, we cannot distinguish between these two scenarios.

  14. X-Ray Emitting GHz-Peaked Spectrum Galaxies: Testing a Dynamical-Radiative Model with Broad-Band Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Ostorero, L.; /Turin U. /INFN, Turin; Moderski, R.; /Warsaw, Copernicus Astron. Ctr. /KIPAC, Menlo Park; Stawarz, L.; /KIPAC, Menlo Park /Jagiellonian U., Astron. Observ.; Diaferio, A.; /Turin U. /INFN, Turin; Kowalska, I.; /Warsaw U. Observ.; Cheung, C.C.; /NASA, Goddard /Naval Research Lab, Wash., D.C.; Kataoka, J.; /Waseda U., RISE; Begelman, M.C.; /JILA, Boulder; Wagner, S.J.; /Heidelberg Observ.

    2010-06-07

    In a dynamical-radiative model we recently developed to describe the physics of compact, GHz-Peaked-Spectrum (GPS) sources, the relativistic jets propagate across the inner, kpc-sized region of the host galaxy, while the electron population of the expanding lobes evolves and emits synchrotron and inverse-Compton (IC) radiation. Interstellar-medium gas clouds engulfed by the expanding lobes, and photoionized by the active nucleus, are responsible for the radio spectral turnover through free-free absorption (FFA) of the synchrotron photons. The model provides a description of the evolution of the GPS spectral energy distribution (SED) with the source expansion, predicting significant and complex high-energy emission, from the X-ray to the {gamma}-ray frequency domain. Here, we test this model with the broad-band SEDs of a sample of eleven X-ray emitting GPS galaxies with Compact-Symmetric-Object (CSO) morphology, and show that: (i) the shape of the radio continuum at frequencies lower than the spectral turnover is indeed well accounted for by the FFA mechanism; (ii) the observed X-ray spectra can be interpreted as non-thermal radiation produced via IC scattering of the local radiation fields off the lobe particles, providing a viable alternative to the thermal, accretion-disk dominated scenario. We also show that the relation between the hydrogen column densities derived from the X-ray (N{sub H}) and radio (N{sub HI}) data of the sources is suggestive of a positive correlation, which, if confirmed by future observations, would provide further support to our scenario of high-energy emitting lobes.

  15. X-ray Emitting GHz-Peaked Spectrum Galaxies: Testing a Dynamical-Radiative Model with Broad-Band Spectra

    International Nuclear Information System (INIS)

    Ostorero, L.; Moderski, R.; Stawarz, L.; Diaferio, A.; Kowalska, I.; Cheung, C.C.; Kataoka, J.; Begelman, M.C.; Wagner, S.J.

    2010-01-01

    In a dynamical-radiative model we recently developed to describe the physics of compact, GHz-Peaked-Spectrum (GPS) sources, the relativistic jets propagate across the inner, kpc-sized region of the host galaxy, while the electron population of the expanding lobes evolves and emits synchrotron and inverse-Compton (IC) radiation. Interstellar-medium gas clouds engulfed by the expanding lobes, and photoionized by the active nucleus, are responsible for the radio spectral turnover through free-free absorption (FFA) of the synchrotron photons. The model provides a description of the evolution of the GPS spectral energy distribution (SED) with the source expansion, predicting significant and complex high-energy emission, from the X-ray to the γ-ray frequency domain. Here, we test this model with the broad-band SEDs of a sample of eleven X-ray emitting GPS galaxies with Compact-Symmetric-Object (CSO) morphology, and show that: (i) the shape of the radio continuum at frequencies lower than the spectral turnover is indeed well accounted for by the FFA mechanism; (ii) the observed X-ray spectra can be interpreted as non-thermal radiation produced via IC scattering of the local radiation fields off the lobe particles, providing a viable alternative to the thermal, accretion-disk dominated scenario. We also show that the relation between the hydrogen column densities derived from the X-ray (N H ) and radio (N HI ) data of the sources is suggestive of a positive correlation, which, if confirmed by future observations, would provide further support to our scenario of high-energy emitting lobes.

  16. Development and reproducibility evaluation of a Monte Carlo-based standard LINAC model for quality assurance of multi-institutional clinical trials.

    Science.gov (United States)

    Usmani, Muhammad Nauman; Takegawa, Hideki; Takashina, Masaaki; Numasaki, Hodaka; Suga, Masaki; Anetai, Yusuke; Kurosu, Keita; Koizumi, Masahiko; Teshima, Teruki

    2014-11-01

    Technical developments in radiotherapy (RT) have created a need for systematic quality assurance (QA) to ensure that clinical institutions deliver prescribed radiation doses consistent with the requirements of clinical protocols. For QA, an ideal dose verification system should be independent of the treatment-planning system (TPS). This paper describes the development and reproducibility evaluation of a Monte Carlo (MC)-based standard LINAC model as a preliminary requirement for independent verification of dose distributions. The BEAMnrc MC code is used for characterization of the 6-, 10- and 15-MV photon beams for a wide range of field sizes. The modeling of the LINAC head components is based on the specifications provided by the manufacturer. MC dose distributions are tuned to match Varian Golden Beam Data (GBD). For reproducibility evaluation, calculated beam data is compared with beam data measured at individual institutions. For all energies and field sizes, the MC and GBD agreed to within 1.0% for percentage depth doses (PDDs), 1.5% for beam profiles and 1.2% for total scatter factors (Scps.). Reproducibility evaluation showed that the maximum average local differences were 1.3% and 2.5% for PDDs and beam profiles, respectively. MC and institutions' mean Scps agreed to within 2.0%. An MC-based standard LINAC model developed to independently verify dose distributions for QA of multi-institutional clinical trials and routine clinical practice has proven to be highly accurate and reproducible and can thus help ensure that prescribed doses delivered are consistent with the requirements of clinical protocols. © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  17. Hexadecylphosphocholine (miltefosine) has broad-spectrum fungicidal activity and is efficacious in a mouse model of cryptococcosis.

    Science.gov (United States)

    Widmer, Fred; Wright, Lesley C; Obando, Daniel; Handke, Rosemary; Ganendren, Ranjini; Ellis, David H; Sorrell, Tania C

    2006-02-01

    The alkyl phosphocholine drug miltefosine is structurally similar to natural substrates of the fungal virulence determinant phospholipase B1 (PLB1), which is a potential drug target. We determined the MICs of miltefosine against key fungal pathogens, correlated antifungal activity with inhibition of the PLB1 activities (PLB, lysophospholipase [LPL], and lysophospholipase-transacylase [LPTA]), and investigated its efficacy in a mouse model of disseminated cryptococcosis. Miltefosine inhibited secreted cryptococcal LPTA activity by 35% at the subhemolytic concentration of 25 microM (10.2 microg/ml) and was inactive against mammalian pancreatic phospholipase A2 (PLA2). At 250 microM, cytosolic PLB, LPL, and LPTA activities were inhibited by 25%, 51%, and 77%, respectively. The MICs at which 90% of isolates were inhibited (MIC90s) against Candida albicans, Candida glabrata, Candida krusei, Cryptococcus neoformans, Cryptococcus gattii, Aspergillus fumigatus, Fusarium solani, Scedosporium prolificans, and Scedosporium apiospermum were 2 to 4 microg/ml. The MICs of miltefosine against Candida tropicalis (n = 8) were 2 to 4 microg/ml, those against Aspergillus terreus and Candida parapsilosis were 8 microg/ml (MIC90), and those against Aspergillus flavus (n = 8) were 2 to 16 microg/ml. Miltefosine was fungicidal for C. neoformans, with rates of killing of 2 log units within 4 h at 7.0 microM (2.8 microg/ml). Miltefosine given orally to mice on days 1 to 5 after intravenous infection with C. neoformans delayed the development of illness and mortality and significantly reduced the brain cryptococcal burden. We conclude that miltefosine has broad-spectrum antifungal activity and is active in vivo in a mouse model of disseminated cryptococcosis. The relatively small inhibitory effect on PLB1 enzyme activities at concentrations exceeding the MIC by 2 to 20 times suggests that PLB1 inhibition is not the only mechanism of the antifungal effect.

  18. Pangea breakup and northward drift of the Indian subcontinent reproduced by a numerical model of mantle convection.

    Science.gov (United States)

    Yoshida, Masaki; Hamano, Yozo

    2015-02-12

    Since around 200 Ma, the most notable event in the process of the breakup of Pangea has been the high speed (up to 20 cm yr(-1)) of the northward drift of the Indian subcontinent. Our numerical simulations of 3-D spherical mantle convection approximately reproduced the process of continental drift from the breakup of Pangea at 200 Ma to the present-day continental distribution. These simulations revealed that a major factor in the northward drift of the Indian subcontinent was the large-scale cold mantle downwelling that developed spontaneously in the North Tethys Ocean, attributed to the overall shape of Pangea. The strong lateral mantle flow caused by the high-temperature anomaly beneath Pangea, due to the thermal insulation effect, enhanced the acceleration of the Indian subcontinent during the early stage of the Pangea breakup. The large-scale hot upwelling plumes from the lower mantle, initially located under Africa, might have contributed to the formation of the large-scale cold mantle downwelling in the North Tethys Ocean.

  19. Three-dimensional surgical modelling with an open-source software protocol: study of precision and reproducibility in mandibular reconstruction with the fibula free flap.

    Science.gov (United States)

    Ganry, L; Quilichini, J; Bandini, C M; Leyder, P; Hersant, B; Meningaud, J P

    2017-08-01

    Very few surgical teams currently use totally independent and free solutions to perform three-dimensional (3D) surgical modelling for osseous free flaps in reconstructive surgery. This study assessed the precision and technical reproducibility of a 3D surgical modelling protocol using free open-source software in mandibular reconstruction with fibula free flaps and surgical guides. Precision was assessed through comparisons of the 3D surgical guide to the sterilized 3D-printed guide, determining accuracy to the millimetre level. Reproducibility was assessed in three surgical cases by volumetric comparison to the millimetre level. For the 3D surgical modelling, a difference of less than 0.1mm was observed. Almost no deformations (free flap modelling was between 0.1mm and 0.4mm, and the average precision of the complete reconstructed mandible was less than 1mm. The open-source software protocol demonstrated high accuracy without complications. However, the precision of the surgical case depends on the surgeon's 3D surgical modelling. Therefore, surgeons need training on the use of this protocol before applying it to surgical cases; this constitutes a limitation. Further studies should address the transfer of expertise. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  20. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  1. Reproducibility in Seismic Imaging

    Directory of Open Access Journals (Sweden)

    González-Verdejo O.

    2012-04-01

    Full Text Available Within the field of exploration seismology, there is interest at national level of integrating reproducibility in applied, educational and research activities related to seismic processing and imaging. This reproducibility implies the description and organization of the elements involved in numerical experiments. Thus, a researcher, teacher or student can study, verify, repeat, and modify them independently. In this work, we document and adapt reproducibility in seismic processing and imaging to spread this concept and its benefits, and to encourage the use of open source software in this area within our academic and professional environment. We present an enhanced seismic imaging example, of interest in both academic and professional environments, using Mexican seismic data. As a result of this research, we prove that it is possible to assimilate, adapt and transfer technology at low cost, using open source software and following a reproducible research scheme.

  2. Response of a comprehensive climate model to a broad range of external forcings: relevance for deep ocean ventilation and the development of late Cenozoic ice ages

    Science.gov (United States)

    Galbraith, Eric; de Lavergne, Casimir

    2018-03-01

    Over the past few million years, the Earth descended from the relatively warm and stable climate of the Pliocene into the increasingly dramatic ice age cycles of the Pleistocene. The influences of orbital forcing and atmospheric CO2 on land-based ice sheets have long been considered as the key drivers of the ice ages, but less attention has been paid to their direct influences on the circulation of the deep ocean. Here we provide a broad view on the influences of CO2, orbital forcing and ice sheet size according to a comprehensive Earth system model, by integrating the model to equilibrium under 40 different combinations of the three external forcings. We find that the volume contribution of Antarctic (AABW) vs. North Atlantic (NADW) waters to the deep ocean varies widely among the simulations, and can be predicted from the difference between the surface densities at AABW and NADW deep water formation sites. Minima of both the AABW-NADW density difference and the AABW volume occur near interglacial CO2 (270-400 ppm). At low CO2, abundant formation and northward export of sea ice in the Southern Ocean contributes to very salty and dense Antarctic waters that dominate the global deep ocean. Furthermore, when the Earth is cold, low obliquity (i.e. a reduced tilt of Earth's rotational axis) enhances the Antarctic water volume by expanding sea ice further. At high CO2, AABW dominance is favoured due to relatively warm subpolar North Atlantic waters, with more dependence on precession. Meanwhile, a large Laurentide ice sheet steers atmospheric circulation as to strengthen the Atlantic Meridional Overturning Circulation, but cools the Southern Ocean remotely, enhancing Antarctic sea ice export and leading to very salty and expanded AABW. Together, these results suggest that a `sweet spot' of low CO2, low obliquity and relatively small ice sheets would have poised the AMOC for interruption, promoting Dansgaard-Oeschger-type abrupt change. The deep ocean temperature and

  3. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Science.gov (United States)

    Nada, Rania M; Maal, Thomas J J; Breuning, K Hero; Bergé, Stefaan J; Mostafa, Yehya A; Kuijpers-Jagtman, Anne Marie

    2011-02-09

    Superimposition of serial Cone Beam Computed Tomography (CBCT) scans has become a valuable tool for three dimensional (3D) assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16) for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27) for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  4. Assessment of a numerical model to reproduce event‐scale erosion and deposition distributions in a braided river

    Science.gov (United States)

    Measures, R.; Hicks, D. M.; Brasington, J.

    2016-01-01

    Abstract Numerical morphological modeling of braided rivers, using a physics‐based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth‐averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high‐flow event. Evaluation of model performance primarily focused upon using high‐resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach‐scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers. PMID:27708477

  5. Skills of General Circulation and Earth System Models in reproducing streamflow to the ocean: the case of Congo river

    Science.gov (United States)

    Santini, M.; Caporaso, L.

    2017-12-01

    Although the importance of water resources in the context of climate change, it is still difficult to correctly simulate the freshwater cycle over the land via General Circulation and Earth System Models (GCMs and ESMs). Existing efforts from the Climate Model Intercomparison Project 5 (CMIP5) were mainly devoted to the validation of atmospheric variables like temperature and precipitation, with low attention to discharge.Here we investigate the present-day performances of GCMs and ESMs participating to CMIP5 in simulating the discharge of the river Congo to the sea thanks to: i) the long-term availability of discharge data for the Kinshasa hydrological station representative of more than 95% of the water flowing in the whole catchment; and ii) the River's still low influence by human intervention, which enables comparison with the (mostly) natural streamflow simulated within CMIP5.Our findings suggest how most of models appear overestimating the streamflow in terms of seasonal cycle, especially in the late winter and spring, while overestimation and variability across models are lower in late summer. Weighted ensemble means are also calculated, based on simulations' performances given by several metrics, showing some improvements of results.Although simulated inter-monthly and inter-annual percent anomalies do not appear significantly different from those in observed data, when translated into well consolidated indicators of drought attributes (frequency, magnitude, timing, duration), usually adopted for more immediate communication to stakeholders and decision makers, such anomalies can be misleading.These inconsistencies produce incorrect assessments towards water management planning and infrastructures (e.g. dams or irrigated areas), especially if models are used instead of measurements, as in case of ungauged basins or for basins with insufficient data, as well as when relying on models for future estimates without a preliminary quantification of model biases.

  6. Assessment of a numerical model to reproduce event-scale erosion and deposition distributions in a braided river

    Science.gov (United States)

    Williams, R. D.; Measures, R.; Hicks, D. M.; Brasington, J.

    2016-08-01

    Numerical morphological modeling of braided rivers, using a physics-based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth-averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high-flow event. Evaluation of model performance primarily focused upon using high-resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach-scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers.

  7. Repeatability and Reproducibility of Corneal Biometric Measurements Using the Visante Omni and a Rabbit Experimental Model of Post-Surgical Corneal Ectasia

    Science.gov (United States)

    Liu, Yu-Chi; Konstantopoulos, Aris; Riau, Andri K.; Bhayani, Raj; Lwin, Nyein C.; Teo, Ericia Pei Wen; Yam, Gary Hin Fai; Mehta, Jodhbir S.

    2015-01-01

    Purpose: To investigate the repeatability and reproducibility of the Visante Omni topography in obtaining topography measurements of rabbit corneas and to develop a post-surgical model of corneal ectasia. Methods: Eight rabbits were used to study the repeatability and reproducibility by assessing the intra- and interobserver bias and limits of agreement. Another nine rabbits underwent different diopters (D) of laser in situ keratosmileusis (LASIK) were used for the development of ectasia model. All eyes were examined with the Visante Omni, and corneal ultrastructure were evaluated with transmission electron microscopy (TEM). Results: There was no significant intra- or interobserver difference for mean steep and flat keratometry (K) values of simulated K, anterior, and posterior elevation measurements. Eyes underwent −5 D LASIK had a significant increase in mean amplitude of astigmatism and posterior surface elevation with time (P for trend corneal ectasia that was gradual in development and simulated the human condition. Translational Relevance: The results provide the foundations for the future evaluation of novel treatment modalities for post-surgical ectasia and keratoconus. PMID:25938004

  8. A Reliable and Reproducible Model for Assessing the Effect of Different Concentrations of α-Solanine on Rat Bone Marrow Mesenchymal Stem Cells

    Directory of Open Access Journals (Sweden)

    Adriana Ordóñez-Vásquez

    2017-01-01

    Full Text Available Αlpha-solanine (α-solanine is a glycoalkaloid present in potato (Solanum tuberosum. It has been of particular interest because of its toxicity and potential teratogenic effects that include abnormalities of the central nervous system, such as exencephaly, encephalocele, and anophthalmia. Various types of cell culture have been used as experimental models to determine the effect of α-solanine on cell physiology. The morphological changes in the mesenchymal stem cell upon exposure to α-solanine have not been established. This study aimed to describe a reliable and reproducible model for assessing the structural changes induced by exposure of mouse bone marrow mesenchymal stem cells (MSCs to different concentrations of α-solanine for 24 h. The results demonstrate that nonlethal concentrations of α-solanine (2–6 μM changed the morphology of the cells, including an increase in the number of nucleoli, suggesting elevated protein synthesis, and the formation of spicules. In addition, treatment with α-solanine reduced the number of adherent cells and the formation of colonies in culture. Immunophenotypic characterization and staining of MSCs are proposed as a reproducible method that allows description of cells exposed to the glycoalkaloid, α-solanine.

  9. Attempting to train a digital human model to reproduce human subject reach capabilities in an ejection seat aircraft

    NARCIS (Netherlands)

    Zehner, G.F.; Hudson, J.A.; Oudenhuijzen, A.

    2006-01-01

    From 1997 through 2002, the Air Force Research Lab and TNO Defence, Security and Safety (Business Unit Human Factors) were involved in a series of tests to quantify the accuracy of five Human Modeling Systems (HMSs) in determining accommodation limits of ejection seat aircraft. The results of these

  10. A sensitive and reproducible in vivo imaging mouse model for evaluation of drugs against late-stage human African trypanosomiasis.

    Science.gov (United States)

    Burrell-Saward, Hollie; Rodgers, Jean; Bradley, Barbara; Croft, Simon L; Ward, Theresa H

    2015-02-01

    To optimize the Trypanosoma brucei brucei GVR35 VSL-2 bioluminescent strain as an innovative drug evaluation model for late-stage human African trypanosomiasis. An IVIS® Lumina II imaging system was used to detect bioluminescent T. b. brucei GVR35 parasites in mice to evaluate parasite localization and disease progression. Drug treatment was assessed using qualitative bioluminescence imaging and real-time quantitative PCR (qPCR). We have shown that drug dose-response can be evaluated using bioluminescence imaging and confirmed quantification of tissue parasite load using qPCR. The model was also able to detect drug relapse earlier than the traditional blood film detection and even in the absence of any detectable peripheral parasites. We have developed and optimized a new, efficient method to evaluate novel anti-trypanosomal drugs in vivo and reduce the current 180 day drug relapse experiment to a 90 day model. The non-invasive in vivo imaging model reduces the time required to assess preclinical efficacy of new anti-trypanosomal drugs. © The Author 2014. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Isokinetic eccentric exercise as a model to induce and reproduce pathophysiological alterations related to delayed onset muscle soreness

    DEFF Research Database (Denmark)

    Lund, Henrik; Vestergaard-Poulsen, P; Kanstrup, I.L.

    1998-01-01

    Physiological alterations following unaccustomed eccentric exercise in an isokinetic dynamometer of the right m. quadriceps until exhaustion were studied, in order to create a model in which the physiological responses to physiotherapy could be measured. In experiment I (exp. I), seven selected p...

  12. Developing a Collection of Composable Data Translation Software Units to Improve Efficiency and Reproducibility in Ecohydrologic Modeling Workflows

    Science.gov (United States)

    Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.

    2017-12-01

    Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of

  13. Synchronized mammalian cell culture: part II--population ensemble modeling and analysis for development of reproducible processes.

    Science.gov (United States)

    Jandt, Uwe; Barradas, Oscar Platas; Pörtner, Ralf; Zeng, An-Ping

    2015-01-01

    The consideration of inherent population inhomogeneities of mammalian cell cultures becomes increasingly important for systems biology study and for developing more stable and efficient processes. However, variations of cellular properties belonging to different sub-populations and their potential effects on cellular physiology and kinetics of culture productivity under bioproduction conditions have not yet been much in the focus of research. Culture heterogeneity is strongly determined by the advance of the cell cycle. The assignment of cell-cycle specific cellular variations to large-scale process conditions can be optimally determined based on the combination of (partially) synchronized cultivation under otherwise physiological conditions and subsequent population-resolved model adaptation. The first step has been achieved using the physical selection method of countercurrent flow centrifugal elutriation, recently established in our group for different mammalian cell lines which is presented in Part I of this paper series. In this second part, we demonstrate the successful adaptation and application of a cell-cycle dependent population balance ensemble model to describe and understand synchronized bioreactor cultivations performed with two model mammalian cell lines, AGE1.HNAAT and CHO-K1. Numerical adaptation of the model to experimental data allows for detection of phase-specific parameters and for determination of significant variations between different phases and different cell lines. It shows that special care must be taken with regard to the sampling frequency in such oscillation cultures to minimize phase shift (jitter) artifacts. Based on predictions of long-term oscillation behavior of a culture depending on its start conditions, optimal elutriation setup trade-offs between high cell yields and high synchronization efficiency are proposed. © 2014 American Institute of Chemical Engineers.

  14. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  15. Minimum Information about a Cardiac Electrophysiology Experiment (MICEE): Standardised Reporting for Model Reproducibility, Interoperability, and Data Sharing

    Science.gov (United States)

    Quinn, TA; Granite, S; Allessie, MA; Antzelevitch, C; Bollensdorff, C; Bub, G; Burton, RAB; Cerbai, E; Chen, PS; Delmar, M; DiFrancesco, D; Earm, YE; Efimov, IR; Egger, M; Entcheva, E; Fink, M; Fischmeister, R; Franz, MR; Garny, A; Giles, WR; Hannes, T; Harding, SE; Hunter, PJ; Iribe, G; Jalife, J; Johnson, CR; Kass, RS; Kodama, I; Koren, G; Lord, P; Markhasin, VS; Matsuoka, S; McCulloch, AD; Mirams, GR; Morley, GE; Nattel, S; Noble, D; Olesen, SP; Panfilov, AV; Trayanova, NA; Ravens, U; Richard, S; Rosenbaum, DS; Rudy, Y; Sachs, F; Sachse, FB; Saint, DA; Schotten, U; Solovyova, O; Taggart, P; Tung, L; Varró, A; Volders, PG; Wang, K; Weiss, JN; Wettwer, E; White, E; Wilders, R; Winslow, RL; Kohl, P

    2011-01-01

    Cardiac experimental electrophysiology is in need of a well-defined Minimum Information Standard for recording, annotating, and reporting experimental data. As a step toward establishing this, we present a draft standard, called Minimum Information about a Cardiac Electrophysiology Experiment (MICEE). The ultimate goal is to develop a useful tool for cardiac electrophysiologists which facilitates and improves dissemination of the minimum information necessary for reproduction of cardiac electrophysiology research, allowing for easier comparison and utilisation of findings by others. It is hoped that this will enhance the integration of individual results into experimental, computational, and conceptual models. In its present form, this draft is intended for assessment and development by the research community. We invite the reader to join this effort, and, if deemed productive, implement the Minimum Information about a Cardiac Electrophysiology Experiment standard in their own work. PMID:21745496

  16. Reproducibility of haemodynamical simulations in a subject-specific stented aneurysm model--a report on the Virtual Intracranial Stenting Challenge 2007.

    Science.gov (United States)

    Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F

    2008-07-19

    This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and

  17. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Directory of Open Access Journals (Sweden)

    Rania M Nada

    Full Text Available Superimposition of serial Cone Beam Computed Tomography (CBCT scans has become a valuable tool for three dimensional (3D assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16 for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27 for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  18. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  19. Retrospective Correction of Physiological Noise: Impact on Sensitivity, Specificity, and Reproducibility of Resting-State Functional Connectivity in a Reading Network Model.

    Science.gov (United States)

    Krishnamurthy, Venkatagiri; Krishnamurthy, Lisa C; Schwam, Dina M; Ealey, Ashley; Shin, Jaemin; Greenberg, Daphne; Morris, Robin D

    2018-03-01

    It is well accepted that physiological noise (PN) obscures the detection of neural fluctuations in resting-state functional connectivity (rsFC) magnetic resonance imaging. However, a clear consensus for an optimal PN correction (PNC) methodology and how it can impact the rsFC signal characteristics is still lacking. In this study, we probe the impact of three PNC methods: RETROICOR: (Glover et al., 2000 ), ANATICOR: (Jo et al., 2010 ), and RVTMBPM: (Bianciardi et al., 2009 ). Using a reading network model, we systematically explore the effects of PNC optimization on sensitivity, specificity, and reproducibility of rsFC signals. In terms of specificity, ANATICOR was found to be effective in removing local white matter (WM) fluctuations and also resulted in aggressive removal of expected cortical-to-subcortical functional connections. The ability of RETROICOR to remove PN was equivalent to removal of simulated random PN such that it artificially inflated the connection strength, thereby decreasing sensitivity. RVTMBPM maintained specificity and sensitivity by balanced removal of vasodilatory PN and local WM nuisance edges. Another aspect of this work was exploring the effects of PNC on identifying reading group differences. Most PNC methods accounted for between-subject PN variability resulting in reduced intersession reproducibility. This effect facilitated the detection of the most consistent group differences. RVTMBPM was most effective in detecting significant group differences due to its inherent sensitivity to removing spatially structured and temporally repeating PN arising from dense vasculature. Finally, results suggest that combining all three PNC resulted in "overcorrection" by removing signal along with noise.

  20. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  1. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  2. Opening Reproducible Research

    Science.gov (United States)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  3. The 2010 Broad Prize

    Science.gov (United States)

    Education Digest: Essential Readings Condensed for Quick Review, 2011

    2011-01-01

    A new data analysis, based on data collected as part of The Broad Prize process, provides insights into which large urban school districts in the United States are doing the best job of educating traditionally disadvantaged groups: African-American, Hispanics, and low-income students. Since 2002, The Eli and Edythe Broad Foundation has awarded The…

  4. Inhibition of basophil activation by histamine: a sensitive and reproducible model for the study of the biological activity of high dilutions.

    Science.gov (United States)

    Sainte-Laudy, J; Belon, Ph

    2009-10-01

    (another human basophil activation marker). Results were expressed in mean fluorescence intensity of the CD203c positive population (MFI-CD203c) and an activation index calculated by an algorithm. For the mouse basophil model, histamine was measured spectrofluorimetrically. The main results obtained over 28 years of work was the demonstration of a reproducible inhibition of human basophil activation by high dilutions of histamine, the effect peaks in the range of 15-17CH. The effect was not significant when histamine was replaced by histidine (a histamine precursor) or cimetidine (histamine H2 receptor antagonist) was added to the incubation medium. These results were confirmed by flow cytometry. Using the latter technique, we also showed that 4-Methyl histamine (H2 agonist) induced a similar effect, in contrast to 1-Methyl histamine, an inactive histamine metabolite. Using the mouse model, we showed that histamine high dilutions, in the same range of dilutions, inhibited histamine release. Successively, using different models to study of human and murine basophil activation, we demonstrated that high dilutions of histamine, in the range of 15-17CH induce a reproducible biological effect. This phenomenon has been confirmed by a multi-center study using the HBDT model and by at least three independent laboratories by flow cytometry. The specificity of the observed effect was confirmed, versus the water controls at the same dilution level by the absence of biological activity of inactive compounds such as histidine and 1-Methyl histamine and by the reversibility of this effect in the presence of a histamine receptor H2 antagonist.

  5. Exploring the origin of broad-band emissions of Mrk 501 with a two-zone model

    Science.gov (United States)

    Lei, Maichang; Yang, Chuyuan; Wang, Jiancheng; Yang, Xiaolin

    2018-04-01

    We propose a two-zone synchrotron self-Compton (SSC) model, including an inner gamma-ray emitting region with spherical shape and a conical radio emitting region located at the extended jet, to alleviate the long-standing "bulk Lorentz factor crisis" in blazars. In this model, the spectral energy distributions (SEDs) of blazars are produced by considering the gamma-ray emitting region inverse Compton scattering of both the synchrotron photons itself and the ambient photons from the radio emitting region. Applying the model to Mrk 501, we obtain that the radio emitting region has a comoving length of ˜0.15 pc and is located at sub-parsec scale from the central engine by modeling the radio data; the flux of the Compton scattering of the ambient photons is so low that it can be neglected safely. The characteristic hard gamma-ray spectrum can be explained by the superposition of two SSC processes, and the model can approximately explain the very high energy (VHE) data. The insights into the spectral shape and the inter-band correlations under the flaring state will provide us with a diagnostic for the bulk Lorentz factor of radio emitting region, where the low and upper limits of 8 and 15 are preferred, and for the two-zone SSC model itself. In addition, our two-zone SSC model shows that the gamma-ray emitting region creates flare on the timescale of merely a few hours, and the long time outbursts more likely originate from the extended radio emitting region.

  6. Slip model and Synthetic Broad-band Strong Motions for the 2015 Mw 8.3 Illapel (Chile) Earthquake.

    Science.gov (United States)

    Aguirre, P.; Fortuno, C.; de la Llera, J. C.

    2017-12-01

    The MW 8.3 earthquake that occurred on September 16th 2015 west of Illapel, Chile, ruptured a 200 km section of the plate boundary between 29º S and 33º S. SAR data acquired by the Sentinel 1A satellite was used to obtain the interferogram of the earthquake, and from it, the component of the displacement field of the surface in the line of sight of the satellite. Based on this interferogram, the corresponding coseismic slip distribution for the earthquake was determined based on different plausible finite fault geometries. The model that best fits the data gathered is one whose rupture surface is consistent with the Slab 1.0 model, with a constant strike angle of 4º and variable dip angle ranging from 2.7º near the trench to 24.3º down dip. Using this geometry the maximum slip obtained is 7.52 m and the corresponding seismic moment is 3.78·1021 equivalent to a moment magnitude Mw 8.3. Calculation of the Coulomb failure stress change induced by this slip distribution evidences a strong correlation between regions where stress is increased as consequence of the earthquake, and the occurrence of the most relevant aftershocks, providing a consistency check for the inversion procedure applied and its results.The finite fault model for the Illapel earthquake is used to test a hybrid methodology for generation of synthetic ground motions that combines a deterministic calculation of the low frequency content, with stochastic modelling of the high frequency signal. Strong ground motions are estimated at the location of seismic stations recording the Illapel earthquake. Such simulations include the effect of local soil conditions, which are modelled empirically based on H/V ratios obtained from a large database of historical seismic records. Comparison of observed and synthetic records based on the 5%-damped response spectra yield satisfactory results for locations where the site response function is more robustly estimated.

  7. Hexadecylphosphocholine (Miltefosine) Has Broad-Spectrum Fungicidal Activity and Is Efficacious in a Mouse Model of Cryptococcosis

    OpenAIRE

    Widmer, Fred; Wright, Lesley C.; Obando, Daniel; Handke, Rosemary; Ganendren, Ranjini; Ellis, David H.; Sorrell, Tania C.

    2006-01-01

    The alkyl phosphocholine drug miltefosine is structurally similar to natural substrates of the fungal virulence determinant phospholipase B1 (PLB1), which is a potential drug target. We determined the MICs of miltefosine against key fungal pathogens, correlated antifungal activity with inhibition of the PLB1 activities (PLB, lysophospholipase [LPL], and lysophospholipase-transacylase [LPTA]), and investigated its efficacy in a mouse model of disseminated cryptococcosis. Miltefosine inhibited ...

  8. Modeling tidal freshwater marsh sustainability in the Sacramento-San Joaquin Delta under a broad suite of potential future scenarios

    Science.gov (United States)

    Swanson, Kathleen M.; Drexler, Judith Z.; Fuller, Christopher C.; Schoellhamer, David H.

    2015-01-01

    In this paper, we report on the adaptation and application of a one-dimensional marsh surface elevation model, the Wetland Accretion Rate Model of Ecosystem Resilience (WARMER), to explore the conditions that lead to sustainable tidal freshwater marshes in the Sacramento–San Joaquin Delta. We defined marsh accretion parameters to encapsulate the range of observed values over historic and modern time-scales based on measurements from four marshes in high and low energy fluvial environments as well as possible future trends in sediment supply and mean sea level. A sensitivity analysis of 450 simulations was conducted encompassing a range of eScholarship provides open access, scholarly publishing services to the University of California and delivers a dynamic research platform to scholars worldwide. porosity values, initial elevations, organic and inorganic matter accumulation rates, and sea-level rise rates. For the range of inputs considered, the magnitude of SLR over the next century was the primary driver of marsh surface elevation change. Sediment supply was the secondary control. More than 84% of the scenarios resulted in sustainable marshes with 88 cm of SLR by 2100, but only 32% and 11% of the scenarios resulted in surviving marshes when SLR was increased to 133 cm and 179 cm, respectively. Marshes situated in high-energy zones were marginally more resilient than those in low-energy zones because of their higher inorganic sediment supply. Overall, the results from this modeling exercise suggest that marshes at the upstream reaches of the Delta—where SLR may be attenuated—and high energy marshes along major channels with high inorganic sediment accumulation rates will be more resilient to global SLR in excess of 88 cm over the next century than their downstream and low-energy counterparts. However, considerable uncertainties exist in the projected rates of sea-level rise and sediment avail-ability. In addition, more research is needed to constrain future

  9. Computational modelling of the cerebral cortical microvasculature: effect of x-ray microbeams versus broad beam irradiation

    Science.gov (United States)

    Merrem, A.; Bartzsch, S.; Laissue, J.; Oelfke, U.

    2017-05-01

    Microbeam Radiation Therapy is an innovative pre-clinical strategy which uses arrays of parallel, tens of micrometres wide kilo-voltage photon beams to treat tumours. These x-ray beams are typically generated on a synchrotron source. It was shown that these beam geometries allow exceptional normal tissue sparing from radiation damage while still being effective in tumour ablation. A final biological explanation for this enhanced therapeutic ratio has still not been found, some experimental data support an important role of the vasculature. In this work, the effect of microbeams on a normal microvascular network of the cerebral cortex was assessed in computer simulations and compared to the effect of homogeneous, seamless exposures at equal energy absorption. The anatomy of a cerebral microvascular network and the inflicted radiation damage were simulated to closely mimic experimental data using a novel probabilistic model of radiation damage to blood vessels. It was found that the spatial dose fractionation by microbeam arrays significantly decreased the vascular damage. The higher the peak-to-valley dose ratio, the more pronounced the sparing effect. Simulations of the radiation damage as a function of morphological parameters of the vascular network demonstrated that the distribution of blood vessel radii is a key parameter determining both the overall radiation damage of the vasculature and the dose-dependent differential effect of microbeam irradiation.

  10. Universal or Specific? A Modeling-Based Comparison of Broad-Spectrum Influenza Vaccines against Conventional, Strain-Matched Vaccines.

    Directory of Open Access Journals (Sweden)

    Rahul Subramanian

    2016-12-01

    Full Text Available Despite the availability of vaccines, influenza remains a major public health challenge. A key reason is the virus capacity for immune escape: ongoing evolution allows the continual circulation of seasonal influenza, while novel influenza viruses invade the human population to cause a pandemic every few decades. Current vaccines have to be updated continually to keep up to date with this antigenic change, but emerging 'universal' vaccines-targeting more conserved components of the influenza virus-offer the potential to act across all influenza A strains and subtypes. Influenza vaccination programmes around the world are steadily increasing in their population coverage. In future, how might intensive, routine immunization with novel vaccines compare against similar mass programmes utilizing conventional vaccines? Specifically, how might novel and conventional vaccines compare, in terms of cumulative incidence and rates of antigenic evolution of seasonal influenza? What are their potential implications for the impact of pandemic emergence? Here we present a new mathematical model, capturing both transmission dynamics and antigenic evolution of influenza in a simple framework, to explore these questions. We find that, even when matched by per-dose efficacy, universal vaccines could dampen population-level transmission over several seasons to a greater extent than conventional vaccines. Moreover, by lowering opportunities for cross-protective immunity in the population, conventional vaccines could allow the increased spread of a novel pandemic strain. Conversely, universal vaccines could mitigate both seasonal and pandemic spread. However, where it is not possible to maintain annual, intensive vaccination coverage, the duration and breadth of immunity raised by universal vaccines are critical determinants of their performance relative to conventional vaccines. In future, conventional and novel vaccines are likely to play complementary roles in

  11. A right to reproduce?

    Science.gov (United States)

    Emson, H E

    1992-10-31

    Conscious control of the environment by homo sapiens has brought almost total release from the controls of ecology that limit the population of all other species. After a mere 10,000 years, humans have brought the planet close to collapse, and all the debate in the world seems unlikely to save it. A combination of uncontrolled breeding and rapacity is propelling us down the slippery slope 1st envisioned by Malthus, dragging the rest of the planet along. Only the conscious control, and most likely voluntary, reimposition of controls on breeding will reduce the overgrowth of humans, and we have far to go in that direction. "According to the United Nations Universal Declaration of Human Rights (1948, articles 16[I] and 16 [III]), Men and women of full age without any limitation due to race, nationality or religion have the right to marry and to found a family ... the family is the natural and fundamental group unit of society." The rhetoric of rights without the balancing of responsibilities is wrong in health care, and even more wrong in the context of world population. The mind-set of dominance and exploitation over the rest of creation has meant human reluctance to admit participation in a system where every part is interdependent. We must balance the right to reproduce with it responsible use, valuing interdependence, understanding, and respect with a duty not to unbalance, damage, or destroy. It is long overdue that we discard every statement of right that is unmatched by the equivalent duty and responsibility.

  12. Optimized broad-histogram simulations for strong first-order phase transitions: droplet transitions in the large-Q Potts model

    International Nuclear Information System (INIS)

    Bauer, Bela; Troyer, Matthias; Gull, Emanuel; Trebst, Simon; Huse, David A

    2010-01-01

    The numerical simulation of strongly first-order phase transitions has remained a notoriously difficult problem even for classical systems due to the exponentially suppressed (thermal) equilibration in the vicinity of such a transition. In the absence of efficient update techniques, a common approach for improving equilibration in Monte Carlo simulations is broadening the sampled statistical ensemble beyond the bimodal distribution of the canonical ensemble. Here we show how a recently developed feedback algorithm can systematically optimize such broad-histogram ensembles and significantly speed up equilibration in comparison with other extended ensemble techniques such as flat-histogram, multicanonical and Wang–Landau sampling. We simulate, as a prototypical example of a strong first-order transition, the two-dimensional Potts model with up to Q = 250 different states in large systems. The optimized histogram develops a distinct multi-peak structure, thereby resolving entropic barriers and their associated phase transitions in the phase coexistence region—such as droplet nucleation and annihilation, and droplet–strip transitions for systems with periodic boundary conditions. We characterize the efficiency of the optimized histogram sampling by measuring round-trip times τ(N, Q) across the phase transition for samples comprised of N spins. While we find power-law scaling of τ versus N for small Q∼ 2 , we observe a crossover to exponential scaling for larger Q. These results demonstrate that despite the ensemble optimization, broad-histogram simulations cannot fully eliminate the supercritical slowing down at strongly first-order transitions

  13. Ethical review of biobank research: Should RECs review each release of material from biobanks operating under an already-approved broad consent and data protection model?

    Science.gov (United States)

    Strech, Daniel

    2015-10-01

    The use of broad consent in biobank research has implications for the procedures of ethics review. This paper describes these implications and makes a recommendation for how to deal with them. Two steps in the ethics review of biobank research can be distinguished. In a first step, a research ethics committee (REC) reviews a biobank's framework regarding oversight procedures (e.g. broad consent form and data protection model). A second step then reviews specific projects that require the release of particular biomaterial and/or data. This paper argues that only a few research-related risks remain for the second step of ethical review and that a self-regulated body such as a biobank internal access committee would suffice (in principle) to address these risks. The reduction of REC involvement in biobank research proposed here has three aims: (i) to conserve time and money, (ii) to allow RECs to focus on higher-risk areas, and (iii) to promote professional self-regulation. Assuming that the public understands that neither REC involvement nor competent access committees can guarantee 100% protection against misuse of data, the proposed reduction of REC involvement could also enhance the public perception of biobank research as an ethically-sensitive enterprise that can be sufficiently controlled through competent self-regulation. In order to compensate for reduced REC involvement and to maintain public trust, biobanks should implement safeguards such as public information on approved projects. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  14. Osteolytica: An automated image analysis software package that rapidly measures cancer-induced osteolytic lesions in in vivo models with greater reproducibility compared to other commonly used methods.

    Science.gov (United States)

    Evans, H R; Karmakharm, T; Lawson, M A; Walker, R E; Harris, W; Fellows, C; Huggins, I D; Richmond, P; Chantry, A D

    2016-02-01

    Methods currently used to analyse osteolytic lesions caused by malignancies such as multiple myeloma and metastatic breast cancer vary from basic 2-D X-ray analysis to 2-D images of micro-CT datasets analysed with non-specialised image software such as ImageJ. However, these methods have significant limitations. They do not capture 3-D data, they are time-consuming and they often suffer from inter-user variability. We therefore sought to develop a rapid and reproducible method to analyse 3-D osteolytic lesions in mice with cancer-induced bone disease. To this end, we have developed Osteolytica, an image analysis software method featuring an easy to use, step-by-step interface to measure lytic bone lesions. Osteolytica utilises novel graphics card acceleration (parallel computing) and 3-D rendering to provide rapid reconstruction and analysis of osteolytic lesions. To evaluate the use of Osteolytica we analysed tibial micro-CT datasets from murine models of cancer-induced bone disease and compared the results to those obtained using a standard ImageJ analysis method. Firstly, to assess inter-user variability we deployed four independent researchers to analyse tibial datasets from the U266-NSG murine model of myeloma. Using ImageJ, inter-user variability between the bones was substantial (±19.6%), in contrast to using Osteolytica, which demonstrated minimal variability (±0.5%). Secondly, tibial datasets from U266-bearing NSG mice or BALB/c mice injected with the metastatic breast cancer cell line 4T1 were compared to tibial datasets from aged and sex-matched non-tumour control mice. Analyses by both Osteolytica and ImageJ showed significant increases in bone lesion area in tumour-bearing mice compared to control mice. These results confirm that Osteolytica performs as well as the current 2-D ImageJ osteolytic lesion analysis method. However, Osteolytica is advantageous in that it analyses over the entirety of the bone volume (as opposed to selected 2-D images), it

  15. Efficacy of Oral E1210, a New Broad-Spectrum Antifungal with a Novel Mechanism of Action, in Murine Models of Candidiasis, Aspergillosis, and Fusariosis▿

    Science.gov (United States)

    Hata, Katsura; Horii, Takaaki; Miyazaki, Mamiko; Watanabe, Nao-aki; Okubo, Miyuki; Sonoda, Jiro; Nakamoto, Kazutaka; Tanaka, Keigo; Shirotori, Syuji; Murai, Norio; Inoue, Satoshi; Matsukura, Masayuki; Abe, Shinya; Yoshimatsu, Kentaro; Asada, Makoto

    2011-01-01

    E1210 is a first-in-class, broad-spectrum antifungal with a novel mechanism of action—inhibition of fungal glycosylphosphatidylinositol biosynthesis. In this study, the efficacies of E1210 and reference antifungals were evaluated in murine models of oropharyngeal and disseminated candidiasis, pulmonary aspergillosis, and disseminated fusariosis. Oral E1210 demonstrated dose-dependent efficacy in infections caused by Candida species, Aspergillus spp., and Fusarium solani. In the treatment of oropharyngeal candidiasis, E1210 and fluconazole each caused a significantly greater reduction in the number of oral CFU than the control treatment (P candidiasis model, mice treated with E1210, fluconazole, caspofungin, or liposomal amphotericin B showed significantly higher survival rates than the control mice (P candidiasis caused by azole-resistant Candida albicans or Candida tropicalis. A 24-h delay in treatment onset minimally affected the efficacy outcome of E1210 in the treatment of disseminated candidiasis. In the Aspergillus flavus pulmonary aspergillosis model, mice treated with E1210, voriconazole, or caspofungin showed significantly higher survival rates than the control mice (P candidiasis, pulmonary aspergillosis, and disseminated fusariosis. These data suggest that further studies to determine E1210's potential for the treatment of disseminated fungal infections are indicated. PMID:21788462

  16. Efficacy of oral E1210, a new broad-spectrum antifungal with a novel mechanism of action, in murine models of candidiasis, aspergillosis, and fusariosis.

    Science.gov (United States)

    Hata, Katsura; Horii, Takaaki; Miyazaki, Mamiko; Watanabe, Nao-Aki; Okubo, Miyuki; Sonoda, Jiro; Nakamoto, Kazutaka; Tanaka, Keigo; Shirotori, Syuji; Murai, Norio; Inoue, Satoshi; Matsukura, Masayuki; Abe, Shinya; Yoshimatsu, Kentaro; Asada, Makoto

    2011-10-01

    E1210 is a first-in-class, broad-spectrum antifungal with a novel mechanism of action-inhibition of fungal glycosylphosphatidylinositol biosynthesis. In this study, the efficacies of E1210 and reference antifungals were evaluated in murine models of oropharyngeal and disseminated candidiasis, pulmonary aspergillosis, and disseminated fusariosis. Oral E1210 demonstrated dose-dependent efficacy in infections caused by Candida species, Aspergillus spp., and Fusarium solani. In the treatment of oropharyngeal candidiasis, E1210 and fluconazole each caused a significantly greater reduction in the number of oral CFU than the control treatment (P candidiasis model, mice treated with E1210, fluconazole, caspofungin, or liposomal amphotericin B showed significantly higher survival rates than the control mice (P candidiasis caused by azole-resistant Candida albicans or Candida tropicalis. A 24-h delay in treatment onset minimally affected the efficacy outcome of E1210 in the treatment of disseminated candidiasis. In the Aspergillus flavus pulmonary aspergillosis model, mice treated with E1210, voriconazole, or caspofungin showed significantly higher survival rates than the control mice (P candidiasis, pulmonary aspergillosis, and disseminated fusariosis. These data suggest that further studies to determine E1210's potential for the treatment of disseminated fungal infections are indicated.

  17. Efficacy of the broad-spectrum antiviral compound BCX4430 against Zika virus in cell culture and in a mouse model.

    Science.gov (United States)

    Julander, Justin G; Siddharthan, Venkatraman; Evans, Joe; Taylor, Ray; Tolbert, Kelsey; Apuli, Chad; Stewart, Jason; Collins, Preston; Gebre, Makda; Neilson, Skot; Van Wettere, Arnaud; Lee, Young-Min; Sheridan, William P; Morrey, John D; Babu, Y S

    2017-01-01

    Zika virus (ZIKV) is currently undergoing pandemic emergence. While disease is typically subclinical, severe neurologic manifestations in fetuses and newborns after congenital infection underscore an urgent need for antiviral interventions. The adenosine analog BCX4430 has broad-spectrum activity against a wide range of RNA viruses, including potent in vivo activity against yellow fever, Marburg and Ebola viruses. We tested this compound against African and Asian lineage ZIKV in cytopathic effect inhibition and virus yield reduction assays in various cell lines. To further evaluate the efficacy in a relevant animal model, we developed a mouse model of severe ZIKV infection, which recapitulates various human disease manifestations including peripheral virus replication, conjunctivitis, encephalitis and myelitis. Time-course quantification of viral RNA accumulation demonstrated robust viral replication in several relevant tissues, including high and persistent viral loads observed in the brain and testis. The presence of viral RNA in various tissues was confirmed by an infectious culture assay as well as immunohistochemical staining of tissue sections. Treatment of ZIKV-infected mice with BCX4430 significantly improved outcome even when treatment was initiated during the peak of viremia. The demonstration of potent activity of BCX4430 against ZIKV in a lethal mouse model warrant its continued clinical development. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Optimized broad-histogram simulations for strong first-order phase transitions: droplet transitions in the large-Q Potts model

    Science.gov (United States)

    Bauer, Bela; Gull, Emanuel; Trebst, Simon; Troyer, Matthias; Huse, David A.

    2010-01-01

    The numerical simulation of strongly first-order phase transitions has remained a notoriously difficult problem even for classical systems due to the exponentially suppressed (thermal) equilibration in the vicinity of such a transition. In the absence of efficient update techniques, a common approach for improving equilibration in Monte Carlo simulations is broadening the sampled statistical ensemble beyond the bimodal distribution of the canonical ensemble. Here we show how a recently developed feedback algorithm can systematically optimize such broad-histogram ensembles and significantly speed up equilibration in comparison with other extended ensemble techniques such as flat-histogram, multicanonical and Wang-Landau sampling. We simulate, as a prototypical example of a strong first-order transition, the two-dimensional Potts model with up to Q = 250 different states in large systems. The optimized histogram develops a distinct multi-peak structure, thereby resolving entropic barriers and their associated phase transitions in the phase coexistence region—such as droplet nucleation and annihilation, and droplet-strip transitions for systems with periodic boundary conditions. We characterize the efficiency of the optimized histogram sampling by measuring round-trip times τ(N, Q) across the phase transition for samples comprised of N spins. While we find power-law scaling of τ versus N for small Q \\lesssim 50 and N \\lesssim 40^2 , we observe a crossover to exponential scaling for larger Q. These results demonstrate that despite the ensemble optimization, broad-histogram simulations cannot fully eliminate the supercritical slowing down at strongly first-order transitions.

  19. Production process reproducibility and product quality consistency of transient gene expression in HEK293 cells with anti-PD1 antibody as the model protein.

    Science.gov (United States)

    Ding, Kai; Han, Lei; Zong, Huifang; Chen, Junsheng; Zhang, Baohong; Zhu, Jianwei

    2017-03-01

    Demonstration of reproducibility and consistency of process and product quality is one of the most crucial issues in using transient gene expression (TGE) technology for biopharmaceutical development. In this study, we challenged the production consistency of TGE by expressing nine batches of recombinant IgG antibody in human embryonic kidney 293 cells to evaluate reproducibility including viable cell density, viability, apoptotic status, and antibody yield in cell culture supernatant. Product quality including isoelectric point, binding affinity, secondary structure, and thermal stability was assessed as well. In addition, major glycan forms of antibody from different batches of production were compared to demonstrate glycosylation consistency. Glycan compositions of the antibody harvested at different time periods were also measured to illustrate N-glycan distribution over the culture time. From the results, it has been demonstrated that different TGE batches are reproducible from lot to lot in overall cell growth, product yield, and product qualities including isoelectric point, binding affinity, secondary structure, and thermal stability. Furthermore, major N-glycan compositions are consistent among different TGE batches and conserved during cell culture time.

  20. A comparison of simple and realistic eye models for calculation of fluence to dose conversion coefficients in a broad parallel beam incident of protons

    International Nuclear Information System (INIS)

    Sakhaee, Mahmoud; Vejdani-Noghreiyan, Alireza; Ebrahimi-Khankook, Atiyeh

    2015-01-01

    Radiation induced cataract has been demonstrated among people who are exposed to ionizing radiation. To evaluate the deterministic effects of ionizing radiation on the eye lens, several papers dealing with the eye lens dose have been published. ICRP Publication 103 states that the lens of the eye may be more radiosensitive than previously considered. Detailed investigation of the response of the lens showed that there are strong differences in sensitivity to ionizing radiation exposure with respect to cataract induction among the tissues of the lens of the eye. This motivated several groups to look deeper into issue of the dose to a sensitive cell population within the lens, especially for radiations with low energy penetrability that have steep dose gradients inside the lens. Two sophisticated mathematical models of the eye including the inner structure have been designed for the accurate dose estimation in recent years. This study focuses on the calculations of the absorbed doses of different parts of the eye using the stylized models located in UF-ORNL phantom and comparison with the data calculated with the reference computational phantom in a broad parallel beam incident of protons with energies between 20 MeV and 10 GeV. The obtained results indicate that the total lens absorbed doses of reference phantom has good compliance with those of the more sensitive regions of stylized models. However, total eye absorbed dose of these models greatly differ with each other for lower energies. - Highlights: • The validation of reference data for the eye was studied for proton exposures. • Two real mathematical models of the eye were imported into the UF-ORNL phantom. • Fluence to dose conversion coefficients were calculated for different eye sections. • Obtained Results were compared with that of assessed by ICRP adult male phantom

  1. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  2. The anomalous tides near Broad Sound

    Science.gov (United States)

    Middleton, Jason H.; Buchwald, V. T.; Huthnance, John M.

    Observations of tidal current and height, in conjunction with theoretical mathematical models are used to investigate the propagation of the tide near Broad Sound, a narrowing estuary situated on a wide section of continental shelf toward the southern end of the Great Barrier Reef. The observations indicate that the dense offshore reefs severely inhibit tidal flow, with the result that tides flood toward Broad Sound from the north and from the south, along the main lagoon. There is a local magnification of the semi-diurnal tides within Broad Sound itself. Models of flow across reefs confirm the effectiveness of dense, shallow, and broad reefs in acting as a barrier to the tide. The diffraction of tides through large gaps in the reef is modelled using conformal mapping techniques and with the inclusion of energy leakage, the diffraction model predicts magnification of the semi-diurnal tidal heights by a factor of about 4 and a phase lag of 3 h on the shelf near Broad Sound, these values being consistent with observation. The observed convergence of the tide close to, and within Broad Sound itself is consistent with the proximity of the semi-diurnal tidal period to the natural period for flow in Broad Sound, considered as a narrowing estuary. This results in further amplification, by an additional factor of about 1.5, so that the tides in Broad Sound are increased by a factor of between 5 and 6, altogether, compared with those elsewhere on the east Australian coast.

  3. Global, broad, or specific cognitive differences? Using a MIMIC model to examine differences in CHC abilities in children with learning disabilities.

    Science.gov (United States)

    Niileksela, Christopher R; Reynolds, Matthew R

    2014-01-01

    This study was designed to better understand the relations between learning disabilities and different levels of latent cognitive abilities, including general intelligence (g), broad cognitive abilities, and specific abilities based on the Cattell-Horn-Carroll theory of intelligence (CHC theory). Data from the Differential Ability Scales-Second Edition (DAS-II) were used to create a multiple-indicator multiple cause model to examine the latent mean differences in cognitive abilities between children with and without learning disabilities in reading (LD reading), math (LD math), and reading and writing(LD reading and writing). Statistically significant differences were found in the g factor between the norm group and the LD groups. After controlling for differences in g, the LD reading and LD reading and writing groups showed relatively lower latent processing speed, and the LD math group showed relatively higher latent comprehension-knowledge. There were also some differences in some specific cognitive abilities, including lower scores in spatial relations and numerical facility for the LD math group, and lower scores in visual memory for the LD reading and writing group. These specific mean differences were above and beyond any differences in the latent cognitive factor means.

  4. Bacteriophage ΦSA012 Has a Broad Host Range against Staphylococcus aureus and Effective Lytic Capacity in a Mouse Mastitis Model

    Directory of Open Access Journals (Sweden)

    Hidetomo Iwano

    2018-01-01

    Full Text Available Bovine mastitis is an inflammation of the mammary gland caused by bacterial infection in dairy cattle. It is the most costly disease in the dairy industry because of the high use of antibiotics. Staphylococcus aureus is one of the major causative agents of bovine mastitis and antimicrobial resistance. Therefore, new strategies to control bacterial infection are required in the dairy industry. One potential strategy is bacteriophage (phage therapy. In the present study, we examined the host range of previously isolated S. aureus phages ΦSA012 and ΦSA039 against S. aureus strains isolated from mastitic cows. These phages could kill all S. aureus (93 strains from 40 genotypes and methicillin-resistant S. aureus (six strains from six genotypes strains tested. Using a mouse mastitis model, we demonstrated that ΦSA012 reduced proliferation of S. aureus and inflammation in the mammary gland. Furthermore, intravenous or intraperitoneal phage administration reduced proliferation of S. aureus in the mammary glands. These results suggest that broad host range phages ΦSA012 is potential antibacterial agents for dairy production medicine.

  5. Enhancing reproducibility: Failures from Reproducibility Initiatives underline core challenges.

    Science.gov (United States)

    Mullane, Kevin; Williams, Michael

    2017-08-15

    Efforts to address reproducibility concerns in biomedical research include: initiatives to improve journal publication standards and peer review; increased attention to publishing methodological details that enable experiments to be reconstructed; guidelines on standards for study design, implementation, analysis and execution; meta-analyses of multiple studies within a field to synthesize a common conclusion and; the formation of consortia to adopt uniform protocols and internally reproduce data. Another approach to addressing reproducibility are Reproducibility Initiatives (RIs), well-intended, high-profile, systematically peer-vetted initiatives that are intended to replace the traditional process of scientific self-correction. Outcomes from the RIs reported to date have questioned the usefulness of this approach, particularly when the RI outcome differs from other independent self-correction studies that have reproduced the original finding. As a failed RI attempt is a single outcome distinct from the original study, it cannot provide any definitive conclusions necessitating additional studies that the RI approach has neither the ability nor intent of conducting making it a questionable replacement for self-correction. A failed RI attempt also has the potential to damage the reputation of the author of the original finding. Reproduction is frequently confused with replication, an issue that is more than semantic with the former denoting "similarity" and the latter an "exact copy" - an impossible outcome in research because of known and unknown technical, environmental and motivational differences between the original and reproduction studies. To date, the RI framework has negatively impacted efforts to improve reproducibility, confounding attempts to determine whether a research finding is real. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Modeling biophysical properties of broad-leaved stands in the hyrcanian forests of Iran using fused airborne laser scanner data and ultraCam-D images

    Science.gov (United States)

    Mohammadi, Jahangir; Shataee, Shaban; Namiranian, Manochehr; Næsset, Erik

    2017-09-01

    Inventories of mixed broad-leaved forests of Iran mainly rely on terrestrial measurements. Due to rapid changes and disturbances and great complexity of the silvicultural systems of these multilayer forests, frequent repetition of conventional ground-based plot surveys is often cost prohibitive. Airborne laser scanning (ALS) and multispectral data offer an alternative or supplement to conventional inventories in the Hyrcanian forests of Iran. In this study, the capability of a combination of ALS and UltraCam-D data to model stand volume, tree density, and basal area using random forest (RF) algorithm was evaluated. Systematic sampling was applied to collect field plot data on a 150 m × 200 m sampling grid within a 1100 ha study area located at 36°38‧- 36°42‧N and 54°24‧-54°25‧E. A total of 308 circular plots (0.1 ha) were measured for calculation of stand volume, tree density, and basal area per hectare. For each plot, a set of variables was extracted from both ALS and multispectral data. The RF algorithm was used for modeling of the biophysical properties using ALS and UltraCam-D data separately and combined. The results showed that combining the ALS data and UltraCam-D images provided a slight increase in prediction accuracy compared to separate modeling. The RMSE as percentage of the mean, the mean difference between observed and predicted values, and standard deviation of the differences using a combination of ALS data and UltraCam-D images in an independent validation at 0.1-ha plot level were 31.7%, 1.1%, and 84 m3 ha-1 for stand volume; 27.2%, 0.86%, and 6.5 m2 ha-1 for basal area, and 35.8%, -4.6%, and 77.9 n ha-1 for tree density, respectively. Based on the results, we conclude that fusion of ALS and UltraCam-D data may be useful for modeling of stand volume, basal area, and tree density and thus gain insights into structural characteristics in the complex Hyrcanian forests.

  7. A method to isolate bacterial communities and characterize ecosystems from food products: Validation and utilization in as a reproducible chicken meat model.

    Science.gov (United States)

    Rouger, Amélie; Remenant, Benoit; Prévost, Hervé; Zagorec, Monique

    2017-04-17

    Influenced by production and storage processes and by seasonal changes the diversity of meat products microbiota can be very variable. Because microbiotas influence meat quality and safety, characterizing and understanding their dynamics during processing and storage is important for proposing innovative and efficient storage conditions. Challenge tests are usually performed using meat from the same batch, inoculated at high levels with one or few strains. Such experiments do not reflect the true microbial situation, and the global ecosystem is not taken into account. Our purpose was to constitute live stocks of chicken meat microbiotas to create standard and reproducible ecosystems. We searched for the best method to collect contaminating bacterial communities from chicken cuts to store as frozen aliquots. We tested several methods to extract DNA of these stored communities for subsequent PCR amplification. We determined the best moment to collect bacteria in sufficient amounts during the product shelf life. Results showed that the rinsing method associated to the use of Mobio DNA extraction kit was the most reliable method to collect bacteria and obtain DNA for subsequent PCR amplification. Then, 23 different chicken meat microbiotas were collected using this procedure. Microbiota aliquots were stored at -80°C without important loss of viability. Their characterization by cultural methods confirmed the large variability (richness and abundance) of bacterial communities present on chicken cuts. Four of these bacterial communities were used to estimate their ability to regrow on meat matrices. Challenge tests performed on sterile matrices showed that these microbiotas were successfully inoculated and could overgrow the natural microbiota of chicken meat. They can therefore be used for performing reproducible challenge tests mimicking a true meat ecosystem and enabling the possibility to test the influence of various processing or storage conditions on complex meat

  8. The Revised Child Anxiety and Depression Scale-Short Version: scale reduction via exploratory bifactor modeling of the broad anxiety factor.

    Science.gov (United States)

    Ebesutani, Chad; Reise, Steven P; Chorpita, Bruce F; Ale, Chelsea; Regan, Jennifer; Young, John; Higa-McMillan, Charmaine; Weisz, John R

    2012-12-01

    Using a school-based (N = 1,060) and clinic-referred (N = 303) youth sample, the authors developed a 25-item shortened version of the Revised Child Anxiety and Depression Scale (RCADS) using Schmid-Leiman exploratory bifactor analysis to reduce client burden and administration time and thus improve the transportability characteristics of this youth anxiety and depression measure. Results revealed that all anxiety items primarily reflected a single "broad anxiety" dimension, which informed the development of a reduced 15-item Anxiety Total scale. Although specific DSM-oriented anxiety subscales were not included in this version, the items comprising the Anxiety Total scale were evenly pulled from the 5 anxiety-related content domains from the original RCADS. The resultant 15-item Anxiety Total scale evidenced significant correspondence with anxiety diagnostic groups based on structured clinical interviews. The scores from the 10-item Depression Total scale (retained from the original version) were also associated with acceptable reliability in the clinic-referred and school-based samples (α = .80 and .79, respectively); this is in contrast to the alternate 5-item shortened RCADS Depression Total scale previously developed by Muris, Meesters, and Schouten (2002), which evidenced depression scores of unacceptable reliability (α = .63). The shortened RCADS developed in the present study thus balances efficiency, breadth, and scale score reliability in a way that is potentially useful for repeated measurement in clinical settings as well as wide-scale screenings that assess anxiety and depressive problems. These future applications are discussed, as are recommendations for continued use of exploratory bifactor modeling in scale development.

  9. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  10. European cold winter 2009-2010: How unusual in the instrumental record and how reproducible in the ARPEGE-Climat model?

    Science.gov (United States)

    Ouzeau, G.; Cattiaux, J.; Douville, H.; Ribes, A.; Saint-Martin, D.

    2011-06-01

    Boreal winter 2009-2010 made headlines for cold anomalies in many countries of the northern mid-latitudes. Northern Europe was severely hit by this harsh winter in line with a record persistence of the negative phase of the North Atlantic Oscillation (NAO). In the present study, we first provide a wider perspective on how unusual this winter was by using the recent 20th Century Reanalysis. A weather regime analysis shows that the frequency of the negative NAO was unprecedented since winter 1939-1940, which is then used as a dynamical analog of winter 2009-2010 to demonstrate that the latter might have been much colder without the background global warming observed during the twentieth century. We then use an original nudging technique in ensembles of global atmospheric simulations driven by observed sea surface temperature (SST) and radiative forcings to highlight the relevance of the stratosphere for understanding if not predicting such anomalous winter seasons. Our results demonstrate that an improved representation of the lower stratosphere is necessary to reproduce not only the seasonal mean negative NAO signal, but also its intraseasonal distribution and the corresponding increased probability of cold waves over northern Europe.

  11. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0....

  12. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0....

  13. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  14. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  15. Global, Broad, or Specific Cognitive Differences? Using a MIMIC Model to Examine Differences in CHC Abilities in Children with Learning Disabilities

    Science.gov (United States)

    Niileksela, Christopher R.; Reynolds, Matthew R.

    2014-01-01

    This study was designed to better understand the relations between learning disabilities and different levels of latent cognitive abilities, including general intelligence (g), broad cognitive abilities, and specific abilities based on the Cattell-Horn-Carroll theory of intelligence (CHC theory). Data from the "Differential Ability…

  16. Evaluation of Land Surface Models in Reproducing Satellite-Derived LAI over the High-Latitude Northern Hemisphere. Part I: Uncoupled DGVMs

    Directory of Open Access Journals (Sweden)

    Ning Zeng

    2013-10-01

    Full Text Available Leaf Area Index (LAI represents the total surface area of leaves above a unit area of ground and is a key variable in any vegetation model, as well as in climate models. New high resolution LAI satellite data is now available covering a period of several decades. This provides a unique opportunity to validate LAI estimates from multiple vegetation models. The objective of this paper is to compare new, satellite-derived LAI measurements with modeled output for the Northern Hemisphere. We compare monthly LAI output from eight land surface models from the TRENDY compendium with satellite data from an Artificial Neural Network (ANN from the latest version (third generation of GIMMS AVHRR NDVI data over the period 1986–2005. Our results show that all the models overestimate the mean LAI, particularly over the boreal forest. We also find that seven out of the eight models overestimate the length of the active vegetation-growing season, mostly due to a late dormancy as a result of a late summer phenology. Finally, we find that the models report a much larger positive trend in LAI over this period than the satellite observations suggest, which translates into a higher trend in the growing season length. These results highlight the need to incorporate a larger number of more accurate plant functional types in all models and, in particular, to improve the phenology of deciduous trees.

  17. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    from the quantified attributes predict overall preference well. The findings allow for some generalizations within musical program genres regarding the perception of and preference for certain spatial reproduction modes, but for limited generalizations across selections from different musical genres.......A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes...

  18. The Revised Child Anxiety and Depression Scale-Short Version: Scale reduction via exploratory bifactor modeling of the broad anxiety factor.

    OpenAIRE

    Ebesutani, Chad; Reise, Steven P.; Chorpita, Bruce F.; Ale, Chelsea; Regan, Jennifer; Young, John; Higa-McMillan, Charmaine; Weisz, John R

    2012-01-01

    Using a school-based (N = 1,060) and clinic-referred (N = 303) youth sample, the authors developed a 25-item shortened version of the Revised Child Anxiety and Depression Scale (RCADS) using Schmid-Leiman exploratory bifactor analysis to reduce client burden and administration time and thus improve the transportability characteristics of this youth anxiety and depression measure. Results revealed that all anxiety items primarily reflected a single “broad anxiety” dimension, which informed the...

  19. Broad band exciplex dye lasers

    International Nuclear Information System (INIS)

    Dienes, A.; Shank, C.V.; Trozzolo, A.M.

    1975-01-01

    The disclosure is concerned with exciplex dye lasers, i.e., lasers in which the emitting species is a complex formed only from a constituent in an electronically excited state. Noting that an exciplex laser, favorable from the standpoint of broad tunability, results from a broad shift in the peak emission wavelength for the exciplex relative to the unreacted species, a desirable class resulting in such broad shift is described. Preferred classes of laser media utilizing specified resonant molecules are set forth. (auth)

  20. How well do environmental archives of atmospheric mercury deposition in the Arctic reproduce rates and trends depicted by atmospheric models and measurements?

    Science.gov (United States)

    Goodsite, M E; Outridge, P M; Christensen, J H; Dastoor, A; Muir, D; Travnikov, O; Wilson, S

    2013-05-01

    This review compares the reconstruction of atmospheric Hg deposition rates and historical trends over recent decades in the Arctic, inferred from Hg profiles in natural archives such as lake and marine sediments, peat bogs and glacial firn (permanent snowpack), against those predicted by three state-of-the-art atmospheric models based on global Hg emission inventories from 1990 onwards. Model veracity was first tested against atmospheric Hg measurements. Most of the natural archive and atmospheric data came from the Canadian-Greenland sectors of the Arctic, whereas spatial coverage was poor in other regions. In general, for the Canadian-Greenland Arctic, models provided good agreement with atmospheric gaseous elemental Hg (GEM) concentrations and trends measured instrumentally. However, there are few instrumented deposition data with which to test the model estimates of Hg deposition, and these data suggest models over-estimated deposition fluxes under Arctic conditions. Reconstructed GEM data from glacial firn on Greenland Summit showed the best agreement with the known decline in global Hg emissions after about 1980, and were corroborated by archived aerosol filter data from Resolute, Nunavut. The relatively stable or slowly declining firn and model GEM trends after 1990 were also corroborated by real-time instrument measurements at Alert, Nunavut, after 1995. However, Hg fluxes and trends in northern Canadian lake sediments and a southern Greenland peat bog did not exhibit good agreement with model predictions of atmospheric deposition since 1990, the Greenland firn GEM record, direct GEM measurements, or trends in global emissions since 1980. Various explanations are proposed to account for these discrepancies between atmosphere and archives, including problems with the accuracy of archive chronologies, climate-driven changes in Hg transfer rates from air to catchments, waters and subsequently into sediments, and post-depositional diagenesis in peat bogs

  1. Giant Broad Line Regions in Dwarf Seyferts

    Indian Academy of Sciences (India)

    one can determine the inner and outer radii of the BLRs by modeling the ... The physics behind the production of broad emission lines in active galactic nuclei ... shapes. The inner radius of the volume defines the full velocity width at zero intensity of the model emission line, and the outer radius of the volume defines the ...

  2. Chow-Liu trees are sufficient predictive models for reproducing key features of functional networks of periictal EEG time-series.

    Science.gov (United States)

    Steimer, Andreas; Zubler, Frédéric; Schindler, Kaspar

    2015-09-01

    Seizure freedom in patients suffering from pharmacoresistant epilepsies is still not achieved in 20-30% of all cases. Hence, current therapies need to be improved, based on a more complete understanding of ictogenesis. In this respect, the analysis of functional networks derived from intracranial electroencephalographic (iEEG) data has recently become a standard tool. Functional networks however are purely descriptive models and thus are conceptually unable to predict fundamental features of iEEG time-series, e.g., in the context of therapeutical brain stimulation. In this paper we present some first steps towards overcoming the limitations of functional network analysis, by showing that its results are implied by a simple predictive model of time-sliced iEEG time-series. More specifically, we learn distinct graphical models (so called Chow-Liu (CL) trees) as models for the spatial dependencies between iEEG signals. Bayesian inference is then applied to the CL trees, allowing for an analytic derivation/prediction of functional networks, based on thresholding of the absolute value Pearson correlation coefficient (CC) matrix. Using various measures, the thus obtained networks are then compared to those which were derived in the classical way from the empirical CC-matrix. In the high threshold limit we find (a) an excellent agreement between the two networks and (b) key features of periictal networks as they have previously been reported in the literature. Apart from functional networks, both matrices are also compared element-wise, showing that the CL approach leads to a sparse representation, by setting small correlations to values close to zero while preserving the larger ones. Overall, this paper shows the validity of CL-trees as simple, spatially predictive models for periictal iEEG data. Moreover, we suggest straightforward generalizations of the CL-approach for modeling also the temporal features of iEEG signals. Copyright © 2015 Elsevier Inc. All rights

  3. Theoretical Modeling and Computer Simulations for the Origins and Evolution of Reproducing Molecular Systems and Complex Systems with Many Interactive Parts

    Science.gov (United States)

    Liang, Shoudan

    2000-01-01

    Our research effort has produced nine publications in peer-reviewed journals listed at the end of this report. The work reported here are in the following areas: (1) genetic network modeling; (2) autocatalytic model of pre-biotic evolution; (3) theoretical and computational studies of strongly correlated electron systems; (4) reducing thermal oscillations in atomic force microscope; (5) transcription termination mechanism in prokaryotic cells; and (6) the low glutamine usage in thennophiles obtained by studying completely sequenced genomes. We discuss the main accomplishments of these publications.

  4. Adjustments in the Almod 3W2 code models for reproducing the net load trip test in Angra I nuclear power plant

    International Nuclear Information System (INIS)

    Camargo, C.T.M.; Madeira, A.A.; Pontedeiro, A.C.; Dominguez, L.

    1986-09-01

    The recorded traces got from the net load trip test in Angra I NPP yelded the oportunity to make fine adjustments in the ALMOD 3W2 code models. The changes are described and the results are compared against plant real data. (Author) [pt

  5. An International Ki67 Reproducibility Study

    Science.gov (United States)

    2013-01-01

    Background In breast cancer, immunohistochemical assessment of proliferation using the marker Ki67 has potential use in both research and clinical management. However, lack of consistency across laboratories has limited Ki67’s value. A working group was assembled to devise a strategy to harmonize Ki67 analysis and increase scoring concordance. Toward that goal, we conducted a Ki67 reproducibility study. Methods Eight laboratories received 100 breast cancer cases arranged into 1-mm core tissue microarrays—one set stained by the participating laboratory and one set stained by the central laboratory, both using antibody MIB-1. Each laboratory scored Ki67 as percentage of positively stained invasive tumor cells using its own method. Six laboratories repeated scoring of 50 locally stained cases on 3 different days. Sources of variation were analyzed using random effects models with log2-transformed measurements. Reproducibility was quantified by intraclass correlation coefficient (ICC), and the approximate two-sided 95% confidence intervals (CIs) for the true intraclass correlation coefficients in these experiments were provided. Results Intralaboratory reproducibility was high (ICC = 0.94; 95% CI = 0.93 to 0.97). Interlaboratory reproducibility was only moderate (central staining: ICC = 0.71, 95% CI = 0.47 to 0.78; local staining: ICC = 0.59, 95% CI = 0.37 to 0.68). Geometric mean of Ki67 values for each laboratory across the 100 cases ranged 7.1% to 23.9% with central staining and 6.1% to 30.1% with local staining. Factors contributing to interlaboratory discordance included tumor region selection, counting method, and subjective assessment of staining positivity. Formal counting methods gave more consistent results than visual estimation. Conclusions Substantial variability in Ki67 scoring was observed among some of the world’s most experienced laboratories. Ki67 values and cutoffs for clinical decision-making cannot be transferred between laboratories without

  6. Systematic Methodology for Reproducible Optimizing Batch Operation

    DEFF Research Database (Denmark)

    Bonné, Dennis; Jørgensen, Sten Bay

    2006-01-01

    contribution furthermore presents how the asymptotic convergence of Iterative Learning Control is combined with the closed-loop performance of Model Predictive Control to form a robust and asymptotically stable optimal controller for ensuring reliable and reproducible operation of batch processes....... This controller may also be used for Optimizing control. The modeling and control performance is demonstrated on a fed-batch protein cultivation example. The presented methodologies lend themselves directly for application as Process Analytical Technologies (PAT).......This contribution presents a systematic methodology for rapid acquirement of discrete-time state space model representations of batch processes based on their historical operation data. These state space models are parsimoniously parameterized as a set of local, interdependent models. The present...

  7. Simulation of the hydrodynamic conditions of the eye to better reproduce the drug release from hydrogel contact lenses: experiments and modeling.

    Science.gov (United States)

    Pimenta, A F R; Valente, A; Pereira, J M C; Pereira, J C F; Filipe, H P; Mata, J L G; Colaço, R; Saramago, B; Serro, A P

    2016-12-01

    Currently, most in vitro drug release studies for ophthalmic applications are carried out in static sink conditions. Although this procedure is simple and useful to make comparative studies, it does not describe adequately the drug release kinetics in the eye, considering the small tear volume and flow rates found in vivo. In this work, a microfluidic cell was designed and used to mimic the continuous, volumetric flow rate of tear fluid and its low volume. The suitable operation of the cell, in terms of uniformity and symmetry of flux, was proved using a numerical model based in the Navier-Stokes and continuity equations. The release profile of a model system (a hydroxyethyl methacrylate-based hydrogel (HEMA/PVP) for soft contact lenses (SCLs) loaded with diclofenac) obtained with the microfluidic cell was compared with that obtained in static conditions, showing that the kinetics of release in dynamic conditions is slower. The application of the numerical model demonstrated that the designed cell can be used to simulate the drug release in the whole range of the human eye tear film volume and allowed to estimate the drug concentration in the volume of liquid in direct contact with the hydrogel. The knowledge of this concentration, which is significantly different from that measured in the experimental tests during the first hours of release, is critical to predict the toxicity of the drug release system and its in vivo efficacy. In conclusion, the use of the microfluidic cell in conjunction with the numerical model shall be a valuable tool to design and optimize new therapeutic drug-loaded SCLs.

  8. Reproducibility of isotope ratio measurements

    International Nuclear Information System (INIS)

    Elmore, D.

    1981-01-01

    The use of an accelerator as part of a mass spectrometer has improved the sensitivity for measuring low levels of long-lived radionuclides by several orders of magnitude. However, the complexity of a large tandem accelerator and beam transport system has made it difficult to match the precision of low energy mass spectrometry. Although uncertainties for accelerator measured isotope ratios as low as 1% have been obtained under favorable conditions, most errors quoted in the literature for natural samples are in the 5 to 20% range. These errors are dominated by statistics and generally the reproducibility is unknown since the samples are only measured once

  9. Integrating Hot and Cool Intelligences: Thinking Broadly about Broad Abilities

    Directory of Open Access Journals (Sweden)

    W. Joel Schneider

    2016-01-01

    Full Text Available Although results from factor-analytic studies of the broad, second-stratum abilities of human intelligence have been fairly consistent for decades, the list of broad abilities is far from complete, much less understood. We propose criteria by which the list of broad abilities could be amended and envision alternatives for how our understanding of the hot intelligences (abilities involving emotionally-salient information and cool intelligences (abilities involving perceptual processing and logical reasoning might be integrated into a coherent theoretical framework.

  10. Immortalized keratinocytes derived from patients with epidermolytic ichthyosis reproduce the disease phenotype: a useful in vitro model for testing new treatments.

    Science.gov (United States)

    Chamcheu, J C; Pihl-Lundin, I; Mouyobo, C E; Gester, T; Virtanen, M; Moustakas, A; Navsaria, H; Vahlquist, A; Törmä, H

    2011-02-01

    Epidermolytic ichthyosis (EI) is a skin fragility disorder caused by mutations in genes encoding suprabasal keratins 1 and 10. While the aetiology of EI is known, model systems are needed for pathophysiological studies and development of novel therapies. To generate immortalized keratinocyte lines from patients with EI for studies of EI cell pathology and the effects of chemical chaperones as putative therapies. We derived keratinocytes from three patients with EI and one healthy control and established immortalized keratinocytes using human papillomavirus 16-E6/E7. Growth and differentiation characteristics, ability to regenerate organotypic epidermis, keratin expression, formation of cytoskeletal aggregates, and responses to heat shock and chemical chaperones were assessed. The cell lines EH11 (K1_p.Val176_Lys197del), EH21 (K10_p.156Arg>Gly), EH31 (K10_p.Leu161_Asp162del) and NKc21 (wild-type) currently exceed 160 population doublings and differentiate when exposed to calcium. At resting state, keratin aggregates were detected in 9% of calcium-differentiated EH31 cells, but not in any other cell line. Heat stress further increased this proportion to 30% and also induced aggregates in 3% of EH11 cultures. Treatment with trimethylamine N-oxide and 4-phenylbutyrate (4-PBA) reduced the fraction of aggregate-containing cells and affected the mRNA expression of keratins 1 and 10 while 4-PBA also modified heat shock protein 70 (HSP70) expression. Furthermore, in situ proximity ligation assay suggested a colocalization between HSP70 and keratins 1 and 10. Reconstituted epidermis from EI cells cornified but EH21 and EH31 cells produced suprabasal cytolysis, closely resembling the in vivo phenotype. These immortalized cell lines represent a useful model for studying EI biology and novel therapies. © 2011 The Authors. BJD © 2011 British Association of Dermatologists.

  11. A Mouse Model That Reproduces the Developmental Pathways and Site Specificity of the Cancers Associated With the Human BRCA1 Mutation Carrier State

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2015-10-01

    Full Text Available Predisposition to breast and extrauterine Müllerian carcinomas in BRCA1 mutation carriers is due to a combination of cell-autonomous consequences of BRCA1 inactivation on cell cycle homeostasis superimposed on cell-nonautonomous hormonal factors magnified by the effects of BRCA1 mutations on hormonal changes associated with the menstrual cycle. We used the Müllerian inhibiting substance type 2 receptor (Mis2r promoter and a truncated form of the Follicle stimulating hormone receptor (Fshr promoter to introduce conditional knockouts of Brca1 and p53 not only in mouse mammary and Müllerian epithelia, but also in organs that control the estrous cycle. Sixty percent of the double mutant mice developed invasive Müllerian and mammary carcinomas. Mice carrying heterozygous mutations in Brca1 and p53 also developed invasive tumors, albeit at a lesser (30% rate, in which the wild type alleles were no longer present due to loss of heterozygosity. While mice carrying heterozygous mutations in both genes developed mammary tumors, none of the mice carrying only a heterozygous p53 mutation developed such tumors (P < 0.0001, attesting to a role for Brca1 mutations in tumor development. This mouse model is attractive to investigate cell-nonautonomous mechanisms associated with cancer predisposition in BRCA1 mutation carriers and to investigate the merit of chemo-preventive drugs targeting such mechanisms.

  12. Conserved synthetic peptides from the hemagglutinin of influenza viruses induce broad humoral and T-cell responses in a pig model.

    Directory of Open Access Journals (Sweden)

    Júlia Vergara-Alert

    Full Text Available Outbreaks involving either H5N1 or H1N1 influenza viruses (IV have recently become an increasing threat to cause potential pandemics. Pigs have an important role in this aspect. As reflected in the 2009 human H1N1 pandemia, they may act as a vehicle for mixing and generating new assortments of viruses potentially pathogenic to animals and humans. Lack of universal vaccines against the highly variable influenza virus forces scientists to continuously design vaccines à la carte, which is an expensive and risky practice overall when dealing with virulent strains. Therefore, we focused our efforts on developing a broadly protective influenza vaccine based on the Informational Spectrum Method (ISM. This theoretical prediction allows the selection of highly conserved peptide sequences from within the hemagglutinin subunit 1 protein (HA1 from either H5 or H1 viruses which are located in the flanking region of the HA binding site and with the potential to elicit broader immune responses than conventional vaccines. Confirming the theoretical predictions, immunization of conventional farm pigs with the synthetic peptides induced humoral responses in every single pig. The fact that the induced antibodies were able to recognize in vitro heterologous influenza viruses such as the pandemic H1N1 virus (pH1N1, two swine influenza field isolates (SwH1N1 and SwH3N2 and a H5N1 highly pathogenic avian virus, confirm the broad recognition of the antibodies induced. Unexpectedly, all pigs also showed T-cell responses that not only recognized the specific peptides, but also the pH1N1 virus. Finally, a partial effect on the kinetics of virus clearance was observed after the intranasal infection with the pH1N1 virus, setting forth the groundwork for the design of peptide-based vaccines against influenza viruses. Further insights into the understanding of the mechanisms involved in the protection afforded will be necessary to optimize future vaccine formulations.

  13. Evaluation of the agonist PET radioligand [¹¹C]GR103545 to image kappa opioid receptor in humans: kinetic model selection, test-retest reproducibility and receptor occupancy by the antagonist PF-04455242.

    Science.gov (United States)

    Naganawa, Mika; Jacobsen, Leslie K; Zheng, Ming-Qiang; Lin, Shu-Fei; Banerjee, Anindita; Byon, Wonkyung; Weinzimmer, David; Tomasi, Giampaolo; Nabulsi, Nabeel; Grimwood, Sarah; Badura, Lori L; Carson, Richard E; McCarthy, Timothy J; Huang, Yiyun

    2014-10-01

    Kappa opioid receptors (KOR) are implicated in several brain disorders. In this report, a first-in-human positron emission tomography (PET) study was conducted with the potent and selective KOR agonist tracer, [(11)C]GR103545, to determine an appropriate kinetic model for analysis of PET imaging data and assess the test-retest reproducibility of model-derived binding parameters. The non-displaceable distribution volume (V(ND)) was estimated from a blocking study with naltrexone. In addition, KOR occupancy of PF-04455242, a selective KOR antagonist that is active in preclinical models of depression, was also investigated. For determination of a kinetic model and evaluation of test-retest reproducibility, 11 subjects were scanned twice with [(11)C]GR103545. Seven subjects were scanned before and 75 min after oral administration of naltrexone (150 mg). For the KOR occupancy study, six subjects were scanned at baseline and 1.5 h and 8 h after an oral dose of PF-04455242 (15 mg, n=1 and 30 mg, n=5). Metabolite-corrected arterial input functions were measured and all scans were 150 min in duration. Regional time-activity curves (TACs) were analyzed with 1- and 2-tissue compartment models (1TC and 2TC) and the multilinear analysis (MA1) method to derive regional volume of distribution (V(T)). Relative test-retest variability (TRV), absolute test-retest variability (aTRV) and intra-class coefficient (ICC) were calculated to assess test-retest reproducibility of regional VT. Occupancy plots were computed for blocking studies to estimate occupancy and V(ND). The half maximal inhibitory concentration (IC50) of PF-04455242 was determined from occupancies and drug concentrations in plasma. [(11)C]GR103545 in vivo K(D) was also estimated. Regional TACs were well described by the 2TC model and MA1. However, 2TC VT was sometimes estimated with high standard error. Thus MA1 was the model of choice. Test-retest variability was ~15%, depending on the outcome measure. The blocking

  14. A Model of Compound Heterozygous, Loss-of-Function Alleles Is Broadly Consistent with Observations from Complex-Disease GWAS Datasets.

    Directory of Open Access Journals (Sweden)

    Jaleal S Sanjak

    2017-01-01

    Full Text Available The genetic component of complex disease risk in humans remains largely unexplained. A corollary is that the allelic spectrum of genetic variants contributing to complex disease risk is unknown. Theoretical models that relate population genetic processes to the maintenance of genetic variation for quantitative traits may suggest profitable avenues for future experimental design. Here we use forward simulation to model a genomic region evolving under a balance between recurrent deleterious mutation and Gaussian stabilizing selection. We consider multiple genetic and demographic models, and several different methods for identifying genomic regions harboring variants associated with complex disease risk. We demonstrate that the model of gene action, relating genotype to phenotype, has a qualitative effect on several relevant aspects of the population genetic architecture of a complex trait. In particular, the genetic model impacts genetic variance component partitioning across the allele frequency spectrum and the power of statistical tests. Models with partial recessivity closely match the minor allele frequency distribution of significant hits from empirical genome-wide association studies without requiring homozygous effect sizes to be small. We highlight a particular gene-based model of incomplete recessivity that is appealing from first principles. Under that model, deleterious mutations in a genomic region partially fail to complement one another. This model of gene-based recessivity predicts the empirically observed inconsistency between twin and SNP based estimated of dominance heritability. Furthermore, this model predicts considerable levels of unexplained variance associated with intralocus epistasis. Our results suggest a need for improved statistical tools for region based genetic association and heritability estimation.

  15. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  16. Estimating Poromechanical and Hydraulic Properties of Fractured Media Aquifers Using a Model of the Aquifer at Ploemeur France: Broad Applications and Future Uses

    Science.gov (United States)

    Wilson, M. W.; Burbey, T. J.

    2017-12-01

    Aquifers in fractured crystalline bedrock are located over half of the earth's surface and are vital civil and economic resources particularly in places where ample, safe surface water is not available. With fractured media aquifers providing large percentages of water for municipal, industrial, and agricultural use in many regions of the world. Distinguishing sustainable quantities of extraction is of paramount importance to the continuing viability of these important resources and the communities they serve. The fractured and faulted crystalline-rock aquifer system supporting the community of Ploemeur France has been providing one million cubic meters of water annually, resulting in a modest long-term drawdown of about 15m. To understand the sources and mechanisms of recharge that support this aquifer system, a three-dimensional ABAQUS model was developed using known geologic, water-level and geodetic (tiltmeters and GPS) data to simulate the natural aquifer system that is dominated by a permeable sub-vertical fault and an intersecting semi-horizontal contact zone. The model is used to constrain the poromechanical properties of the fault and contact zones relative to the host crystalline rocks and overlying saprolite by taking advantage of the tilt and seasonal GPS responses caused by municipal pumping along with water-level data for the area. A chief goal in this modeling effort is to assess the sources of recharge to this aquifer system that is atypically productive for a crystalline-rock setting. Preliminary results suggest that the source of water supplying this community is a combination of rapid localized recharge through the saprolite and fault zone and recharge along the contact zone, both from the north (older water) and where it is exposed to the south (younger water). The modeling effort also shows the importance of combining GPS and surface tiltmeter data with water-level measurements for constraining the properties of this complex aquifer system and

  17. Broad Bandwidth or High Fidelity? Evidence from the Structure of Genetic and Environmental Effects on the Facets of the Five Factor Model

    Science.gov (United States)

    Briley, Daniel A.; Tucker-Drob, Elliot M.

    2017-01-01

    The Five Factor Model (FFM) of personality is well-established at the phenotypic level, but much less is known about the coherence of the genetic and environmental influences within each personality domain. Univariate behavioral genetic analyses have consistently found the influence of additive genes and nonshared environment on multiple personality facets, but the extent to which genetic and environmental influences on specific facets reflect more general influences on higher order factors is less clear. We applied a multivariate quantitative-genetic approach to scores on the CPI-Big Five facets for 490 monozygotic and 317 dizygotic twins who took part in the National Merit Twin Study. Our results revealed a complex genetic structure for facets composing all five factors, with both domain-general and facet-specific genetic and environmental influences. Models that required common genetic and environmental influences on each facet to occur by way of effects on a higher order trait did not fit as well as models allowing for common genetic and environmental effects to act directly on the facets for three of the Big Five domains. These results add to the growing body of literature indicating that important variation in personality occurs at the facet level which may be overshadowed by aggregating to the trait level. Research at the facet level, rather than the factor level, is likely to have pragmatic advantages in future research on the genetics of personality. PMID:22695681

  18. Estimation of strong ground motion in broad-frequency band based on a seismic source scaling model and an empirical Green's function technique

    Directory of Open Access Journals (Sweden)

    K. Kamae

    1994-06-01

    Full Text Available We introduce a generalized method for simulating strong ground motion from large earthquakes by summing subevent records to follow the ?2 law. The original idea of the method is based on a constant stress parameter between the target event and the subevent. It is applicable to a case where both events have a different stress drop after some manipulation. However, the simulation for a very large earthquake from a small event with this method has inevitably some deficiencies of spectral amplitudes in the intermediate frequency range deviating f`rom the ?2 model, although the high and low frequency motions match the scaling. We improve the simulation algorithm so as not to make spectral sags, introducing self-similar distribution of subfaults with different sizes in the fault plane, so-called fractal composite faulting model. We show successful simulations for intermediate-sized earthquakes (MJMA = 5.0, 6.0 and 6.1, the large aftershocks of the 1983 Akita-Oki earthquake. using the records of smaller aftershocks (MJMA = 3.9 and 5.0 as an empirical Green's function. Further, we attempted to estimate strong ground motion for the 1946 Nankai earthquake with Mw 8.2, using the records of a MJMA 5.1 earthquake occurring near the source region of the mainshock. We found that strong ground motions simulated for the fractal composite faulting model with two asperities radiating significantly high frequency motions matched well the observed data such as the near-field displacement record, the source spectrum estimated from the teleseismic record, and the seismic intensity distribution during the 1946 Nankai earthquake.

  19. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  20. Predicting land use change on a broad area: Dyna-CLUE model application to the Litorale Domizio-Agro Aversano (Campania, South Italy

    Directory of Open Access Journals (Sweden)

    Stefania Pindozzi

    2017-06-01

    Full Text Available The long-standing awareness of the environmental impact of land-use change (LUC has led scientific community to develop tools able to predict their amount and to evaluate their effect on environment, with the aim supporting policy makers in their planning activities. This paper proposes an implementation of the Dyna-CLUE (Dynamic Conversion of Land Use and its Effects model applied to the Litorale Domizio-Agro Aversano, an area of Campania region, which needs interventions for environmental remediation. Future land use changes were simulated in two different scenarios developed under alternative strategies of land management: scenario 1 is a simple projection of the recent LUC trend, while scenario 2 hypothesises the introduction of no-food crops, such as poplar (Populus nigra L. and giant reed (Arundo donax L., in addition to a less impactful urban sprawl, which is one of the main issues in the study area. The overall duration of simulations was 13 years, subdivided into yearly time steps. CORINE land cover map of 2006 was used as baseline for land use change detection in the study area. Competition between different land use types is taken into account by setting the conversion elasticity, a parameter ranging from 0 to 1, according to their capital investment level. Location suitability for each land use type is based on logit model. Since no actual land use already exists for the alternative crops investigated in scenario 2, a suitability map realised through a spatial multicriteria decision analysis was used as a proxy for its land use pattern. The comparison of the land use in 2012 and scenario 1, evaluated through the application of Kappa statistics, showed a general tendency to expansion of built-up areas, with an increase of about 2400 ha (1.5% of the total surface, at the expense of agricultural land and those covered by natural vegetation. The comparison of the land use in 2012 and scenario 2 showed a less significant spread of built

  1. The STAGGER-grid: A grid of 3D stellar atmosphere models. V. Synthetic stellar spectra and broad-band photometry

    Science.gov (United States)

    Chiavassa, A.; Casagrande, L.; Collet, R.; Magic, Z.; Bigot, L.; Thévenin, F.; Asplund, M.

    2018-03-01

    Context. The surface structures and dynamics of cool stars are characterised by the presence of convective motions and turbulent flows which shape the emergent spectrum. Aims: We used realistic three-dimensional (3D) radiative hydrodynamical simulations from the STAGGER-grid to calculate synthetic spectra with the radiative transfer code OPTIM3D for stars with different stellar parameters to predict photometric colours and convective velocity shifts. Methods: We calculated spectra from 1000 to 200 000 Å with a constant resolving power of λ/Δλ = 20 000 and from 8470 and 8710 Å (Gaia Radial Velocity Spectrometer - RVS - spectral range) with a constant resolving power of λ/Δλ = 300 000. Results: We used synthetic spectra to compute theoretical colours in the Johnson-Cousins UBV (RI)C, SDSS, 2MASS, Gaia, SkyMapper, Strömgren systems, and HST-WFC3. Our synthetic magnitudes are compared with those obtained using 1D hydrostatic models. We showed that 1D versus 3D differences are limited to a small percent except for the narrow filters that span the optical and UV region of the spectrum. In addition, we derived the effect of the convective velocity fields on selected Fe I lines. We found the overall convective shift for 3D simulations with respect to the reference 1D hydrostatic models, revealing line shifts of between -0.235 and +0.361 km s-1. We showed a net correlation of the convective shifts with the effective temperature: lower effective temperatures denote redshifts and higher effective temperatures denote blueshifts. We conclude that the extraction of accurate radial velocities from RVS spectra need an appropriate wavelength correction from convection shifts. Conclusions: The use of realistic 3D hydrodynamical stellar atmosphere simulations has a small but significant impact on the predicted photometry compared with classical 1D hydrostatic models for late-type stars. We make all the spectra publicly available for the community through the POLLUX database

  2. Microsatellite diversity and broad scale geographic structure in a model legume: building a set of nested core collection for studying naturally occurring variation in Medicago truncatula

    DEFF Research Database (Denmark)

    Ronfort, Joelle; Bataillon, Thomas; Santoni, Sylvain

    2006-01-01

    scheme. Conclusion The stratification inferred is discussed considering potential historical events like expansion, refuge history and admixture between neighbouring groups. Information on the allelic richness and the inferred population structure are used to build a nested core-collection. The set......Abstract               Acknowledgements References   Background Exploiting genetic diversity requires previous knowledge of the extent and structure of the variation occurring in a species. Such knowledge can in turn be used to build a core-collection, i.e. a subset of accessions that aim...... at representing the genetic diversity of this species with a minimum of repetitiveness. We investigate the patterns of genetic diversity and population structure in a collection of 346 inbred lines representing the breadth of naturally occurring diversity in the Legume plant model Medicago truncatula using 13...

  3. Vaccines directed against microorganisms or their products present during biofilm lifestyle: can we make a translation as a broad biological model to tuberculosis?

    Directory of Open Access Journals (Sweden)

    Mario Alberto eFlores-Valdez

    2016-01-01

    Full Text Available Tuberculosis (TB remains as a global public health problem. In recent years, experimental evidence suggesting the relevance of in vitro pellicle (a type of biofilm formed at the air-liquid interface production as a phenotype mimicking aspects found by M. tuberculosis-complex bacteria during in vivo infection has started to accumulate. There are still opportunities for better diagnostic tools, therapeutic molecules as well as new vaccine candidates to assist in TB control programs worldwide and particularly in less developed nations. Regarding vaccines, despite the availability of a live, attenuated strain (M. bovis BCG since almost a century ago, its variable efficacy and lack of protection against pulmonary and latent disease has prompted basic and applied research leading to preclinical and clinical evaluation of up to 15 new candidates. In this work, I present examples of vaccines based on whole cells grown as biofilms, or specific proteins expressed under such condition, and the effect they have shown in relevant animal models or directly in the natural host. I also discuss why it might be worthwhile to explore these approaches, for constructing and developing new vaccine candidates for testing their efficacy against TB.

  4. STRONG RESPONSE OF THE VERY BROAD Hβ EMISSION LINE IN THE LUMINOUS RADIO-QUIET QUASAR PG 1416-129

    International Nuclear Information System (INIS)

    Wang, J.; Li, Y.

    2011-01-01

    We report new spectroscopic observations performed in 2010 and 2011 for the luminous radio-quiet quasar PG 1416-129. Our new spectra with high quality cover both Hβ and Hα regions, and show negligible line profile variation within a timescale of one year. The two spectra allow us to study the variability of the Balmer line profile by comparing the spectra with previous ones taken at 10 and 20 years ago. By decomposing the broad Balmer emission lines into two Gaussian profiles, our spectral analysis suggests a strong response to the continuum level for the very broad component, and significant variations in both bulk blueshift velocity/FWHM and flux for the broad component. The new observations additionally indicate flat Balmer decrements (i.e., too strong Hβ emission) at the line wings, which is hard to reproduce using recent optically thin models. With these observations we argue that a separate inner optically thin emission-line region might not be necessary in the object to reproduce the observed line profiles.

  5. Maternal administration of solithromycin, a new, potent, broad-spectrum fluoroketolide antibiotic, achieves fetal and intra-amniotic antimicrobial protection in a pregnant sheep model.

    Science.gov (United States)

    Keelan, Jeffrey A; Kemp, Matthew W; Payne, Matthew S; Johnson, David; Stock, Sarah J; Saito, Masatoshi; Fernandes, Prabhavathi; Newnham, John P

    2014-01-01

    Solithromycin (CEM-101) is a new antibiotic that is highly potent against Ureaplasma and Mycoplasma spp. and active against many other antibiotic-resistant organisms. We have explored the maternal-amniotic-fetal pharmacokinetics of CEM-101 in a pregnant sheep model to assess its potential for treating intrauterine and antenatal infection. Chronically catheterized pregnant ewes (n = 6 or 7) received either a single maternal intravenous (i.v.) infusion of CEM-101 (10 mg/kg of body weight), a single intra-amniotic (i.a.) injection (1.4 mg/kg of estimated fetal weight), or a combined i.v. and i.a. dose. Maternal plasma (MP), fetal plasma (FP), and amniotic fluid (AF) samples were taken via catheter at intervals of 0 to 72 h postadministration, and concentrations of solithromycin and its bioactive polar metabolites (N-acetyl [NAc]-CEM-101 and CEM-214) were determined. Following maternal i.v. infusion, peak CEM-101 concentrations in MP, FP, and AF were 1,073, 353, and 214 ng/ml, respectively, representing a maternal-to-fetal plasma transfer efficiency of 34%. A single maternal dose resulted in effective concentrations (>30 ng/ml) in MP, FP, and AF sustained for >12 h. NAc-CEM-101 and CEM-214 exhibited delayed accumulation and clearance in FP and AF, resulting in an additive antimicrobial effect (>48 h). Intra-amniotic solithromycin injection resulted in elevated (∼50 μg/ml) and sustained CEM-101 concentrations in AF and significant levels in FP, although the efficiency of amniotic-to-fetal transfer was low (∼1.5%). Combined i.v. and i.a. administration resulted in primarily additive concentrations of CEM-101 in all three compartments. Our findings suggest that CEM-101 may provide, for the first time, an effective antimicrobial approach for the prevention and treatment of intrauterine infection and early prevention of preterm birth.

  6. A Mixed-Methods Trial of Broad Band Noise and Nature Sounds for Tinnitus Therapy: Group and Individual Responses Modeled under the Adaptation Level Theory of Tinnitus

    Science.gov (United States)

    Durai, Mithila; Searchfield, Grant D.

    2017-01-01

    Objectives: A randomized cross-over trial in 18 participants tested the hypothesis that nature sounds, with unpredictable temporal characteristics and high valence would yield greater improvement in tinnitus than constant, emotionally neutral broadband noise. Study Design: The primary outcome measure was the Tinnitus Functional Index (TFI). Secondary measures were: loudness and annoyance ratings, loudness level matches, minimum masking levels, positive and negative emotionality, attention reaction and discrimination time, anxiety, depression and stress. Each sound was administered using MP3 players with earbuds for 8 continuous weeks, with a 3 week wash-out period before crossing over to the other treatment sound. Measurements were undertaken for each arm at sound fitting, 4 and 8 weeks after administration. Qualitative interviews were conducted at each of these appointments. Results: From a baseline TFI score of 41.3, sound therapy resulted in TFI scores at 8 weeks of 35.6; broadband noise resulted in significantly greater reduction (8.2 points) after 8 weeks of sound therapy use than nature sounds (3.2 points). The positive effect of sound on tinnitus was supported by secondary outcome measures of tinnitus, emotion, attention, and psychological state, but not interviews. Tinnitus loudness level match was higher for BBN at 8 weeks; while there was little change in loudness level matches for nature sounds. There was no change in minimum masking levels following sound therapy administration. Self-reported preference for one sound over another did not correlate with changes in tinnitus. Conclusions: Modeled under an adaptation level theory framework of tinnitus perception, the results indicate that the introduction of broadband noise shifts internal adaptation level weighting away from the tinnitus signal, reducing tinnitus magnitude. Nature sounds may modify the affective components of tinnitus via a secondary, residual pathway, but this appears to be less important

  7. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  8. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...

  9. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ...

  10. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  11. Strain Gauges Based on CVD Graphene Layers and Exfoliated Graphene Nanoplatelets with Enhanced Reproducibility and Scalability for Large Quantities.

    Science.gov (United States)

    Yokaribas, Volkan; Wagner, Stefan; Schneider, Daniel S; Friebertshäuser, Philipp; Lemme, Max C; Fritzen, Claus-Peter

    2017-12-18

    The two-dimensional material graphene promises a broad variety of sensing activities. Based on its low weight and high versatility, the sensor density can significantly be increased on a structure, which can improve reliability and reduce fluctuation in damage detection strategies such as structural health monitoring (SHM). Moreover; it initializes the basis of structure-sensor fusion towards self-sensing structures. Strain gauges are extensively used sensors in scientific and industrial applications. In this work, sensing in small strain fields (from -0.1% up to 0.1%) with regard to structural dynamics of a mechanical structure is presented with sensitivities comparable to bulk materials by measuring the inherent piezoresistive effect of graphene grown by chemical vapor deposition (CVD) with a very high aspect ratio of approximately 4.86 × 10⁸. It is demonstrated that the increasing number of graphene layers with CVD graphene plays a key role in reproducible strain gauge application since defects of individual layers may become less important in the current path. This may lead to a more stable response and, thus, resulting in a lower scattering.. Further results demonstrate the piezoresistive effect in a network consisting of liquid exfoliated graphene nanoplatelets (GNP), which result in even higher strain sensitivity and reproducibility. A model-assisted approach provides the main parameters to find an optimum of sensitivity and reproducibility of GNP films. The fabricated GNP strain gauges show a minimal deviation in PRE effect with a GF of approximately 5.6 and predict a linear electromechanical behaviour up to 1% strain. Spray deposition is used to develop a low-cost and scalable manufacturing process for GNP strain gauges. In this context, the challenge of reproducible and reliable manufacturing and operating must be overcome. The developed sensors exhibit strain gauges by considering the significant importance of reproducible sensor performances and open

  12. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  13. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  14. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  15. Broad-Application Test Reactor

    International Nuclear Information System (INIS)

    Motloch, C.G.

    1992-05-01

    This report is about a new, safe, and operationally efficient DOE reactor of nuclear research and testing proposed for the early to mid- 21st Century. Dubbed the Broad-Application Test Reactor (BATR), the proposed facility incorporates a multiple-application, multiple-mission design to support DOE programs such as naval reactors and space power and propulsion, as well as research in medical, science, isotope, and electronics arenas. DOE research reactors are aging, and implementing major replacement projects requires long lead times. Primary design drivers include safety, low risk, minimum operation cost, mission flexibility, waste minimization, and long life. Scientists and engineers at the Idaho National Engineering Laboratory are evaluating possible fuel forms, structural materials, reactor geometries, coolants, and moderators

  16. Broad-band beam buncher

    International Nuclear Information System (INIS)

    Goldberg, D.A.; Flood, W.S.; Arthur, A.A.; Voelker, F.

    1986-01-01

    This patent describes a broad-band beam buncher. This beam buncher consists of: a housing adapted to be eacuated, an electron gun in the housing for producing a beam of electrons, buncher means in the housing forming a buncher cavity which has an entrance opening for receiving the electron beam and an exit opening through which the electron beam passes out of the buncher cavity, a drift tube electrode in the buncher cavity and disposed between the entrance opening and the exit opening with first and second gaps between the drift tube electrode and the entrance and exit openings, the drift tube electrode which has a first drift space through which the electron beam passes in traveling between the entrance and exit openings, modulating means for supplying an ultrahigh frequeny modulating signal to the drift tube electrode for producing velocity modulation of the electrons in the electron beam as the electrons pass through the buncher cavity and the drift tube electrode between the entrance opening and the exit opening, drift space means in the housing forming a second drift space for receiving the velocity modulated electron beam from the exit opening, the velocity modulated electron beam being bunched as it passes along the second drift space, the drift space means has a discharge opening through which the electron beam is discharged from the second drift space after being bunched therein, the modulating means containing a signal source for producing an ultrahigh frequency signal, a transmission line connected between the signal source and the drift tube electrode, and terminating means connected to the drift tube electrode for terminating the transmission line in approximately its characteristic impedance to afford a broad response band with minimum 6 variations therein

  17. Broad resonances and beta-decay

    DEFF Research Database (Denmark)

    Riisager, K.; Fynbo, H. O. U.; Hyldegaard, S.

    2015-01-01

    Beta-decay into broad resonances gives a distorted lineshape in the observed energy spectrum. Part of the distortion arises from the phase space factor, but we show that the beta-decay matrix element may also contribute. Based on a schematic model for p-wave continuum neutron states it is argued...... that beta-decay directly to the continuum should be considered as a possible contributing mechanism in many decays close to the driplines. The signatures in R-matrix fits for such decays directly to continuum states are discussed and illustrated through an analysis of the beta-decay of $^8$B into $2...

  18. Transparent, reproducible and reusable research in pharmacoepidemiology

    NARCIS (Netherlands)

    Gardarsdottir, Helga; Sauer, Brian C.; Liang, Huifang; Ryan, Patrick; Klungel, Olaf; Reynolds, Robert

    2012-01-01

    Background: Epidemiological research has been criticized as being unreliable. Scientific evidence is strengthened when the study procedures of important findings are transparent, open for review, and easily reproduced by different investigators and in various settings. Studies often have common

  19. Thou Shalt Be Reproducible! A Technology Perspective

    Science.gov (United States)

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  20. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  1. Modeling the Broad-Band Emission from the Gamma-Ray Emitting Narrow-Line Seyfert-1 Galaxies 1H 0323+342 and B2 0954+25A

    Energy Technology Data Exchange (ETDEWEB)

    Arrieta-Lobo, Maialen; Boisson, Catherine; Zech, Andreas, E-mail: maialen.arrieta@obspm.fr [Laboratoire Univers et Theories, Observatoire de Paris, CNRS, Université Paris-Diderot, PSL Research University, Meudon (France)

    2017-12-08

    Prior to the Fermi-LAT era, only two classes of Active Galactic Nuclei (AGN) were thought to harbor relativistic jets that radiate up to gamma-ray energies: blazars and radio galaxies. The detection of variable gamma-ray emission from Narrow Line Seyfert 1 (NLSy1) galaxies has put them on the spotlight as a new class of gamma-ray emitting AGN. In this respect, gamma-ray emitting NLSy1s seem to be situated between blazars (dominated by non-thermal emission) and Seyferts (accretion disc dominated). In this work, we model the Spectral Energy Distribution (SED) of two gamma-loud NLSy1s, 1H 0323+342 and B2 0954+25A, during quiescent and flaring episodes via a multi-component radiative model that features a relativistic jet and external photon fields from the torus, disc, corona and Broad Line Region (BLR). We find that the interpretation of the high-energy emission of jetted NLSy1s requires taking into account Inverse Compton emission from particles in the relativistic jet that interact with external photon fields. Minimal changes are applied to the model parameters to transition from average to flaring states. In this scenario, the observed variability is explained mainly by means of changes in the jet density and Doppler factor.

  2. Data Science Innovations That Streamline Development, Documentation, Reproducibility, and Dissemination of Models in Computational Thermodynamics: An Application of Image Processing Techniques for Rapid Computation, Parameterization and Modeling of Phase Diagrams

    Science.gov (United States)

    Ghiorso, M. S.

    2014-12-01

    Computational thermodynamics (CT) represents a collection of numerical techniques that are used to calculate quantitative results from thermodynamic theory. In the Earth sciences, CT is most often applied to estimate the equilibrium properties of solutions, to calculate phase equilibria from models of the thermodynamic properties of materials, and to approximate irreversible reaction pathways by modeling these as a series of local equilibrium steps. The thermodynamic models that underlie CT calculations relate the energy of a phase to temperature, pressure and composition. These relationships are not intuitive and they are seldom well constrained by experimental data; often, intuition must be applied to generate a robust model that satisfies the expectations of use. As a consequence of this situation, the models and databases the support CT applications in geochemistry and petrology are tedious to maintain as new data and observations arise. What is required to make the process more streamlined and responsive is a computational framework that permits the rapid generation of observable outcomes from the underlying data/model collections, and importantly, the ability to update and re-parameterize the constitutive models through direct manipulation of those outcomes. CT procedures that take models/data to the experiential reference frame of phase equilibria involve function minimization, gradient evaluation, the calculation of implicit lines, curves and surfaces, contour extraction, and other related geometrical measures. All these procedures are the mainstay of image processing analysis. Since the commercial escalation of video game technology, open source image processing libraries have emerged (e.g., VTK) that permit real time manipulation and analysis of images. These tools find immediate application to CT calculations of phase equilibria by permitting rapid calculation and real time feedback between model outcome and the underlying model parameters.

  3. Broad-band near-field ground motion simulations in 3-dimensional scattering media

    KAUST Repository

    Imperatori, W.

    2012-12-06

    The heterogeneous nature of Earth\\'s crust is manifested in the scattering of propagating seismic waves. In recent years, different techniques have been developed to include such phenomenon in broad-band ground-motion calculations, either considering scattering as a semi-stochastic or purely stochastic process. In this study, we simulate broad-band (0–10 Hz) ground motions with a 3-D finite-difference wave propagation solver using several 3-D media characterized by von Karman correlation functions with different correlation lengths and standard deviation values. Our goal is to investigate scattering characteristics and its influence on the seismic wavefield at short and intermediate distances from the source in terms of ground motion parameters. We also examine scattering phenomena, related to the loss of radiation pattern and the directivity breakdown. We first simulate broad-band ground motions for a point-source characterized by a classic ω2 spectrum model. Fault finiteness is then introduced by means of a Haskell-type source model presenting both subshear and super-shear rupture speed. Results indicate that scattering plays an important role in ground motion even at short distances from the source, where source effects are thought to be dominating. In particular, peak ground motion parameters can be affected even at relatively low frequencies, implying that earthquake ground-motion simulations should include scattering also for peak ground velocity (PGV) calculations. At the same time, we find a gradual loss of the source signature in the 2–5 Hz frequency range, together with a distortion of the Mach cones in case of super-shear rupture. For more complex source models and truly heterogeneous Earth, these effects may occur even at lower frequencies. Our simulations suggests that von Karman correlation functions with correlation length between several hundred metres and few kilometres, Hurst exponent around 0.3 and standard deviation in the 5–10 per cent

  4. Reproducibility of pacing profiles in elite swimmers.

    Science.gov (United States)

    Skorski, Sabrina; Faude, Oliver; Caviezel, Seraina; Meyer, Tim

    2014-03-01

    To analyze the reproducibility of pacing in elite swimmers during competitions and to compare heats and finals within 1 event. Finals and heats of 158 male swimmers (age 22.8 ± 2.9 y) from 29 nations were analyzed in 2 competitions (downloaded from swimrankings.net). Of these, 134 were listed in the world's top 50 in 2010; the remaining 24 were finalists of the Pan Pacific Games or European Championships. The level of both competitions for the analysis had to be at least national championships (7.7 ± 5.4 wk apart). Standard error of measurement expressed as percentage of the subject's mean score (CV) with 90% confidence limits (CL) for each 50-m split time and for total times were calculated. In addition, mixed general modeling was used to determine standard deviations between and within swimmers. CV for total time in finals ranged between 0.8% and 1.3% (CL 0.6-2.2%). Regarding split times, 200-m freestyle showed a consistent pacing over all split times (CV 0.9-1.6%). During butterfly, backstroke, and 400-m freestyle, CVs were low in the first 3 and 7 sections, respectively (CV 0.9-1.7%), with greater variability in the last section (1.9-2.2%). In breaststroke, values were higher in all sections (CV 1.2-2.3%). Within-subject SDs for changes between laps were between 0.9% and 2.6% in all finals. Split-time variability for finals and heats ranged between 0.9% and 2.5% (CL 0.3-4.9%). Pacing profiles are consistent between different competitions. Variability of pacing seems to be a result of the within-subject variation rather than a result of different competitions.

  5. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  6. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  7. MICROLENSING OF QUASAR BROAD EMISSION LINES: CONSTRAINTS ON BROAD LINE REGION SIZE

    Energy Technology Data Exchange (ETDEWEB)

    Guerras, E.; Mediavilla, E. [Instituto de Astrofisica de Canarias, Via Lactea S/N, La Laguna E-38200, Tenerife (Spain); Jimenez-Vicente, J. [Departamento de Fisica Teorica y del Cosmos, Universidad de Granada, Campus de Fuentenueva, E-18071 Granada (Spain); Kochanek, C. S. [Department of Astronomy and the Center for Cosmology and Astroparticle Physics, The Ohio State University, 4055 McPherson Lab, 140 West 18th Avenue, Columbus, OH 43221 (United States); Munoz, J. A. [Departamento de Astronomia y Astrofisica, Universidad de Valencia, E-46100 Burjassot, Valencia (Spain); Falco, E. [Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Motta, V. [Departamento de Fisica y Astronomia, Universidad de Valparaiso, Avda. Gran Bretana 1111, Valparaiso (Chile)

    2013-02-20

    We measure the differential microlensing of the broad emission lines between 18 quasar image pairs in 16 gravitational lenses. We find that the broad emission lines are in general weakly microlensed. The results show, at a modest level of confidence (1.8{sigma}), that high ionization lines such as C IV are more strongly microlensed than low ionization lines such as H{beta}, indicating that the high ionization line emission regions are more compact. If we statistically model the distribution of microlensing magnifications, we obtain estimates for the broad line region size of r{sub s} = 24{sup +22} {sub -15} and r{sub s} = 55{sup +150} {sub -35} lt-day (90% confidence) for the high and low ionization lines, respectively. When the samples are divided into higher and lower luminosity quasars, we find that the line emission regions of more luminous quasars are larger, with a slope consistent with the expected scaling from photoionization models. Our estimates also agree well with the results from local reveberation mapping studies.

  8. Experimental challenges to reproduce seismic fault motion

    Science.gov (United States)

    Shimamoto, T.

    2011-12-01

    This presentation briefly reviews scientific and technical development in the studies of intermediate to high-velocity frictional properties of faults and summarizes remaining technical challenges to reproduce nucleation to growth processes of large earthquakes in laboratory. Nearly 10 high-velocity or low to high-velocity friction apparatuses have been built in the last several years in the world and it has become possible now to produce sub-plate velocity to seismic slip rate in a single machine. Despite spreading of high-velocity friction studies, reproducing seismic fault motion at high P and T conditions to cover the entire seismogenic zone is still a big challenge. Previous studies focused on (1) frictional melting, (2) thermal pressurization, and (3) high-velocity gouge behavior without frictional melting. Frictional melting process was solved as a Stefan problem with very good agreement with experimental results. Thermal pressurization has been solved theoretically based on measured transport properties and has been included successfully in the modeling of earthquake generation. High-velocity gouge experiments in the last several years have revealed that a wide variety of gouges exhibit dramatic weakening at high velocities (e.g., Di Toro et al., 2011, Nature). Most gouge experiments were done under dry conditions partly to separate gouge friction from the involvement of thermal pressurization. However, recent studies demonstrated that dehydration or degassing due to mineral decomposition can occur during seismic fault motion. Those results not only provided a new view of looking at natural fault zones in search of geological evidence of seismic fault motion, but also indicated that thermal pressurization and gouge weakening can occur simultaneously even in initially dry gouge. Thus experiments with controlled pore pressure are needed. I have struggled to make a pressure vessel for wet high-velocity experiments in the last several years. A technical

  9. Novel Clostridium difficile Anti-Toxin (TcdA and TcdB Humanized Monoclonal Antibodies Demonstrate In Vitro Neutralization across a Broad Spectrum of Clinical Strains and In Vivo Potency in a Hamster Spore Challenge Model.

    Directory of Open Access Journals (Sweden)

    Hongyu Qiu

    Full Text Available Clostridium difficile (C. difficile infection (CDI is the main cause of nosocomial antibiotic-associated colitis and increased incidence of community-associated diarrhea in industrialized countries. At present, the primary treatment of CDI is antibiotic administration, which is effective but often associated with recurrence, especially in the elderly. Pathogenic strains produce enterotoxin, toxin A (TcdA, and cytotoxin, toxin B (TcdB, which are necessary for C. difficile induced diarrhea and gut pathological changes. Administration of anti-toxin antibodies provides an alternative approach to treat CDI, and has shown promising results in preclinical and clinical studies. In the current study, several humanized anti-TcdA and anti-TcdB monoclonal antibodies were generated and their protective potency was characterized in a hamster infection model. The humanized anti-TcdA (CANmAbA4 and anti-TcdB (CANmAbB4 and CANmAbB1 antibodies showed broad spectrum in vitro neutralization of toxins from clinical strains and neutralization in a mouse toxin challenge model. Moreover, co-administration of humanized antibodies (CANmAbA4 and CANmAbB4 cocktail provided a high level of protection in a dose dependent manner (85% versus 57% survival at day 22 for 50 mg/kg and 20 mg/kg doses, respectively in a hamster gastrointestinal infection (GI model. This study describes the protective effects conferred by novel neutralizing anti-toxin monoclonal antibodies against C. difficile toxins and their potential as therapeutic agents in treating CDI.

  10. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael C.; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric-Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  11. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  12. REPRODUCIBILITY OF CHILDHOOD RESPIRATORY SYMPTOM QUESTIONS

    NARCIS (Netherlands)

    BRUNEKREEF, B; GROOT, B; RIJCKEN, B; HOEK, G; STEENBEKKERS, A; DEBOER, A

    The reproducibility of answers to childhood respiratory symptom questions was investigated by administering two childhood respiratory symptom questionnaires twice, with a one month interval, to the same population of Dutch school children. The questionnaires were completed by the parents of 410

  13. Reply to the comment of S. Rayne on "QSAR model reproducibility and applicability: A case study of rate constants of hydroxyl radical reaction models applied to polybrominated diphenyl ethers and (benzo-)triazoles".

    Science.gov (United States)

    Gramatica, Paola; Kovarich, Simona; Roy, Partha Pratim

    2013-07-30

    We appreciate the interest of Dr. Rayne on our article and we completely agree that the dataset of (benzo-)triazoles, which were screened by the hydroxyl radical reaction quantitative structure-activity relationship (QSAR) model, was not only composed of benzo-triazoles but also included some simpler triazoles (without the condensed benzene ring), such as the chemicals listed by Dr. Rayne, as well as some related heterocycles (also few not aromatic). We want to clarify that in this article (as well as in other articles in which the same dataset was screened), for conciseness, the abbreviations (B)TAZs and BTAZs were used as general (and certainly too simplified) notations meaning an extended dataset of benzo-triazoles, triazoles, and related compounds. Copyright © 2013 Wiley Periodicals, Inc.

  14. ITK: enabling reproducible research and open science

    Science.gov (United States)

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  15. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  16. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  17. Sharing meanings: developing interoperable semantic technologies to enhance reproducibility in earth and environmental science research

    Science.gov (United States)

    Schildhauer, M.

    2015-12-01

    Earth and environmental scientists are familiar with the entities, processes, and theories germane to their field of study, and comfortable collecting and analyzing data in their area of interest. Yet, while there appears to be consistency and agreement as to the scientific "terms" used to describe features in their data and analyses, aside from a few fundamental physical characteristics—such as mass or velocity-- there can be broad tolerances, if not considerable ambiguity, in how many earth science "terms" map to the underlying "concepts" that they actually represent. This ambiguity in meanings, or "semantics", creates major problems for scientific reproducibility. It greatly impedes the ability to replicate results—by making it difficult to determine the specifics of the intended meanings of terms such as deforestation or carbon flux -- as to scope, composition, magnitude, etc. In addition, semantic ambiguity complicates assemblage of comparable data for reproducing results, due to ambiguous or idiosyncratic labels for measurements, such as percent cover of forest, where the term "forest" is undefined; or where a reported output of "total carbon-emissions" might just include CO2 emissions, but not methane emissions. In this talk, we describe how the NSF-funded DataONE repository for earth and environmental science data (http://dataone.org), is using W3C-standard languages (RDF/OWL) to build an ontology for clarifying concepts embodied in heterogeneous data and model outputs. With an initial focus on carbon cycling concepts using terrestrial biospheric model outputs and LTER productivity data, we describe how we are achieving interoperability with "semantic vocabularies" (or ontologies) from aligned earth and life science domains, including OBO-foundry ontologies such as ENVO and BCO; the ISO/OGC O&M; and the NSF Earthcube GeoLink project. Our talk will also discuss best practices that may be helpful for other groups interested in constructing their own

  18. Accuracy, reproducibility, and time efficiency of dental measurements using different technologies.

    Science.gov (United States)

    Grünheid, Thorsten; Patel, Nishant; De Felippe, Nanci L; Wey, Andrew; Gaillard, Philippe R; Larson, Brent E

    2014-02-01

    Historically, orthodontists have taken dental measurements on plaster models. Technological advances now allow orthodontists to take these measurements on digital models. In this study, we aimed to assess the accuracy, reproducibility, and time efficiency of dental measurements taken on 3 types of digital models. emodels (GeoDigm, Falcon Heights, Minn), SureSmile models (OraMetrix, Richardson, Tex), and AnatoModels (Anatomage, San Jose, Calif) were made for 30 patients. Mesiodistal tooth-width measurements taken on these digital models were timed and compared with those on the corresponding plaster models, which were used as the gold standard. Accuracy and reproducibility were assessed using the Bland-Altman method. Differences in time efficiency were tested for statistical significance with 1-way analysis of variance. Measurements on SureSmile models were the most accurate, followed by those on emodels and AnatoModels. Measurements taken on SureSmile models were also the most reproducible. Measurements taken on SureSmile models and emodels were significantly faster than those taken on AnatoModels and plaster models. Tooth-width measurements on digital models can be as accurate as, and might be more reproducible and significantly faster than, those taken on plaster models. Of the models studied, the SureSmile models provided the best combination of accuracy, reproducibility, and time efficiency of measurement. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  19. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  20. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  1. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  2. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  3. DETECTION OF EXTREMELY BROAD WATER EMISSION FROM THE MOLECULAR CLOUD INTERACTING SUPERNOVA REMNANT G349.7+0.2

    Energy Technology Data Exchange (ETDEWEB)

    Rho, J. [SETI Institute, 189 N. Bernardo Avenue, Mountain View, CA 94043 (United States); Hewitt, J. W. [CRESST/University of Maryland, Baltimore County, Baltimore, MD 21250 (United States); Boogert, A. [SOFIA Science Center, NASA Ames Research Center, MS 232-11, Moffett Field, CA 94035 (United States); Kaufman, M. [Department of Physics and Astronomy, San Jose State University, San Jose, CA 95192-0106 (United States); Gusdorf, A., E-mail: jrho@seti.org, E-mail: john.w.hewitt@nasa.gov, E-mail: aboogert@sofia.usra.edu, E-mail: michael.kaufman@sjsu.edu, E-mail: antoine.gusdorf@lra.ens.fr [LERMA, UMR 8112 du CNRS, Observatoire de Paris, École Normale Suprieure, 24 rue Lhomond, F-75231 Paris Cedex 05 (France)

    2015-10-10

    We performed Herschel HIFI, PACS, and SPIRE observations toward the molecular cloud interacting supernova remnant G349.7+0.2. An extremely broad emission line was detected at 557 GHz from the ground state transition 1{sub 10}-1{sub 01} of ortho-water. This water line can be separated into three velocity components with widths of 144, 27, and 4 km s{sup −1}. The 144 km s{sup −1} component is the broadest water line detected to date in the literature. This extremely broad line width shows the importance of probing shock dynamics. PACS observations revealed three additional ortho-water lines, as well as numerous high-J carbon monoxide (CO) lines. No para-water lines were detected. The extremely broad water line is indicative of a high velocity shock, which is supported by the observed CO rotational diagram that was reproduced with a J-shock model with a density of 10{sup 4} cm{sup −3} and a shock velocity of 80 km s{sup −1}. Two far-infrared fine-structure lines, [O i] at 145 μm and [C ii] line at 157 μm, are also consistent with the high velocity J-shock model. The extremely broad water line could be simply from short-lived molecules that have not been destroyed in high velocity J-shocks; however, it may be from more complicated geometry such as high-velocity water bullets or a shell expanding in high velocity. We estimate the CO and H{sub 2}O densities, column densities, and temperatures by comparison with RADEX and detailed shock models.

  4. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  5. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  6. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  7. Broad Prize: Do the Successes Spread?

    Science.gov (United States)

    Samuels, Christina A.

    2011-01-01

    When the Broad Prize for Urban Education was created in 2002, billionaire philanthropist Eli Broad said he hoped the awards, in addition to rewarding high-performing school districts, would foster healthy competition; boost the prestige of urban education, long viewed as dysfunctional; and showcase best practices. Over the 10 years the prize has…

  8. Broad Academy's Growing Reach Draws Scrutiny

    Science.gov (United States)

    Samuels, Christina A.

    2011-01-01

    Billionaire businessman Eli Broad, one of the country's most active philanthropists, founded the Broad Superintendents Academy in 2002 with an extraordinarily optimistic goal: Find leaders from both inside and outside education, train them, and have them occupying the superintendencies in a third of the 75 largest school districts--in just two…

  9. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  10. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  11. Queer nuclear families? Reproducing and transgressing heteronormativity.

    Science.gov (United States)

    Folgerø, Tor

    2008-01-01

    During the past decade the public debate on gay and lesbian adoptive rights has been extensive in the Norwegian media. The debate illustrates how women and men planning to raise children in homosexual family constellations challenge prevailing cultural norms and existing concepts of kinship and family. The article discusses how lesbian mothers and gay fathers understand and redefine their own family practices. An essential point in this article is the fundamental ambiguity in these families' accounts of themselves-how they simultaneously transgress and reproduce heteronormative assumptions about childhood, fatherhood, motherhood, family and kinship.

  12. Results of von Neumann analyses for reproducing kernel semi-discretizations

    Energy Technology Data Exchange (ETDEWEB)

    Voth, T.E.; Christon, M.A.

    1998-06-01

    The Reproducing Kernel Particle Method (RKPM) has many attractive properties that make it ideal for treating a broad class of physical problems. RKPM may be implemented in a mesh-full or a mesh-free manner and provides the ability to tune the method, via the selection of a dilation parameter and window function, in order to achieve the requisite numerical performance. RKPM also provides a framework for performing hierarchical computations making it an ideal candidate for simulating multi-scale problems. Although RKPM has many appealing attributes, the method is quite new and its numerical performance is still being quantified with respect to more traditional discretization methods. In order to assess the numerical performance of RKPM, detailed studies of RKPM on a series of model partial differential equations has been undertaken. The results of von Neumann analyses for RKPM semi-discretizations of one and two-dimensional, first and second-order wave equations are presented in the form of phase and group errors. Excellent dispersion characteristics are found for the consistent mass matrix with the proper choice of dilation parameter. In contrast, the influence of row-sum lumping the mass matrix is shown to introduce severe lagging phase errors. A higher-order mass matrix improves the dispersion characteristics relative to the lumped mass matrix but delivers severe lagging phase errors relative to the fully integrated, consistent mass matrix.

  13. BASTet: Shareable and Reproducible Analysis and Visualization of Mass Spectrometry Imaging Data via OpenMSI.

    Science.gov (United States)

    Rubel, Oliver; Bowen, Benjamin P

    2018-01-01

    Mass spectrometry imaging (MSI) is a transformative imaging method that supports the untargeted, quantitative measurement of the chemical composition and spatial heterogeneity of complex samples with broad applications in life sciences, bioenergy, and health. While MSI data can be routinely collected, its broad application is currently limited by the lack of easily accessible analysis methods that can process data of the size, volume, diversity, and complexity generated by MSI experiments. The development and application of cutting-edge analytical methods is a core driver in MSI research for new scientific discoveries, medical diagnostics, and commercial-innovation. However, the lack of means to share, apply, and reproduce analyses hinders the broad application, validation, and use of novel MSI analysis methods. To address this central challenge, we introduce the Berkeley Analysis and Storage Toolkit (BASTet), a novel framework for shareable and reproducible data analysis that supports standardized data and analysis interfaces, integrated data storage, data provenance, workflow management, and a broad set of integrated tools. Based on BASTet, we describe the extension of the OpenMSI mass spectrometry imaging science gateway to enable web-based sharing, reuse, analysis, and visualization of data analyses and derived data products. We demonstrate the application of BASTet and OpenMSI in practice to identify and compare characteristic substructures in the mouse brain based on their chemical composition measured via MSI.

  14. Efficient and reproducible identification of mismatch repair deficient colon cancer

    DEFF Research Database (Denmark)

    Joost, Patrick; Bendahl, Pär-Ola; Halvarsson, Britta

    2013-01-01

    BACKGROUND: The identification of mismatch-repair (MMR) defective colon cancer is clinically relevant for diagnostic, prognostic and potentially also for treatment predictive purposes. Preselection of tumors for MMR analysis can be obtained with predictive models, which need to demonstrate ease...... of application and favorable reproducibility. METHODS: We validated the MMR index for the identification of prognostically favorable MMR deficient colon cancers and compared performance to 5 other prediction models. In total, 474 colon cancers diagnosed ≥ age 50 were evaluated with correlation between...... and efficiently identifies MMR defective colon cancers with high sensitivity and specificity. The model shows stable performance with low inter-observer variability and favorable performance when compared to other MMR predictive models....

  15. Timbral aspects of reproduced sound in small rooms. I

    DEFF Research Database (Denmark)

    Bech, Søren

    1995-01-01

    This paper reports some of the influences of individual reflections on the timbre of reproduced sound. A single loudspeaker with frequency-independent directivity characteristics, positioned in a listening room of normal size with frequency-independent absorption coefficients of the room surfaces......, has been simulated using an electroacoustic setup. The model included the direct sound, 17 individual reflections, and the reverberant field. The threshold of detection and just-noticeable differences for an increase in level were measured for individual reflections using eight subjects for noise...

  16. Mathematical Development: The Role of Broad Cognitive Processes

    Science.gov (United States)

    Calderón-Tena, Carlos O.

    2016-01-01

    This study investigated the role of broad cognitive processes in the development of mathematics skills among children and adolescents. Four hundred and forty-seven students (age mean [M] = 10.23 years, 73% boys and 27% girls) from an elementary school district in the US southwest participated. Structural equation modelling tests indicated that…

  17. Broad spectrum microarray for fingerprint-based bacterial species identification

    Directory of Open Access Journals (Sweden)

    Frey Jürg E

    2010-02-01

    Full Text Available Abstract Background Microarrays are powerful tools for DNA-based molecular diagnostics and identification of pathogens. Most target a limited range of organisms and are based on only one or a very few genes for specific identification. Such microarrays are limited to organisms for which specific probes are available, and often have difficulty discriminating closely related taxa. We have developed an alternative broad-spectrum microarray that employs hybridisation fingerprints generated by high-density anonymous markers distributed over the entire genome for identification based on comparison to a reference database. Results A high-density microarray carrying 95,000 unique 13-mer probes was designed. Optimized methods were developed to deliver reproducible hybridisation patterns that enabled confident discrimination of bacteria at the species, subspecies, and strain levels. High correlation coefficients were achieved between replicates. A sub-selection of 12,071 probes, determined by ANOVA and class prediction analysis, enabled the discrimination of all samples in our panel. Mismatch probe hybridisation was observed but was found to have no effect on the discriminatory capacity of our system. Conclusions These results indicate the potential of our genome chip for reliable identification of a wide range of bacterial taxa at the subspecies level without laborious prior sequencing and probe design. With its high resolution capacity, our proof-of-principle chip demonstrates great potential as a tool for molecular diagnostics of broad taxonomic groups.

  18. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    Science.gov (United States)

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. Copyright © 2015, American Association for the Advancement of Science.

  19. The NANOGrav Observing Program: Automation and Reproducibility

    Science.gov (United States)

    Brazier, Adam; Cordes, James; Demorest, Paul; Dolch, Timothy; Ferdman, Robert; Garver-Daniels, Nathaniel; Hawkins, Steven; Lam, Michael Timothy; Lazio, T. Joseph W.

    2018-01-01

    The NANOGrav Observing Program is a decades-long search for gravitational waves using pulsar timing which relies, for its sensitivity, on large data sets from observations of many pulsars. These are constructed through an intensive, long-term observing campaign. The nature of the program requires automation in the transfer and archiving of the large volume of raw telescope data, the calibration of those data, and making these resulting data products—required for diagnostic and data exploration purposes—available to NANOGrav members. Reproducibility of results is a key goal in this project, and essential to its success; it requires treating the software itself as a data product of the research, while ensuring easy access by, and collaboration between, members of NANOGrav, the International Pulsar Timing Array consortium (of which NANOGrav is a key member), as well as the wider astronomy community and the public.

  20. Uniform and reproducible stirring in a microbioreactor

    DEFF Research Database (Denmark)

    Bolic, Andrijana; Eliasson Lantz, Anna; Rottwitt, Karsten

    At present, research in bioprocess science and engineering increasingly requires fast and accurate analytical data (rapid testing) that can be used for investigation of the interaction between bioprocess operation conditions and the performance of the bioprocess. Miniaturization is certainly...... microbioreactor application. In order to address some of these questions, we are currently investigating and developing a microbioreactor platform with a reactor volume up to 1ml, as we believe that this volume is of interest to many industrial applications. It is widely known that stirring plays a very important...... role in achieving successful cultivations by promoting uniform process conditions and – for aerobic cultivations – a high oxygen transfer rate. In this contribution, the development of a suitable, reliable and reproducible stirrer in a microbioreactor for batch and continuous cultivation of S...

  1. Is Grannum grading of the placenta reproducible?

    Science.gov (United States)

    Moran, Mary; Ryan, John; Brennan, Patrick C.; Higgins, Mary; McAuliffe, Fionnuala M.

    2009-02-01

    Current ultrasound assessment of placental calcification relies on Grannum grading. The aim of this study was to assess if this method is reproducible by measuring inter- and intra-observer variation in grading placental images, under strictly controlled viewing conditions. Thirty placental images were acquired and digitally saved. Five experienced sonographers independently graded the images on two separate occasions. In order to eliminate any technological factors which could affect data reliability and consistency all observers reviewed images at the same time. To optimise viewing conditions ambient lighting was maintained between 25-40 lux, with monitors calibrated to the GSDF standard to ensure consistent brightness and contrast. Kappa (κ) analysis of the grades assigned was used to measure inter- and intra-observer reliability. Intra-observer agreement had a moderate mean κ-value of 0.55, with individual comparisons ranging from 0.30 to 0.86. Two images saved from the same patient, during the same scan, were each graded as I, II and III by the same observer. A mean κ-value of 0.30 (range from 0.13 to 0.55) indicated fair inter-observer agreement over the two occasions and only one image was graded consistently the same by all five observers. The study findings confirmed the lack of reproducibility associated with Grannum grading of the placenta despite optimal viewing conditions and highlight the need for new methods of assessing placental health in order to improve neonatal outcomes. Alternative methods for quantifying placental calcification such as a software based technique and 3D ultrasound assessment need to be explored.

  2. Evaluation of Statistical Downscaling Skill at Reproducing Extreme Events

    Science.gov (United States)

    McGinnis, S. A.; Tye, M. R.; Nychka, D. W.; Mearns, L. O.

    2015-12-01

    Climate model outputs usually have much coarser spatial resolution than is needed by impacts models. Although higher resolution can be achieved using regional climate models for dynamical downscaling, further downscaling is often required. The final resolution gap is often closed with a combination of spatial interpolation and bias correction, which constitutes a form of statistical downscaling. We use this technique to downscale regional climate model data and evaluate its skill in reproducing extreme events. We downscale output from the North American Regional Climate Change Assessment Program (NARCCAP) dataset from its native 50-km spatial resolution to the 4-km resolution of University of Idaho's METDATA gridded surface meterological dataset, which derives from the PRISM and NLDAS-2 observational datasets. We operate on the major variables used in impacts analysis at a daily timescale: daily minimum and maximum temperature, precipitation, humidity, pressure, solar radiation, and winds. To interpolate the data, we use the patch recovery method from the Earth System Modeling Framework (ESMF) regridding package. We then bias correct the data using Kernel Density Distribution Mapping (KDDM), which has been shown to exhibit superior overall performance across multiple metrics. Finally, we evaluate the skill of this technique in reproducing extreme events by comparing raw and downscaled output with meterological station data in different bioclimatic regions according to the the skill scores defined by Perkins et al in 2013 for evaluation of AR4 climate models. We also investigate techniques for improving bias correction of values in the tails of the distributions. These techniques include binned kernel density estimation, logspline kernel density estimation, and transfer functions constructed by fitting the tails with a generalized pareto distribution.

  3. Response to Comment on "Estimating the reproducibility of psychological science"

    NARCIS (Netherlands)

    Anderson, Christopher J; Bahník, Štěpán; Barnett-Cowan, Michael; Bosco, Frank A; Chandler, Jesse; Chartier, Christopher R; Cheung, Felix; Christopherson, Cody D; Cordes, Andreas; Cremata, Edward J; Della Penna, Nicolas; Estel, Vivien; Fedor, Anna; Fitneva, Stanka A; Frank, Michael C; Grange, James A; Hartshorne, Joshua K; Hasselman, Fred; Henninger, Felix; van der Hulst, Marije; Jonas, Kai J; Lai, Calvin K; Levitan, Carmel A; Miller, Jeremy K; Moore, Katherine S; Meixner, Johannes M; Munafò, Marcus R; Neijenhuijs, Koen I; Nilsonne, Gustav; Nosek, Brian A; Plessow, Franziska; Prenoveau, Jason M; Ricker, Ashley A; Schmidt, Kathleen; Spies, Jeffrey R; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B; van Aert, Robbie C M; van Assen, Marcel A L M; Vanpaemel, Wolf; Vianello, Michelangelo; Voracek, Martin; Zuni, Kellylynn

    2016-01-01

    Gilbert et al. conclude that evidence from the Open Science Collaboration's Reproducibility Project: Psychology indicates high reproducibility, given the study methodology. Their very optimistic assessment is limited by statistical misconceptions and by causal inferences from selectively

  4. Broad-Band Spectroscopy of Hercules X-1 with Suzaku

    Science.gov (United States)

    Asami, Fumi; Enoto, Teruaki; Iwakiri, Wataru; Yamada, Shin'ya; Tamagawa, Toru; Mihara, Tatehiro; Nagase, Fumiaki

    2014-01-01

    Hercules X-1 was observed with Suzaku in the main-on state from 2005 to 2010. The 0.4- 100 keV wide-band spectra obtained in four observations showed a broad hump around 4-9 keV in addition to narrow Fe lines at 6.4 and 6.7 keV. The hump was seen in all the four observations regardless of the selection of the continuum models. Thus it is considered a stable and intrinsic spectral feature in Her X-1. The broad hump lacked a sharp structure like an absorption edge. Thus it was represented by two different spectral models: an ionized partial covering or an additional broad line at 6.5 keV. The former required a persistently existing ionized absorber, whose origin was unclear. In the latter case, the Gaussian fitting of the 6.5-keV line needs a large width of sigma = 1.0-1.5 keV and a large equivalent width of 400-900 eV. If the broad line originates from Fe fluorescence of accreting matter, its large width may be explained by the Doppler broadening in the accretion flow. However, the large equivalent width may be inconsistent with a simple accretion geometry.

  5. Reliability: on the reproducibility of assessment data.

    Science.gov (United States)

    Downing, Steven M

    2004-09-01

    All assessment data, like other scientific experimental data, must be reproducible in order to be meaningfully interpreted. The purpose of this paper is to discuss applications of reliability to the most common assessment methods in medical education. Typical methods of estimating reliability are discussed intuitively and non-mathematically. Reliability refers to the consistency of assessment outcomes. The exact type of consistency of greatest interest depends on the type of assessment, its purpose and the consequential use of the data. Written tests of cognitive achievement look to internal test consistency, using estimation methods derived from the test-retest design. Rater-based assessment data, such as ratings of clinical performance on the wards, require interrater consistency or agreement. Objective structured clinical examinations, simulated patient examinations and other performance-type assessments generally require generalisability theory analysis to account for various sources of measurement error in complex designs and to estimate the consistency of the generalisations to a universe or domain of skills. Reliability is a major source of validity evidence for assessments. Low reliability indicates that large variations in scores can be expected upon retesting. Inconsistent assessment scores are difficult or impossible to interpret meaningfully and thus reduce validity evidence. Reliability coefficients allow the quantification and estimation of the random errors of measurement in assessments, such that overall assessment can be improved.

  6. Environment and industrial economy: Challenge of reproducibility

    International Nuclear Information System (INIS)

    Rullani, E.

    1992-01-01

    Historically and methodologically counterposed until now, the environmentalist and the economic approach to environmental problems need to be integrated in a new approach that considers, from one side, the relevance of the ecological equilibria for the economic systems and, from the other side, the economic dimension (in terms of investments and transformations in the production system) of any attempt to achieve a better environment. In order to achieve this integration, both approaches are compelled to give up some cultural habits that have characterized them, and have contributed to over-emphasize the opposition between them. The article shows that both approaches can converge into a new one, in which environment is no longer only an holistic, not bargainable, natural external limit to human activity (as in the environmentalist approach), nor simply a scarce and exhaustible resource (as economics tends to consider it); environment should instead become part of the reproducibility sphere, or, in other words, it must be regarded as part of the output that the economic system provides. This new approach, due to scientific and technological advances, is made possible for an increasing class of environmental problems. In order to do this, an evolution is required, that could be able to convert environmental goals into investment and technological innovation goals, and communicate to the firms the value society assigns to environmental resources. This value, the author suggests, should correspond to the reproduction cost. Various examples of this new approach are analyzed and discussed

  7. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  8. Reproducibility of the water drinking test.

    Science.gov (United States)

    Muñoz, C R; Macias, J H; Hartleben, C

    2015-11-01

    To investigate the reproducibility of the water drinking test in determining intraocular pressure peaks and fluctuation. It has been suggested that there is limited agreement between the water drinking test and diurnal tension curve. This may be because it has only been compared with a 10-hour modified diurnal tension curve, missing 70% of IOP peaks that occurred during night. This was a prospective, analytical and comparative study that assesses the correlation, agreement, sensitivity and specificity of the water drinking test. The correlation between the water drinking test and diurnal tension curve was significant and strong (r=0.93, Confidence interval 95% between 0.79 and 0.96, p<01). A moderate agreement was observed between these measurements (pc=0.93, Confidence interval 95% between 0.87 and 0.95, p<.01). The agreement was within±2mmHg in 89% of the tests. Our study found a moderate agreement between the water drinking test and diurnal tension curve, in contrast with the poor agreement found in other studies, possibly due to the absence of nocturnal IOP peaks. These findings suggest that the water drinking test could be used to determine IOP peaks, as well as for determining baseline IOP. Copyright © 2014 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.

  9. Are classifications of proximal radius fractures reproducible?

    Directory of Open Access Journals (Sweden)

    dos Santos João BG

    2009-10-01

    Full Text Available Abstract Background Fractures of the proximal radius need to be classified in an appropriate and reproducible manner. The aim of this study was to assess the reliability of the three most widely used classification systems. Methods Elbow radiographs images of patients with proximal radius fractures were classified according to Mason, Morrey, and Arbeitsgemeinschaft für osteosynthesefragen/Association for the Study of Internal Fixation (AO/ASIF classifications by four observers with different experience with this subject to assess their intra- and inter-observer agreement. Each observer analyzed the images on three different occasions on a computer with numerical sequence randomly altered. Results We found that intra-observer agreement of Mason and Morrey classifications were satisfactory (κ = 0.582 and 0.554, respectively, while the AO/ASIF classification had poor intra-observer agreement (κ = 0.483. Inter-observer agreement was higher in the Mason (κ = 0.429-0.560 and Morrey (κ = 0.319-0.487 classifications than in the AO/ASIF classification (κ = 0.250-0.478, which showed poor reliability. Conclusion Inter- and intra-observer agreement of the Mason and Morey classifications showed overall satisfactory reliability when compared to the AO/ASIF system. The Mason classification is the most reliable system.

  10. Ratio-scaling of listener preference of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian

    2005-01-01

    -trivial assumption in the case of complex spatial sounds. In the present study the Bradley-Terry-Luce (BTL) model was employed to investigate the unidimensionality of preference judgments made by 40 listeners on multichannel reproduced sound. Short musical excerpts played back in eight reproduction modes (mono......, stereo and various multichannel formats) served as stimuli. On each trial, the task of the subjects was to choose the format they preferred, proceeding through all the possible pairs of the eight reproduction modes. This experiment was replicated with four types of programme material (pop and classical...... music). As a main result, the BTL model was found to predict the choice frequencies well. This implies that listeners were able to integrate the complex nature of the sounds into a unidimensional preference judgment. It further implies the existence of a preference scale on which the reproduction modes...

  11. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    Directory of Open Access Journals (Sweden)

    Eiji Watanabe

    2018-03-01

    Full Text Available The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  12. Giant Broad Line Regions in Dwarf Seyferts

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... High angular resolution spectroscopy obtained with the Hubble Space Telescope (HST) has revealed a remarkable population of galaxies hosting dwarf Seyfert nuclei with an unusually large broad-line region (BLR). These objects are remarkable for two reasons. Firstly, the size of the BLR can, in some ...

  13. Education and Broad Concepts of Agency

    Science.gov (United States)

    Winch, Christopher

    2014-01-01

    Drawing on recent debates about the relationship between propositional and practical knowledge, this article is concerned with broad concepts of agency. Specifically, it is concerned with agency that involves the forming and putting into effect of intentions over relatively extended periods, particularly in work contexts (called, for want of a…

  14. Analysis of mammalian gene function through broad-based phenotypic screens across a consortium of mouse clinics

    DEFF Research Database (Denmark)

    de Angelis, Martin Hrabě; Nicholson, George; Selloum, Mohammed

    2015-01-01

    for the broad-based phenotyping of knockouts through a pipeline comprising 20 disease-oriented platforms. We developed new statistical methods for pipeline design and data analysis aimed at detecting reproducible phenotypes with high power. We acquired phenotype data from 449 mutant alleles, representing 320...

  15. Fatigue failure of materials under broad band random vibrations

    Science.gov (United States)

    Huang, T. C.; Lanz, R. W.

    1971-01-01

    The fatigue life of material under multifactor influence of broad band random excitations has been investigated. Parameters which affect the fatigue life are postulated to be peak stress, variance of stress and the natural frequency of the system. Experimental data were processed by the hybrid computer. Based on the experimental results and regression analysis a best predicting model has been found. All values of the experimental fatigue lives are within the 95% confidence intervals of the predicting equation.

  16. GeoTrust Hub: A Platform For Sharing And Reproducing Geoscience Applications

    Science.gov (United States)

    Malik, T.; Tarboton, D. G.; Goodall, J. L.; Choi, E.; Bhatt, A.; Peckham, S. D.; Foster, I.; Ton That, D. H.; Essawy, B.; Yuan, Z.; Dash, P. K.; Fils, G.; Gan, T.; Fadugba, O. I.; Saxena, A.; Valentic, T. A.

    2017-12-01

    Recent requirements of scholarly communication emphasize the reproducibility of scientific claims. Text-based research papers are considered poor mediums to establish reproducibility. Papers must be accompanied by "research objects", aggregation of digital artifacts that together with the paper provide an authoritative record of a piece of research. We will present GeoTrust Hub (http://geotrusthub.org), a platform for creating, sharing, and reproducing reusable research objects. GeoTrust Hub provides tools for scientists to create `geounits'--reusable research objects. Geounits are self-contained, annotated, and versioned containers that describe and package computational experiments in an efficient and light-weight manner. Geounits can be shared on public repositories such as HydroShare and FigShare, and also using their respective APIs reproduced on provisioned clouds. The latter feature enables science applications to have a lifetime beyond sharing, wherein they can be independently verified and trust be established as they are repeatedly reused. Through research use cases from several geoscience laboratories across the United States, we will demonstrate how tools provided from GeoTrust Hub along with Hydroshare as its public repository for geounits is advancing the state of reproducible research in the geosciences. For each use case, we will address different computational reproducibility requirements. Our first use case will be an example of setup reproducibility which enables a scientist to set up and reproduce an output from a model with complex configuration and development environments. Our second use case will be an example of algorithm/data reproducibility, where in a shared data science model/dataset can be substituted with an alternate one to verify model output results, and finally an example of interactive reproducibility, in which an experiment is dependent on specific versions of data to produce the result. Toward this we will use software and data

  17. Semiautomated, Reproducible Batch Processing of Soy

    Science.gov (United States)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  18. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  19. Quark/gluon jet discrimination: a reproducible analysis using R

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The power to discriminate between light-quark jets and gluon jets would have a huge impact on many searches for new physics at CERN and beyond. This talk will present a walk-through of the development of a prototype machine learning classifier for differentiating between quark and gluon jets at experiments like those at the Large Hadron Collider at CERN. A new fast feature selection method that combines information theory and graph analytics will be outlined. This method has found new variables that promise significant improvements in discrimination power. The prototype jet tagger is simple, interpretable, parsimonious, and computationally extremely cheap, and therefore might be suitable for use in trigger systems for real-time data processing. Nested stratified k-fold cross validation was used to generate robust estimates of model performance. The data analysis was performed entirely in the R statistical programming language, and is fully reproducible. The entire analysis workflow is data-driven, automated a...

  20. Fourier evaluation of broad Moessbauer spectra

    International Nuclear Information System (INIS)

    Vincze, I.

    1981-01-01

    It is shown by the Fourier analysis of broad Moessbauer spectra that the even part of the distribution of the dominant hyperfine interaction (hyperfine field or quadrupole splitting) can be obtained directly without using least-square fitting procedures. Also the odd part of this distribution correlated with other hyperfine parameters (e.g. isomer shift) can be directly determined. Examples for amorphous magnetic and paramagnetic iron-based alloys are presented. (author)

  1. Eight-band k·p modeling of InAs/InGaAsSb type-II W-design quantum well structures for interband cascade lasers emitting in a broad range of mid infrared

    Energy Technology Data Exchange (ETDEWEB)

    Ryczko, K.; Sęk, G.; Misiewicz, J. [Institute of Physics, Wrocław University of Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wrocław (Poland)

    2013-12-14

    Band structure properties of the type-II W-design AlSb/InAs/GaIn(As)Sb/InAs/AlSb quantum wells have been investigated theoretically in a systematic manner and with respect to their use in the active region of interband cascade laser for a broad range of emission in mid infrared between below 3 to beyond 10 μm. Eight-band k·p approach has been utilized to calculate the electronic subbands. The fundamental optical transition energy and the corresponding oscillator strength have been determined in function of the thickness of InAs and GaIn(As)Sb layers and the composition of the latter. There have been considered active structures on two types of relevant substrates, GaSb and InAs, introducing slightly modified strain conditions. Additionally, the effect of external electric field has been taken into account to simulate the conditions occurring in the operational devices. The results show that introducing arsenic as fourth element into the valence band well of the type-II W-design system, and then altering its composition, can efficiently enhance the transition oscillator strength and allow additionally increasing the emission wavelength, which makes this solution prospective for improved performance and long wavelength interband cascade lasers.

  2. Reproducing an extreme flood with uncertain post-event information

    Directory of Open Access Journals (Sweden)

    D. Fuentes-Andino

    2017-07-01

    Full Text Available Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum–Cunge–Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events

  3. On the Inclusion Relation of Reproducing Kernel Hilbert Spaces

    OpenAIRE

    Zhang, Haizhang; Zhao, Liang

    2011-01-01

    To help understand various reproducing kernels used in applied sciences, we investigate the inclusion relation of two reproducing kernel Hilbert spaces. Characterizations in terms of feature maps of the corresponding reproducing kernels are established. A full table of inclusion relations among widely-used translation invariant kernels is given. Concrete examples for Hilbert-Schmidt kernels are presented as well. We also discuss the preservation of such a relation under various operations of ...

  4. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  5. Layout for Assessing Dynamic Posture: Development, Validation, and Reproducibility.

    Science.gov (United States)

    Noll, Matias; Candotti, Cláudia Tarragô; da Rosa, Bruna Nichele; Sedrez, Juliana Adami; Vieira, Adriane; Loss, Jefferson Fagundes

    2016-01-01

    To determine the psychometric properties of the layout for assessing dynamic posture (LADy). The study was divided into 2 phases: (1) development of the instrument and (2) determination of validity and reproducibility. The LADy was designed to evaluate the position adopted in 9 dynamic postures. The results confirmed the validity and reproducibility of the instrument. From a total of 51 criteria assessing 9 postures, 1 was rejected. The reproducibility for each of the criteria was classified as moderate to excellent. The LADy constitutes a valid and reproducible instrument for the evaluation of dynamic postures in children 11 to 17 years old. It is low cost and applicable in the school environment.

  6. DISCOVERY OF BROAD MOLECULAR LINES AND OF SHOCKED MOLECULAR HYDROGEN FROM THE SUPERNOVA REMNANT G357.7+0.3: HHSMT, APEX, SPITZER , AND SOFIA OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Rho, J. [SETI Institute, 189 N. Bernardo Ave., Mountain View, CA 94043 (United States); Hewitt, J. W. [CRESST/University of Maryland, Baltimore County, Baltimore, MD 21250 and NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States); Bieging, J. [Steward Observatory, The University of Arizona, Tucson AZ 85721 (United States); Reach, W. T. [Universities Space Research Association, SOFIA Science Center, NASA Ames Research Center, MS 232, Moffett Field, CA 94034 (United States); Andersen, M. [Gemini Observatory, Casilla 603, La Serena (Chile); Güsten, R., E-mail: jrho@seti.org, E-mail: john.w.hewitt@unf.edu, E-mail: jbieging@as.arizona.edu, E-mail: wreach@sofia.usra.edu, E-mail: manderse@gemini.edu, E-mail: guesten@mpifr-bonn.mpg.de [Max Planck Institut für Radioastronomie, Auf dem Hugel 69, D-53121 Bonn (Germany)

    2017-01-01

    We report a discovery of shocked gas from the supernova remnant (SNR) G357.7+0.3. Our millimeter and submillimeter observations reveal broad molecular lines of CO(2-1), CO(3-2), CO(4-3), {sup 13}CO (2-1), and {sup 13}CO (3-2), HCO{sup +}, and HCN using the Heinrich Hertz Submillimeter Telescope, the Arizona 12 m Telescope, APEX, and the MOPRA Telescope. The widths of the broad lines are 15–30 km s{sup −1}, and the detection of such broad lines is unambiguous, dynamic evidence showing that the SNR G357.7+0.3 is interacting with molecular clouds. The broad lines appear in extended regions (>4.′5 × 5′). We also present the detection of shocked H{sub 2} emission in the mid-infrared but lacking ionic lines using Spitzer /IRS observations to map a few-arcminute area. The H{sub 2} excitation diagram shows a best fit with a two-temperature local thermal equilibrium model with the temperatures of ∼200 and 660 K. We observed [C ii] at 158 μ m and high- J CO(11-10) with the German Receiver for Astronomy at Terahertz Frequencies (GREAT) on the Stratospheric Observatory for Infrared Astronomy. The GREAT spectrum of [C ii], a 3 σ detection, shows a broad line profile with a width of 15.7 km{sup −1} that is similar to those of broad CO molecular lines. The line width of [C ii] implies that ionic lines can come from a low-velocity C-shock. Comparison of H{sub 2} emission with shock models shows that a combination of two C-shock models is favored over a combination of C- and J-shocks or a single shock. We estimate the CO density, column density, and temperature using a RADEX model. The best-fit model with n (H{sub 2}) = 1.7 × 10{sup 4} cm{sup −3}, N(CO) = 5.6 × 10{sup 16} cm{sup −2}, and T  = 75 K can reproduce the observed millimeter CO brightnesses.

  7. Flow characteristics at trapezoidal broad-crested side weir

    Directory of Open Access Journals (Sweden)

    Říha Jaromír

    2015-06-01

    Full Text Available Broad-crested side weirs have been the subject of numerous hydraulic studies; however, the flow field at the weir crest and in front of the weir in the approach channel still has not been fully described. Also, the discharge coefficient of broad-crested side weirs, whether slightly inclined towards the stream or lateral, still has yet to be clearly determined. Experimental research was carried out to describe the flow characteristics at low Froude numbers in the approach flow channel for various combinations of in- and overflow discharges. Three side weir types with different oblique angles were studied. Their flow characteristics and discharge coefficients were analyzed and assessed based on the results obtained from extensive measurements performed on a hydraulic model. The empirical relation between the angle of side weir obliqueness, Froude numbers in the up- and downstream channels, and the coefficient of obliqueness was derived.

  8. Prediction of lung tumour position based on spirometry and on abdominal displacement: Accuracy and reproducibility

    International Nuclear Information System (INIS)

    Hoisak, Jeremy D.P.; Sixel, Katharina E.; Tirona, Romeo; Cheung, Patrick C.F.; Pignol, Jean-Philippe

    2006-01-01

    Background and purpose: A simulation investigating the accuracy and reproducibility of a tumour motion prediction model over clinical time frames is presented. The model is formed from surrogate and tumour motion measurements, and used to predict the future position of the tumour from surrogate measurements alone. Patients and methods: Data were acquired from five non-small cell lung cancer patients, on 3 days. Measurements of respiratory volume by spirometry and abdominal displacement by a real-time position tracking system were acquired simultaneously with X-ray fluoroscopy measurements of superior-inferior tumour displacement. A model of tumour motion was established and used to predict future tumour position, based on surrogate input data. The calculated position was compared against true tumour motion as seen on fluoroscopy. Three different imaging strategies, pre-treatment, pre-fraction and intrafractional imaging, were employed in establishing the fitting parameters of the prediction model. The impact of each imaging strategy upon accuracy and reproducibility was quantified. Results: When establishing the predictive model using pre-treatment imaging, four of five patients exhibited poor interfractional reproducibility for either surrogate in subsequent sessions. Simulating the formulation of the predictive model prior to each fraction resulted in improved interfractional reproducibility. The accuracy of the prediction model was only improved in one of five patients when intrafractional imaging was used. Conclusions: Employing a prediction model established from measurements acquired at planning resulted in localization errors. Pre-fractional imaging improved the accuracy and reproducibility of the prediction model. Intrafractional imaging was of less value, suggesting that the accuracy limit of a surrogate-based prediction model is reached with once-daily imaging

  9. Stereographic measurement of orbital volume, a digital reproducible evaluation method.

    Science.gov (United States)

    Mottini, Matthias; Wolf, Christian A; Seyed Jafari, S Morteza; Katsoulis, Konstantinos; Schaller, Benoît

    2017-10-01

    Up to date, no standardised reproducible orbital volume measurement method is available. Therefore, this study aimed to investigate the accuracy of a new measurement method, which delineates the boundaries of orbital cavity three-dimensionally (3D). In order to calculate the orbital volume from axial CT slice images of the patients, using our first described measurement method, the segmentation of the orbital cavity and the bony skull was performed using Amira 3D Analysis Software. The files were then imported into the Blender program. The stereographic skull model was aligned based on the Frankfurt horizontal plane and superposed according to defined anatomical reference points. The anterior sectional plane ran through the most posterior section of the lacrimal fossa and the farthest dorsal point of the anterior latero-orbital margin, which is positioned perpendicular to the Frankfurt horizontal plane. The volume of each orbital cavity was then determined automatically by the Blender program. The 10 consecutive subjects (5 female, 5 male) with mean age of 50.3±21.3 years were considered for analysis in the current study. The first investigator reported a mean orbital volume of 20.24±1.01 cm3 in the first and 20.25±1.03 cm3 in the second evaluation. Furthermore, the intraclass correlation coefficient (ICC) showed an excellent intrarater agreement (ICC=0.997). Additionally, the second investigator detected a mean orbital volume of 20.20±1.08 cm3 in his assessment, in which an excellent inter-rater agreement was found in ICC (ICC=0.994). This method provides a standardised and reproducible 3D approach to the measurement of the orbital volume. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  10. Broad-Spectrum Drugs Against Viral Agents

    Directory of Open Access Journals (Sweden)

    Jonathan P. Wong

    2008-09-01

    Full Text Available Development of antivirals has focused primarily on vaccines and on treatments for specific viral agents. Although effective, these approaches may be limited in situations where the etiologic agent is unknown or when the target virus has undergone mutation, recombination or reassortment. Augmentation of the innate immune response may be an effective alternative for disease amelioration. Nonspecific, broad-spectrum immune responses can be induced by double-stranded (dsRNAs such as poly (ICLC, or oligonucleotides (ODNs containing unmethylated deocycytidyl-deoxyguanosinyl (CpG motifs. These may offer protection against various bacterial and viral pathogens regardless of their genetic makeup, zoonotic origin or drug resistance.

  11. The broad utility of Trizac diamond tile

    Science.gov (United States)

    Gagliardi, John I.; Romero, Vincent D.; Sventek, Bruce; Zu, Lijun

    2017-10-01

    Sample finishing data from a broad range of materials — glasses, sapphire, silicon carbide, silicon, zirconium oxide, lithium tantalate, and flooring materials — are shown effectively processed with Trizact™ Diamond Tile (TDT). This data should provide the reader with an understanding of what to expect when using TDT on hard to grind or brittle materials. Keys to maintaining effective TDT pad wear rates, and therefore cost effect and stable processes, are described as managing 1) the proper lubricant flow rate for glasses and silicon-type materials and 2) the conditioning particle concentration for harder-to-grind materials

  12. Organ-on-a-Chip Technology for Reproducing Multiorgan Physiology.

    Science.gov (United States)

    Lee, Seung Hwan; Sung, Jong Hwan

    2018-01-01

    In the drug development process, the accurate prediction of drug efficacy and toxicity is important in order to reduce the cost, labor, and effort involved. For this purpose, conventional 2D cell culture models are used in the early phase of drug development. However, the differences between the in vitro and the in vivo systems have caused the failure of drugs in the later phase of the drug-development process. Therefore, there is a need for a novel in vitro model system that can provide accurate information for evaluating the drug efficacy and toxicity through a closer recapitulation of the in vivo system. Recently, the idea of using microtechnology for mimicking the microscale tissue environment has become widespread, leading to the development of "organ-on-a-chip." Furthermore, the system is further developed for realizing a multiorgan model for mimicking interactions between multiple organs. These advancements are still ongoing and are aimed at ultimately developing "body-on-a-chip" or "human-on-a-chip" devices for predicting the response of the whole body. This review summarizes recently developed organ-on-a-chip technologies, and their applications for reproducing multiorgan functions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Repeatability and reproducibility of Population Viability Analysis (PVA and the implications for threatened species management

    Directory of Open Access Journals (Sweden)

    Clare Morrison

    2016-08-01

    Full Text Available Conservation triage focuses on prioritizing species, populations or habitats based on urgency, biodiversity benefits, recovery potential as well as cost. Population Viability Analysis (PVA is frequently used in population focused conservation prioritizations. The critical nature of many of these management decisions requires that PVA models are repeatable and reproducible to reliably rank species and/or populations quantitatively. This paper assessed the repeatability and reproducibility of a subset of previously published PVA models. We attempted to rerun baseline models from 90 publicly available PVA studies published between 2000-2012 using the two most common PVA modelling software programs, VORTEX and RAMAS-GIS. Forty percent (n = 36 failed, 50% (45 were both repeatable and reproducible, and 10% (9 had missing baseline models. Repeatability was not linked to taxa, IUCN category, PVA program version used, year published or the quality of publication outlet, suggesting that the problem is systemic within the discipline. Complete and systematic presentation of PVA parameters and results are needed to ensure that the scientific input into conservation planning is both robust and reliable, thereby increasing the chances of making decisions that are both beneficial and defensible. The implications for conservation triage may be far reaching if population viability models cannot be reproduced with confidence, thus undermining their intended value.

  14. Homotopy deform method for reproducing kernel space for ...

    Indian Academy of Sciences (India)

    In this paper, the combination of homotopy deform method (HDM) and simplified reproducing kernel method (SRKM) is introduced for solving the boundary value problems (BVPs) of nonlinear differential equations. The solution methodology is based on Adomian decomposition and reproducing kernel method (RKM).

  15. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  16. Completely reproducible description of digital sound data with cellular automata

    International Nuclear Information System (INIS)

    Wada, Masato; Kuroiwa, Jousuke; Nara, Shigetoshi

    2002-01-01

    A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information

  17. Participant Nonnaiveté and the reproducibility of cognitive psychology

    NARCIS (Netherlands)

    R.A. Zwaan (Rolf); D. Pecher (Diane); G. Paolacci (Gabriele); S. Bouwmeester (Samantha); P.P.J.L. Verkoeijen (Peter); K. Dijkstra (Katinka); R. Zeelenberg (René)

    2017-01-01

    textabstractMany argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature—three each from the domains of perception/action, memory, and language, respectively—and found that they are highly reproducible. Not only can

  18. Extreme Variability in a Broad Absorption Line Quasar

    Energy Technology Data Exchange (ETDEWEB)

    Stern, Daniel; Jun, Hyunsung D. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Mail Stop 169-221, Pasadena, CA 91109 (United States); Graham, Matthew J.; Djorgovski, S. G.; Donalek, Ciro; Drake, Andrew J.; Mahabal, Ashish A.; Steidel, Charles C. [California Institute of Technology, 1200 E. California Boulevard, Pasadena, CA 91125 (United States); Arav, Nahum; Chamberlain, Carter [Department of Physics, Virginia Tech, Blacksburg, VA 24061 (United States); Barth, Aaron J. [Department of Physics and Astronomy, 4129 Frederick Reines Hall, University of California, Irvine, CA 92697 (United States); Glikman, Eilat, E-mail: daniel.k.stern@jpl.nasa.gov [Department of Physics, Middlebury College, Middlebury, VT 05753 (United States)

    2017-04-20

    CRTS J084133.15+200525.8 is an optically bright quasar at z = 2.345 that has shown extreme spectral variability over the past decade. Photometrically, the source had a visual magnitude of V ∼ 17.3 between 2002 and 2008. Then, over the following five years, the source slowly brightened by approximately one magnitude, to V ∼ 16.2. Only ∼1 in 10,000 quasars show such extreme variability, as quantified by the extreme parameters derived for this quasar assuming a damped random walk model. A combination of archival and newly acquired spectra reveal the source to be an iron low-ionization broad absorption line quasar with extreme changes in its absorption spectrum. Some absorption features completely disappear over the 9 years of optical spectra, while other features remain essentially unchanged. We report the first definitive redshift for this source, based on the detection of broad H α in a Keck/MOSFIRE spectrum. Absorption systems separated by several 1000 km s{sup −1} in velocity show coordinated weakening in the depths of their troughs as the continuum flux increases. We interpret the broad absorption line variability to be due to changes in photoionization, rather than due to motion of material along our line of sight. This source highlights one sort of rare transition object that astronomy will now be finding through dedicated time-domain surveys.

  19. Broad spectrum antiangiogenic treatment for ocular neovascular diseases.

    Directory of Open Access Journals (Sweden)

    Ofra Benny

    2010-09-01

    Full Text Available Pathological neovascularization is a hallmark of late stage neovascular (wet age-related macular degeneration (AMD and the leading cause of blindness in people over the age of 50 in the western world. The treatments focus on suppression of choroidal neovascularization (CNV, while current approved therapies are limited to inhibiting vascular endothelial growth factor (VEGF exclusively. However, this treatment does not address the underlying cause of AMD, and the loss of VEGF's neuroprotective can be a potential side effect. Therapy which targets the key processes in AMD, the pathological neovascularization, vessel leakage and inflammation could bring a major shift in the approach to disease treatment and prevention. In this study we have demonstrated the efficacy of such broad spectrum antiangiogenic therapy on mouse model of AMD.Lodamin, a polymeric formulation of TNP-470, is a potent broad-spectrum antiangiogenic drug. Lodamin significantly reduced key processes involved in AMD progression as demonstrated in mice and rats. Its suppressive effects on angiogenesis, vascular leakage and inflammation were studied in a wide array of assays including; a Matrigel, delayed-type hypersensitivity (DTH, Miles assay, laser-induced CNV and corneal micropocket assay. Lodamin significantly suppressed the secretion of various pro-inflammatory cytokines in the CNV lesion including monocyte chemotactic protein-1 (MCP-1/Ccl2. Importantly, Lodamin was found to regress established CNV lesions, unlike soluble fms-like tyrosine kinase-1 (sFlk-1. The drug was found to be safe in mice and have little toxicity as demonstrated by electroretinography (ERG assessing retinal and by histology.Lodamin, a polymer formulation of TNP-470, was identified as a first in its class, broad-spectrum antiangiogenic drug that can be administered orally or locally to treat corneal and retinal neovascularization. Several unique properties make Lodamin especially beneficial for ophthalmic

  20. Spectrophotometry of six broad absorption line QSOs

    Science.gov (United States)

    Junkkarinen, Vesa T.; Burbidge, E. Margaret; Smith, Harding E.

    1987-01-01

    Spectrophotometric observations of six broad absorption-line QSOs (BALQSOs) are presented. The continua and emission lines are compared with those in the spectra of QSOs without BALs. A statistically significant difference is found in the emission-line intensity ratio for (N V 1240-A)/(C IV 1549-A). The median value of (N V)/(C IV) for the BALQSOs is two to three times the median for QSOs without BALs. The absorption features of the BALQSOs are described, and the column densities and limits on the ionization structure of the BAL region are discussed. If the dominant ionization mechanism is photoionization, then it is likely that either the ionizing spectrum is steep or the abundances are considerably different from solar. Collisional ionization may be a significant factor, but it cannot totally dominate the ionization rate.

  1. Study on broad beam heavy ion CT

    International Nuclear Information System (INIS)

    Ohno, Yumiko; Kohno, Toshiyuki; Sasaki, Hitomi; Nanbu, S.; Kanai, Tatsuaki

    2003-01-01

    To achieve the heavy ion radiotherapy more precisely, it is important to know the distribution of the electron density in a human body, which is highly related to the range of charged particles. From a heavy ion CT image, we can directly obtain the 2-D distribution of the electron density in a sample. For this purpose, we have developed a broad beam heavy ion CT system. The electron density was obtained using some kinds of solutions targets. Also the dependence of the spatial resolution on the target size and the kinds of beams was estimated in this work using cylinders targets of 40, 60 and 80 mm in diameter, each of them has a hole of 10 mm in diameter at the center of it. (author)

  2. Ceftaroline: a new broad-spectrum cephalosporin.

    Science.gov (United States)

    Lim, Lauren; Sutton, Elizabeth; Brown, Jack

    2011-03-15

    The pharmacokinetics, pharmacodynamics, clinical efficacy, and safety of ceftaroline are reviewed. Ceftaroline, a new broad- spectrum antibiotic, is approved for the treatment of complicated skin and skin structure infections (cSSSIs) and community- acquired pneumonia (CAP). This β-lactam antibiotic has extended activity against gram-positive organisms and has activity against common gram-negative organisms. The drug's spectrum of activity includes both methicillin-resistant Staphylococcus aureus and multidrug-resistant Streptococcus pneumoniae. However, its activity against extended-spectrum β-lactamase-producing bacteria is limited. These bacteria, particularly those that express AmpC β-lactamase, greatly reduce the activity of ceftaroline. The prodrug of ceftaroline (ceftaroline fosamil) is rapidly converted to its active form (ceftaroline) in plasma. This dose-linear drug has been found to be pharmacodynamically best correlated with the percentage of time that free drug concentrations remain above the minimum inhibitory concentration. Ceftaroline's safety profile is similar to that of the other cephalosporins, with minimal adverse drug reactions, most of which are considered mild. Currently available pharmacokinetic, animal, and clinical studies have found that ceftaroline has reasonable efficacy and tolerability but have also revealed that dosing regimen modifications may be needed in patients with moderate-to-severe renal impairment. The recommended dosage of ceftaroline for the treatment of cSSSIs and CAP is 600 mg infused intravenously over 60 minutes every 12 hours. The recommended duration of therapy is 5-14 and 5-7 days for cSSSIs and CAP, respectively. Additional Phase III studies are currently underway. Ceftaroline is a new broad-spectrum cephalosporin indicated for the treatment of cSSSIs and CAP caused by susceptible gram-positive and gram-negative organisms.

  3. Energy determines broad pattern of plant distribution in Western Himalaya.

    Science.gov (United States)

    Panda, Rajendra M; Behera, Mukunda Dev; Roy, Partha S; Biradar, Chandrashekhar

    2017-12-01

    Several factors describe the broad pattern of diversity in plant species distribution. We explore these determinants of species richness in Western Himalayas using high-resolution species data available for the area to energy, water, physiography and anthropogenic disturbance. The floral data involves 1279 species from 1178 spatial locations and 738 sample plots of a national database. We evaluated their correlation with 8-environmental variables, selected on the basis of correlation coefficients and principal component loadings, using both linear (structural equation model) and nonlinear (generalised additive model) techniques. There were 645 genera and 176 families including 815 herbs, 213 shrubs, 190 trees, and 61 lianas. The nonlinear model explained the maximum deviance of 67.4% and showed the dominant contribution of climate on species richness with a 59% share. Energy variables (potential evapotranspiration and temperature seasonality) explained the deviance better than did water variables (aridity index and precipitation of the driest quarter). Temperature seasonality had the maximum impact on the species richness. The structural equation model confirmed the results of the nonlinear model but less efficiently. The mutual influences of the climatic variables were found to affect the predictions of the model significantly. To our knowledge, the 67.4% deviance found in the species richness pattern is one of the highest values reported in mountain studies. Broadly, climate described by water-energy dynamics provides the best explanation for the species richness pattern. Both modeling approaches supported the same conclusion that energy is the best predictor of species richness. The dry and cold conditions of the region account for the dominant contribution of energy on species richness.

  4. Plant STAND P-loop NTPases: a current perspective of genome distribution, evolution, and function : Plant STAND P-loop NTPases: genomic organization, evolution, and molecular mechanism models contribute broadly to plant pathogen defense.

    Science.gov (United States)

    Arya, Preeti; Acharya, Vishal

    2018-02-01

    STAND P-loop NTPase is the common weapon used by plant and other organisms from all three kingdoms of life to defend themselves against pathogen invasion. The purpose of this study is to review comprehensively the latest finding of plant STAND P-loop NTPase related to their genomic distribution, evolution, and their mechanism of action. Earlier, the plant STAND P-loop NTPase known to be comprised of only NBS-LRRs/AP-ATPase/NB-ARC ATPase. However, recent finding suggests that genome of early green plants comprised of two types of STAND P-loop NTPases: (1) mammalian NACHT NTPases and (2) NBS-LRRs. Moreover, YchF (unconventional G protein and members of P-loop NTPase) subfamily has been reported to be exceptionally involved in biotic stress (in case of Oryza sativa), thereby a novel member of STAND P-loop NTPase in green plants. The lineage-specific expansion and genome duplication events are responsible for abundance of plant STAND P-loop NTPases; where "moderate tandem and low segmental duplication" trajectory followed in majority of plant species with few exception (equal contribution of tandem and segmental duplication). Since the past decades, systematic research is being investigated into NBS-LRR function supported the direct recognition of pathogen or pathogen effectors by the latest models proposed via 'integrated decoy' or 'sensor domains' model. Here, we integrate the recently published findings together with the previous literature on the genomic distribution, evolution, and distinct models proposed for functional molecular mechanism of plant STAND P-loop NTPases.

  5. In-vitro accuracy and reproducibility evaluation of probing depth measurements of selected periodontal probes

    Directory of Open Access Journals (Sweden)

    K.N. Al Shayeb

    2014-01-01

    Conclusion: Depth measurements with the Chapple UB-CF-15 probe were more accurate and reproducible compared to measurements with the Vivacare TPS and Williams 14 W probes. This in vitro model may be useful for intra-examiner calibration or clinician training prior to the clinical evaluation of patients or in longitudinal studies involving periodontal evaluation.

  6. A Low-Cost Anthropometric Walking Robot for Reproducing Gait Lab Data

    Directory of Open Access Journals (Sweden)

    Rogério Eduardo da Silva Santana

    2008-01-01

    Full Text Available Human gait analysis is one of the resources that may be used in the study and treatment of pathologies of the locomotive system. This paper deals with the modelling and control aspects of the design, construction and testing of a biped walking robot conceived to, in limited extents, reproduce the human gait. Robot dimensions have been chosen in order to guarantee anthropomorphic proportions and then to help health professionals in gait studies. The robot has been assembled with low-cost components and can reproduce, in an assisted way, real-gait patterns generated from data previously acquired in gait laboratories. Part of the simulated and experimental results are addressed to demonstrate the ability of the biped robot in reproducing normal and pathological human gait.

  7. The NorWeST Summer Stream Temperature Model and Scenarios for the Western U.S.: A Crowd-Sourced Database and New Geospatial Tools Foster a User Community and Predict Broad Climate Warming of Rivers and Streams

    Science.gov (United States)

    Isaak, Daniel J.; Wenger, Seth J.; Peterson, Erin E.; Ver Hoef, Jay M.; Nagel, David E.; Luce, Charles H.; Hostetler, Steven W.; Dunham, Jason B.; Roper, Brett B.; Wollrab, Sherry P.; Chandler, Gwynne L.; Horan, Dona L.; Parkes-Payne, Sharon

    2017-11-01

    Thermal regimes are fundamental determinants of aquatic ecosystems, which makes description and prediction of temperatures critical during a period of rapid global change. The advent of inexpensive temperature sensors dramatically increased monitoring in recent decades, and although most monitoring is done by individuals for agency-specific purposes, collectively these efforts constitute a massive distributed sensing array that generates an untapped wealth of data. Using the framework provided by the National Hydrography Dataset, we organized temperature records from dozens of agencies in the western U.S. to create the NorWeST database that hosts >220,000,000 temperature recordings from >22,700 stream and river sites. Spatial-stream-network models were fit to a subset of those data that described mean August water temperatures (AugTw) during 63,641 monitoring site-years to develop accurate temperature models (r2 = 0.91; RMSPE = 1.10°C; MAPE = 0.72°C), assess covariate effects, and make predictions at 1 km intervals to create summer climate scenarios. AugTw averaged 14.2°C (SD = 4.0°C) during the baseline period of 1993-2011 in 343,000 km of western perennial streams but trend reconstructions also indicated warming had occurred at the rate of 0.17°C/decade (SD = 0.067°C/decade) during the 40 year period of 1976-2015. Future scenarios suggest continued warming, although variation will occur within and among river networks due to differences in local climate forcing and stream responsiveness. NorWeST scenarios and data are available online in user-friendly digital formats and are widely used to coordinate monitoring efforts among agencies, for new research, and for conservation planning.

  8. REPRODUCIBLE DRUG REPURPOSING: WHEN SIMILARITY DOES NOT SUFFICE.

    Science.gov (United States)

    Guney, Emre

    2017-01-01

    Repurposing existing drugs for new uses has attracted considerable attention over the past years. To identify potential candidates that could be repositioned for a new indication, many studies make use of chemical, target, and side effect similarity between drugs to train classifiers. Despite promising prediction accuracies of these supervised computational models, their use in practice, such as for rare diseases, is hindered by the assumption that there are already known and similar drugs for a given condition of interest. In this study, using publicly available data sets, we question the prediction accuracies of supervised approaches based on drug similarity when the drugs in the training and the test set are completely disjoint. We first build a Python platform to generate reproducible similarity-based drug repurposing models. Next, we show that, while a simple chemical, target, and side effect similarity based machine learning method can achieve good performance on the benchmark data set, the prediction performance drops sharply when the drugs in the folds of the cross validation are not overlapping and the similarity information within the training and test sets are used independently. These intriguing results suggest revisiting the assumptions underlying the validation scenarios of similarity-based methods and underline the need for unsupervised approaches to identify novel drug uses inside the unexplored pharmacological space. We make the digital notebook containing the Python code to replicate our analysis that involves the drug repurposing platform based on machine learning models and the proposed disjoint cross fold generation method freely available at github.com/emreg00/repurpose.

  9. The reproducibility of random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    RAPD) profiles of Streptococcus thermophilus strains by using the polymerase chain reaction (PCR). Several factors can cause the amplification of false and non reproducible bands in the RAPD profiles. We tested three primers, OPI-02 MOD, ...

  10. Homotopy deform method for reproducing kernel space for ...

    Indian Academy of Sciences (India)

    2016-09-23

    Sep 23, 2016 ... Nonlinear differential equations; the homotopy deform method; the simplified reproducing kernel ... an equivalent integro differential equation. ... an algorithm for solving nonlinear multipoint BVPs by combining homotopy perturbation and variational iteration methods. Most recently, Duan and Rach [12].

  11. Homotopy deform method for reproducing kernel space for ...

    Indian Academy of Sciences (India)

    2016-09-23

    s12043-016-1269-8. Homotopy deform method for reproducing kernel space for nonlinear boundary value problems. MIN-QIANG XU. ∗ and YING-ZHEN LIN. School of Science, Zhuhai Campus, Beijing Institute of Technology, ...

  12. Transition questions in clinical practice - validity and reproducibility

    DEFF Research Database (Denmark)

    Lauridsen, Henrik Hein

    2008-01-01

    Transition questions in CLINICAL practice - validity and reproducibility Lauridsen HH1, Manniche C3, Grunnet-Nilsson N1, Hartvigsen J1,2 1   Clinical Locomotion Science, Institute of Sports Science and Clinical Biomechanics, University of Southern Denmark, Odense, Denmark. e-mail: hlauridsen....... One way to determine the relevance of change scores is through the use of transition questions (TQ) that assesses patients’ retrospective perception of treatment effect. However, little is known about the validity and reproducibility of TQ’s. The objectives of this study were to explore aspects...... of construct validity and reproducibility of a TQ and make proposals for standardised use. One-hundred-and-ninety-one patients with low back pain and/or leg pain were followed over an 8-week period receiving 3 disability and 2 pain questionnaires together with a 7-point TQ. Reproducibility was determined using...

  13. Case Studies and Challenges in Reproducibility in the Computational Sciences

    OpenAIRE

    Arabas, Sylwester; Bareford, Michael R.; de Silva, Lakshitha R.; Gent, Ian P.; Gorman, Benjamin M.; Hajiarabderkani, Masih; Henderson, Tristan; Hutton, Luke; Konovalov, Alexander; Kotthoff, Lars; McCreesh, Ciaran; Nacenta, Miguel A.; Paul, Ruma R.; Petrie, Karen E. J.; Razaq, Abdul

    2014-01-01

    This paper investigates the reproducibility of computational science research and identifies key challenges facing the community today. It is the result of the First Summer School on Experimental Methodology in Computational Science Research (https://blogs.cs.st-andrews.ac.uk/emcsr2014/). First, we consider how to reproduce experiments that involve human subjects, and in particular how to deal with different ethics requirements at different institutions. Second, we look at whether parallel an...

  14. Broad-Band Activatable White-Opsin.

    Directory of Open Access Journals (Sweden)

    Subrata Batabyal

    Full Text Available Currently, the use of optogenetic sensitization of retinal cells combined with activation/inhibition has the potential to be an alternative to retinal implants that would require electrodes inside every single neuron for high visual resolution. However, clinical translation of optogenetic activation for restoration of vision suffers from the drawback that the narrow spectral sensitivity of an opsin requires active stimulation by a blue laser or a light emitting diode with much higher intensities than ambient light. In order to allow an ambient light-based stimulation paradigm, we report the development of a 'white-opsin' that has broad spectral excitability in the visible spectrum. The cells sensitized with white-opsin showed excitability at an order of magnitude higher with white light compared to using only narrow-band light components. Further, cells sensitized with white-opsin produced a photocurrent that was five times higher than Channelrhodopsin-2 under similar photo-excitation conditions. The use of fast white-opsin may allow opsin-sensitized neurons in a degenerated retina to exhibit a higher sensitivity to ambient white light. This property, therefore, significantly lowers the activation threshold in contrast to conventional approaches that use intense narrow-band opsins and light to activate cellular stimulation.

  15. Novel application of a tissue-engineered collagen-based three-dimensional bio-implant in a large tendon defect model: a broad-based study with high value in translational medicine.

    Science.gov (United States)

    Meimandi-Parizi, Abdolhamid; Oryan, Ahmad; Moshiri, Ali; Silver, Ian A

    2013-08-01

    This study was designed to investigate the effectiveness of a novel tissue-engineered three-dimensional collagen implant on healing of a large tendon-defect model, in vivo. Forty rabbits were divided into two equal groups: treated and control. A 2cm full-thickness gap was created in the left Achilles tendons of all the rabbits. To maintain the gap at the desired length (2cm), a Kessler suture was anchored within the proximal and distal ends of the remaining tendon. In the treated group a collagen implant was inserted in the gap while in the control group the gap was left unfilled. At weekly intervals the animals were examined clinically and their Achilles tendons tested bioelectrically. The hematological parameters and the serum Platelet-Derived Growth Factor of the animals were analyzed at 60 days post injury (DPI) immediately prior to euthanasia. Their injured (left) and normal contralateral Achilles tendons were harvested and examined at gross morphologic level before being subjected to biomechanical testing, and biophysical and biochemical analysis. The treated animals showed superior weight-bearing and greater physical activity than their controls. New dense tendinous tissue with a transverse diameter comparable to that of intact tendons filled the defect area of the treated tendons and had entirely replaced the collagen implant, at 60 DPI. In control lesions the defect was filled with loose areolar connective tissue similar to subcutaneous fascia. Treatment significantly improved the electrical resistance, dry matter, hydroxyproline content, water uptake and water delivery characteristics, of the healing tissue, as well as maximum load, yield load, maximum stress, yield stress and modulus of elasticity of the injured treated tendons compared to those of the control tendons (P<0.05). Use of this three-dimensional collagen implant improved the healing of large tendon defects in rabbits. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Systematic heterogenization for better reproducibility in animal experimentation.

    Science.gov (United States)

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  17. Spiking Neural Network With Distributed Plasticity Reproduces Cerebellar Learning in Eye Blink Conditioning Paradigms.

    Science.gov (United States)

    Antonietti, Alberto; Casellato, Claudia; Garrido, Jesús A; Luque, Niceto R; Naveros, Francisco; Ros, Eduardo; D' Angelo, Egidio; Pedrocchi, Alessandra

    2016-01-01

    In this study, we defined a realistic cerebellar model through the use of artificial spiking neural networks, testing it in computational simulations that reproduce associative motor tasks in multiple sessions of acquisition and extinction. By evolutionary algorithms, we tuned the cerebellar microcircuit to find out the near-optimal plasticity mechanism parameters that better reproduced human-like behavior in eye blink classical conditioning, one of the most extensively studied paradigms related to the cerebellum. We used two models: one with only the cortical plasticity and another including two additional plasticity sites at nuclear level. First, both spiking cerebellar models were able to well reproduce the real human behaviors, in terms of both "timing" and "amplitude", expressing rapid acquisition, stable late acquisition, rapid extinction, and faster reacquisition of an associative motor task. Even though the model with only the cortical plasticity site showed good learning capabilities, the model with distributed plasticity produced faster and more stable acquisition of conditioned responses in the reacquisition phase. This behavior is explained by the effect of the nuclear plasticities, which have slow dynamics and can express memory consolidation and saving. We showed how the spiking dynamics of multiple interactive neural mechanisms implicitly drive multiple essential components of complex learning processes.  This study presents a very advanced computational model, developed together by biomedical engineers, computer scientists, and neuroscientists. Since its realistic features, the proposed model can provide confirmations and suggestions about neurophysiological and pathological hypotheses and can be used in challenging clinical applications.

  18. Arctic Change Information for a Broad Audience

    Science.gov (United States)

    Soreide, N. N.; Overland, J. E.; Calder, J.

    2002-12-01

    Demonstrable environmental changes have occurred in the Arctic over the past three decades. NOAA's Arctic Theme Page is a rich resource web site focused on high latitude studies and the Arctic, with links to widely distributed data and information focused on the Arctic. Included is a collection of essays on relevant topics by experts in Arctic research. The website has proven useful to a wide audience, including scientists, students, teachers, decision makers and the general public, as indicated through recognition by USA Today, Science magazine, etc. (http://www.arctic.noaa.gov) Working jointly with NSF and the University of Washington's Polar Science Center as part of the Study of Environmental Arctic Change (SEARCH) program, NOAA has developed a website for access to pan-Arctic time series spanning diverse data types including climate indices, atmospheric, oceanic, sea ice, terrestrial, biological and fisheries. Modest analysis functions and more detailed analysis results are provided. (http://www.unaami.noaa.gov/). This paper will describe development of an Artic Change Detection status website to provide a direct and comprehensive view of previous and ongoing change in the Arctic for a broad climate community. For example, composite metrics are developed using principal component analysis based on 86 multivariate pan-Arctic time series for seven data types. Two of these metrics can be interpreted as a regime change/trend component and an interdecadal component. Changes can also be visually observed through tracking of 28 separate biophysical indicators. Results will be presented in the form of a web site with relevant, easily understood, value-added knowledge backed by peer review from Arctic scientists and scientific journals.

  19. Broadly sampled multigene trees of eukaryotes

    Directory of Open Access Journals (Sweden)

    Logsdon John M

    2008-01-01

    Full Text Available Abstract Background Our understanding of the eukaryotic tree of life and the tremendous diversity of microbial eukaryotes is in flux as additional genes and diverse taxa are sampled for molecular analyses. Despite instability in many analyses, there is an increasing trend to classify eukaryotic diversity into six major supergroups: the 'Amoebozoa', 'Chromalveolata', 'Excavata', 'Opisthokonta', 'Plantae', and 'Rhizaria'. Previous molecular analyses have often suffered from either a broad taxon sampling using only single-gene data or have used multigene data with a limited sample of taxa. This study has two major aims: (1 to place taxa represented by 72 sequences, 61 of which have not been characterized previously, onto a well-sampled multigene genealogy, and (2 to evaluate the support for the six putative supergroups using two taxon-rich data sets and a variety of phylogenetic approaches. Results The inferred trees reveal strong support for many clades that also have defining ultrastructural or molecular characters. In contrast, we find limited to no support for most of the putative supergroups as only the 'Opisthokonta' receive strong support in our analyses. The supergroup 'Amoebozoa' has only moderate support, whereas the 'Chromalveolata', 'Excavata', 'Plantae', and 'Rhizaria' receive very limited or no support. Conclusion Our analytical approach substantiates the power of increased taxon sampling in placing diverse eukaryotic lineages within well-supported clades. At the same time, this study indicates that the six supergroup hypothesis of higher-level eukaryotic classification is likely premature. The use of a taxon-rich data set with 105 lineages, which still includes only a small fraction of the diversity of microbial eukaryotes, fails to resolve deeper phylogenetic relationships and reveals no support for four of the six proposed supergroups. Our analyses provide a point of departure for future taxon- and gene-rich analyses of the

  20. Broad ion beam serial section tomography.

    Science.gov (United States)

    Winiarski, B; Gholinia, A; Mingard, K; Gee, M; Thompson, G E; Withers, P J

    2017-01-01

    Here we examine the potential of serial Broad Ion Beam (BIB) Ar + ion polishing as an advanced serial section tomography (SST) technique for destructive 3D material characterisation for collecting data from volumes with lateral dimensions significantly greater than 100µm and potentially over millimetre sized areas. Further, the associated low level of damage introduced makes BIB milling very well suited to 3D EBSD acquisition with very high indexing rates. Block face serial sectioning data registration schemes usually assume that the data comprises a series of parallel, planar slices. We quantify the variations in slice thickness and parallelity which can arise when using BIB systems comparing Gatan PECS and Ilion BIB systems for large volume serial sectioning and 3D-EBSD data acquisition. As a test case we obtain 3D morphologies and grain orientations for both phases of a WC-11%wt. Co hardmetal. In our case we have carried out the data acquisition through the manual transfer of the sample between SEM and BIB which is a very slow process (1-2 slice per day), however forthcoming automated procedures will markedly speed up the process. We show that irrespective of the sectioning method raw large area 2D-EBSD maps are affected by distortions and artefacts which affect 3D-EBSD such that quantitative analyses and visualisation can give misleading and erroneous results. Addressing and correcting these issues will offer real benefits when large area (millimetre sized) automated serial section BIBS is developed. Copyright © 2016. Published by Elsevier B.V.

  1. Broad-line Type Ic supernova SN 2014ad

    Science.gov (United States)

    Sahu, D. K.; Anupama, G. C.; Chakradhari, N. K.; Srivastav, S.; Tanaka, Masaomi; Maeda, Keiichi; Nomoto, Ken'ichi

    2018-04-01

    We present optical and ultraviolet photometry and low-resolution optical spectroscopy of the broad-line Type Ic supernova SN 2014ad in the galaxy PGC 37625 (Mrk 1309), covering the evolution of the supernova during -5 to +87 d with respect to the date of maximum in the B band. A late-phase spectrum obtained at +340 d is also presented. With an absolute V-band magnitude at peak of MV = -18.86 ± 0.23 mag, SN 2014ad is fainter than supernovae associated with gamma ray bursts (GRBs), and brighter than most of the normal and broad-line Type Ic supernovae without an associated GRB. The spectral evolution indicates that the expansion velocity of the ejecta, as measured using the Si II line, is as high as ˜33 500 km s-1 around maximum, while during the post-maximum phase it settles at ˜15 000 km s-1. The expansion velocity of SN 2014ad is higher than that of all other well-observed broad-line Type Ic supernovae except for the GRB-associated SN 2010bh. The explosion parameters, determined by applying Arnett's analytical light-curve model to the observed bolometric light-curve, indicate that it was an energetic explosion with a kinetic energy of ˜(1 ± 0.3) × 1052 erg and a total ejected mass of ˜(3.3 ± 0.8) M⊙, and that ˜0.24 M⊙ of 56Ni was synthesized in the explosion. The metallicity of the host galaxy near the supernova region is estimated to be ˜0.5 Z⊙.

  2. Broadly protective influenza vaccines: Redirecting the antibody response through adjuvation

    NARCIS (Netherlands)

    Cox, F.

    2016-01-01

    Influenza virus infections are responsible for significant morbidity worldwide and current vaccines have limited coverage, therefore it remains a high priority to develop broadly protective vaccines. With the discovery of broadly neutralizing antibodies (bnAbs) against influenza these vaccines

  3. Scientific Reproducibility in Biomedical Research: Provenance Metadata Ontology for Semantic Annotation of Study Description.

    Science.gov (United States)

    Sahoo, Satya S; Valdez, Joshua; Rueschman, Michael

    2016-01-01

    Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled "Rigor and Reproducibility " for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project.

  4. Validation, automatic generation and use of broad phonetic transcriptions

    NARCIS (Netherlands)

    Bael, Cristophe Patrick Jan Van

    2007-01-01

    Broad phonetic transcriptions represent the pronunciation of words as strings of characters from specifically designed symbol sets. In everyday life, broad phonetic transcriptions are often used as aids to pronounce (foreign) words. In addition, broad phonetic transcriptions are often used for

  5. Using prediction markets to estimate the reproducibility of scientific research.

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  6. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    Science.gov (United States)

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-08-01

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  7. Using prediction markets to estimate the reproducibility of scientific research

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  8. Reproducibility2020: Progress and priorities [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Leonard P. Freedman

    2017-05-01

    Full Text Available The preclinical research process is a cycle of idea generation, experimentation, and reporting of results. The biomedical research community relies on the reproducibility of published discoveries to create new lines of research and to translate research findings into therapeutic applications. Since 2012, when scientists from Amgen reported that they were able to reproduce only 6 of 53 “landmark” preclinical studies, the biomedical research community began discussing the scale of the reproducibility problem and developing initiatives to address critical challenges. Global Biological Standards Institute (GBSI released the “Case for Standards” in 2013, one of the first comprehensive reports to address the rising concern of irreproducible biomedical research. Further attention was drawn to issues that limit scientific self-correction, including reporting and publication bias, underpowered studies, lack of open access to methods and data, and lack of clearly defined standards and guidelines in areas such as reagent validation. To evaluate the progress made towards reproducibility since 2013, GBSI identified and examined initiatives designed to advance quality and reproducibility. Through this process, we identified key roles for funders, journals, researchers and other stakeholders and recommended actions for future progress. This paper describes our findings and conclusions.

  9. Reproducibility of bracket positioning in the indirect bonding technique.

    Science.gov (United States)

    Nichols, Dale A; Gardner, Gary; Carballeyra, Alain D

    2013-11-01

    Current studies have compared indirect bonding with direct placement of orthodontic brackets; many of these have shown that indirect bonding is generally a more accurate technique. However, the reproducibility of an indirect bonding setup by an orthodontist has yet to be described in the literature. Using cone-beam computed tomography and computer-assisted modeling software, we evaluated the consistency of orthodontists in placing orthodontic brackets at different times. Five orthodontists with experience in indirect bonding were selected to place brackets on 10 different casts at 3 time periods (n = 30 per orthodontist). Each participant completed an initial indirect bonding setup on each cast; subsequent bracket placements were completed twice at monthly intervals for comparison with the initial setup. The casts were scanned using an iCAT cone-beam computed tomography scanner (Imaging Sciences International, Hatfield, Pa) and imported into Geomagic Studio software (Geomagic, Research Triangle Park, NC) for superimposition and analysis. The scans for each time period were superimposed on the initial setup in the imaging software, and differences between bracket positions were calculated. For each superimposition, the measurements recorded were the greatest discrepancies between individual brackets as well as the mean discrepancies and standard deviations between all brackets on each cast. Single-factor and repeated-measure analysis of variance showed no statistically significant differences between time points of each orthodontist, or among the orthodontists for the parameters measured. The mean discrepancy was 0.1 mm for each 10-bracket indirect bonding setup. Orthodontists are consistent in selecting bracket positions for an indirect bonding setup at various time periods. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  10. Reproducing American Sign Language Sentences: Cognitive Scaffolding in Working Memory

    Directory of Open Access Journals (Sweden)

    Ted eSupalla

    2014-08-01

    Full Text Available The American Sign Language Sentence Reproduction Test (ASL-SRT requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects’ recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies in the absence of linguistic knowledge. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are

  11. Reproducible preclinical research-Is embracing variability the answer?

    Science.gov (United States)

    Karp, Natasha A

    2018-03-01

    Translational failures and replication issues of published research are undermining preclinical research and, if the outcomes are questionable, raise ethical implications over the continued use of animals. Standardization of procedures, environmental conditions, and genetic background has traditionally been proposed as the gold standard approach, as it reduces variability, thereby enhancing sensitivity and supporting reproducibility when the environment is defined precisely. An alternative view is that standardization can identify idiosyncratic effects and hence decrease reproducibility. In support of this alternative view, Voelkl and colleagues present evidence from resampling a large quantity of research data exploring a variety of treatments. They demonstrate that by implementing multi-laboratory experiments with as few as two sites, we can increase reproducibility by embracing variation without increasing the sample size.

  12. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    Science.gov (United States)

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  13. Progress toward openness, transparency, and reproducibility in cognitive neuroscience.

    Science.gov (United States)

    Gilmore, Rick O; Diaz, Michele T; Wyble, Brad A; Yarkoni, Tal

    2017-05-01

    Accumulating evidence suggests that many findings in psychological science and cognitive neuroscience may prove difficult to reproduce; statistical power in brain imaging studies is low and has not improved recently; software errors in analysis tools are common and can go undetected for many years; and, a few large-scale studies notwithstanding, open sharing of data, code, and materials remain the rare exception. At the same time, there is a renewed focus on reproducibility, transparency, and openness as essential core values in cognitive neuroscience. The emergence and rapid growth of data archives, meta-analytic tools, software pipelines, and research groups devoted to improved methodology reflect this new sensibility. We review evidence that the field has begun to embrace new open research practices and illustrate how these can begin to address problems of reproducibility, statistical power, and transparency in ways that will ultimately accelerate discovery. © 2017 New York Academy of Sciences.

  14. Reproducibility of clinical research in critical care: a scoping review.

    Science.gov (United States)

    Niven, Daniel J; McCormick, T Jared; Straus, Sharon E; Hemmelgarn, Brenda R; Jeffs, Lianne; Barnes, Tavish R M; Stelfox, Henry T

    2018-02-21

    The ability to reproduce experiments is a defining principle of science. Reproducibility of clinical research has received relatively little scientific attention. However, it is important as it may inform clinical practice, research agendas, and the design of future studies. We used scoping review methods to examine reproducibility within a cohort of randomized trials examining clinical critical care research and published in the top general medical and critical care journals. To identify relevant clinical practices, we searched the New England Journal of Medicine, The Lancet, and JAMA for randomized trials published up to April 2016. To identify a comprehensive set of studies for these practices, included articles informed secondary searches within other high-impact medical and specialty journals. We included late-phase randomized controlled trials examining therapeutic clinical practices in adults admitted to general medical-surgical or specialty intensive care units (ICUs). Included articles were classified using a reproducibility framework. An original study was the first to evaluate a clinical practice. A reproduction attempt re-evaluated that practice in a new set of participants. Overall, 158 practices were examined in 275 included articles. A reproduction attempt was identified for 66 practices (42%, 95% CI 33-50%). Original studies reported larger effects than reproduction attempts (primary endpoint, risk difference 16.0%, 95% CI 11.6-20.5% vs. 8.4%, 95% CI 6.0-10.8%, P = 0.003). More than half of clinical practices with a reproduction attempt demonstrated effects that were inconsistent with the original study (56%, 95% CI 42-68%), among which a large number were reported to be efficacious in the original study and to lack efficacy in the reproduction attempt (34%, 95% CI 19-52%). Two practices reported to be efficacious in the original study were found to be harmful in the reproduction attempt. A minority of critical care practices with research published

  15. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  16. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    Science.gov (United States)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  17. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  18. Language-Agnostic Reproducible Data Analysis Using Literate Programming

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir. PMID:27711123

  19. Inter- and intra-laboratory study to determine the reproducibility of toxicogenomics datasets.

    Science.gov (United States)

    Scott, D J; Devonshire, A S; Adeleye, Y A; Schutte, M E; Rodrigues, M R; Wilkes, T M; Sacco, M G; Gribaldo, L; Fabbri, M; Coecke, S; Whelan, M; Skinner, N; Bennett, A; White, A; Foy, C A

    2011-11-28

    The application of toxicogenomics as a predictive tool for chemical risk assessment has been under evaluation by the toxicology community for more than a decade. However, it predominately remains a tool for investigative research rather than for regulatory risk assessment. In this study, we assessed whether the current generation of microarray technology in combination with an in vitro experimental design was capable of generating robust, reproducible data of sufficient quality to show promise as a tool for regulatory risk assessment. To this end, we designed a prospective collaborative study to determine the level of inter- and intra-laboratory reproducibility between three independent laboratories. All test centres (TCs) adopted the same protocols for all aspects of the toxicogenomic experiment including cell culture, chemical exposure, RNA extraction, microarray data generation and analysis. As a case study, the genotoxic carcinogen benzo[a]pyrene (B[a]P) and the human hepatoma cell line HepG2 were used to generate three comparable toxicogenomic data sets. High levels of technical reproducibility were demonstrated using a widely employed gene expression microarray platform. While differences at the global transcriptome level were observed between the TCs, a common subset of B[a]P responsive genes (n=400 gene probes) was identified at all TCs which included many genes previously reported in the literature as B[a]P responsive. These data show promise that the current generation of microarray technology, in combination with a standard in vitro experimental design, can produce robust data that can be generated reproducibly in independent laboratories. Future work will need to determine whether such reproducible in vitro model(s) can be predictive for a range of toxic chemicals with different mechanisms of action and thus be considered as part of future testing regimes for regulatory risk assessment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  20. Reserves of reproducibility and accuracy of spectrochemical methods of analysis

    International Nuclear Information System (INIS)

    Britske, M.Eh.; Slabodenyuk, I.V.

    1982-01-01

    Reproducibility and accuracy of analysis by methods of absorption and emission flame spectroscopy are practically adequate under the conditions of comparability of detection limits. The basic part of error is contributed by fluctuations of free atom concentration in the flame torch. An instrumental error and a part of er-- ror, contributed by random fluctations of flame temperature, can be practically neglected. Further improvement of the reproducibility can be achieved at the expense of stabilization of aerozol generation or by using the internal standard technique in the work on two-channel spectrometers. Dispersions for the both techniques are compared on the example of indium determination

  1. Reproducibility of esophageal scintigraphy using semi-solid yoghurt

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Yukinori; Kinoshita, Manabu; Asakura, Yasushi; Kakinuma, Tohru; Shimoji, Katsunori; Fujiwara, Kenji; Suzuki, Kenji; Miyamae, Tatsuya [Saitama Medical School, Moroyama (Japan)

    1999-10-01

    Esophageal scintigraphy is a non-invasive method which evaluate esophageal function quantitatively. We applied new technique using semi-solid yoghurt, which can evaluate esophageal function in a sitting position. To evaluate the reproducibility of this method, scintigraphy were performed in 16 healthy volunteers. From the result of four swallows except the first one, the mean coefficients of variation in esophageal transit time and esophageal emptying time were 12.8% and 13.4% respectively (interday variation). As regards the interday variation, this method had also good reproducibility from the result on the 2 separate days. (author)

  2. The Maudsley Outpatient Study of Treatments for Anorexia Nervosa and Related Conditions (MOSAIC): Comparison of the Maudsley Model of Anorexia Nervosa Treatment for Adults (MANTRA) with specialist supportive clinical management (SSCM) in outpatients with broadly defined anorexia nervosa: A randomized controlled trial.

    Science.gov (United States)

    Schmidt, Ulrike; Magill, Nicholas; Renwick, Bethany; Keyes, Alexandra; Kenyon, Martha; Dejong, Hannah; Lose, Anna; Broadbent, Hannah; Loomes, Rachel; Yasin, Huma; Watson, Charlotte; Ghelani, Shreena; Bonin, Eva-Maria; Serpell, Lucy; Richards, Lorna; Johnson-Sabine, Eric; Boughton, Nicky; Whitehead, Linette; Beecham, Jennifer; Treasure, Janet; Landau, Sabine

    2015-08-01

    Anorexia nervosa (AN) in adults has poor outcomes, and treatment evidence is limited. This study evaluated the efficacy and acceptability of a novel, targeted psychological therapy for AN (Maudsley Model of Anorexia Nervosa Treatment for Adults; MANTRA) compared with Specialist Supportive Clinical Management (SSCM). One hundred forty-two outpatients with broadly defined AN (body mass index [BMI] ≤ 18.5 kg/m²) were randomly allocated to receive 20 to 30 weekly sessions (depending on clinical severity) plus add-ons (4 follow-up sessions, optional sessions with dietician and with carers) of MANTRA (n = 72) or SSCM (n = 70). Assessments were administered blind to treatment condition at baseline, 6 months, and 12 months after randomization. The primary outcome was BMI at 12 months. Secondary outcomes included eating disorders symptomatology, other psychopathology, neuro-cognitive and social cognition, and acceptability. Additional service utilization was also assessed. Outcomes were analyzed using linear mixed models. Both treatments resulted in significant improvements in BMI and reductions in eating disorders symptomatology, distress levels, and clinical impairment over time, with no statistically significant difference between groups at either 6 or 12 months. Improvements in neuro-cognitive and social-cognitive measures over time were less consistent. One SSCM patient died. Compared with SSCM, MANTRA patients rated their treatment as significantly more acceptable and credible at 12 months. There was no significant difference between groups in additional service consumption. Both treatments appear to have value as first-line outpatient interventions for patients with broadly defined AN. Longer term outcomes remain to be evaluated. (c) 2015 APA, all rights reserved).

  3. Reproducibility of cerebral tissue oxygen saturation measurements by near-infrared spectroscopy in newborn infants

    Science.gov (United States)

    Jenny, Carmen; Biallas, Martin; Trajkovic, Ivo; Fauchère, Jean-Claude; Bucher, Hans Ulrich; Wolf, Martin

    2011-09-01

    Early detection of cerebral hypoxemia is an important aim in neonatology. A relevant parameter to assess brain oxygenation may be the cerebral tissue oxygen saturation (StO2) measured by near-infrared spectroscopy (NIRS). So far the reproducibility of StO2 measurements was too low for clinical application, probably due to inhomogeneities. The aim of this study was to test a novel sensor geometry which reduces the influence of inhomogeneities. Thirty clinically stable newborn infants, with a gestational age of median 33.9 (range 26.9 to 41.9) weeks, birth weight of 2220 (820 to 4230) g, postnatal age of 5 (1 to 71) days were studied. At least four StO2 measurements of 1 min duration were carried out using NIRS on the lateral head. The sensor was repositioned between measurements. Reproducibility was calculated by a linear mixed effects model. The mean StO2 was 79.99 +/- 4.47% with a reproducibility of 2.76% and a between-infant variability of 4.20%. Thus, the error of measurement only accounts for 30.1% of the variability. The novel sensor geometry leads to considerably more precise measurements compared to previous studies with, e.g., ~5% reproducibility for the NIRO 300. The novel StO2 values hence have a higher clinical relevance.

  4. Language actively reproduces the socio-economic inequalities in ...

    African Journals Online (AJOL)

    Our relations in society as men and women are determined and expressed by the language we speak. However, language does not passively reflect society but rather actively reproduces the inequalities in society. Language is a cognitive process involving the production and understanding of linguistic communication as ...

  5. Intercenter reproducibility of binary typing for Staphylococcus aureus

    NARCIS (Netherlands)

    van Leeuwen, Willem B.; Snoeijers, Sandor; van der Werken-Libregts, Christel; Tuip, Anita; van der Zee, Anneke; Egberink, Diane; de Proost, Monique; Bik, Elisabeth; Lunter, Bjorn; Kluytmans, Jan; Gits, Etty; van Duyn, Inge; Heck, Max; van der Zwaluw, Kim; Wannet, Wim; Noordhoek, Gerda T.; Mulder, Sije; Renders, Nicole; Boers, Miranda; Zaat, Sebastiaan; van der Riet, Daniëlle; Kooistra, Mirjam; Talens, Adriaan; Dijkshoorn, Lenie; van der Reyden, Tanny; Veenendaal, Dick; Bakker, Nancy; Cookson, Barry; Lynch, Alisson; Witte, Wolfgang; Cuny, Christa; Blanc, Dominique; Vernez, Isabelle; Hryniewicz, Waleria; Fiett, Janusz; Struelens, Marc; Deplano, Ariane; Landegent, Jim; Verbrugh, Henri A.; van Belkum, Alex

    2002-01-01

    The reproducibility of the binary typing (BT) protocol developed for epidemiological typing of Staphylococcus aureus was analyzed in a biphasic multicenter study. In a Dutch multicenter pilot study, 10 genetically unique isolates of methicillin-resistant S. aureus (MRSA) were characterized by the BT

  6. Latin America Today: An Atlas of Reproducible Pages. Revised Edition.

    Science.gov (United States)

    World Eagle, Inc., Wellesley, MA.

    This document contains reproducible maps, charts and graphs of Latin America for use by teachers and students. The maps are divided into five categories (1) the land; (2) peoples, countries, cities, and governments; (3) the national economies, product, trade, agriculture, and resources; (4) energy, education, employment, illicit drugs, consumer…

  7. Reproducible cavitation activity in water-particle suspensions

    NARCIS (Netherlands)

    Borkent, B.M.; Arora, M.; Ohl, C.D.

    2007-01-01

    The study of cavitation inception in liquids rarely yields reproducible data, unless special control is taken on the cleanliness of the experimental environment. In this paper, an experimental technique is demonstrated which allows repeatable measurements of cavitation activity in liquid-particle

  8. Reproducibility of BOLD signal change induced by breath holding.

    Science.gov (United States)

    Magon, Stefano; Basso, Gianpaolo; Farace, Paolo; Ricciardi, Giuseppe Kenneth; Beltramello, Alberto; Sbarbati, Andrea

    2009-04-15

    Blood oxygen level dependent (BOLD) contrast is influenced by some physiological factors such as blood flow and blood volume that can be a source of variability in fMRI analysis. Previous studies proposed to use the cerebrovascular response data to normalize or calibrate BOLD maps in order to reduce variability of fMRI data both among brain areas in single subject analysis and across subjects. Breath holding is one of the most widely used methods to investigate the vascular reactivity. However, little is known about the robustness and reproducibility of this procedure. In this study we investigated three different breath holding periods. Subjects were asked to hold their breath for 9, 15 or 21 s in three separate runs and the fMRI protocol was repeated after 15 to 20 days. Our data show that the BOLD response to breath holding after inspiration results in a complex shape due to physiological factors that influence the signal variation with a timing that is highly reproducible. Nevertheless, the reproducibility of the magnitude of the cerebrovascular response to CO(2), expressed as amplitude of BOLD signal and number of responding voxels, strongly depends on duration of breath holding periods. Breath holding period of 9 s results in high variability of the magnitude of the response while longer breath holding durations produce more robust and reproducible BOLD responses.

  9. Reproducible and expedient rice regeneration system using in vitro ...

    African Journals Online (AJOL)

    Inevitable prerequisite for expedient regeneration in rice is the selection of totipotent explant and developing an apposite combination of growth hormones. Here, we reported a reproducible regeneration protocol in which basal segments of the stem of the in vitro grown rice plants were used as ex-plant. Using the protocol ...

  10. Reproducible positioning in chest X-ray radiography

    International Nuclear Information System (INIS)

    1974-01-01

    A device is described that can be used to ensure reproducibility in the positioning of the patient during X-ray radiography of the thorax. Signals are taken from an electrocardiographic monitor and from a device recording the respiratory cycle. Radiography is performed only when two preselected signals coincide

  11. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  12. Reproducing (and Disrupting) Heteronormativity: Gendered Sexual Socialization in Preschool Classrooms

    Science.gov (United States)

    Gansen, Heidi M.

    2017-01-01

    Using ethnographic data from 10 months of observations in nine preschool classrooms, I examine gendered sexual socialization children receive from teachers' practices and reproduce through peer interactions. I find heteronormativity permeates preschool classrooms, where teachers construct (and occasionally disrupt) gendered sexuality in a number…

  13. Reproducibility of the Pleth Variability Index in premature infants

    NARCIS (Netherlands)

    Den Boogert, W.J. (Wilhelmina J.); H.A. van Elteren (Hugo); T.G. Goos (Tom); I.K.M. Reiss (Irwin); R.C.J. de Jonge (Rogier); V.J. van den Berg (Victor J.)

    2017-01-01

    textabstractThe aim was to assess the reproducibility of the Pleth Variability Index (PVI), developed for non-invasive monitoring of peripheral perfusion, in preterm neonates below 32 weeks of gestational age. Three PVI measurements were consecutively performed in stable, comfortable preterm

  14. Reproducibility of the Pleth Variability Index in premature infants

    NARCIS (Netherlands)

    Den Boogert, Wilhelmina J.; Van Elteren, Hugo A.; Goos, T.G.; Reiss, Irwin K.M.; De Jonge, Rogier C.J.; van Den Berg, Victor J.

    2017-01-01

    The aim was to assess the reproducibility of the Pleth Variability Index (PVI), developed for non-invasive monitoring of peripheral perfusion, in preterm neonates below 32 weeks of gestational age. Three PVI measurements were consecutively performed in stable, comfortable preterm neonates in the

  15. Exploring the Coming Repositories of Reproducible Experiments: Challenges and Opportunities

    DEFF Research Database (Denmark)

    Freire, Juliana; Bonnet, Philippe; Shasha, Dennis

    2011-01-01

    Computational reproducibility efforts in many communities will soon give rise to validated software and data repositories of high quality. A scientist in a field may want to query the components of such repositories to build new software workflows, perhaps after adding the scientist’s own algorithm....... This paper explores research challenges necessary to achieving this goal....

  16. Composting in small laboratory pilots: Performance and reproducibility

    International Nuclear Information System (INIS)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-01-01

    Highlights: ► We design an innovative small-scale composting device including six 4-l reactors. ► We investigate the performance and reproducibility of composting on a small scale. ► Thermophilic conditions are established by self-heating in all replicates. ► Biochemical transformations, organic matter losses and stabilisation are realistic. ► The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors ( 2 consumption and CO 2 emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  17. Reproducibility of abdominal fat assessment by ultrasound and computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaete; Benedeti, Augusto Cesar Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge, E-mail: fernando@fatesa.edu.br [Faculdade de Tecnologia em Saude (FATESA), Ribeirao Preto, SP (Brazil); Universidade de Fortaleza (UNIFOR), Fortaleza, CE (Brazil). Departmento de Radiologia; Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Medicina. Departmento de Medicina Clinica; Universidade de Sao Paulo (FFCLRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Filosofia, Ciencias e Letras; Hospital Mae de Deus, Porto Alegre, RS (Brazil)

    2017-05-15

    Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility. (author)

  18. Reproducibility of corneal, macular and retinal nerve fiber layer ...

    African Journals Online (AJOL)

    Purpose: To determine the intra-session and inter-session reproducibility of corneal, macular and retinal nerve fiber layer thickness (RNFL) measurements with the iVue-100 optical coherence tomography in normal eyes. Methods: These parameters were measured in the right eyes of 50 healthy participants with normal ...

  19. Reproducibility of corneal, macular and retinal nerve fiber layer ...

    African Journals Online (AJOL)

    in determining the utility of a device used for clinical and research purposes.2 The aim of this study was therefore to determine the reproducibility of corneal, macular and. RNFL thickness measurements in normal eyes using the. iVue-100 SD-OCT. Subjects and methods. The study was approved by the University of KwaZu-.

  20. The reproducibility of the Canadian Occupational Performance Measure

    NARCIS (Netherlands)

    Eyssen, Isaline C J M; Beelen, A.; Dedding, C; Cardol, M.; Dekker, J

    2005-01-01

    OBJECTIVE: To assess the reproducibility (reliability and inter-rater agreement) of the client-centred Canadian Occupational Performance Measure (COPM). DESIGN: The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data

  1. The reproducibility of the Canadian occupational performance measure.

    NARCIS (Netherlands)

    Eyssen, I.C.J.M.; Beelen, A.; Dedding, C.; Cardol, M.; Dekker, J.

    2005-01-01

    OBJECTIVE: To assess the reproducibility (reliability and inter-rater agreement) of the client-centred Canadian Occupational Performance Measure (COPM). Design: The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data

  2. The reproducibility of the Canadian Occupational Performance Measure

    NARCIS (Netherlands)

    Eyssen, I.C.; Beelen, A.; Dedding, C.; Cardol, M.; Dekker, J.

    2005-01-01

    Objective: To assess the reproducibility (reliability and inter-rater agreement> of the client-centred Canadian Occupational Performance Measure (COPM). Design: The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data

  3. The reproducibility of the Canadian Occupational Performance Measure

    NARCIS (Netherlands)

    Eyssen, I. C. J. M.; Beelen, A.; Dedding, C.; Cardol, M.; Dekker, J.

    2005-01-01

    To assess the reproducibility (reliability and inter-rater agreement) of the client-centred Canadian Occupational Performance Measure (COPM). The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data analysis was based on

  4. The reproducibility of the Canadian Occupational Performance Measure

    NARCIS (Netherlands)

    Eyssen, I.C.; Beelen, A.; Dedding, C.; Cardol, M.; Dekker, J.

    2005-01-01

    OBJECTIVE: To assess the reproducibility (reliability and inter-rater agreement) of the client-centred Canadian Occupational Performance Measure (COPM). Design: The COPM was administered twice, with a mean interval of seven days (SD 1.6, range 4-14), by two different occupational therapists. Data

  5. Multi-Parametric Neuroimaging Reproducibility: A 3T Resource Study

    Science.gov (United States)

    Landman, Bennett A.; Huang, Alan J.; Gifford, Aliya; Vikram, Deepti S.; Lim, Issel Anne L.; Farrell, Jonathan A.D.; Bogovic, John A.; Hua, Jun; Chen, Min; Jarso, Samson; Smith, Seth A.; Joel, Suresh; Mori, Susumu; Pekar, James J.; Barker, Peter B.; Prince, Jerry L.; van Zijl, Peter C.M.

    2010-01-01

    Modern MRI image processing methods have yielded quantitative, morphometric, functional, and structural assessments of the human brain. These analyses typically exploit carefully optimized protocols for specific imaging targets. Algorithm investigators have several excellent public data resources to use to test, develop, and optimize their methods. Recently, there has been an increasing focus on combining MRI protocols in multi-parametric studies. Notably, these have included innovative approaches for fusing connectivity inferences with functional and/or anatomical characterizations. Yet, validation of the reproducibility of these interesting and novel methods has been severely hampered by the limited availability of appropriate multi-parametric data. We present an imaging protocol optimized to include state-of-the-art assessment of brain function, structure, micro-architecture, and quantitative parameters within a clinically feasible 60 minute protocol on a 3T MRI scanner. We present scan-rescan reproducibility of these imaging contrasts based on 21 healthy volunteers (11 M/10 F, 22–61 y/o). The cortical gray matter, cortical white matter, ventricular cerebrospinal fluid, thalamus, putamen, caudate, cerebellar gray matter, cerebellar white matter, and brainstem were identified with mean volume-wise reproducibility of 3.5%. We tabulate the mean intensity, variability and reproducibility of each contrast in a region of interest approach, which is essential for prospective study planning and retrospective power analysis considerations. Anatomy was highly consistent on structural acquisition (~1–5% variability), while variation on diffusion and several other quantitative scans was higher (~parametric imaging protocols. PMID:21094686

  6. Reproducibility in the assessment of acute pancreatitis with computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Freire Filho, Edison de Oliveira; Vieira, Renata La Rocca; Yamada, Andre Fukunishi; Shigueoka, David Carlos; Bekhor, Daniel; Freire, Maxime Figueiredo de Oliveira; Ajzen, Sergio; D' Ippolito, Giuseppe [Universidade Federal de Sao Paulo (UNIFESP/EPM), SP (Brazil). Dept. of Imaging Diagnosis]. E-mail: eofilho@ig.com.br; eoffilho@uol.com.br

    2007-11-15

    Objective: To evaluate the reproducibility of unenhanced and contrast-enhanced computed tomography in the assessment of patients with acute pancreatitis. Materials and methods: Fifty-one unenhanced and contrast-enhanced abdominal computed tomography studies of patients with acute pancreatitis were blindly reviewed by two radiologists (observers 1 and 2). The morphological index was separately calculated for unenhanced and contrast-enhanced computed tomography and the disease severity index was established. Intraobserver and interobserver reproducibility of computed tomography was measured by means of the kappa index ({kappa}). Results: Interobserver agreement was {kappa} 0.666, 0.705, 0.648, 0.547 and 0.631, respectively for unenhanced and contrast-enhanced morphological index, presence of pancreatic necrosis, pancreatic necrosis extension, and disease severity index. Intraobserver agreement (observers 1 and 2, respectively) was {kappa} = 0.796 and 0.732 for unenhanced morphological index; {kappa} 0.725 and 0.802 for contrast- enhanced morphological index; {kappa} = 0.674 and 0.849 for presence of pancreatic necrosis; {kappa} = 0.606 and 0.770 for pancreatic necrosis extension; and {kappa} = 0.801 and 0.687 for disease severity index at computed tomography. Conclusion: Computed tomography for determination of morphological index and disease severity index in the staging of acute pancreatitis is a quite reproducible method. The absence of contrast- enhancement does not affect the computed tomography morphological index reproducibility. (author)

  7. Reproducibility of Tactile Assessments for Children with Unilateral Cerebral Palsy

    Science.gov (United States)

    Auld, Megan Louise; Ware, Robert S.; Boyd, Roslyn Nancy; Moseley, G. Lorimer; Johnston, Leanne Marie

    2012-01-01

    A systematic review identified tactile assessments used in children with cerebral palsy (CP), but their reproducibility is unknown. Sixteen children with unilateral CP and 31 typically developing children (TDC) were assessed 2-4 weeks apart. Test-retest percent agreements within one point for children with unilateral CP (and TDC) were…

  8. Reproducibility and Reliability of Repeated Quantitative Fluorescence Angiography

    DEFF Research Database (Denmark)

    Nerup, Nikolaj; Knudsen, Kristine Bach Korsholm; Ambrus, Rikard

    2017-01-01

    that the camera can detect. As the emission of fluorescence is dependent of the excitatory light intensity, reduction of this may solve the problem. The aim of the present study was to investigate the reproducibility and reliability of repeated quantitative FA during a reduction of excitatory light....

  9. Annotating with Propp's Morphology of the Folktale: Reproducibility and Trainability

    NARCIS (Netherlands)

    Fisseni, B.; Kurji, A.; Löwe, B.

    2014-01-01

    We continue the study of the reproducibility of Propp’s annotations from Bod et al. (2012). We present four experiments in which test subjects were taught Propp’s annotation system; we conclude that Propp’s system needs a significant amount of training, but that with sufficient time investment, it

  10. Statecraft and Study Abroad: Imagining, Narrating and Reproducing the State

    Science.gov (United States)

    Lansing, Jade; Farnum, Rebecca L.

    2017-01-01

    Study abroad in higher education is on the rise, marketed as an effective way to produce global citizens and undermine international boundaries. In practice, however, programmes frequently reify rather than challenge states: participants "study Morocco" rather than "exploring Marrakech." This framing reproduces real and…

  11. Using different approaches to assess the reproducibility of a ...

    African Journals Online (AJOL)

    2011-02-24

    Feb 24, 2011 ... questionnaire (QFFQ) used for assessment of the habitual dietary intake of Setswana-speaking adults in the North West Province of South. Africa. ... Food intake was coded and analysed for nutrient intake per day for each subject. ..... and Ovarian Cancer Study Groups reported the reproducibility of a.

  12. Reproducibility of abdominal fat assessment by ultrasound and computed tomography

    Directory of Open Access Journals (Sweden)

    Fernando Marum Mauad

    Full Text Available Abstract Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6% were men and 62 (61.4% were women-with a mean age of 66.3 years (60-80 years. The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest. We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility.

  13. Reproducibility in density functional theory calculations of solids

    DEFF Research Database (Denmark)

    Lejaeghere, Kurt; Bihlmayer, Gustav; Björkman, Torbjörn

    2016-01-01

    The widespread popularity of density functional theory has given rise to an extensive range of dedicated codes for predicting molecular and crystalline properties. However, each code implements the formalism in a different way, raising questions about the reproducibility of such predictions. We...

  14. Exploring the reproducibility of functional connectivity alterations in Parkinson's disease.

    Science.gov (United States)

    Badea, Liviu; Onu, Mihaela; Wu, Tao; Roceanu, Adina; Bajenaru, Ovidiu

    2017-01-01

    Since anatomic MRI is presently not able to directly discern neuronal loss in Parkinson's Disease (PD), studying the associated functional connectivity (FC) changes seems a promising approach toward developing non-invasive and non-radioactive neuroimaging markers for this disease. While several groups have reported such FC changes in PD, there are also significant discrepancies between studies. Investigating the reproducibility of PD-related FC changes on independent datasets is therefore of crucial importance. We acquired resting-state fMRI scans for 43 subjects (27 patients and 16 normal controls, with 2 replicate scans per subject) and compared the observed FC changes with those obtained in two independent datasets, one made available by the PPMI consortium (91 patients, 18 controls) and a second one by the group of Tao Wu (20 patients, 20 controls). Unfortunately, PD-related functional connectivity changes turned out to be non-reproducible across datasets. This could be due to disease heterogeneity, but also to technical differences. To distinguish between the two, we devised a method to directly check for disease heterogeneity using random splits of a single dataset. Since we still observe non-reproducibility in a large fraction of random splits of the same dataset, we conclude that functional heterogeneity may be a dominating factor behind the lack of reproducibility of FC alterations in different rs-fMRI studies of PD. While global PD-related functional connectivity changes were non-reproducible across datasets, we identified a few individual brain region pairs with marginally consistent FC changes across all three datasets. However, training classifiers on each one of the three datasets to discriminate PD scans from controls produced only low accuracies on the remaining two test datasets. Moreover, classifiers trained and tested on random splits of the same dataset (which are technically homogeneous) also had low test accuracies, directly substantiating

  15. Reproducible Hydrogeophysical Inversions through the Open-Source Library pyGIMLi

    Science.gov (United States)

    Wagner, F. M.; Rücker, C.; Günther, T.

    2017-12-01

    Many tasks in applied geosciences cannot be solved by a single measurement method and require the integration of geophysical, geotechnical and hydrological methods. In the emerging field of hydrogeophysics, researchers strive to gain quantitative information on process-relevant subsurface parameters by means of multi-physical models, which simulate the dynamic process of interest as well as its geophysical response. However, such endeavors are associated with considerable technical challenges, since they require coupling of different numerical models. This represents an obstacle for many practitioners and students. Even technically versatile users tend to build individually tailored solutions by coupling different (and potentially proprietary) forward simulators at the cost of scientific reproducibility. We argue that the reproducibility of studies in computational hydrogeophysics, and therefore the advancement of the field itself, requires versatile open-source software. To this end, we present pyGIMLi - a flexible and computationally efficient framework for modeling and inversion in geophysics. The object-oriented library provides management for structured and unstructured meshes in 2D and 3D, finite-element and finite-volume solvers, various geophysical forward operators, as well as Gauss-Newton based frameworks for constrained, joint and fully-coupled inversions with flexible regularization. In a step-by-step demonstration, it is shown how the hydrogeophysical response of a saline tracer migration can be simulated. Tracer concentration data from boreholes and measured voltages at the surface are subsequently used to estimate the hydraulic conductivity distribution of the aquifer within a single reproducible Python script.

  16. Reproducible simulation of respiratory motion in porcine lung explants

    Energy Technology Data Exchange (ETDEWEB)

    Biederer, J. [Dept. of Diagnostic Radiology, Univ. Hospital Schleswig-Holstein, Campus Kiel (Germany); Dept. of Radiology, German Cancer Research Center, Heidelberg (Germany); Plathow, C. [Dept. of Diagnostic Radiology, Eberhard-Karls-Univ. Tuebingen (Germany); Dept. of Radiology, German Cancer Research Center, Heidelberg (Germany); Schoebinger, M.; Meinzer, H.P. [Dept. of Medical and Biological Informatics, German Cancer Research Center, Heidelberg (Germany); Tetzlaff, R.; Puderbach, M.; Zaporozhan, J.; Kauczor, H.U. [Dept. of Radiology, German Cancer Research Center, Heidelberg (Germany); Bolte, H.; Heller, M. [Dept. of Diagnostic Radiology, Univ. Hospital Schleswig-Holstein, Campus Kiel (Germany)

    2006-11-15

    Purpose: To develop a model for exactly reproducible respiration motion simulations of animal lung explants inside an MR-compatible chest phantom. Materials and Methods: The materials included a piston pump and a flexible silicone reconstruction of a porcine diaphragm and were used in combination with an established MR-compatible chest phantom for porcine heart-lung preparations. The rhythmic inflation and deflation of the diaphragm at the bottom of the artificial thorax with water (1-1.5 L) induced lung tissue displacement resembling diaphragmatic breathing. This system was tested on five porcine heart-lung preparations using 1.5T MRI with transverse and coronal 3D-GRE (TR/TE=3.63/1.58, 256 x 256 matrix, 350 mm FOV, 4 mm slices) and half Fourier T2-FSE (TR/TE=545/29, 256 x 192, 350 mm, 6 mm) as well as multiple row detector CT (16 x 1 mm collimation, pitch 1.5, FOV 400 mm, 120 mAs) acquired at five fixed inspiration levels. Dynamic CT scans and coronal MRI with dynamic 2D-GRE and 2D-SS-GRE sequences (image frequencies of 10/sec and 3/sec, respectively) were acquired during continuous 'breathing' (7/minute). The position of the piston pump was visually correlated with the respiratory motion visible through the transparent wall of the phantom and with dynamic displays of CT and MR images. An elastic body splines analysis of the respiratory motion was performed using CT data. Results: Visual evaluation of MRI and CT showed three-dimensional movement of the lung tissue throughout the respiration cycle. Local tissue displacement inside the lung explants was documented with motion maps calculated from CT. The maximum displacement at the top of the diaphragm (mean 26.26 [SD 1.9] mm on CT and 27.16 [SD 1.5] mm on MRI, respectively [p=0.25; Wilcoxon test]) was in the range of tidal breathing in human patients. Conclusion: The chest phantom with a diaphragmatic pump is a promising platform for multi-modality imaging studies of the effects of respiratory lung

  17. Composting in small laboratory pilots: Performance and reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Lashermes, G.; Barriuso, E. [INRA, UMR1091 Environment and Arable Crops (INRA, AgroParisTech), F-78850 Thiverval-Grignon (France); Le Villio-Poitrenaud, M. [VEOLIA Environment - Research and Innovation, F-78520 Limay (France); Houot, S., E-mail: sabine.houot@grignon.inra.fr [INRA, UMR1091 Environment and Arable Crops (INRA, AgroParisTech), F-78850 Thiverval-Grignon (France)

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer We design an innovative small-scale composting device including six 4-l reactors. Black-Right-Pointing-Pointer We investigate the performance and reproducibility of composting on a small scale. Black-Right-Pointing-Pointer Thermophilic conditions are established by self-heating in all replicates. Black-Right-Pointing-Pointer Biochemical transformations, organic matter losses and stabilisation are realistic. Black-Right-Pointing-Pointer The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors (<10 l) have been employed in composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O{sub 2} consumption and CO{sub 2} emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final

  18. Reproducibility in protein profiling by MALDI-TOF mass spectrometry

    DEFF Research Database (Denmark)

    Albrethsen, Jakob

    2007-01-01

    , with the reported mean CV of the peak intensity varying among studies from 4% to 26%. There is additional interexperiment variation in peak intensity. Current approaches to improve the analytical performance of MALDI protein profiling include automated sample processing, extensive prefractionation strategies......, immunocapture, prestructured target surfaces, standardized matrix (co)crystallization, improved MALDI-TOF MS instrument components, internal standard peptides, quality-control samples, replicate measurements, and algorithms for normalization and peak detection. CONCLUSIONS: Further evaluation and optimization......BACKGROUND: Protein profiling with high-throughput sample preparation and MALDI-TOF MS analysis is a new potential tool for diagnosis of human diseases. However, analytical reproducibility is a significant challenge in MALDI protein profiling. This minireview summarizes studies of reproducibility...

  19. Towards a Reproducible Synthesis of High Aspect Ratio Gold Nanorods

    Directory of Open Access Journals (Sweden)

    Susanne Koeppl

    2011-01-01

    Full Text Available The seed-mediated method in presence of high concentrations of CTAB is frequently implemented in the preparation of high aspect ratio gold nanorods (i.e., nanorods with aspect ratios of 5 or more; however, the reproducibility has still been limited. We rendered the synthesis procedure simpler, decreased the susceptibility to impurities, and improved the reproducibility of the product distribution. As a result of the high aspect ratios, longitudinal plasmon absorptions were shifted up to very high absorption maxima of 1955 nm in UV-vis-NIR spectra (since this band is completely covered in aqueous solution by the strong absorption of water, the gold species were embedded in poly(vinyl alcohol films for UV-vis-NIR measurements. Finally, the directed particle growth in (110 direction leads to the conclusion that the adsorption of CTAB molecules at specific crystal faces accounts for nanorod growth and not cylindrical CTAB micelles, in agreement with other observations.

  20. Reproducible analyses of microbial food for advanced life support systems

    Science.gov (United States)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  1. MASSIVE DATA, THE DIGITIZATION OF SCIENCE, AND REPRODUCIBILITY OF RESULTS

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    As the scientific enterprise becomes increasingly computational and data-driven, the nature of the information communicated must change. Without inclusion of the code and data with published computational results, we are engendering a credibility crisis in science. Controversies such as ClimateGate, the microarray-based drug sensitivity clinical trials under investigation at Duke University, and retractions from prominent journals due to unverified code suggest the need for greater transparency in our computational science. In this talk I argue that the scientific method be restored to (1) a focus on error control as central to scientific communication and (2) complete communication of the underlying methodology producing the results, ie. reproducibility. I outline barriers to these goals based on recent survey work (Stodden 2010), and suggest solutions such as the “Reproducible Research Standard” (Stodden 2009), giving open licensing options designed to create an intellectual property framework for scien...

  2. Towards reproducibility of research by reuse of IT best practices

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Reproducibility of any research gives much higher credibility both to research results and to the researchers. This is true for any kind of research including computer science, where a lot of tools and approaches have been developed to ensure reproducibility. In this talk I will focus on basic and seemingly simple principles, which sometimes look too obvious to follow, but help researchers build beautiful and reliable systems that produce consistent, measurable results. My talk will cover, among other things, the problem of embedding machine learning techniques into analysis strategy. I will also speak about the most common pitfalls in this process and how to avoid them. In addition, I will demonstrate the research environment based on the principles that I will have outlined. About the speaker Andrey Ustyuzhanin (36) is Head of CERN partnership program at Yandex. He is involved in the development of event indexing and event filtering services which Yandex has been providing for the LHCb experiment sinc...

  3. Reproducibility of the Tronzo and AO classifications for transtrochanteric fractures.

    Science.gov (United States)

    Mattos, Carlos Augusto; Jesus, Alexandre Atsushi Koza; Floter, Michelle Dos Santos; Nunes, Luccas Franco Bettencourt; Sanches, Bárbara de Baptista; Zabeu, José Luís Amim

    2015-01-01

    To analyze the reproducibility of the Tronzo and AO classifications for transtrochanteric fractures. This was a cross-sectional study in which the intraobserver and interobserver concordance between two readings made by 11 observers was analyzed. The analysis of the variations used the kappa statistical method. Moderate concordance was found in relation to the AO classification, while slight concordance was found for the Tronzo classification. This study found that the AO/Asif classification for transtrochanteric presented greater intra and interobserver reproducibility and that greater concordance was correlated with greater experience of the observers. Without division into subgroups, the AO/Asif classification was shown, as described in the literature, to be acceptable for clinical use in relation to transtrochanteric fractures of the femur, although it did not show absolute concordance, given that its concordance level was only moderate. Nonetheless, its concordance was better than that of the Tronzo classification.

  4. Tools for Reproducibility and Extensibility in Scientific Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others.    There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...

  5. A new strategy to deliver synthetic protein drugs: self-reproducible biologics using minicircles.

    Science.gov (United States)

    Yi, Hyoju; Kim, Youngkyun; Kim, Juryun; Jung, Hyerin; Rim, Yeri Alice; Jung, Seung Min; Park, Sung-Hwan; Ju, Ji Hyeon

    2014-08-05

    Biologics are the most successful drugs used in anticytokine therapy. However, they remain partially unsuccessful because of the elevated cost of their synthesis and purification. Development of novel biologics has also been hampered by the high cost. Biologics are made of protein components; thus, theoretically, they can be produced in vivo. Here we tried to invent a novel strategy to allow the production of synthetic drugs in vivo by the host itself. The recombinant minicircles encoding etanercept or tocilizumab, which are synthesized currently by pharmaceutical companies, were injected intravenously into animal models. Self-reproduced etanercept and tocilizumab were detected in the serum of mice. Moreover, arthritis subsided in mice that were injected with minicircle vectors carrying biologics. Self-reproducible biologics need neither factory facilities for drug production nor clinical processes, such as frequent drug injection. Although this novel strategy is in its very early conceptual stage, it seems to represent a potential alternative method for the delivery of biologics.

  6. Reproducibility of preclinical animal research improves with heterogeneity of study samples

    Science.gov (United States)

    Vogt, Lucile; Sena, Emily S.; Würbel, Hanno

    2018-01-01

    Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research. PMID:29470495

  7. Serous tubal intraepithelial carcinoma: diagnostic reproducibility and its implications.

    Science.gov (United States)

    Carlson, Joseph W; Jarboe, Elke A; Kindelberger, David; Nucci, Marisa R; Hirsch, Michelle S; Crum, Christopher P

    2010-07-01

    Serous tubal intraepithelial carcinoma (STIC) is detected in between 5% and 7% of women undergoing risk-reduction salpingooophorectomy for mutations in the BRCA1 or 2 genes (BRCA+), and seems to play a role in the pathogenesis of many ovarian and "primary peritoneal" serous carcinomas. The recognition of STIC is germane to the management of BRCA+ women; however, the diagnostic reproducibility of STIC is unknown. Twenty-one cases were selected and classified as STIC or benign, using both hematoxylin and eosin and immunohistochemical stains for p53 and MIB-1. Digital images of 30 hematoxylin and eosin-stained STICs (n=14) or benign tubal epithelium (n=16) were photographed and randomized for blind digital review in a Powerpoint format by 6 experienced gynecologic pathologists and 6 pathology trainees. A generalized kappa statistic for multiple raters was calculated for all groups. For all reviewers, the kappa was 0.333, indicating poor reproducibility; kappa was 0.453 for the experienced gynecologic pathologists (fair-to-good reproducibility), and kappa=0.253 for the pathology residents (poor reproducibility). In the experienced group, 3 of 14 STICs were diagnosed by all 6 reviewers, and 9 of 14 by a majority of the reviewers. These results show that interobserver concordance in the recognition of STIC in high-quality digital images is at best fair-to-good for even experienced gynecologic pathologists, and a proportion cannot be consistently identified even among experienced observers. In view of these findings, a diagnosis of STIC should be corroborated by a second pathologist, if feasible.

  8. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    Science.gov (United States)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  9. Composting in small laboratory pilots: performance and reproducibility.

    Science.gov (United States)

    Lashermes, G; Barriuso, E; Le Villio-Poitrenaud, M; Houot, S

    2012-02-01

    Small-scale reactors (composting research, but few attempts have assessed the performance of composting considering the transformations of organic matter. Moreover, composting at small scales is often performed by imposing a fixed temperature, thus creating artificial conditions, and the reproducibility of composting has rarely been reported. The objectives of this study are to design an innovative small-scale composting device safeguarding self-heating to drive the composting process and to assess the performance and reproducibility of composting in small-scale pilots. The experimental setup included six 4-l reactors used for composting a mixture of sewage sludge and green wastes. The performance of the process was assessed by monitoring the temperature, O(2) consumption and CO(2) emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Reproducibility of Computer-Aided Detection Marks in Digital Mammography

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Kim, Sun Mi; Im, Jung Gi; Cha, Joo Hee

    2007-01-01

    To evaluate the performance and reproducibility of a computeraided detection (CAD) system in mediolateral oblique (MLO) digital mammograms taken serially, without release of breast compression. A CAD system was applied preoperatively to the fulfilled digital mammograms of two MLO views taken without release of breast compression in 82 patients (age range: 33 83 years; mean age: 49 years) with previously diagnosed breast cancers. The total number of visible lesion components in 82 patients was 101: 66 masses and 35 microcalcifications. We analyzed the sensitivity and reproducibility of the CAD marks. The sensitivity of the CAD system for first MLO views was 71% (47/66) for masses and 80% (28/35) for microcalcifications. The sensitivity of the CAD system for second MLO views was 68% (45/66) for masses and 17% (6/35) for microcalcifications. In 84 ipsilateral serial MLO image sets (two patients had bilateral cancers), identical images, regardless of the existence of CAD marks, were obtained for 35% (29/84) and identical images with CAD marks were obtained for 29% (23/78). Identical images, regardless of the existence of CAD marks, for contralateral MLO images were 65% (52/80) and identical images with CAD marks were obtained for 28% (11/39). The reproducibility of CAD marks for the true positive masses in serial MLO views was 84% (42/50) and that for the true positive microcalcifications was 0% (0/34). The CAD system in digital mammograms showed a high sensitivity for detecting masses and microcalcifications. However, reproducibility of microcalcification marks was very low in MLO views taken serially without release of breast compression. Minute positional change and patient movement can alter the images and result in a significant effect on the algorithm utilized by the CAD for detecting microcalcifications

  11. LHC Orbit Correction Reproducibility and Related Machine Protection

    OpenAIRE

    Baer, T; Fuchsberger, K; Schmidt, R; Wenninger, J

    2012-01-01

    The Large Hadron Collider (LHC) has an unprecedented nominal stored beam energy of up to 362 MJ per beam. In order to ensure an adequate machine protection by the collimation system, a high reproducibility of the beam position at collimators and special elements like the final focus quadrupoles is essential. This is realized by a combination of manual orbit corrections, feed forward and real time feedback. In order to protect the LHC against inconsistent orbit corrections, which could put the...

  12. Reproducibility of urinary biomarkers in multiple 24-h urine samples.

    Science.gov (United States)

    Sun, Qi; Bertrand, Kimberly A; Franke, Adrian A; Rosner, Bernard; Curhan, Gary C; Willett, Walter C

    2017-01-01

    Limited knowledge regarding the reproducibility of biomarkers in 24-h urine samples has hindered the collection and use of the samples in epidemiologic studies. We aimed to evaluate the reproducibility of various markers in repeat 24-h urine samples. We calculated intraclass correlation coefficients (ICCs) of biomarkers measured in 24-h urine samples that were collected in 3168 participants in the NHS (Nurses' Health Study), NHSII (Nurses' Health Study II), and Health Professionals Follow-Up Study. In 742 women with 4 samples each collected over the course of 1 y, ICCs for sodium were 0.32 in the NHS and 0.34 in the NHSII. In 2439 men and women with 2 samples each collected over 1 wk to ≥1 mo, the ICCs ranged from 0.33 to 0.68 for sodium at various intervals between collections. The urinary excretion of potassium, calcium, magnesium, phosphate, sulfate, and other urinary markers showed generally higher reproducibility (ICCs >0.4). In 47 women with two 24-h urine samples, ICCs ranged from 0.15 (catechin) to 0.75 (enterolactone) for polyphenol metabolites. For phthalates, ICCs were generally ≤0.26 except for monobenzyl phthalate (ICC: 0.55), whereas the ICC was 0.39 for bisphenol A (BPA). We further estimated that, for the large majority of the biomarkers, the mean of three 24-h urine samples could provide a correlation of ≥0.8 with true long-term urinary excretion. These data suggest that the urinary excretion of various biomarkers, such as minerals, electrolytes, most polyphenols, and BPA, is reasonably reproducible in 24-h urine samples that are collected within a few days or ≤1 y. Our findings show that three 24-h samples are sufficient for the measurement of long-term exposure status in epidemiologic studies. © 2017 American Society for Nutrition.

  13. Multi-parametric neuroimaging reproducibility: a 3-T resource study.

    Science.gov (United States)

    Landman, Bennett A; Huang, Alan J; Gifford, Aliya; Vikram, Deepti S; Lim, Issel Anne L; Farrell, Jonathan A D; Bogovic, John A; Hua, Jun; Chen, Min; Jarso, Samson; Smith, Seth A; Joel, Suresh; Mori, Susumu; Pekar, James J; Barker, Peter B; Prince, Jerry L; van Zijl, Peter C M

    2011-02-14

    Modern MRI image processing methods have yielded quantitative, morphometric, functional, and structural assessments of the human brain. These analyses typically exploit carefully optimized protocols for specific imaging targets. Algorithm investigators have several excellent public data resources to use to test, develop, and optimize their methods. Recently, there has been an increasing focus on combining MRI protocols in multi-parametric studies. Notably, these have included innovative approaches for fusing connectivity inferences with functional and/or anatomical characterizations. Yet, validation of the reproducibility of these interesting and novel methods has been severely hampered by the limited availability of appropriate multi-parametric data. We present an imaging protocol optimized to include state-of-the-art assessment of brain function, structure, micro-architecture, and quantitative parameters within a clinically feasible 60-min protocol on a 3-T MRI scanner. We present scan-rescan reproducibility of these imaging contrasts based on 21 healthy volunteers (11 M/10 F, 22-61 years old). The cortical gray matter, cortical white matter, ventricular cerebrospinal fluid, thalamus, putamen, caudate, cerebellar gray matter, cerebellar white matter, and brainstem were identified with mean volume-wise reproducibility of 3.5%. We tabulate the mean intensity, variability, and reproducibility of each contrast in a region of interest approach, which is essential for prospective study planning and retrospective power analysis considerations. Anatomy was highly consistent on structural acquisition (~1-5% variability), while variation on diffusion and several other quantitative scans was higher (~parametric imaging protocols. Copyright © 2010 Elsevier Inc. All rights reserved.

  14. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    Science.gov (United States)

    2017-08-09

    metabolites with percent standard deviation Cramer- Rao lower bounds ≤20% were included in statistical analyses. One subject’s MRI#1 and one sub...relative to the mean as it is calculated as the standard deviation nor- malized by the average between visits. MRD provides information about the...inherent technical and physiological consistency of these measurements. This longitudinal study examined the variance and reproducibility of commonly

  15. Intra- and inter-examiner reproducibility of manual probing depth

    OpenAIRE

    Andrade,Roberto; Espinoza,Manuel; Gómez,Elena Maria; Rolando Espinoza,José; Cruz,Elizabeth

    2012-01-01

    The periodontal probe remains the best clinical diagnostic tool for the collection of information regarding the health status and the attachment level of periodontal tissues. The aim of this study was to evaluate intra- and inter-examiner reproducibility of probing depth (PD) measurements made with a manual probe. With the approval of an Ethics Committee, 20 individuals without periodontal disease were selected if they presented at least 6 teeth per quadrant. Using a Williams periodontal prob...

  16. Social Cognition, Social Skill, and the Broad Autism Phenotype

    Science.gov (United States)

    Sasson, Noah J.; Nowlin, Rachel B.; Pinkham, Amy E.

    2013-01-01

    Social-cognitive deficits differentiate parents with the "broad autism phenotype" from non-broad autism phenotype parents more robustly than other neuropsychological features of autism, suggesting that this domain may be particularly informative for identifying genetic and brain processes associated with the phenotype. The current study…

  17. Development of a Broad-Spectrum Antiviral Agent with Activity ...

    African Journals Online (AJOL)

    Development of a Broad-Spectrum Antiviral Agent with Activity Against Herpesvirus Replication and Gene Expression. ... Tropical Journal of Pharmaceutical Research ... Purpose: To evaluate the broad-spectrum antiviral activity of peptide H9 (H9) in vitro in order to gain insight into its underlying molecular mechanisms.

  18. A Cointegration And Error Correction Approach To Broad Money ...

    African Journals Online (AJOL)

    This study considered the stability of broad money demand function in Nigeria using data for 1970 to 2004. The study applied the Cointegration and error correction approach The Johansen Cointegration test shows that long run equilibrium relationship exists between broad money demand and its determinants. While the ...

  19. Validity and reproducibility of a Spanish dietary history.

    Directory of Open Access Journals (Sweden)

    Pilar Guallar-Castillón

    Full Text Available To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E, which collects information on numerous aspects of the Spanish diet.The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart.The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66, meat (r = 0.66, fish (r = 0.42, vegetables (r = 0.62 and fruits (r = 0.44. The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76, proteins (r= 0.58, lipids (r = 0.73, saturated fat (r = 0.73, monounsaturated fat (r = 0.59, polyunsaturated fat (r = 0.57, and carbohydrates (r = 0.66. The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients.The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients.

  20. The Reproducibility of Nuclear Morphometric Measurements in Invasive Breast Carcinoma

    Directory of Open Access Journals (Sweden)

    Pauliina Kronqvist

    1997-01-01

    Full Text Available The intraobserver and interobserver reproducibility of computerized nuclear morphometry was determined in repeated measurements of 212 samples of invasive breast cancer. The influence of biological variation and the selection of the measurement area was also tested. Morphometrically determined mean nuclear profile area (Pearson’s r 0.89, grading efficiency (GE 0.95 and standard deviation (SD of nuclear profile area (Pearson’s r 0.84, GE 0.89 showed high reproducibility. In this respect, nuclear morphometry equals with other established methods of quantitative pathology and exceeds the results of subjective grading of nuclear atypia in invasive breast cancer. A training period of eight days was sufficient to produce clear improvement in consistency of nuclear morphometry results. By estimating the sources of variation it could be shown that the variation associated with the measurement procedure itself is small. Instead, sample associated variation is responsible for the majority of variation in the measurements (82.9% in mean nuclear profile area and 65.9% in SD of nuclear profile area. This study points out that when standardized methods are applied computerized morphometry is a reproducible and reliable method of assessing nuclear atypia in invasive breast cancer. For further improvement special emphasize should be put on sampling rules of selecting the microscope fields and measurement areas.

  1. A Pragmatic Approach for Reproducible Research With Sensitive Data.

    Science.gov (United States)

    Shepherd, Bryan E; Blevins Peratikos, Meridith; Rebeiro, Peter F; Duda, Stephany N; McGowan, Catherine C

    2017-08-15

    Reproducible research is important for assessing the integrity of findings and disseminating methods, but it requires making original study data sets publicly available. This requirement is difficult to meet in settings with sensitive data, which can mean that resulting studies are not reproducible. For studies in which data cannot be shared, we propose a pragmatic approach to make research quasi-reproducible. On a publicly available website without restriction, researchers should post 1) analysis code used in the published study, 2) simulated data, and 3) results obtained by applying the analysis code used in the published study to the simulated data. Although it is not a perfect solution, such an approach makes analyses transparent for critical evaluation and dissemination and is therefore a significant improvement over current practice. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. Aveiro method in reproducing kernel Hilbert spaces under complete dictionary

    Science.gov (United States)

    Mai, Weixiong; Qian, Tao

    2017-12-01

    Aveiro Method is a sparse representation method in reproducing kernel Hilbert spaces (RKHS) that gives orthogonal projections in linear combinations of reproducing kernels over uniqueness sets. It, however, suffers from determination of uniqueness sets in the underlying RKHS. In fact, in general spaces, uniqueness sets are not easy to be identified, let alone the convergence speed aspect with Aveiro Method. To avoid those difficulties we propose an anew Aveiro Method based on a dictionary and the matching pursuit idea. What we do, in fact, are more: The new Aveiro method will be in relation to the recently proposed, the so called Pre-Orthogonal Greedy Algorithm (P-OGA) involving completion of a given dictionary. The new method is called Aveiro Method Under Complete Dictionary (AMUCD). The complete dictionary consists of all directional derivatives of the underlying reproducing kernels. We show that, under the boundary vanishing condition, bring available for the classical Hardy and Paley-Wiener spaces, the complete dictionary enables an efficient expansion of any given element in the Hilbert space. The proposed method reveals new and advanced aspects in both the Aveiro Method and the greedy algorithm.

  3. Dosimetric Algorithm to Reproduce Isodose Curves Obtained from a LINAC

    Directory of Open Access Journals (Sweden)

    Julio Cesar Estrada Espinosa

    2014-01-01

    Full Text Available In this work isodose curves are obtained by the use of a new dosimetric algorithm using numerical data from percentage depth dose (PDD and the maximum absorbed dose profile, calculated by Monte Carlo in a 18 MV LINAC. The software allows reproducing the absorbed dose percentage in the whole irradiated volume quickly and with a good approximation. To validate results an 18 MV LINAC with a whole geometry and a water phantom were constructed. On this construction, the distinct simulations were processed by the MCNPX code and then obtained the PDD and profiles for the whole depths of the radiation beam. The results data were used by the code to produce the dose percentages in any point of the irradiated volume. The absorbed dose for any voxel’s size was also reproduced at any point of the irradiated volume, even when the voxels are considered to be of a pixel’s size. The dosimetric algorithm is able to reproduce the absorbed dose induced by a radiation beam over a water phantom, considering PDD and profiles, whose maximum percent value is in the build-up region. Calculation time for the algorithm is only a few seconds, compared with the days taken when it is carried out by Monte Carlo.

  4. Sex is over-rated: on the right to reproduce.

    Science.gov (United States)

    Cutas, Daniela

    2009-03-01

    In this article, I will show what is respected most in human reproduction and parenting is not a right to reproduce in the way in which this right is explicitly proposed. The only way in which people can become, and function as, parents without having to submit themselves to anyone else's judgements and decisions, is by having reproductive sex. Whatever one's intentions, social status, standard of living, income, etc., so long as assistance is not required, that person's reproductive decisions will not be interfered with in any way, at least not until neglect or abuse of their offspring becomes known. Moreover, none of the features that are said to back the right to reproduce (such as bodily integrity or personal autonomy) can justify one's unquestioned access to the relationship with another unable to consent (the child). This indicates that the discourse in terms of the right to reproduce as is currently used so as to justify non-interference with natural reproduction and parenting coupled with the regulation of assisted forms of reproduction and parenting, is at best self-deluding and that all it protects is people's freedom to have reproductive sex and handle the consequences.

  5. Reproducibility and validity of self-perceived oral health conditions.

    Science.gov (United States)

    Pinelli, Camila; de Castro Monteiro Loffredo, Leonor

    2007-12-01

    The reproducibility and validity of self-perceived periodontal, dental, and temporomandibular joint (TMJ) conditions were investigated. A questionnaire was applied in interview to 200 adults aged from 35 to 44, who were attending as casual patients at Araraquara School of Dentistry, São Paulo State University, São Paulo, Brazil. Clinical examination was based on the guidelines of the World Health Organization manual. The interview and the clinical examination were performed in two occasions, by a calibrated examiner. Reproducibility and validity were, respectively, verified by kappa statistics (kappa) and sensitivity (Sen) and specificity (Spec) values, having clinical examination as the validation criterion. The results showed an almost perfect agreement for self-perceived TMJ (kappa = 0.85) and periodontal conditions (kappa = 0.81), and it was substantial for dental condition (kappa = 0.69). Reproducibility according to clinical examination showed good results (kappa = 0.73 for CPI index, kappa = 0.96 for dental caries, and kappa = 0.74 for TMJ conditions). Sensitivity and specificity values were higher for self-perceived dental (Sen = 0.84, Spec = 1.0) and TMJ conditions (Sen = 1.0, Spec = 0.8). With regard to periodontal condition, specificity was low (0.43), although sensitivity was very high (1.0). Self-perceived oral health was reliable for the examined conditions. Validity was good to detect dental conditions and TMJ disorders, and it was more sensitive than specific to detect the presence of periodontal disease.

  6. Reproducibility of gene expression across generations of Affymetrix microarrays

    Directory of Open Access Journals (Sweden)

    Haslett Judith N

    2003-06-01

    Full Text Available Abstract Background The development of large-scale gene expression profiling technologies is rapidly changing the norms of biological investigation. But the rapid pace of change itself presents challenges. Commercial microarrays are regularly modified to incorporate new genes and improved target sequences. Although the ability to compare datasets across generations is crucial for any long-term research project, to date no means to allow such comparisons have been developed. In this study the reproducibility of gene expression levels across two generations of Affymetrix GeneChips® (HuGeneFL and HG-U95A was measured. Results Correlation coefficients were computed for gene expression values across chip generations based on different measures of similarity. Comparing the absolute calls assigned to the individual probe sets across the generations found them to be largely unchanged. Conclusion We show that experimental replicates are highly reproducible, but that reproducibility across generations depends on the degree of similarity of the probe sets and the expression level of the corresponding transcript.

  7. Hopping models for ion conduction in noncrystals

    DEFF Research Database (Denmark)

    Dyre, Jeppe; Schrøder, Thomas

    2007-01-01

    semiconductors). These universalities are subject of much current interest, for instance interpreted in the context of simple hopping models. In the present paper we first discuss the temperature dependence of the dc conductivity in hopping models and the importance of the percolation phenomenon. Next......, the experimental (quasi)universality of the ac conductivity is discussed. It is shown that hopping models are able to reproduce the experimental finding that the response obeys time-temperature superposition, while at the same time a broad range of activation energies is involved in the conduction process. Again...

  8. The Rapid Reproducers Paradox: Population Control and Individual Procreative Rights

    NARCIS (Netherlands)

    Wissenburg, M.L.J.

    1998-01-01

    In this article, I consider the impact of population policies on individual rights (in a very broad sense of the word), a topic that has received disproportionately little attention in debates on the legitimacy of population rights. I first concentrate on arguments in favour of very radical

  9. Analysis of mammalian gene function through broad based phenotypic screens across a consortium of mouse clinics

    Science.gov (United States)

    Adams, David J; Adams, Niels C; Adler, Thure; Aguilar-Pimentel, Antonio; Ali-Hadji, Dalila; Amann, Gregory; André, Philippe; Atkins, Sarah; Auburtin, Aurelie; Ayadi, Abdel; Becker, Julien; Becker, Lore; Bedu, Elodie; Bekeredjian, Raffi; Birling, Marie-Christine; Blake, Andrew; Bottomley, Joanna; Bowl, Mike; Brault, Véronique; Busch, Dirk H; Bussell, James N; Calzada-Wack, Julia; Cater, Heather; Champy, Marie-France; Charles, Philippe; Chevalier, Claire; Chiani, Francesco; Codner, Gemma F; Combe, Roy; Cox, Roger; Dalloneau, Emilie; Dierich, André; Di Fenza, Armida; Doe, Brendan; Duchon, Arnaud; Eickelberg, Oliver; Esapa, Chris T; El Fertak, Lahcen; Feigel, Tanja; Emelyanova, Irina; Estabel, Jeanne; Favor, Jack; Flenniken, Ann; Gambadoro, Alessia; Garrett, Lilian; Gates, Hilary; Gerdin, Anna-Karin; Gkoutos, George; Greenaway, Simon; Glasl, Lisa; Goetz, Patrice; Da Cruz, Isabelle Goncalves; Götz, Alexander; Graw, Jochen; Guimond, Alain; Hans, Wolfgang; Hicks, Geoff; Hölter, Sabine M; Höfler, Heinz; Hancock, John M; Hoehndorf, Robert; Hough, Tertius; Houghton, Richard; Hurt, Anja; Ivandic, Boris; Jacobs, Hughes; Jacquot, Sylvie; Jones, Nora; Karp, Natasha A; Katus, Hugo A; Kitchen, Sharon; Klein-Rodewald, Tanja; Klingenspor, Martin; Klopstock, Thomas; Lalanne, Valerie; Leblanc, Sophie; Lengger, Christoph; le Marchand, Elise; Ludwig, Tonia; Lux, Aline; McKerlie, Colin; Maier, Holger; Mandel, Jean-Louis; Marschall, Susan; Mark, Manuel; Melvin, David G; Meziane, Hamid; Micklich, Kateryna; Mittelhauser, Christophe; Monassier, Laurent; Moulaert, David; Muller, Stéphanie; Naton, Beatrix; Neff, Frauke; Nolan, Patrick M; Nutter, Lauryl MJ; Ollert, Markus; Pavlovic, Guillaume; Pellegata, Natalia S; Peter, Emilie; Petit-Demoulière, Benoit; Pickard, Amanda; Podrini, Christine; Potter, Paul; Pouilly, Laurent; Puk, Oliver; Richardson, David; Rousseau, Stephane; Quintanilla-Fend, Leticia; Quwailid, Mohamed M; Racz, Ildiko; Rathkolb, Birgit; Riet, Fabrice; Rossant, Janet; Roux, Michel; Rozman, Jan; Ryder, Ed; Salisbury, Jennifer; Santos, Luis; Schäble, Karl-Heinz; Schiller, Evelyn; Schrewe, Anja; Schulz, Holger; Steinkamp, Ralf; Simon, Michelle; Stewart, Michelle; Stöger, Claudia; Stöger, Tobias; Sun, Minxuan; Sunter, David; Teboul, Lydia; Tilly, Isabelle; Tocchini-Valentini, Glauco P; Tost, Monica; Treise, Irina; Vasseur, Laurent; Velot, Emilie; Vogt-Weisenhorn, Daniela; Wagner, Christelle; Walling, Alison; Weber, Bruno; Wendling, Olivia; Westerberg, Henrik; Willershäuser, Monja; Wolf, Eckhard; Wolter, Anne; Wood, Joe; Wurst, Wolfgang; Yildirim, Ali Önder; Zeh, Ramona; Zimmer, Andreas; Zimprich, Annemarie

    2015-01-01

    The function of the majority of genes in the mouse and human genomes remains unknown. The mouse ES cell knockout resource provides a basis for characterisation of relationships between gene and phenotype. The EUMODIC consortium developed and validated robust methodologies for broad-based phenotyping of knockouts through a pipeline comprising 20 disease-orientated platforms. We developed novel statistical methods for pipeline design and data analysis aimed at detecting reproducible phenotypes with high power. We acquired phenotype data from 449 mutant alleles, representing 320 unique genes, of which half had no prior functional annotation. We captured data from over 27,000 mice finding that 83% of the mutant lines are phenodeviant, with 65% demonstrating pleiotropy. Surprisingly, we found significant differences in phenotype annotation according to zygosity. Novel phenotypes were uncovered for many genes with unknown function providing a powerful basis for hypothesis generation and further investigation in diverse systems. PMID:26214591

  10. Cervical vertebrae maturation method morphologic criteria: poor reproducibility.

    Science.gov (United States)

    Nestman, Trenton S; Marshall, Steven D; Qian, Fang; Holton, Nathan; Franciscus, Robert G; Southard, Thomas E

    2011-08-01

    The cervical vertebrae maturation (CVM) method has been advocated as a predictor of peak mandibular growth. A careful review of the literature showed potential methodologic errors that might influence the high reported reproducibility of the CVM method, and we recently established that the reproducibility of the CVM method was poor when these potential errors were eliminated. The purpose of this study was to further investigate the reproducibility of the individual vertebral patterns. In other words, the purpose was to determine which of the individual CVM vertebral patterns could be classified reliably and which could not. Ten practicing orthodontists, trained in the CVM method, evaluated the morphology of cervical vertebrae C2 through C4 from 30 cephalometric radiographs using questions based on the CVM method. The Fleiss kappa statistic was used to assess interobserver agreement when evaluating each cervical vertebrae morphology question for each subject. The Kendall coefficient of concordance was used to assess the level of interobserver agreement when determining a "derived CVM stage" for each subject. Interobserver agreement was high for assessment of the lower borders of C2, C3, and C4 that were either flat or curved in the CVM method, but interobserver agreement was low for assessment of the vertebral bodies of C3 and C4 when they were either trapezoidal, rectangular horizontal, square, or rectangular vertical; this led to the overall poor reproducibility of the CVM method. These findings were reflected in the Fleiss kappa statistic. Furthermore, nearly 30% of the time, individual morphologic criteria could not be combined to generate a final CVM stage because of incompatible responses to the 5 questions. Intraobserver agreement in this study was only 62%, on average, when the inconclusive stagings were excluded as disagreements. Intraobserver agreement was worse (44%) when the inconclusive stagings were included as disagreements. For the group of subjects

  11. Inter-examiner reproducibility of tests for lumbar motor control

    Directory of Open Access Journals (Sweden)

    Elkjaer Arne

    2011-05-01

    Full Text Available Abstract Background Many studies show a relation between reduced lumbar motor control (LMC and low back pain (LBP. However, test circumstances vary and during test performance, subjects may change position. In other words, the reliability - i.e. reproducibility and validity - of tests for LMC should be based on quantitative data. This has not been considered before. The aim was to analyse the reproducibility of five different quantitative tests for LMC commonly used in daily clinical practice. Methods The five tests for LMC were: repositioning (RPS, sitting forward lean (SFL, sitting knee extension (SKE, and bent knee fall out (BKFO, all measured in cm, and leg lowering (LL, measured in mm Hg. A total of 40 subjects (14 males, 26 females 25 with and 15 without LBP, with a mean age of 46.5 years (SD 14.8, were examined independently and in random order by two examiners on the same day. LBP subjects were recruited from three physiotherapy clinics with a connection to the clinic's gym or back-school. Non-LBP subjects were recruited from the clinic's staff acquaintances, and from patients without LBP. Results The means and standard deviations for each of the tests were 0.36 (0.27 cm for RPS, 1.01 (0.62 cm for SFL, 0.40 (0.29 cm for SKE, 1.07 (0.52 cm for BKFO, and 32.9 (7.1 mm Hg for LL. All five tests for LMC had reproducibility with the following ICCs: 0.90 for RPS, 0.96 for SFL, 0.96 for SKE, 0.94 for BKFO, and 0.98 for LL. Bland and Altman plots showed that most of the differences between examiners A and B were less than 0.20 cm. Conclusion These five tests for LMC displayed excellent reproducibility. However, the diagnostic accuracy of these tests needs to be addressed in larger cohorts of subjects, establishing values for the normal population. Also cut-points between subjects with and without LBP must be determined, taking into account age, level of activity, degree of impairment and participation in sports. Whether reproducibility of these

  12. Improving reproducibility and external validity. The role of standardization and data reporting of laboratory rat husbandry and housing.

    Science.gov (United States)

    Fontoura-Andrade, José Luiz; Amorim, Rivadávio Fernandes Batista de; Sousa, João Batista de

    2017-03-01

    To identify the most relevant flaws in standardization in husbandry practices and lack of transparency to report them. This review proposes some measures in order to improve transparency, reproducibility and eventually external validity in experimental surgery experiments with rat model. We performed a search of scientific articles in PUBMED data base. The survey was conducted from august 2016 to January 2017. The keywords used were "reproducibility", "external validity", "rat model", "rat husbandry", "rat housing", and the time frame was up to January 2017. Articles discarded were the ones which the abstract or the key words did not imply that the authors would discuss any relationship of husbandry and housing with the reproducibility and transparency of reporting animal experiment. Reviews and papers that discussed specifically reproducibility and data reporting transparency were laboriously explored, including references for other articles that could fulfil the inclusion criteria. A total of 246 articles were initially found but only 44 were selected. Lack of transparency is the rule and not the exception when reporting results with rat model. This results in poor reproducibility and low external validity with the consequence of considerable loss of time and financial resources. There are still much to be done to improve compliance and adherence of researchers, editors and reviewers to adopt guidelines to mitigate some of the challenges that can impair reproducibility and external validity. Authors and reviewers should avoid pitfalls of absent, insufficient or inaccurate description of relevant information the rat model used. This information should be correctly published or reported on another source easily available for readers. Environmental conditions are well known by laboratory animal personnel and are well controlled in housing facilities, but usually neglected in experimental laboratories when the rat model is a novelty for the researcher.

  13. Batch-batch stable microbial community in the traditional fermentation process of huyumei broad bean pastes.

    Science.gov (United States)

    Zhu, Linjiang; Fan, Zihao; Kuai, Hui; Li, Qi

    2017-09-01

    During natural fermentation processes, a characteristic microbial community structure (MCS) is naturally formed, and it is interesting to know about its batch-batch stability. This issue was explored in a traditional semi-solid-state fermentation process of huyumei, a Chinese broad bean paste product. The results showed that this MCS mainly contained four aerobic Bacillus species (8 log CFU per g), including B. subtilis, B. amyloliquefaciens, B. methylotrophicus, and B. tequilensis, and the facultative anaerobe B. cereus with a low concentration (4 log CFU per g), besides a very small amount of the yeast Zygosaccharomyces rouxii (2 log CFU per g). The dynamic change of the MCS in the brine fermentation process showed that the abundance of dominant species varied within a small range, and in the beginning of process the growth of lactic acid bacteria was inhibited and Staphylococcus spp. lost its viability. Also, the MCS and its dynamic change were proved to be highly reproducible among seven batches of fermentation. Therefore, the MCS naturally and stably forms between different batches of the traditional semi-solid-state fermentation of huyumei. Revealing microbial community structure and its batch-batch stability is helpful for understanding the mechanisms of community formation and flavour production in a traditional fermentation. This issue in a traditional semi-solid-state fermentation of huyumei broad bean paste was firstly explored. This fermentation process was revealed to be dominated by a high concentration of four aerobic species of Bacillus, a low concentration of B. cereus and a small amount of Zygosaccharomyces rouxii. Lactic acid bacteria and Staphylococcus spp. lost its viability at the beginning of fermentation. Such the community structure was proved to be highly reproducible among seven batches. © 2017 The Society for Applied Microbiology.

  14. Investigation of the Intra- and Interlaboratory Reproducibility of a Small Scale Standardized Supersaturation and Precipitation Method.

    Science.gov (United States)

    Plum, Jakob; Madsen, Cecilie M; Teleki, Alexandra; Bevernage, Jan; da Costa Mathews, Claudia; Karlsson, Eva M; Carlert, Sara; Holm, Rene; Müller, Thomas; Matthews, Wayne; Sayers, Alice; Ojala, Krista; Tsinsman, Konstantin; Lingamaneni, Ram; Bergström, Christel As; Rades, Thomas; Müllertz, Anette

    2017-12-04

    The high number of poorly water-soluble compounds in drug development has increased the need for enabling formulations to improve oral bioavailability. One frequently applied approach is to induce supersaturation at the absorptive site, e.g., the small intestine, increasing the amount of dissolved compound available for absorption. However, due to the stochastic nature of nucleation, supersaturating drug delivery systems may lead to inter- and intrapersonal variability. The ability to define a feasible range with respect to the supersaturation level is a crucial factor for a successful formulation. Therefore, an in vitro method is needed, from where the ability of a compound to supersaturate can be defined in a reproducible way. Hence, this study investigates the reproducibility of an in vitro small scale standardized supersaturation and precipitation method (SSPM). First an intralaboratory reproducibility study of felodipine was conducted, after which seven partners contributed with data for three model compounds; aprepitant, felodipine, and fenofibrate, to determine the interlaboratory reproducibility of the SSPM. The first part of the SSPM determines the apparent degrees of supersaturation (aDS) to investigate for each compound. Each partner independently determined the maximum possible aDS and induced 100, 87.5, 75, and 50% of their determined maximum possible aDS in the SSPM. The concentration-time profile of the supersaturation and following precipitation was obtained in order to determine the induction time (t ind ) for detectable precipitation. The data showed that the absolute values of t ind and aDS were not directly comparable between partners, however, upon linearization of the data a reproducible rank ordering of the three model compounds was obtained based on the β-value, which was defined as the slope of the ln(t ind ) versus ln(aDS) -2 plot. Linear regression of this plot showed that aprepitant had the highest β-value, 15.1, while felodipine and

  15. Reproducibility and Practical Adoption of GEOBIA with Open-Source Software in Docker Containers

    Directory of Open Access Journals (Sweden)

    Christian Knoth

    2017-03-01

    Full Text Available Geographic Object-Based Image Analysis (GEOBIA mostly uses proprietary software,but the interest in Free and Open-Source Software (FOSS for GEOBIA is growing. This interest stems not only from cost savings, but also from benefits concerning reproducibility and collaboration. Technical challenges hamper practical reproducibility, especially when multiple software packages are required to conduct an analysis. In this study, we use containerization to package a GEOBIA workflow in a well-defined FOSS environment. We explore the approach using two software stacks to perform an exemplary analysis detecting destruction of buildings in bi-temporal images of a conflict area. The analysis combines feature extraction techniques with segmentation and object-based analysis to detect changes using automatically-defined local reference values and to distinguish disappeared buildings from non-target structures. The resulting workflow is published as FOSS comprising both the model and data in a ready to use Docker image and a user interface for interaction with the containerized workflow. The presented solution advances GEOBIA in the following aspects: higher transparency of methodology; easier reuse and adaption of workflows; better transferability between operating systems; complete description of the software environment; and easy application of workflows by image analysis experts and non-experts. As a result, it promotes not only the reproducibility of GEOBIA, but also its practical adoption.

  16. Stable, precise, and reproducible patterning of bicoid and hunchback molecules in the early Drosophila embryo.

    Directory of Open Access Journals (Sweden)

    Yurie Okabe-Oho

    2009-08-01

    Full Text Available Precise patterning of morphogen molecules and their accurate reading out are of key importance in embryonic development. Recent experiments have visualized distributions of proteins in developing embryos and shown that the gradient of concentration of Bicoid morphogen in Drosophila embryos is established rapidly after fertilization and remains stable through syncytial mitoses. This stable Bicoid gradient is read out in a precise way to distribute Hunchback with small fluctuations in each embryo and in a reproducible way, with small embryo-to-embryo fluctuation. The mechanisms of such stable, precise, and reproducible patterning through noisy cellular processes, however, still remain mysterious. To address these issues, here we develop the one- and three-dimensional stochastic models of the early Drosophila embryo. The simulated results show that the fluctuation in expression of the hunchback gene is dominated by the random arrival of Bicoid at the hunchback enhancer. Slow diffusion of Hunchback protein, however, averages out this intense fluctuation, leading to the precise patterning of distribution of Hunchback without loss of sharpness of the boundary of its distribution. The coordinated rates of diffusion and transport of input Bicoid and output Hunchback play decisive roles in suppressing fluctuations arising from the dynamical structure change in embryos and those arising from the random diffusion of molecules, and give rise to the stable, precise, and reproducible patterning of Bicoid and Hunchback distributions.

  17. Oxygen consumption of rats with broad intestinal resection

    Directory of Open Access Journals (Sweden)

    Luz J.

    2000-01-01

    Full Text Available The study was performed to investigate possible alterations in oxygen consumption in an animal model with broad intestinal resection. Oxygen consumption and the thermal effect of a short meal were measured in rats subjected to short bowel syndrome. Four groups of rats were used. Group I was the control group, group II was sham operated, group III was submitted to 80% jejunum-ileum resection, and group IV was submitted to 80% jejunum-ileum resection with colon interposition. Ninety days after surgery, oxygen consumption was measured over a period of 6 h with the animals fasted overnight. The thermal effect of feeding was determined in another session of oxygen consumption measurement in animals fasted for 12 h. A 12-kcal meal was then introduced into the animal chamber and oxygen consumption was measured for a further 4 h. No differences in fasting oxygen consumption or in the thermal effect of the meal were detected among the groups studied. It is concluded that short bowel syndrome does not affect the overall energy expenditure of rats.

  18. Failure of the Woods-Saxon nuclear potential to simultaneously reproduce precise fusion and elastic scattering measurements

    International Nuclear Information System (INIS)

    Mukherjee, A.; Hinde, D. J.; Dasgupta, M.; Newton, J. O.; Butt, R. D.; Hagino, K.

    2007-01-01

    A precise fusion excitation function has been measured for the 12 C+ 208 Pb reaction at energies around the barrier, allowing the fusion barrier distribution to be extracted. The fusion cross sections at high energies differ significantly from existing fusion data. Coupled reaction channels calculations have been carried out with the code FRESCO. A bare potential previously claimed to uniquely describe a wide range of 12 C+ 208 Pb near-barrier reaction channels failed to reproduce the new fusion data. The nuclear potential diffuseness of 0.95 fm which fits the fusion excitation function over a broad energy range fails to reproduce the elastic scattering. A diffuseness of 0.55 fm reproduces the fusion barrier distribution and elastic scattering data, but significantly overpredicts the fusion cross sections at high energies. This may be due to physical processes not included in the calculations. To constrain calculations, it is desirable to have precisely measured fusion cross sections, especially at energies around the barrier

  19. Reproducibility of CT Perfusion Parameters in Liver Tumors and Normal Liver

    Science.gov (United States)

    Ng, Chaan S.; Chandler, Adam G.; Wei, Wei; Herron, Delise H.; Anderson, Ella F.; Kurzrock, Razelle; Charnsangavej, Chusilp

    2011-01-01

    Purpose: To assess the reproducibility of computed tomographic (CT) perfusion measurements in liver tumors and normal liver and effects of motion and data acquisition time on parameters. Materials and Methods: Institutional review board approval and written informed consent were obtained for this prospective study. The study complied with HIPAA regulations. Two CT perfusion scans were obtained 2–7 days apart in seven patients with liver tumors with two scanning phases (phase 1: 30-second breath-hold cine; phase 2: six intermittent free-breathing cines) spanning 135 seconds. Blood flow (BF), blood volume (BV), mean transit time (MTT), and permeability–surface area product (PS) for tumors and normal liver were calculated from phase 1 with and without rigid registration and, for combined phases 1 and 2, with manually and rigid-registered phase 2 images, by using deconvolution modeling. Variability was assessed with within-patient coefficients of variation (CVs) and Bland-Altman analyses. Results: For tumors, BF, BV, MTT, and PS values and reproducibility varied by analytical method, the former by up to 11%, 23%, 21%, and 138%, respectively. Median PS values doubled with the addition of phase 2 data to phase 1 data. The best overall reproducibility was obtained with rigidly registered phase 1 and phase 2 images, with within-patient CVs for BF, BV, MTT, and PS of 11.2%, 14.4%, 5.5% and 12.1%, respectively. Normal liver evaluations were similar, except with marginally lower variability. Conclusion: Absolute values and reproducibility of CT perfusion parameters were markedly influenced by motion and data acquisition time. PS, in particular, probably requires data acquisition beyond a single breath hold, for which motion-correction techniques are likely necessary. © RSNA, 2011 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11110331/-/DC1 PMID:21788525

  20. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques

    Science.gov (United States)

    Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi

    2017-01-01

    Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642

  1. Reproducibility of precipitation distributions over extratropical continental regions in the CMIP5

    Science.gov (United States)

    Hirota, Nagio; Takayabu, Yukari

    2013-04-01

    Reproducibility of precipitation distributions over extratropical continental regions in the CMIP5 Nagio Hirota1,2 and Yukari N. Takayabu2 (1) National Institute of Polar Research (NIPR) (2) Atmosphere and Ocean Research Institute (AORI), the University of Tokyo Reproducibility of precipitation distributions over extratropical continental regions by CMIP5 climate models in their historical runs are evaluated, in comparison with GPCP(V2.2), CMAP(V0911), daily gridded gauge data APHRODITE. Surface temperature, cloud radiative forcing, and atmospheric circulations are also compared with observations of CRU-UEA, CERES, and ERA-interim/ERA40/JRA reanalysis data. It is shown that many CMIP5 models underestimate and overestimate summer precipitation over West and East Eurasia, respectively. These precipitation biases correspond to moisture transport associated with a cyclonic circulation bias over the whole continent of Eurasia. Meanwhile, many models underestimate cloud over the Eurasian continent, and associated shortwave cloud radiative forcing result in a significant warm bias. Evaporation feedback amplify the warm bias over West Eurasia. These processes consistently explain the precipitation biases over the Erasian continent in summer. We also examined reproducibility of winter precipitation, but robust results are not obtained yet due to the large uncertainty in observation associated with the adjustment of snow measurement in windy condition. Better observational data sets are necessary for further model validation. Acknowledgment: This study is supported by the PMM RA of JAXA, Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology, Japan, and Environment Research and Technology Development Fund (A-1201) of the Ministry of the Environment, Japan.

  2. The Reliability and Reproducibility of Corneal Confocal Microscopy in Children.

    Science.gov (United States)

    Pacaud, Danièle; Romanchuk, Kenneth G; Tavakoli, Mitra; Gougeon, Claire; Virtanen, Heidi; Ferdousi, Maryam; Nettel-Aguirre, Alberto; Mah, Jean K; Malik, Rayaz A

    2015-08-01

    To assess the image and patient level interrater agreement and repeatability within 1 month for corneal nerve fiber length (CNFL) measured using in vivo corneal confocal microscopy (IVCCM) in children. Seventy-one subjects (mean [SD] age 14.3 [2.6] years, range 8-18 years; 44 with type 1 diabetes and 27 controls; 36 males and 35 females) were included. 547 images (∼6 images per subject) were analyzed manually by two independent and masked observers. One-month repeat visit images were analyzed by a single masked observer in 21 patients. Automated image analysis was then performed using a specialized computerized software (ACCMetrics). For CNFL, the ICC (95% CI) were 0.94 (0.93-0.95) for image-level, 0.86 (0.78-0.91) for patient-level, and 0.88 (0.72-0.95) for the 1-month repeat assessment, and the Bland-Altman plots showed minimal bias between observers. Although there was excellent agreement between manual and automated analysis according to an ICC 0.89 (0.82-0.93), the Bland-Altman plot showed a consistent bias with manual measurements providing higher readings. In vivo corneal confocal microscopy image analysis shows good reproducibility with excellent intraindividual and interindividual variability in pediatric subjects. Since the image-level reproducibility is stronger than the patient-level reproducibility, refinement of the method for image selection will likely further increase the robustness of this novel, rapid, and noninvasive approach to detect early neuropathy in children with diabetes. Further studies on the use of IVCCM to identify early subclinical neuropathy in children are indicated.

  3. Repeatability and reproducibility of decisions by latent fingerprint examiners.

    Directory of Open Access Journals (Sweden)

    Bradford T Ulery

    Full Text Available The interpretation of forensic fingerprint evidence relies on the expertise of latent print examiners. We tested latent print examiners on the extent to which they reached consistent decisions. This study assessed intra-examiner repeatability by retesting 72 examiners on comparisons of latent and exemplar fingerprints, after an interval of approximately seven months; each examiner was reassigned 25 image pairs for comparison, out of total pool of 744 image pairs. We compare these repeatability results with reproducibility (inter-examiner results derived from our previous study. Examiners repeated 89.1% of their individualization decisions, and 90.1% of their exclusion decisions; most of the changed decisions resulted in inconclusive decisions. Repeatability of comparison decisions (individualization, exclusion, inconclusive was 90.0% for mated pairs, and 85.9% for nonmated pairs. Repeatability and reproducibility were notably lower for comparisons assessed by the examiners as "difficult" than for "easy" or "moderate" comparisons, indicating that examiners' assessments of difficulty may be useful for quality assurance. No false positive errors were repeated (n = 4; 30% of false negative errors were repeated. One percent of latent value decisions were completely reversed (no value even for exclusion vs. of value for individualization. Most of the inter- and intra-examiner variability concerned whether the examiners considered the information available to be sufficient to reach a conclusion; this variability was concentrated on specific image pairs such that repeatability and reproducibility were very high on some comparisons and very low on others. Much of the variability appears to be due to making categorical decisions in borderline cases.

  4. REPRODUCIBILITY OF MASKED HYPERTENSION AMONG ADULTS 30 YEARS AND OLDER

    Science.gov (United States)

    Viera, Anthony J.; Lin, Feng-Chang; Tuttle, Laura A.; Olsson, Emily; Stankevitz, Kristin; Girdler, Susan S.; Klein, J. Larry; Hinderliter, Alan L.

    2015-01-01

    Objective Masked hypertension (MH) refers to non-elevated office blood pressure (BP) with elevated out-of-office BP, but its reproducibility has not been conclusively established. We examined one-week reproducibility of MH by home BP monitoring (HBPM) and ambulatory BP monitoring (ABPM). Methods We recruited 420 adults not on BP-lowering medication with recent clinic BP between 120/80 and 149/95 mm Hg. For main comparisons, participants with office average ABPM average was ≥135/85 mm Hg; they were considered to have MH by HBPM if the average was ≥135/85 mm Hg. Percent agreements were quantified using kappa. We also examined prevalence of MH defined as office average ABPM average ≥130/80 mm Hg. We conducted sensitivity analyses using different threshold BP levels for ABPM-office pairings and HBPM-office pairings for defining MH. Results Prevalence rates of MH based on office-awake ABPM pairings were 44% and 43%, with agreement of 71% (kappa=0.40; 95% CI 0.31–0.49). MH was less prevalent (15% and 17%) using HBPM-office pairings, with agreement of 82% (kappa=0.30; 95% CI 0.16–0.44), and more prevalent when considering 24-hour average (50% and 48%). MH was also less prevalent when more stringent diagnostic criteria were applied. Office-HBPM pairings and office-awake ABPM pairings had fair agreement on MH classification on both occasions, with kappas of 0.36 and 0.30. Conclusions MH has fair short-term reproducibility, providing further evidence that for some people, out-of-office BP is systematically higher than when measured in the office setting. PMID:24842491

  5. Reproducibility of gallbladder ejection fraction measured by fatty meal cholescintigraphy

    International Nuclear Information System (INIS)

    Al-Muqbel, Kusai M.; Hani, M. N. Hani; Elheis, M. A.; Al-Omari, M. H.

    2010-01-01

    There are conflicting data in the literature regarding the reproducibility of the gallbladder ejection fraction (GBEF) measured by fatty meal cholescintigraphy (CS). We aimed to test the reproducibility of GBEF measured by fatty meal CS. Thirty-five subjects (25 healthy volunteers and 10 patients with chronic abdominal pain) underwent fatty meal CS twice in order to measure GBEF1 and GBEF2. The healthy volunteers underwent a repeat scan within 1-13 months from the first scan. The patients underwent a repeat scan within 1-4 years from the first scan and were not found to have chronic acalculous cholecystitis (CAC). Our standard fatty meal was composed of a 60-g Snickers chocolate bar and 200 ml full-fat yogurt. The mean ± SD values for GBEF1 and GBEF2 were 52±17% and 52±16%, respectively. There was a direct linear correlation between the values of GBEF1 and GBEF2 for the subjects, with a correlation coefficient of 0.509 (p=0.002). Subgroup data analysis of the volunteer group showed that there was significant linear correlation between volunteer values of GBEF1 and GBEF2, with a correlation coefficient of 0.473 (p=0.017). Subgroup data analysis of the non-CAC patient group showed no significant correlation between patient values of GBEF1 and GBEF2, likely due to limited sample size. This study showed that fatty meal CS is a reliable test in gallbladder motility evaluation and that GBEF measured by fatty meal CS is reproducible

  6. Reproducibility of gallbladder ejection fraction measured by fatty meal cholescintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Al-Muqbel, Kusai M.; Hani, M. N. Hani; Elheis, M. A.; Al-Omari, M. H. [School of Medicine, Jordan University of Science and Technology, Irbid (Jordan)

    2010-12-15

    There are conflicting data in the literature regarding the reproducibility of the gallbladder ejection fraction (GBEF) measured by fatty meal cholescintigraphy (CS). We aimed to test the reproducibility of GBEF measured by fatty meal CS. Thirty-five subjects (25 healthy volunteers and 10 patients with chronic abdominal pain) underwent fatty meal CS twice in order to measure GBEF1 and GBEF2. The healthy volunteers underwent a repeat scan within 1-13 months from the first scan. The patients underwent a repeat scan within 1-4 years from the first scan and were not found to have chronic acalculous cholecystitis (CAC). Our standard fatty meal was composed of a 60-g Snickers chocolate bar and 200 ml full-fat yogurt. The mean {+-} SD values for GBEF1 and GBEF2 were 52{+-}17% and 52{+-}16%, respectively. There was a direct linear correlation between the values of GBEF1 and GBEF2 for the subjects, with a correlation coefficient of 0.509 (p=0.002). Subgroup data analysis of the volunteer group showed that there was significant linear correlation between volunteer values of GBEF1 and GBEF2, with a correlation coefficient of 0.473 (p=0.017). Subgroup data analysis of the non-CAC patient group showed no significant correlation between patient values of GBEF1 and GBEF2, likely due to limited sample size. This study showed that fatty meal CS is a reliable test in gallbladder motility evaluation and that GBEF measured by fatty meal CS is reproducible

  7. Repeat: a framework to assess empirical reproducibility in biomedical research

    Directory of Open Access Journals (Sweden)

    Leslie D. McIntosh

    2017-09-01

    Full Text Available Abstract Background The reproducibility of research is essential to rigorous science, yet significant concerns of the reliability and verifiability of biomedical research have been recently highlighted. Ongoing efforts across several domains of science and policy are working to clarify the fundamental characteristics of reproducibility and to enhance the transparency and accessibility of research. Methods The aim of the proceeding work is to develop an assessment tool operationalizing key concepts of research transparency in the biomedical domain, specifically for secondary biomedical data research using electronic health record data. The tool (RepeAT was developed through a multi-phase process that involved coding and extracting recommendations and practices for improving reproducibility from publications and reports across the biomedical and statistical sciences, field testing the instrument, and refining variables. Results RepeAT includes 119 unique variables grouped into five categories (research design and aim, database and data collection methods, data mining and data cleaning, data analysis, data sharing and documentation. Preliminary results in manually processing 40 scientific manuscripts indicate components of the proposed framework with strong inter-rater reliability, as well as directions for further research and refinement of RepeAT. Conclusions The use of RepeAT may allow the biomedical community to have a better understanding of the current practices of research transparency and accessibility among principal investigators. Common adoption of RepeAT may improve reporting of research practices and the availability of research outputs. Additionally, use of RepeAT will facilitate comparisons of research transparency and accessibility across domains and institutions.

  8. Assessment of precision and reproducibility of a new myograph

    Directory of Open Access Journals (Sweden)

    Piepenbrock Siegfried

    2007-12-01

    Full Text Available Abstract Background The physiological characteristics of muscle activity and the assessment of muscle strength represent important diagnostic information. There are many devices that measure muscle force in humans, but some require voluntary contractions, which are difficult to assess in weak or unconscious patients who are unable to complete a full range of voluntary force assessment tasks. Other devices, which obtain standard muscle contractions by electric stimulations, do not have the technology required to induce and measure reproducible valid contractions at the optimum muscle length. Methods In our study we used a newly developed diagnostic device which measures accurately the reproducibility and time-changed-variability of the muscle force in an individual muscle. A total of 500 in-vivo measurements of supra-maximal isometric single twitch contractions were carried out on the musculus adductor pollicis of 5 test subjects over 10 sessions, with ten repetitions per session. The same protocol was performed on 405 test subjects with two repetitions each to determine a reference-interval on healthy subjects. Results Using our test setting, we found a high reproducibility of the muscle contractions of each test subject. The precision of the measurements performed with our device was 98.74%. Only two consecutive measurements are needed in order to assess a real, representative individual value of muscle force. The mean value of the force of contraction was 9.51 N and the 95% reference interval was 4.77–14.25 N. Conclusion The new myograph is a highly reliable measuring device with which the adductor pollicis can be investigated at the optimum length. It has the potential to become a reliable and valid tool for diagnostic in the clinical setting and for monitoring neuromuscular diseases.

  9. Reproducible diagnosis of Chronic Lymphocytic Leukemia by flow cytometry

    DEFF Research Database (Denmark)

    Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha

    2018-01-01

    diagnosis were: CD43, CD79b, CD81, CD200, CD10, and ROR1. Reproducible criteria for component reagents were assessed retrospectively in 14,643 cases from 13 different centres and showed >97% concordance with current approaches. A pilot study to validate staining quality was completed in eleven centres...... identified. Finally, a consensus "recommended" panel of markers to refine diagnosis in borderline cases (CD43, CD79b, CD81, CD200, CD10, ROR1) has been defined and will be prospectively evaluated. This article is protected by copyright. All rights reserved....

  10. Broad Spectrum Sanitizing Wipes with Food Additives, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcide proposes to develop novel multipurpose non-toxic sanitizing wipes that are aqueous based, have shelf life of 3-5 years, have broad spectrum microbicidal...

  11. Broad-scale phosphoprotein profiling of beta adrenergic receptor (β-AR signaling reveals novel phosphorylation and dephosphorylation events.

    Directory of Open Access Journals (Sweden)

    Andrzej J Chruscinski

    Full Text Available β-adrenergic receptors (β-ARs are model G-protein coupled receptors that mediate signal transduction in the sympathetic nervous system. Despite the widespread clinical use of agents that target β-ARs, the signaling pathways that operate downstream of β-AR stimulation have not yet been completely elucidated. Here, we utilized a lysate microarray approach to obtain a broad-scale perspective of phosphoprotein signaling downstream of β-AR. We monitored the time course of phosphorylation states of 54 proteins after β-AR activation mouse embryonic fibroblast (MEF cells. In response to stimulation with the non-selective β-AR agonist isoproterenol, we observed previously described phosphorylation events such as ERK1/2(T202/Y204 and CREB(S133, but also novel phosphorylation events such as Cdc2(Y15 and Pyk2(Y402. All of these events were mediated through cAMP and PKA as they were reproduced by stimulation with the adenylyl cyclase activator forskolin and were blocked by treatment with H89, a PKA inhibitor. In addition, we also observed a number of novel isoproterenol-induced protein dephosphorylation events in target substrates of the PI3K/AKT pathway: GSK3β(S9, 4E-BP1(S65, and p70s6k(T389. These dephosphorylations were dependent on cAMP, but were independent of PKA and correlated with reduced PI3K/AKT activity. Isoproterenol stimulation also led to a cAMP-dependent dephosphorylation of PP1α(T320, a modification known to correlate with enhanced activity of this phosphatase. Dephosphorylation of PP1α coincided with the secondary decline in phosphorylation of some PKA-phosphorylated substrates, suggesting that PP1α may act in a feedback loop to return these phosphorylations to baseline. In summary, lysate microarrays are a powerful tool to profile phosphoprotein signaling and have provided a broad-scale perspective of how β-AR signaling can regulate key pathways involved in cell growth and metabolism.

  12. Data Processing Workflows to Support Reproducible Data-driven Research in Hydrology

    Science.gov (United States)

    Goodall, J. L.; Essawy, B.; Xu, H.; Rajasekar, A.; Moore, R. W.

    2015-12-01

    Geoscience analyses often require the use of existing data sets that are large, heterogeneous, and maintained by different organizations. A particular challenge in creating reproducible analyses using these data sets is automating the workflows required to transform raw datasets into model specific input files and finally into publication ready visualizations. Data grids, such as the Integrated Rule-Oriented Data System (iRODS), are architectures that allow scientists to access and share large data sets that are geographically distributed on the Internet, but appear to the scientist as a single file management system. The DataNet Federation Consortium (DFC) project is built on iRODS and aims to demonstrate data and computational interoperability across scientific communities. This paper leverages iRODS and the DFC to demonstrate how hydrological modeling workflows can be encapsulated as workflows using the iRODS concept of Workflow Structured Objects (WSO). An example use case is presented for automating hydrologic model post-processing routines that demonstrates how WSOs can be created and used within the DFC to automate the creation of data visualizations from large model output collections. By co-locating the workflow used to create the visualization with the data collection, the use case demonstrates how data grid technology aids in reuse, reproducibility, and sharing of workflows within scientific communities.

  13. The broad line region of AGN: Kinematics and physics

    Directory of Open Access Journals (Sweden)

    Popović L.Č.

    2006-01-01

    Full Text Available In this paper a discussion of kinematics and physics of the Broad Line Region (BLR is given. The possible physical conditions in the BLR and problems in determination of the physical parameters (electron temperature and density are considered. Moreover, one analyses the geometry of the BLR and the probability that (at least a fraction of the radiation in the Broad Emission Lines (BELs originates from a relativistic accretion disk.

  14. The nebular spectra of the transitional Type Ia Supernovae 2007on and 2011iv: broad, multiple components indicate aspherical explosion cores

    Science.gov (United States)

    Mazzali, P. A.; Ashall, C.; Pian, E.; Stritzinger, M. D.; Gall, C.; Phillips, M. M.; Höflich, P.; Hsiao, E.

    2018-02-01

    The nebular-epoch spectrum of the rapidly declining, "transitional" type Ia supernova (SN) 2007on showed double emission peaks, which have been interpreted as indicating that the SN was the result of the direct collision of two white dwarfs. The spectrum can be reproduced using two distinct emission components, one red-shifted and one blue-shifted. These components are similar in mass but have slightly different degrees of ionization. They recede from one another at a line-of-sight speed larger than the sum of the combined expansion velocities of their emitting cores, thereby acting as two independent nebulae. While this configuration appears to be consistent with the scenario of two white dwarfs colliding, it may also indicate an off-centre delayed detonation explosion of a near Chandrasekhar-mass white dwarf. In either case, broad emission line widths and a rapidly evolving light curve can be expected for the bolometric luminosity of the SN. This is the case for both SNe 2007on and 2011iv, also a transitional SN Ia which exploded in the same elliptical galaxy, NGC 1404. Although SN 2011iv does not show double-peaked emission line profiles, the width of its emission lines is such that a two-component model yields somewhat better results than a single-component model. Most of the mass ejected is in one component, however, which suggests that SN 2011iv was the result of the off-centre ignition of a Chandrasekhar-mass white dwarf.

  15. The nebular spectra of the transitional Type Ia Supernovae 2007on and 2011iv: broad, multiple components indicate aspherical explosion cores

    Science.gov (United States)

    Mazzali, P. A.; Ashall, C.; Pian, E.; Stritzinger, M. D.; Gall, C.; Phillips, M. M.; Höflich, P.; Hsiao, E.

    2018-05-01

    The nebular-epoch spectrum of the rapidly declining, `transitional' Type Ia supernova (SN) 2007on showed double emission peaks, which have been interpreted as indicating that the SN was the result of the direct collision of two white dwarfs. The spectrum can be reproduced using two distinct emission components, one redshifted and one blueshifted. These components are similar in mass but have slightly different degrees of ionization. They recede from one another at a line-of-sight speed larger than the sum of the combined expansion velocities of their emitting cores, thereby acting as two independent nebulae. While this configuration appears to be consistent with the scenario of two white dwarfs colliding, it may also indicate an off-centre delayed detonation explosion of a near-Chandrasekhar-mass white dwarf. In either case, broad emission line widths and a rapidly evolving light curve can be expected for the bolometric luminosity of the SN. This is the case for both SNe 2007on and 2011iv, also a transitional SN Ia that exploded in the same elliptical galaxy, NGC 1404. Although SN 2011iv does not show double-peaked emission line profiles, the width of its emission lines is such that a two-component model yields somewhat better results than a single-component model. Most of the mass ejected is in one component, however, which suggests that SN 2011iv was the result of the off-centre ignition of a Chandrasekhar-mass white dwarf.

  16. Reproducibility of suppression of Pythium wilt of cucumber by compost

    Directory of Open Access Journals (Sweden)

    Mauritz Vilhelm Vestberg

    2014-10-01

    Full Text Available There is increasing global interest in using compost to suppress soil-borne fungal and bacterial diseases and nematodes. We studied the reproducibility of compost suppressive capacity (SC against Pythium wilt of cucumber using nine composts produced by the same composting plant in 2008 and 2009. A bioassay was set up in a greenhouse using cucumber inoculated with two strains of Pythium. The composts were used as 20% mixtures (v:v of a basic steam-sterilized light Sphagnum peat and sand (3:1, v:v. Shoot height was measured weekly during the 5-week experiment. At harvest, the SC was calculated as the % difference in shoot dry weight (DW between non-inoculated and inoculated cucumbers. The SC was not affected by year of production (2008 or 2009, indicating reproducibility of SC when the raw materials and the composting method are not changed. Differences in shoot height were not as pronounced as those for shoot DW. The results were encouraging, but further studies are still needed for producing compost with guaranteed suppressiveness properties.

  17. An occlusal plaque index. Measurements of repeatability, reproducibility, and sensitivity.

    Science.gov (United States)

    Splieth, Christian H; Nourallah, Abduhl W

    2006-06-01

    To evaluate a new, computerized method of measuring dental plaque on occlusal surfaces which exhibit the highest caries prevalence. In 16 patients (6-9 years of age), plaque on the occlusal surfaces of permanent molars was stained (Mira-2-Tone) and photographed with an intra-oral camera. In a conventional picture editing program (PC/Adobe PhotoShop 6.0), the occlusal surface and plaque were measured in pixels and the relative proportion of occlusal plaque was calculated (ANALYSIS 3.0). The repeatability and reproducibility of the method were analyzed by re-taking and analyzing four images by two examiners four times via intra- and inter-examiner correlation coefficients and by re-analyzing 10 images. Sensitivity was tested by re-taking and analyzing the images of the same occlusal surfaces in all patients after instructed brushing with an electric toothbrush. Intra- and inter-examiner correlation coefficients for repeatability and reproducibility of the analysis were excellent (ICC> 0.997 and ICC=0.98, resp.; 95% confidence interval: 0.955-0.995). The inter- and intra-examiner coefficients for the whole procedure including the re-taking of images were also high (ICC > 0.90). The method was also highly sensitive, proving a statistically significant plaque reduction after brushing (before: mean 29.2% plaque, after: 14.7% plaque; t-test, P= 0.025).

  18. Everware toolkit. Supporting reproducible science and challenge-driven education.

    Science.gov (United States)

    Ustyuzhanin, A.; Head, T.; Babuschkin, I.; Tiunov, A.

    2017-10-01

    Modern science clearly demands for a higher level of reproducibility and collaboration. To make research fully reproducible one has to take care of several aspects: research protocol description, data access, environment preservation, workflow pipeline, and analysis script preservation. Version control systems like git help with the workflow and analysis scripts part. Virtualization techniques like Docker or Vagrant can help deal with environments. Jupyter notebooks are a powerful platform for conducting research in a collaborative manner. We present project Everware that seamlessly integrates git repository management systems such as Github or Gitlab, Docker and Jupyter helping with a) sharing results of real research and b) boosts education activities. With the help of Everware one can not only share the final artifacts of research but all the depth of the research process. This been shown to be extremely helpful during organization of several data analysis hackathons and machine learning schools. Using Everware participants could start from an existing solution instead of starting from scratch. They could start contributing immediately. Everware allows its users to make use of their own computational resources to run the workflows they are interested in, which leads to higher scalability of the toolkit.

  19. Reproducibility of Variant Calls in Replicate Next Generation Sequencing Experiments.

    Directory of Open Access Journals (Sweden)

    Yuan Qi

    Full Text Available Nucleotide alterations detected by next generation sequencing are not always true biological changes but could represent sequencing errors. Even highly accurate methods can yield substantial error rates when applied to millions of nucleotides. In this study, we examined the reproducibility of nucleotide variant calls in replicate sequencing experiments of the same genomic DNA. We performed targeted sequencing of all known human protein kinase genes (kinome (~3.2 Mb using the SOLiD v4 platform. Seventeen breast cancer samples were sequenced in duplicate (n=14 or triplicate (n=3 to assess concordance of all calls and single nucleotide variant (SNV calls. The concordance rates over the entire sequenced region were >99.99%, while the concordance rates for SNVs were 54.3-75.5%. There was substantial variation in basic sequencing metrics from experiment to experiment. The type of nucleotide substitution and genomic location of the variant had little impact on concordance but concordance increased with coverage level, variant allele count (VAC, variant allele frequency (VAF, variant allele quality and p-value of SNV-call. The most important determinants of concordance were VAC and VAF. Even using the highest stringency of QC metrics the reproducibility of SNV calls was around 80% suggesting that erroneous variant calling can be as high as 20-40% in a single experiment. The sequence data have been deposited into the European Genome-phenome Archive (EGA with accession number EGAS00001000826.

  20. Reproducible Research Practices and Transparency across the Biomedical Literature.

    Directory of Open Access Journals (Sweden)

    Shareen A Iqbal

    2016-01-01

    Full Text Available There is a growing movement to encourage reproducibility and transparency practices in the scientific community, including public access to raw data and protocols, the conduct of replication studies, systematic integration of evidence in systematic reviews, and the documentation of funding and potential conflicts of interest. In this survey, we assessed the current status of reproducibility and transparency addressing these indicators in a random sample of 441 biomedical journal articles published in 2000-2014. Only one study provided a full protocol and none made all raw data directly available. Replication studies were rare (n = 4, and only 16 studies had their data included in a subsequent systematic review or meta-analysis. The majority of studies did not mention anything about funding or conflicts of interest. The percentage of articles with no statement of conflict decreased substantially between 2000 and 2014 (94.4% in 2000 to 34.6% in 2014; the percentage of articles reporting statements of conflicts (0% in 2000, 15.4% in 2014 or no conflicts (5.6% in 2000, 50.0% in 2014 increased. Articles published in journals in the clinical medicine category versus other fields were almost twice as likely to not include any information on funding and to have private funding. This study provides baseline data to compare future progress in improving these indicators in the scientific literature.

  1. Reproducibility of Variant Calls in Replicate Next Generation Sequencing Experiments

    Science.gov (United States)

    Qi, Yuan; Liu, Xiuping; Liu, Chang-gong; Wang, Bailing; Hess, Kenneth R.; Symmans, W. Fraser; Shi, Weiwei; Pusztai, Lajos

    2015-01-01

    Nucleotide alterations detected by next generation sequencing are not always true biological changes but could represent sequencing errors. Even highly accurate methods can yield substantial error rates when applied to millions of nucleotides. In this study, we examined the reproducibility of nucleotide variant calls in replicate sequencing experiments of the same genomic DNA. We performed targeted sequencing of all known human protein kinase genes (kinome) (~3.2 Mb) using the SOLiD v4 platform. Seventeen breast cancer samples were sequenced in duplicate (n=14) or triplicate (n=3) to assess concordance of all calls and single nucleotide variant (SNV) calls. The concordance rates over the entire sequenced region were >99.99%, while the concordance rates for SNVs were 54.3-75.5%. There was substantial variation in basic sequencing metrics from experiment to experiment. The type of nucleotide substitution and genomic location of the variant had little impact on concordance but concordance increased with coverage level, variant allele count (VAC), variant allele frequency (VAF), variant allele quality and p-value of SNV-call. The most important determinants of concordance were VAC and VAF. Even using the highest stringency of QC metrics the reproducibility of SNV calls was around 80% suggesting that erroneous variant calling can be as high as 20-40% in a single experiment. The sequence data have been deposited into the European Genome-phenome Archive (EGA) with accession number EGAS00001000826. PMID:26136146

  2. Highly Efficient and Reproducible Nonfullerene Solar Cells from Hydrocarbon Solvents

    KAUST Repository

    Wadsworth, Andrew

    2017-06-01

    With chlorinated solvents unlikely to be permitted for use in solution-processed organic solar cells in industry, there must be a focus on developing nonchlorinated solvent systems. Here we report high-efficiency devices utilizing a low-bandgap donor polymer (PffBT4T-2DT) and a nonfullerene acceptor (EH-IDTBR) from hydrocarbon solvents and without using additives. When mesitylene was used as the solvent, rather than chlorobenzene, an improved power conversion efficiency (11.1%) was achieved without the need for pre- or post-treatments. Despite altering the processing conditions to environmentally friendly solvents and room-temperature coating, grazing incident X-ray measurements confirmed that active layers processed from hydrocarbon solvents retained the robust nanomorphology obtained with hot-processed chlorinated solvents. The main advantages of hydrocarbon solvent-processed devices, besides the improved efficiencies, were the reproducibility and storage lifetime of devices. Mesitylene devices showed better reproducibility and shelf life up to 4000 h with PCE dropping by only 8% of its initial value.

  3. A Telescope Inventor's Spyglass Possibly Reproduced in a Brueghel's Painting

    Science.gov (United States)

    Molaro, P.; Selvelli, P.

    2011-06-01

    Jan Brueghel the Elder depicted spyglasses belonging to the Archduke Albert VII of Habsburg in at least five paintings in the period between 1608 and 1625. Albert VII was fascinated by art and science and he obtained spyglasses directly from Lipperhey and Sacharias Janssen approximately at the time when the telescope was first shown at The Hague at the end of 1608. In the Extensive Landscape with View of the Castle of Mariemont, dated 1608-1612, the Archduke is looking at his Mariemont castle through an optical tube and this is the first time a spyglass was painted whatsoever. It is quite possible that the painting reproduces one of the first telescopes ever made. Two other Albert VII's telescopes are prominently reproduced in two Allegories of Sight painted a few years later (1617-1618). They are sophisticated instruments and their structure, in particular the shape of the eyepiece, suggests that they are composed by two convex lenses in a Keplerian optical configuration which became of common use only more than two decades later. If this is the case, these paintings are the first available record of a Keplerian telescope.

  4. Data reproducibility of pace strategy in a laboratory test run.

    Science.gov (United States)

    de França, Elias; Xavier, Ana Paula; Hirota, Vinicius Barroso; Côrrea, Sônia Cavalcanti; Caperuto, Érico Chagas

    2016-06-01

    This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest) with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation) until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI) between sets (ranging from 30 to 60 s) and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value), or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE), heart rate (HR) and blood plasma lactate concentration ([La]p) were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05) between test and retest, as well as they demonstrated good intraclass correlation.

  5. Data reproducibility of pace strategy in a laboratory test run

    Directory of Open Access Journals (Sweden)

    Elias de França

    2016-06-01

    Full Text Available This data paper contains data related to a reproducibility test for running pacing strategy in an intermittent running test until exhaustion. Ten participants underwent a crossover study (test and retest with an intermittent running test. The test was composed of three-minute sets (at 1 km/h above Onset Blood Lactate Accumulation until volitional exhaustion. To assess pace strategy change, in the first test participants chose the rest time interval (RTI between sets (ranging from 30 to 60 s and in the second test the maximum RTI values were either the RTI chosen in the first test (maximum RTI value, or less if desired. To verify the reproducibility of the test, rating perceived exertion (RPE, heart rate (HR and blood plasma lactate concentration ([La]p were collected at rest, immediately after each set and at the end of the tests. As results, RTI, RPE, HR, [La]p and time to exhaustion were not statistically different (p>0.05 between test and retest, as well as they demonstrated good intraclass correlation.

  6. When Quality Beats Quantity: Decision Theory, Drug Discovery, and the Reproducibility Crisis.

    Directory of Open Access Journals (Sweden)

    Jack W Scannell

    Full Text Available A striking contrast runs through the last 60 years of biopharmaceutical discovery, research, and development. Huge scientific and technological gains should have increased the quality of academic science and raised industrial R&D efficiency. However, academia faces a "reproducibility crisis"; inflation-adjusted industrial R&D costs per novel drug increased nearly 100 fold between 1950 and 2010; and drugs are more likely to fail in clinical development today than in the 1970s. The contrast is explicable only if powerful headwinds reversed the gains and/or if many "gains" have proved illusory. However, discussions of reproducibility and R&D productivity rarely address this point explicitly. The main objectives of the primary research in this paper are: (a to provide quantitatively and historically plausible explanations of the contrast; and (b identify factors to which R&D efficiency is sensitive. We present a quantitative decision-theoretic model of the R&D process. The model represents therapeutic candidates (e.g., putative drug targets, molecules in a screening library, etc. within a "measurement space", with candidates' positions determined by their performance on a variety of assays (e.g., binding affinity, toxicity, in vivo efficacy, etc. whose results correlate to a greater or lesser degree. We apply decision rules to segment the space, and assess the probability of correct R&D decisions. We find that when searching for rare positives (e.g., candidates that will successfully complete clinical development, changes in the predictive validity of screening and disease models that many people working in drug discovery would regard as small and/or unknowable (i.e., an 0.1 absolute change in correlation coefficient between model output and clinical outcomes in man can offset large (e.g., 10 fold, even 100 fold changes in models' brute-force efficiency. We also show how validity and reproducibility correlate across a population of simulated

  7. Estimating carbon dioxide fluxes from temperate mountain grasslands using broad-band vegetation indices

    Directory of Open Access Journals (Sweden)

    G. Wohlfahrt

    2010-02-01

    Full Text Available The broad-band normalised difference vegetation index (NDVI and the simple ratio (SR were calculated from measurements of reflectance of photosynthetically active and short-wave radiation at two temperate mountain grasslands in Austria and related to the net ecosystem CO2 exchange (NEE measured concurrently by means of the eddy covariance method. There was no significant statistical difference between the relationships of midday mean NEE with narrow- and broad-band NDVI and SR, measured during and calculated for that same time window, respectively. The skill of broad-band NDVI and SR in predicting CO2 fluxes was higher for metrics dominated by gross photosynthesis and lowest for ecosystem respiration, with NEE in between. A method based on a simple light response model whose parameters were parameterised based on broad-band NDVI allowed to improve predictions of daily NEE and is suggested to hold promise for filling gaps in the NEE time series. Relationships of CO2 flux metrics with broad-band NDVI and SR however generally differed between the two studied grassland sites indicting an influence of additional factors not yet accounted for.

  8. Hints of correlation between broad-line and radio variations for 3C 120

    International Nuclear Information System (INIS)

    Liu, H. T.; Bai, J. M.; Li, S. K.; Wang, J. M.

    2014-01-01

    In this paper, we investigate the correlation between broad-line and radio variations for the broad-line radio galaxy 3C 120. By the z-transformed discrete correlation function method and the model-independent flux randomization/random subset selection (FR/RSS) Monte Carlo method, we find that broad Hβ line variations lead the 15 GHz variations. The FR/RSS method shows that the Hβ line variations lead the radio variations by a factor of τ ob = 0.34 ± 0.01 yr. This time lag can be used to locate the position of the emitting region of radio outbursts in the jet, on the order of ∼5 lt-yr from the central engine. This distance is much larger than the size of the broad-line region. The large separation of the radio outburst emitting region from the broad-line region will observably influence the gamma-ray emission in 3C 120.

  9. New Constraints on Quasar Broad Absorption and Emission Line Regions from Gravitational Microlensing

    Energy Technology Data Exchange (ETDEWEB)

    Hutsemékers, Damien; Braibant, Lorraine; Sluse, Dominique [Institut d' Astrophysique et de Géophysique, Université de Liège, Liège (Belgium); Anguita, Timo [Departamento de Ciencias Fisicas, Universidad Andres Bello, Santiago (Chile); Goosmann, René, E-mail: hutsemekers@astro.ulg.ac.be [Observatoire Astronomique de Strasbourg, Université de Strasbourg, Strasbourg (France)

    2017-09-29

    Gravitational microlensing is a powerful tool allowing one to probe the structure of quasars on sub-parsec scale. We report recent results, focusing on the broad absorption and emission line regions. In particular microlensing reveals the intrinsic absorption hidden in the P Cygni-type line profiles observed in the broad absorption line quasar H1413+117, as well as the existence of an extended continuum source. In addition, polarization microlensing provides constraints on the scattering region. In the quasar Q2237+030, microlensing differently distorts the Hα and CIV broad emission line profiles, indicating that the low- and high-ionization broad emission lines must originate from regions with distinct kinematical properties. We also present simulations of the effect of microlensing on line profiles considering simple but representative models of the broad emission line region. Comparison of observations to simulations allows us to conclude that the Hα emitting region in Q2237+030 is best represented by a Keplerian disk.

  10. Comment on "Most computational hydrology is not reproducible, so is it really science?" by Christopher Hutton et al.

    Science.gov (United States)

    Añel, Juan A.

    2017-03-01

    Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.Plain Language SummaryThis article highlights several important issues to ensure the scientific reproducibility of results within the current scientific framework, going beyond simple documentation. Several specific examples are discussed in the field of hydrological modeling.

  11. Chimeric Mice with Competent Hematopoietic Immunity Reproduce Key Features of Severe Lassa Fever.

    Directory of Open Access Journals (Sweden)

    Lisa Oestereich

    2016-05-01

    Full Text Available Lassa fever (LASF is a highly severe viral syndrome endemic to West African countries. Despite the annual high morbidity and mortality caused by LASF, very little is known about the pathophysiology of the disease. Basic research on LASF has been precluded due to the lack of relevant small animal models that reproduce the human disease. Immunocompetent laboratory mice are resistant to infection with Lassa virus (LASV and, to date, only immunodeficient mice, or mice expressing human HLA, have shown some degree of susceptibility to experimental infection. Here, transplantation of wild-type bone marrow cells into irradiated type I interferon receptor knockout mice (IFNAR-/- was used to generate chimeric mice that reproduced important features of severe LASF in humans. This included high lethality, liver damage, vascular leakage and systemic virus dissemination. In addition, this model indicated that T cell-mediated immunopathology was an important component of LASF pathogenesis that was directly correlated with vascular leakage. Our strategy allows easy generation of a suitable small animal model to test new vaccines and antivirals and to dissect the basic components of LASF pathophysiology.

  12. Reproducibility of quantitative high-throughput BI-RADS features extracted from ultrasound images of breast cancer.

    Science.gov (United States)

    Hu, Yuzhou; Qiao, Mengyun; Guo, Yi; Wang, Yuanyuan; Yu, Jinhua; Li, Jiawei; Chang, Cai

    2017-07-01

    Digital Breast Imaging Reporting and Data System (BI-RADS) features extracted from ultrasound images are essential in computer-aided diagnosis, prediction, and prognosis of breast cancer. This study focuses on the reproducibility of quantitative high-throughput BI-RADS features in the presence of variations due to different segmentation results, various ultrasound machine models, and multiple ultrasound machine settings. Dataset 1 consists of 399 patients with invasive breast cancer and is used as the training set to measure the reproducibility of features, while dataset 2 consists of 138 other patients and is a validation set used to evaluate the diagnosis performances of the final reproducible features. Four hundred and sixty high-throughput BI-RADS features are designed and quantized according to BI-RADS lexicon. Concordance Correlation Coefficient (CCC) and Deviation (Dev) are used to assess the effect of the segmentation methods and Between-class Distance (BD) is used to study the influences of the machine models. In addition, the features jointly shared by two methodologies are further investigated on their effects with multiple machine settings. Subsequently, the absolute value of Pearson Correlation Coefficient (R abs ) is applied for redundancy elimination. Finally, the features that are reproducible and not redundant are preserved as the stable feature set. A 10-fold Support Vector Machine (SVM) classifier is employed to verify the diagnostic ability. One hundred and fifty-three features were found to have high reproducibility (CCC > 0.9 & Dev BI-RADS features to various degrees. Our 46 reproducible features were robust to these factors and were capable of distinguishing benign and malignant breast tumors. © 2017 American Association of Physicists in Medicine.

  13. The oomycete broad-host-range pathogen Phytophthora capsici.

    Science.gov (United States)

    Lamour, Kurt H; Stam, Remco; Jupe, Julietta; Huitema, Edgar

    2012-05-01

    Phytophthora capsici is a highly dynamic and destructive pathogen of vegetables. It attacks all cucurbits, pepper, tomato and eggplant, and, more recently, snap and lima beans. The disease incidence and severity have increased significantly in recent decades and the molecular resources to study this pathogen are growing and now include a reference genome. At the population level, the epidemiology varies according to the geographical location, with populations in South America dominated by clonal reproduction, and populations in the USA and South Africa composed of many unique genotypes in which sexual reproduction is common. Just as the impact of crop loss as a result of P. capsici has increased in recent decades, there has been a similar increase in the development of new tools and resources to study this devastating pathogen. Phytophthora capsici presents an attractive model for understanding broad-host-range oomycetes, the impact of sexual recombination in field populations and the basic mechanisms of Phytophthora virulence. Kingdom Chromista; Phylum Oomycota; Class Oomycetes; Order Peronosporales; Family Peronosporaceae; Genus Phytophthora; Species capsici. Symptoms vary considerably according to the host, plant part infected and environmental conditions. For example, in dry areas (e.g. southwestern USA and southern France), infection on tomato and bell or chilli pepper is generally on the roots and crown, and the infected plants have a distinctive black/brown lesion visible at the soil line (Fig. 1). In areas in which rainfall is more common (e.g. eastern USA), all parts of the plant are infected, including the roots, crown, foliage and fruit (Fig. 1). Root infections cause damping off in seedlings, whereas, in older plants, it is common to see stunted growth, wilting and, eventually, death. For tomatoes, it is common to see significant adventitious root growth just above an infected tap root, and the stunted plants, although severely compromised, may not die

  14. Highly Reproducible Automated Proteomics Sample Preparation Workflow for Quantitative Mass Spectrometry.

    Science.gov (United States)

    Fu, Qin; Kowalski, Michael P; Mastali, Mitra; Parker, Sarah J; Sobhani, Kimia; van den Broek, Irene; Hunter, Christie L; Van Eyk, Jennifer E

    2018-01-05

    Sample preparation for protein quantification by mass spectrometry requires multiple processing steps including denaturation, reduction, alkylation, protease digestion, and peptide cleanup. Scaling these procedures for the analysis of numerous complex biological samples can be tedious and time-consuming, as there are many liquid transfer steps and timed reactions where technical variations can be introduced and propagated. We established an automated sample preparation workflow with a total processing time for 96 samples of 5 h, including a 2 h incubation with trypsin. Peptide cleanup is accomplished by online diversion during the LC/MS/MS analysis. In a selected reaction monitoring (SRM) assay targeting 6 plasma biomarkers and spiked β-galactosidase, mean intraday and interday cyclic voltammograms (CVs) for 5 serum and 5 plasma samples over 5 days were samples repeated on 3 separate days had total CVs below 20%. Similar results were obtained when the workflow was transferred to a second site: 93% of peptides had CVs below 20%. An automated trypsin digestion workflow yields uniformly processed samples in less than 5 h. Reproducible quantification of peptides was observed across replicates, days, instruments, and laboratory sites, demonstrating the broad applicability of this approach.

  15. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    Science.gov (United States)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  16. Robust tissue classification for reproducible wound assessment in telemedicine environments

    Science.gov (United States)

    Wannous, Hazem; Treuillet, Sylvie; Lucas, Yves

    2010-04-01

    In telemedicine environments, a standardized and reproducible assessment of wounds, using a simple free-handled digital camera, is an essential requirement. However, to ensure robust tissue classification, particular attention must be paid to the complete design of the color processing chain. We introduce the key steps including color correction, merging of expert labeling, and segmentation-driven classification based on support vector machines. The tool thus developed ensures stability under lighting condition, viewpoint, and camera changes, to achieve accurate and robust classification of skin tissues. Clinical tests demonstrate that such an advanced tool, which forms part of a complete 3-D and color wound assessment system, significantly improves the monitoring of the healing process. It achieves an overlap score of 79.3 against 69.1% for a single expert, after mapping on the medical reference developed from the image labeling by a college of experts.

  17. Palladium gates for reproducible quantum dots in silicon.

    Science.gov (United States)

    Brauns, Matthias; Amitonov, Sergey V; Spruijtenburg, Paul-Christiaan; Zwanenburg, Floris A

    2018-04-09

    We replace the established aluminium gates for the formation of quantum dots