WorldWideScience

Sample records for model qualitatively reproduces

  1. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  2. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  3. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  4. Qualitative modeling of the dynamics of detonations with losses

    KAUST Repository

    Faria, Luiz; Kasimov, Aslan R.

    2015-01-01

    We consider a simplified model for the dynamics of one-dimensional detonations with generic losses. It consists of a single partial differential equation that reproduces, at a qualitative level, the essential properties of unsteady detonation waves, including pulsating and chaotic solutions. In particular, we investigate the effects of shock curvature and friction losses on detonation dynamics. To calculate steady-state solutions, a novel approach to solving the detonation eigenvalue problem is introduced that avoids the well-known numerical difficulties associated with the presence of a sonic point. By using unsteady numerical simulations of the simplified model, we also explore the nonlinear stability of steady-state or quasi-steady solutions. © 2014 The Combustion Institute.

  5. Shear wave elastography for breast masses is highly reproducible.

    Science.gov (United States)

    Cosgrove, David O; Berg, Wendie A; Doré, Caroline J; Skyba, Danny M; Henry, Jean-Pierre; Gay, Joel; Cohen-Bacrie, Claude

    2012-05-01

    To evaluate intra- and interobserver reproducibility of shear wave elastography (SWE) for breast masses. For intraobserver reproducibility, each observer obtained three consecutive SWE images of 758 masses that were visible on ultrasound. 144 (19%) were malignant. Weighted kappa was used to assess the agreement of qualitative elastographic features; the reliability of quantitative measurements was assessed by intraclass correlation coefficients (ICC). For the interobserver reproducibility, a blinded observer reviewed images and agreement on features was determined. Mean age was 50 years; mean mass size was 13 mm. Qualitatively, SWE images were at least reasonably similar for 666/758 (87.9%). Intraclass correlation for SWE diameter, area and perimeter was almost perfect (ICC ≥ 0.94). Intraobserver reliability for maximum and mean elasticity was almost perfect (ICC = 0.84 and 0.87) and was substantial for the ratio of mass-to-fat elasticity (ICC = 0.77). Interobserver agreement was moderate for SWE homogeneity (κ = 0.57), substantial for qualitative colour assessment of maximum elasticity (κ = 0.66), fair for SWE shape (κ = 0.40), fair for B-mode mass margins (κ = 0.38), and moderate for B-mode mass shape (κ = 0.58), orientation (κ = 0.53) and BI-RADS assessment (κ = 0.59). SWE is highly reproducible for assessing elastographic features of breast masses within and across observers. SWE interpretation is at least as consistent as that of BI-RADS ultrasound B-mode features. • Shear wave ultrasound elastography can measure the stiffness of breast tissue • It provides a qualitatively and quantitatively interpretable colour-coded map of tissue stiffness • Intraobserver reproducibility of SWE is almost perfect while intraobserver reproducibility of SWE proved to be moderate to substantial • The most reproducible SWE features between observers were SWE image homogeneity and maximum elasticity.

  6. Hydrological Modeling Reproducibility Through Data Management and Adaptors for Model Interoperability

    Science.gov (United States)

    Turner, M. A.

    2015-12-01

    Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of

  7. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  8. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  9. Qualitative models for space system engineering

    Science.gov (United States)

    Forbus, Kenneth D.

    1990-01-01

    The objectives of this project were: (1) to investigate the implications of qualitative modeling techniques for problems arising in the monitoring, diagnosis, and design of Space Station subsystems and procedures; (2) to identify the issues involved in using qualitative models to enhance and automate engineering functions. These issues include representing operational criteria, fault models, alternate ontologies, and modeling continuous signals at a functional level of description; and (3) to develop a prototype collection of qualitative models for fluid and thermal systems commonly found in Space Station subsystems. Potential applications of qualitative modeling to space-systems engineering, including the notion of intelligent computer-aided engineering are summarized. Emphasis is given to determining which systems of the proposed Space Station provide the most leverage for study, given the current state of the art. Progress on using qualitative models, including development of the molecular collection ontology for reasoning about fluids, the interaction of qualitative and quantitative knowledge in analyzing thermodynamic cycles, and an experiment on building a natural language interface to qualitative reasoning is reported. Finally, some recommendations are made for future research.

  10. Qualitative and quantitative histopathology in transitional cell carcinomas of the urinary bladder. An international investigation of intra- and interobserver reproducibility

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Sasaki, M; Fukuzawa, S

    1994-01-01

    a random, systematic sampling scheme.RESULTS: The results were compared by bivariate correlation analyses and Kendall's tau. The international interobserver reproducibility of qualitative gradings was rather poor (kappa = 0.51), especially for grade 2 tumors (kappa = 0.28). Likewise, the interobserver.......54). This can probably be related to the manual design of the sampling scheme and may be solved by introducing a motorized object stage in the systematic selection of fields of vision for quantitative measurements. However, the nuclear mean size estimators are unaffected by such sampling variability...... of both qualitative and quantitative grading methods. Grading of malignancy was performed by one observer in Japan (using the World Health Organization scheme), and by two observers in Denmark (using the Bergkvist system). A "translation" between the systems, grade for grade, and kappa statistics were...

  11. Qualitative models of magnetic field accelerated propagation in a plasma due to the Hall effect

    International Nuclear Information System (INIS)

    Kukushkin, A.B.; Cherepanov, K.V.

    2000-01-01

    Two qualitatively new models of accelerated magnetic field propagation (relative to normal diffusion) in a plasma due to the Hall effect are developed within the frames of the electron magnetic hydrodynamics. The first model is based on a simple hydrodynamic approach, which, in particular, reproduces the number of known theoretical results. The second one makes it possible to obtain exact analytical description of the basic characteristics of the magnetic field accelerated propagation in a inhomogeneous iso-thermic plasma, namely, the magnetic field front and its effective width [ru

  12. Anatomical Reproducibility of a Head Model Molded by a Three-dimensional Printer.

    Science.gov (United States)

    Kondo, Kosuke; Nemoto, Masaaki; Masuda, Hiroyuki; Okonogi, Shinichi; Nomoto, Jun; Harada, Naoyuki; Sugo, Nobuo; Miyazaki, Chikao

    2015-01-01

    We prepared rapid prototyping models of heads with unruptured cerebral aneurysm based on image data of computed tomography angiography (CTA) using a three-dimensional (3D) printer. The objective of this study was to evaluate the anatomical reproducibility and accuracy of these models by comparison with the CTA images on a monitor. The subjects were 22 patients with unruptured cerebral aneurysm who underwent preoperative CTA. Reproducibility of the microsurgical anatomy of skull bone and arteries, the length and thickness of the main arteries, and the size of cerebral aneurysm were compared between the CTA image and rapid prototyping model. The microsurgical anatomy and arteries were favorably reproduced, apart from a few minute regions, in the rapid prototyping models. No significant difference was noted in the measured lengths of the main arteries between the CTA image and rapid prototyping model, but errors were noted in their thickness (p printer. It was concluded that these models are useful tools for neurosurgical simulation. The thickness of the main arteries and size of cerebral aneurysm should be comprehensively judged including other neuroimaging in consideration of errors.

  13. Modeling and evaluating repeatability and reproducibility of ordinal classifications

    NARCIS (Netherlands)

    de Mast, J.; van Wieringen, W.N.

    2010-01-01

    This paper argues that currently available methods for the assessment of the repeatability and reproducibility of ordinal classifications are not satisfactory. The paper aims to study whether we can modify a class of models from Item Response Theory, well established for the study of the reliability

  14. The construction of a two-dimensional reproducing kernel function and its application in a biomedical model.

    Science.gov (United States)

    Guo, Qi; Shen, Shu-Ting

    2016-04-29

    There are two major classes of cardiac tissue models: the ionic model and the FitzHugh-Nagumo model. During computer simulation, each model entails solving a system of complex ordinary differential equations and a partial differential equation with non-flux boundary conditions. The reproducing kernel method possesses significant applications in solving partial differential equations. The derivative of the reproducing kernel function is a wavelet function, which has local properties and sensitivities to singularity. Therefore, study on the application of reproducing kernel would be advantageous. Applying new mathematical theory to the numerical solution of the ventricular muscle model so as to improve its precision in comparison with other methods at present. A two-dimensional reproducing kernel function inspace is constructed and applied in computing the solution of two-dimensional cardiac tissue model by means of the difference method through time and the reproducing kernel method through space. Compared with other methods, this method holds several advantages such as high accuracy in computing solutions, insensitivity to different time steps and a slow propagation speed of error. It is suitable for disorderly scattered node systems without meshing, and can arbitrarily change the location and density of the solution on different time layers. The reproducing kernel method has higher solution accuracy and stability in the solutions of the two-dimensional cardiac tissue model.

  15. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  16. Intra- and interobserver reliability and intra-catheter reproducibility using frequency domain optical coherence tomography for the evaluation of morphometric stent parameters and qualitative assessment of stent strut coverage

    International Nuclear Information System (INIS)

    Antonsen, Lisbeth; Thayssen, Per; Junker, Anders; Veien, Karsten Tange; Hansen, Henrik Steen; Hansen, Knud Nørregaard; Hougaard, Mikkel; Jensen, Lisette Okkels

    2015-01-01

    Purpose: Frequency-domain optical coherence tomography (FD-OCT) is a high-resolution imaging tool (~ 10–15 μm), which enables near-histological in-vivo images of the coronary vessel wall. The use of the technique is increasing, both for research- and clinical purposes. This study sought to investigate the intra- and interobserver reliability, as well as the intra-catheter reproducibility of quantitative FD-OCT-assessment of morphometric stent parameters and qualitative FD-OCT-evaluation of strut coverage in 10 randomly selected 6-month follow-up Nobori® biolimus-eluting stents (N-BESs). Methods: Ten N-BESs (213 cross sectional areas (CSAs) and 1897 struts) imaged with OCT 6 months post-implantation were randomly selected and analyzed by 2 experienced analysts, and the same 10 N-BESs were analyzed by one of the analysts 3 months later. Further, 2 consecutive pullbacks randomly performed in another 10 N-BESs (219 CSAs and 1860 struts) were independently assessed by one of the analysts. Results: The intraobserver variability with regard to relative difference of mean luminal area and mean stent area at the CSA-level was very low: 0.1% ± 1.4% and 0.5% ± 3.2%. Interobserver variability also proved to be low: − 2.1% ± 3.3% and 2.1% ± 4.6%, and moreover, very restricted intra-catheter variation was observed: 0.02% ± 6.8% and − 0.18% ± 5.2%. The intraobserver-, interobserver- and intra-catheter reliability for the qualitative evaluation of strut coverage was found to be: kappa (κ) = 0.91 (95% confidence interval (CI): 0.88–0.93, p < 0.01), κ = 0.88 (95% CI: 0.85–0.91, p < 0.01), and κ = 0.73 (95% CI: 0.68–0.78, p < 0.01), respectively. Conclusions: FD-OCT is a reproducible and reliable imaging tool for quantitative evaluation of stented coronary segments, and for qualitative assessment of strut coverage. - Highlights: • Frequency-domain optical coherence tomography (FD-OCT) is increasingly adopted in the catherization laboratories. • This

  17. Intra- and interobserver reliability and intra-catheter reproducibility using frequency domain optical coherence tomography for the evaluation of morphometric stent parameters and qualitative assessment of stent strut coverage

    Energy Technology Data Exchange (ETDEWEB)

    Antonsen, Lisbeth, E-mail: Lisbeth.antonsen@rsyd.dk; Thayssen, Per; Junker, Anders; Veien, Karsten Tange; Hansen, Henrik Steen; Hansen, Knud Nørregaard; Hougaard, Mikkel; Jensen, Lisette Okkels

    2015-12-15

    Purpose: Frequency-domain optical coherence tomography (FD-OCT) is a high-resolution imaging tool (~ 10–15 μm), which enables near-histological in-vivo images of the coronary vessel wall. The use of the technique is increasing, both for research- and clinical purposes. This study sought to investigate the intra- and interobserver reliability, as well as the intra-catheter reproducibility of quantitative FD-OCT-assessment of morphometric stent parameters and qualitative FD-OCT-evaluation of strut coverage in 10 randomly selected 6-month follow-up Nobori® biolimus-eluting stents (N-BESs). Methods: Ten N-BESs (213 cross sectional areas (CSAs) and 1897 struts) imaged with OCT 6 months post-implantation were randomly selected and analyzed by 2 experienced analysts, and the same 10 N-BESs were analyzed by one of the analysts 3 months later. Further, 2 consecutive pullbacks randomly performed in another 10 N-BESs (219 CSAs and 1860 struts) were independently assessed by one of the analysts. Results: The intraobserver variability with regard to relative difference of mean luminal area and mean stent area at the CSA-level was very low: 0.1% ± 1.4% and 0.5% ± 3.2%. Interobserver variability also proved to be low: − 2.1% ± 3.3% and 2.1% ± 4.6%, and moreover, very restricted intra-catheter variation was observed: 0.02% ± 6.8% and − 0.18% ± 5.2%. The intraobserver-, interobserver- and intra-catheter reliability for the qualitative evaluation of strut coverage was found to be: kappa (κ) = 0.91 (95% confidence interval (CI): 0.88–0.93, p < 0.01), κ = 0.88 (95% CI: 0.85–0.91, p < 0.01), and κ = 0.73 (95% CI: 0.68–0.78, p < 0.01), respectively. Conclusions: FD-OCT is a reproducible and reliable imaging tool for quantitative evaluation of stented coronary segments, and for qualitative assessment of strut coverage. - Highlights: • Frequency-domain optical coherence tomography (FD-OCT) is increasingly adopted in the catherization laboratories. • This

  18. Interpretative intra- and interobserver reproducibility of Stress/Rest 99m Tc-steamboat's myocardial perfusion SPECT using semi quantitative 20-segment model

    International Nuclear Information System (INIS)

    Fazeli, M.; Firoozi, F.

    2002-01-01

    It well established that myocardial perfusion SPECT with 201 T L or 99 mTc-se sta mi bi play an important role diagnosis and risk assessment in patients with known or suspected coronary artery disease. Both quantitative and qualitative methods are available for interpretation of images. The use of a semi quantitative scoring system in which each of 20 segments is scored according to a five-point scheme provides an approach to interpretation that is more systematic and reproducible than simple qualitative evaluation. Only a limited number of studies have dealt with the interpretive observer reproducibility of 99 mTc-steamboat's myocardial perfusion imaging. The aim of this study was to assess the intra-and inter observer variability of semi quantitative SPECT performed with this technique. Among 789 patients that underwent myocardial perfusion SPECT during last year 80 patients finally need to coronary angiography as gold standard. In this group of patients a semi quantitative visual interpretation was carried out using short axis and vertical long-axis myocardial tomograms and a 20-segments model. These segments we reassigned on six evenly spaced regions in the apical, mid-ventricular, and basal short-axis view and two apical segments on the mid-ventricular long-axis slice. Uptake in each segment was graded on a 5-point scale (0=normal, 1=equivocal, 2=moderate, 3=severe, 4=absence of uptake). The steamboat's images was interpreted separately w ice by two observers without knowledge of each other's findings or results of angiography. A SPECT study was judged abnormal if there were two or more segments with a stress score equal or more than 2. We con eluded that semi-quantitative visual analysis is a simple and reproducible method of interpretation

  19. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  20. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  1. Angiographic core laboratory reproducibility analyses: implications for planning clinical trials using coronary angiography and left ventriculography end-points.

    Science.gov (United States)

    Steigen, Terje K; Claudio, Cheryl; Abbott, David; Schulzer, Michael; Burton, Jeff; Tymchak, Wayne; Buller, Christopher E; John Mancini, G B

    2008-06-01

    To assess reproducibility of core laboratory performance and impact on sample size calculations. Little information exists about overall reproducibility of core laboratories in contradistinction to performance of individual technicians. Also, qualitative parameters are being adjudicated increasingly as either primary or secondary end-points. The comparative impact of using diverse indexes on sample sizes has not been previously reported. We compared initial and repeat assessments of five quantitative parameters [e.g., minimum lumen diameter (MLD), ejection fraction (EF), etc.] and six qualitative parameters [e.g., TIMI myocardial perfusion grade (TMPG) or thrombus grade (TTG), etc.], as performed by differing technicians and separated by a year or more. Sample sizes were calculated from these results. TMPG and TTG were also adjudicated by a second core laboratory. MLD and EF were the most reproducible, yielding the smallest sample size calculations, whereas percent diameter stenosis and centerline wall motion require substantially larger trials. Of the qualitative parameters, all except TIMI flow grade gave reproducibility characteristics yielding sample sizes of many 100's of patients. Reproducibility of TMPG and TTG was only moderately good both within and between core laboratories, underscoring an intrinsic difficulty in assessing these. Core laboratories can be shown to provide reproducibility performance that is comparable to performance commonly ascribed to individual technicians. The differences in reproducibility yield huge differences in sample size when comparing quantitative and qualitative parameters. TMPG and TTG are intrinsically difficult to assess and conclusions based on these parameters should arise only from very large trials.

  2. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    Science.gov (United States)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  3. Decision making in goverment tenders: A formalized qualitative model

    Directory of Open Access Journals (Sweden)

    Štěpán Veselý

    2012-01-01

    Full Text Available The paper presents a simple formalized qualitative model of government tenders (GTs. Qualitative models use just three values: Positive/Increasing, Zero/Constant and Negative/Decreasing. Such quantifiers of trends are the least information intensive. Qualitative models can be useful, since GT evaluation often includes such goals as e.g. efficiency of public purchasing, and variables as e.g. availability of relevant information or subjectivity of judgment, that are difficult to quantify. Hence, a significant fraction of available information about GTs is not of numerical nature, e.g. if availability of relevant information is decreasing then efficiency of public purchasing is decreasing as well. Such equationless relations are studied in this paper. A qualitative model of the function F(Goals, Variables is developed. The model has four goal functions, eight variables, and 39 equationless relations. The model is solved and seven solutions, i.e. scenarios are obtained. All qualitative states, including first and second qualitative derivatives with respect to time, of all variables are specified for each scenario. Any unsteady state behavior of the GT model is described by its transitional oriented graph. There are eight possible transitions among seven scenarios. No a priori knowledge of qualitative modeling is required on the reader’s part.

  4. A qualitatively validated mathematical-computational model of the immune response to the yellow fever vaccine.

    Science.gov (United States)

    Bonin, Carla R B; Fernandes, Guilherme C; Dos Santos, Rodrigo W; Lobosco, Marcelo

    2018-05-25

    Although a safe and effective yellow fever vaccine was developed more than 80 years ago, several issues regarding its use remain unclear. For example, what is the minimum dose that can provide immunity against the disease? A useful tool that can help researchers answer this and other related questions is a computational simulator that implements a mathematical model describing the human immune response to vaccination against yellow fever. This work uses a system of ten ordinary differential equations to represent a few important populations in the response process generated by the body after vaccination. The main populations include viruses, APCs, CD8+ T cells, short-lived and long-lived plasma cells, B cells and antibodies. In order to qualitatively validate our model, four experiments were carried out, and their computational results were compared to experimental data obtained from the literature. The four experiments were: a) simulation of a scenario in which an individual was vaccinated against yellow fever for the first time; b) simulation of a booster dose ten years after the first dose; c) simulation of the immune response to the yellow fever vaccine in individuals with different levels of naïve CD8+ T cells; and d) simulation of the immune response to distinct doses of the yellow fever vaccine. This work shows that the simulator was able to qualitatively reproduce some of the experimental results reported in the literature, such as the amount of antibodies and viremia throughout time, as well as to reproduce other behaviors of the immune response reported in the literature, such as those that occur after a booster dose of the vaccine.

  5. The Accuracy and Reproducibility of Linear Measurements Made on CBCT-derived Digital Models.

    Science.gov (United States)

    Maroua, Ahmad L; Ajaj, Mowaffak; Hajeer, Mohammad Y

    2016-04-01

    To evaluate the accuracy and reproducibility of linear measurements made on cone-beam computed tomography (CBCT)-derived digital models. A total of 25 patients (44% female, 18.7 ± 4 years) who had CBCT images for diagnostic purposes were included. Plaster models were obtained and digital models were extracted from CBCT scans. Seven linear measurements from predetermined landmarks were measured and analyzed on plaster models and the corresponding digital models. The measurements included arch length and width at different sites. Paired t test and Bland-Altman analysis were used to evaluate the accuracy of measurements on digital models compared to the plaster models. Also, intraclass correlation coefficients (ICCs) were used to evaluate the reproducibility of the measurements in order to assess the intraobserver reliability. The statistical analysis showed significant differences on 5 out of 14 variables, and the mean differences ranged from -0.48 to 0.51 mm. The Bland-Altman analysis revealed that the mean difference between variables was (0.14 ± 0.56) and (0.05 ± 0.96) mm and limits of agreement between the two methods ranged from -1.2 to 0.96 and from -1.8 to 1.9 mm in the maxilla and the mandible, respectively. The intraobserver reliability values were determined for all 14 variables of two types of models separately. The mean ICC value for the plaster models was 0.984 (0.924-0.999), while it was 0.946 for the CBCT models (range from 0.850 to 0.985). Linear measurements obtained from the CBCT-derived models appeared to have a high level of accuracy and reproducibility.

  6. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...... source. Toluene was degraded under aerobic conditions at a constant temperature of 28 degreesC. The experiments were modelled by a Monod model - extended to meet the air/liquid system, and the parameter values were estimated using a statistical nonlinear estimation procedure. Model reduction analysis...... resulted in a simpler model without the biomass decay term. In order to test for model reduction and reproducibility of parameter estimates, a likelihood ratio test was employed. The limited reproducibility for these experiments implied that all 9 batch experiments could not be described by the same set...

  7. Intra- and interobserver reliability and intra-catheter reproducibility using frequency domain optical coherence tomography for the evaluation of morphometric stent parameters and qualitative assessment of stent strut coverage

    DEFF Research Database (Denmark)

    Antonsen, Lisbeth; Thayssen, Per; Junker, Anders

    2015-01-01

    to investigate the intra- and interobserver reliability, as well as the intra-catheter reproducibility of quantitative FD-OCT-assessment of morphometric stent parameters and qualitative FD-OCT-evaluation of strut coverage in 10 randomly selected 6-month follow-up Nobori® biolimus-eluting stents (N-BESs). METHODS...... in another 10 N-BESs (219 CSAs and 1860 struts) were independently assessed by one of the analysts. RESULTS: The intraobserver variability with regard to relative difference of mean luminal area and mean stent area at the CSA-level was very low: 0.1%±1.4% and 0.5%±3.2%. Interobserver variability also proved...... (CI): 0.88-0.93, pstented coronary segments, and for qualitative assessment of strut coverage....

  8. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  9. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  10. Controller Synthesis using Qualitative Models and Constraints

    OpenAIRE

    Ramamoorthy, Subramanian; Kuipers, Benjamin J

    2004-01-01

    Many engineering systems require the synthesis of global behaviors in nonlinear dynamical systems. Multiple model approaches to control design make it possible to synthesize robust and optimal versions of such global behaviors. We propose a methodology called Qualitative Heterogeneous Control that enables this type of control design. This methodology is based on a separation of concerns between qualitative correctness and quantitative optimization. Qualitative sufficient conditions are derive...

  11. Reproducibility of qualitative assessments of temporal lobe atrophy in MRI studies.

    Science.gov (United States)

    Sarria-Estrada, S; Acevedo, C; Mitjana, R; Frascheri, L; Siurana, S; Auger, C; Rovira, A

    2015-01-01

    To determine the reproducibility of the Scheltens visual rating scale in establishing atrophy of the medial temporal lobe. We used coronal T1-weighted inversion recovery sequences on a 1.5 Tesla MRI scanner to study 25 patients with clinically diagnosed Alzheimer's disease or mild cognitive decline and 25 subjects without cognitive decline. Five neuroradiologists trained to apply the Scheltens visual rating scale analyzed the images. We used the interclass correlation coefficient to evaluate interrater and intrarater agreement. Raters scored 20 (80%) of the 25 patients with mild cognitive decline or Alzheimer's disease between 2 and 4; by contrast, they scored 21 (84%) of the 25 subjects without cognitive decline between 0 and 1. The interrater agreement was consistently greater than 0.82, with a 95% confidence interval of (0.7-0.9). The intrarater agreement ranged from 0.82 to 0.87, with a 95% confidence interval of (0.56-0.93). The Scheltens visual rating scale is reproducible among observers, and this finding supports its use in clinical practice. Copyright © 2013 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  12. NRFixer: Sentiment Based Model for Predicting the Fixability of Non-Reproducible Bugs

    Directory of Open Access Journals (Sweden)

    Anjali Goyal

    2017-08-01

    Full Text Available Software maintenance is an essential step in software development life cycle. Nowadays, software companies spend approximately 45\\% of total cost in maintenance activities. Large software projects maintain bug repositories to collect, organize and resolve bug reports. Sometimes it is difficult to reproduce the reported bug with the information present in a bug report and thus this bug is marked with resolution non-reproducible (NR. When NR bugs are reconsidered, a few of them might get fixed (NR-to-fix leaving the others with the same resolution (NR. To analyse the behaviour of developers towards NR-to-fix and NR bugs, the sentiment analysis of NR bug report textual contents has been conducted. The sentiment analysis of bug reports shows that NR bugs' sentiments incline towards more negativity than reproducible bugs. Also, there is a noticeable opinion drift found in the sentiments of NR-to-fix bug reports. Observations driven from this analysis were an inspiration to develop a model that can judge the fixability of NR bugs. Thus a framework, {NRFixer,} which predicts the probability of NR bug fixation, is proposed. {NRFixer} was evaluated with two dimensions. The first dimension considers meta-fields of bug reports (model-1 and the other dimension additionally incorporates the sentiments (model-2 of developers for prediction. Both models were compared using various machine learning classifiers (Zero-R, naive Bayes, J48, random tree and random forest. The bug reports of Firefox and Eclipse projects were used to test {NRFixer}. In Firefox and Eclipse projects, J48 and Naive Bayes classifiers achieve the best prediction accuracy, respectively. It was observed that the inclusion of sentiments in the prediction model shows a rise in the prediction accuracy ranging from 2 to 5\\% for various classifiers.

  13. Statistical Methods for the Qualitative Assessment of Dynamic Models with Time Delay (R Package qualV

    Directory of Open Access Journals (Sweden)

    Stefanie Jachner

    2007-06-01

    Full Text Available Results of ecological models differ, to some extent, more from measured data than from empirical knowledge. Existing techniques for validation based on quantitative assessments sometimes cause an underestimation of the performance of models due to time shifts, accelerations and delays or systematic differences between measurement and simulation. However, for the application of such models it is often more important to reproduce essential patterns instead of seemingly exact numerical values. This paper presents techniques to identify patterns and numerical methods to measure the consistency of patterns between observations and model results. An orthogonal set of deviance measures for absolute, relative and ordinal scale was compiled to provide informations about the type of difference. Furthermore, two different approaches accounting for time shifts were presented. The first one transforms the time to take time delays and speed differences into account. The second one describes known qualitative criteria dividing time series into interval units in accordance to their main features. The methods differ in their basic concepts and in the form of the resulting criteria. Both approaches and the deviance measures discussed are implemented in an R package. All methods are demonstrated by means of water quality measurements and simulation data. The proposed quality criteria allow to recognize systematic differences and time shifts between time series and to conclude about the quantitative and qualitative similarity of patterns.

  14. Qualitative models of global warming amplifiers

    NARCIS (Netherlands)

    Milošević, U.; Bredeweg, B.; de Kleer, J.; Forbus, K.D.

    2010-01-01

    There is growing interest from ecological experts to create qualitative models of phenomena for which numerical information is sparse or missing. We present a number of successful models in the field of environmental science, namely, the domain of global warming. The motivation behind the effort is

  15. Assessment of the potential forecasting skill of a global hydrological model in reproducing the occurrence of monthly flow extremes

    Directory of Open Access Journals (Sweden)

    N. Candogan Yossef

    2012-11-01

    Full Text Available As an initial step in assessing the prospect of using global hydrological models (GHMs for hydrological forecasting, this study investigates the skill of the GHM PCR-GLOBWB in reproducing the occurrence of past extremes in monthly discharge on a global scale. Global terrestrial hydrology from 1958 until 2001 is simulated by forcing PCR-GLOBWB with daily meteorological data obtained by downscaling the CRU dataset to daily fields using the ERA-40 reanalysis. Simulated discharge values are compared with observed monthly streamflow records for a selection of 20 large river basins that represent all continents and a wide range of climatic zones.

    We assess model skill in three ways all of which contribute different information on the potential forecasting skill of a GHM. First, the general skill of the model in reproducing hydrographs is evaluated. Second, model skill in reproducing significantly higher and lower flows than the monthly normals is assessed in terms of skill scores used for forecasts of categorical events. Third, model skill in reproducing flood and drought events is assessed by constructing binary contingency tables for floods and droughts for each basin. The skill is then compared to that of a simple estimation of discharge from the water balance (PE.

    The results show that the model has skill in all three types of assessments. After bias correction the model skill in simulating hydrographs is improved considerably. For most basins it is higher than that of the climatology. The skill is highest in reproducing monthly anomalies. The model also has skill in reproducing floods and droughts, with a markedly higher skill in floods. The model skill far exceeds that of the water balance estimate. We conclude that the prospect for using PCR-GLOBWB for monthly and seasonal forecasting of the occurrence of hydrological extremes is positive. We argue that this conclusion applies equally to other similar GHMs and

  16. Environmental Consequences of Wildlife Tourism: The Use of Formalised Qualitative Models

    Directory of Open Access Journals (Sweden)

    Veselý Štěpán

    2015-09-01

    Full Text Available The paper presents a simple qualitative model of environmental consequences of wildlife tourism. Qualitative models use just three values: Positive/Increasing, Zero/Constant and Negative/Decreasing. Such quantifiers of trends are the least information intensive. Qualitative models can be useful, since models of wildlife tourism include such variables as, for example, Biodiversity (BIO, Animals’ habituation to tourists (HAB or Plant composition change (PLA that are sometimes difficult or costly to quantify. Hence, a significant fraction of available information about wildlife tourism and its consequences is not of numerical nature, for example, if HAB is increasing then BIO is decreasing. Such equationless relations are studied in this paper. The model has 10 variables and 20 equationless pairwise interrelations among them. The model is solved and 15 solutions, that is, scenarios are obtained. All qualitative states, including the first and second qualitative derivatives with respect to time, of all variables are specified for each scenario.

  17. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  18. Cellular automaton model in the fundamental diagram approach reproducing the synchronized outflow of wide moving jams

    International Nuclear Information System (INIS)

    Tian, Jun-fang; Yuan, Zhen-zhou; Jia, Bin; Fan, Hong-qiang; Wang, Tao

    2012-01-01

    Velocity effect and critical velocity are incorporated into the average space gap cellular automaton model [J.F. Tian, et al., Phys. A 391 (2012) 3129], which was able to reproduce many spatiotemporal dynamics reported by the three-phase theory except the synchronized outflow of wide moving jams. The physics of traffic breakdown has been explained. Various congested patterns induced by the on-ramp are reproduced. It is shown that the occurrence of synchronized outflow, free outflow of wide moving jams is closely related with drivers time delay in acceleration at the downstream jam front and the critical velocity, respectively. -- Highlights: ► Velocity effect is added into average space gap cellular automaton model. ► The physics of traffic breakdown has been explained. ► The probabilistic nature of traffic breakdown is simulated. ► Various congested patterns induced by the on-ramp are reproduced. ► The occurrence of synchronized outflow of jams depends on drivers time delay.

  19. COMBINE archive and OMEX format : One file to share all information to reproduce a modeling project

    NARCIS (Netherlands)

    Bergmann, Frank T.; Olivier, Brett G.; Soiland-Reyes, Stian

    2014-01-01

    Background: With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models,

  20. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  1. Investigation of dimensional variation in parts manufactured by fused deposition modeling using Gauge Repeatability and Reproducibility

    Science.gov (United States)

    Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar

    2018-02-01

    In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.

  2. Can a coupled meteorology–chemistry model reproduce the ...

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  3. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  4. Modeling with Young Students--Quantitative and Qualitative.

    Science.gov (United States)

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  5. Effective Form of Reproducing the Total Financial Potential of Ukraine

    Directory of Open Access Journals (Sweden)

    Portna Oksana V.

    2015-03-01

    Full Text Available Development of scientific principles of reproducing the total financial potential of the country and its effective form is an urgent problem both in theoretical and practical aspects of the study, the solution of which is intended to ensure the active mobilization and effective use of the total financial potential of Ukraine, and as a result — its expanded reproduction as well, which would contribute to realization of the internal capacities for stabilization of the national economy. The purpose of the article is disclosing the essence of the effective form of reproducing the total financial potential of the country, analyzing the results of reproducing the total financial potential of Ukraine. It has been proved that the basis for the effective form of reproducing the total financial potential of the country is the volume and flow of resources, which are associated with the «real» economy, affect the dynamics of GDP and define it, i.e. resource and process forms of reproducing the total financial potential of Ukraine (which precede the effective one. The analysis of reproducing the total financial potential of Ukraine has shown that in the analyzed period there was an increase in the financial possibilities of the country, but steady dynamics of reduction of the total financial potential was observed. If we consider the amount of resources involved in production, creating a net value added and GDP, it occurs on a restricted basis. Growth of the total financial potential of Ukraine is connected only with extensive quantitative factors rather than intensive qualitative changes.

  6. Pharmacokinetic Modelling to Predict FVIII:C Response to Desmopressin and Its Reproducibility in Nonsevere Haemophilia A Patients.

    Science.gov (United States)

    Schütte, Lisette M; van Hest, Reinier M; Stoof, Sara C M; Leebeek, Frank W G; Cnossen, Marjon H; Kruip, Marieke J H A; Mathôt, Ron A A

    2018-04-01

     Nonsevere haemophilia A (HA) patients can be treated with desmopressin. Response of factor VIII activity (FVIII:C) differs between patients and is difficult to predict.  Our aims were to describe FVIII:C response after desmopressin and its reproducibility by population pharmacokinetic (PK) modelling.  Retrospective data of 128 nonsevere HA patients (age 7-75 years) receiving an intravenous or intranasal dose of desmopressin were used. PK modelling of FVIII:C was performed by nonlinear mixed effect modelling. Reproducibility of FVIII:C response was defined as less than 25% difference in peak FVIII:C between administrations.  A total of 623 FVIII:C measurements from 142 desmopressin administrations were available; 14 patients had received two administrations at different occasions. The FVIII:C time profile was best described by a two-compartment model with first-order absorption and elimination. Interindividual variability of the estimated baseline FVIII:C, central volume of distribution and clearance were 37, 43 and 50%, respectively. The most recently measured FVIII:C (FVIII-recent) was significantly associated with FVIII:C response to desmopressin ( p  C increase of 0.47 IU/mL (median, interquartile range: 0.32-0.65 IU/mL, n  = 142). C response was reproducible in 6 out of 14 patients receiving two desmopressin administrations.  FVIII:C response to desmopressin in nonsevere HA patients was adequately described by a population PK model. Large variability in FVIII:C response was observed, which could only partially be explained by FVIII-recent. C response was not reproducible in a small subset of patients. Therefore, monitoring FVIII:C around surgeries or bleeding might be considered. Research is needed to study this further. Schattauer Stuttgart.

  7. A reproducible accelerated in vitro release testing method for PLGA microspheres.

    Science.gov (United States)

    Shen, Jie; Lee, Kyulim; Choi, Stephanie; Qu, Wen; Wang, Yan; Burgess, Diane J

    2016-02-10

    The objective of the present study was to develop a discriminatory and reproducible accelerated in vitro release method for long-acting PLGA microspheres with inner structure/porosity differences. Risperidone was chosen as a model drug. Qualitatively and quantitatively equivalent PLGA microspheres with different inner structure/porosity were obtained using different manufacturing processes. Physicochemical properties as well as degradation profiles of the prepared microspheres were investigated. Furthermore, in vitro release testing of the prepared risperidone microspheres was performed using the most common in vitro release methods (i.e., sample-and-separate and flow through) for this type of product. The obtained compositionally equivalent risperidone microspheres had similar drug loading but different inner structure/porosity. When microsphere particle size appeared similar, porous risperidone microspheres showed faster microsphere degradation and drug release compared with less porous microspheres. Both in vitro release methods investigated were able to differentiate risperidone microsphere formulations with differences in porosity under real-time (37 °C) and accelerated (45 °C) testing conditions. Notably, only the accelerated USP apparatus 4 method showed good reproducibility for highly porous risperidone microspheres. These results indicated that the accelerated USP apparatus 4 method is an appropriate fast quality control tool for long-acting PLGA microspheres (even with porous structures). Copyright © 2015 Elsevier B.V. All rights reserved.

  8. From alginate impressions to digital virtual models: accuracy and reproducibility.

    Science.gov (United States)

    Dalstra, Michel; Melsen, Birte

    2009-03-01

    To compare the accuracy and reproducibility of measurements performed on digital virtual models with those taken on plaster casts from models poured immediately after the impression was taken, the 'gold standard', and from plaster models poured following a 3-5 day shipping procedure of the alginate impression. Direct comparison of two measuring techniques. The study was conducted at the Department of Orthodontics, School of Dentistry, University of Aarhus, Denmark in 2006/2007. Twelve randomly selected orthodontic graduate students with informed consent. Three sets of alginate impressions were taken from the participants within 1 hour. Plaster models were poured immediately from two of the sets, while the third set was kept in transit in the mail for 3-5 days. Upon return a plaster model was poured as well. Finally digital models were made from the plaster models. A number of measurements were performed on the plaster casts with a digital calliper and on the corresponding digital models using the virtual measuring tool of the accompanying software. Afterwards these measurements were compared statistically. No statistical differences were found between the three sets of plaster models. The intra- and inter-observer variability are smaller for the measurements performed on the digital models. Sending alginate impressions by mail does not affect the quality and accuracy of plaster casts poured from them afterwards. Virtual measurements performed on digital models display less variability than the corresponding measurements performed with a calliper on the actual models.

  9. Effect of Initial Conditions on Reproducibility of Scientific Research

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-01-01

    Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705

  10. Accuracy, reproducibility, and time efficiency of dental measurements using different technologies.

    Science.gov (United States)

    Grünheid, Thorsten; Patel, Nishant; De Felippe, Nanci L; Wey, Andrew; Gaillard, Philippe R; Larson, Brent E

    2014-02-01

    Historically, orthodontists have taken dental measurements on plaster models. Technological advances now allow orthodontists to take these measurements on digital models. In this study, we aimed to assess the accuracy, reproducibility, and time efficiency of dental measurements taken on 3 types of digital models. emodels (GeoDigm, Falcon Heights, Minn), SureSmile models (OraMetrix, Richardson, Tex), and AnatoModels (Anatomage, San Jose, Calif) were made for 30 patients. Mesiodistal tooth-width measurements taken on these digital models were timed and compared with those on the corresponding plaster models, which were used as the gold standard. Accuracy and reproducibility were assessed using the Bland-Altman method. Differences in time efficiency were tested for statistical significance with 1-way analysis of variance. Measurements on SureSmile models were the most accurate, followed by those on emodels and AnatoModels. Measurements taken on SureSmile models were also the most reproducible. Measurements taken on SureSmile models and emodels were significantly faster than those taken on AnatoModels and plaster models. Tooth-width measurements on digital models can be as accurate as, and might be more reproducible and significantly faster than, those taken on plaster models. Of the models studied, the SureSmile models provided the best combination of accuracy, reproducibility, and time efficiency of measurement. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  11. Modeling Stop-and-Go Waves in Pedestrian Dynamics

    OpenAIRE

    Portz, Andrea; Seyfried, Armin

    2010-01-01

    Several spatially continuous pedestrian dynamics models have been validated against empirical data. We try to reproduce the experimental fundamental diagram (velocity versus density) with simulations. In addition to this quantitative criterion, we tried to reproduce stop-and-go waves as a qualitative criterion. Stop-and-go waves are a characteristic phenomenon for the single file movement. Only one of three investigated models satisfies both criteria.

  12. Determining the optimal number of independent components for reproducible transcriptomic data analysis.

    Science.gov (United States)

    Kairov, Ulykbek; Cantini, Laura; Greco, Alessandro; Molkenov, Askhat; Czerwinska, Urszula; Barillot, Emmanuel; Zinovyev, Andrei

    2017-09-11

    Independent Component Analysis (ICA) is a method that models gene expression data as an action of a set of statistically independent hidden factors. The output of ICA depends on a fundamental parameter: the number of components (factors) to compute. The optimal choice of this parameter, related to determining the effective data dimension, remains an open question in the application of blind source separation techniques to transcriptomic data. Here we address the question of optimizing the number of statistically independent components in the analysis of transcriptomic data for reproducibility of the components in multiple runs of ICA (within the same or within varying effective dimensions) and in multiple independent datasets. To this end, we introduce ranking of independent components based on their stability in multiple ICA computation runs and define a distinguished number of components (Most Stable Transcriptome Dimension, MSTD) corresponding to the point of the qualitative change of the stability profile. Based on a large body of data, we demonstrate that a sufficient number of dimensions is required for biological interpretability of the ICA decomposition and that the most stable components with ranks below MSTD have more chances to be reproduced in independent studies compared to the less stable ones. At the same time, we show that a transcriptomics dataset can be reduced to a relatively high number of dimensions without losing the interpretability of ICA, even though higher dimensions give rise to components driven by small gene sets. We suggest a protocol of ICA application to transcriptomics data with a possibility of prioritizing components with respect to their reproducibility that strengthens the biological interpretation. Computing too few components (much less than MSTD) is not optimal for interpretability of the results. The components ranked within MSTD range have more chances to be reproduced in independent studies.

  13. Learning about Ecological Systems by Constructing Qualitative Models with DynaLearn

    Science.gov (United States)

    Leiba, Moshe; Zuzovsky, Ruth; Mioduser, David; Benayahu, Yehuda; Nachmias, Rafi

    2012-01-01

    A qualitative model of a system is an abstraction that captures ordinal knowledge and predicts the set of qualitatively possible behaviours of the system, given a qualitative description of its structure and initial state. This paper examines an innovative approach to science education using an interactive learning environment that supports…

  14. Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000

    Science.gov (United States)

    Sanchez-Gomez, Emilia; Somot, S.; Déqué, M.

    2009-10-01

    One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation.

  15. Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000

    Energy Technology Data Exchange (ETDEWEB)

    Somot, S.; Deque, M. [Meteo-France CNRM/GMGEC CNRS/GAME, Toulouse (France); Sanchez-Gomez, Emilia

    2009-10-15

    One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation. (orig.)

  16. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  17. QML-AiNet: An immune network approach to learning qualitative differential equation models.

    Science.gov (United States)

    Pang, Wei; Coghill, George M

    2015-02-01

    In this paper, we explore the application of Opt-AiNet, an immune network approach for search and optimisation problems, to learning qualitative models in the form of qualitative differential equations. The Opt-AiNet algorithm is adapted to qualitative model learning problems, resulting in the proposed system QML-AiNet. The potential of QML-AiNet to address the scalability and multimodal search space issues of qualitative model learning has been investigated. More importantly, to further improve the efficiency of QML-AiNet, we also modify the mutation operator according to the features of discrete qualitative model space. Experimental results show that the performance of QML-AiNet is comparable to QML-CLONALG, a QML system using the clonal selection algorithm (CLONALG). More importantly, QML-AiNet with the modified mutation operator can significantly improve the scalability of QML and is much more efficient than QML-CLONALG.

  18. Modeling arson - An exercise in qualitative model building

    Science.gov (United States)

    Heineke, J. M.

    1975-01-01

    A detailed example is given of the role of von Neumann and Morgenstern's 1944 'expected utility theorem' (in the theory of games and economic behavior) in qualitative model building. Specifically, an arsonist's decision as to the amount of time to allocate to arson and related activities is modeled, and the responsiveness of this time allocation to changes in various policy parameters is examined. Both the activity modeled and the method of presentation are intended to provide an introduction to the scope and power of the expected utility theorem in modeling situations of 'choice under uncertainty'. The robustness of such a model is shown to vary inversely with the number of preference restrictions used in the analysis. The fewer the restrictions, the wider is the class of agents to which the model is applicable, and accordingly more confidence is put in the derived results. A methodological discussion on modeling human behavior is included.

  19. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  20. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  1. GeoTrust Hub: A Platform For Sharing And Reproducing Geoscience Applications

    Science.gov (United States)

    Malik, T.; Tarboton, D. G.; Goodall, J. L.; Choi, E.; Bhatt, A.; Peckham, S. D.; Foster, I.; Ton That, D. H.; Essawy, B.; Yuan, Z.; Dash, P. K.; Fils, G.; Gan, T.; Fadugba, O. I.; Saxena, A.; Valentic, T. A.

    2017-12-01

    Recent requirements of scholarly communication emphasize the reproducibility of scientific claims. Text-based research papers are considered poor mediums to establish reproducibility. Papers must be accompanied by "research objects", aggregation of digital artifacts that together with the paper provide an authoritative record of a piece of research. We will present GeoTrust Hub (http://geotrusthub.org), a platform for creating, sharing, and reproducing reusable research objects. GeoTrust Hub provides tools for scientists to create `geounits'--reusable research objects. Geounits are self-contained, annotated, and versioned containers that describe and package computational experiments in an efficient and light-weight manner. Geounits can be shared on public repositories such as HydroShare and FigShare, and also using their respective APIs reproduced on provisioned clouds. The latter feature enables science applications to have a lifetime beyond sharing, wherein they can be independently verified and trust be established as they are repeatedly reused. Through research use cases from several geoscience laboratories across the United States, we will demonstrate how tools provided from GeoTrust Hub along with Hydroshare as its public repository for geounits is advancing the state of reproducible research in the geosciences. For each use case, we will address different computational reproducibility requirements. Our first use case will be an example of setup reproducibility which enables a scientist to set up and reproduce an output from a model with complex configuration and development environments. Our second use case will be an example of algorithm/data reproducibility, where in a shared data science model/dataset can be substituted with an alternate one to verify model output results, and finally an example of interactive reproducibility, in which an experiment is dependent on specific versions of data to produce the result. Toward this we will use software and data

  2. A novel highly reproducible and lethal nonhuman primate model for orthopox virus infection.

    Directory of Open Access Journals (Sweden)

    Marit Kramski

    Full Text Available The intentional re-introduction of Variola virus (VARV, the agent of smallpox, into the human population is of great concern due its bio-terroristic potential. Moreover, zoonotic infections with Cowpox (CPXV and Monkeypox virus (MPXV cause severe diseases in humans. Smallpox vaccines presently available can have severe adverse effects that are no longer acceptable. The efficacy and safety of new vaccines and antiviral drugs for use in humans can only be demonstrated in animal models. The existing nonhuman primate models, using VARV and MPXV, need very high viral doses that have to be applied intravenously or intratracheally to induce a lethal infection in macaques. To overcome these drawbacks, the infectivity and pathogenicity of a particular CPXV was evaluated in the common marmoset (Callithrix jacchus.A CPXV named calpox virus was isolated from a lethal orthopox virus (OPV outbreak in New World monkeys. We demonstrated that marmosets infected with calpox virus, not only via the intravenous but also the intranasal route, reproducibly develop symptoms resembling smallpox in humans. Infected animals died within 1-3 days after onset of symptoms, even when very low infectious viral doses of 5x10(2 pfu were applied intranasally. Infectious virus was demonstrated in blood, saliva and all organs analyzed.We present the first characterization of a new OPV infection model inducing a disease in common marmosets comparable to smallpox in humans. Intranasal virus inoculation mimicking the natural route of smallpox infection led to reproducible infection. In vivo titration resulted in an MID(50 (minimal monkey infectious dose 50% of 8.3x10(2 pfu of calpox virus which is approximately 10,000-fold lower than MPXV and VARV doses applied in the macaque models. Therefore, the calpox virus/marmoset model is a suitable nonhuman primate model for the validation of vaccines and antiviral drugs. Furthermore, this model can help study mechanisms of OPV pathogenesis.

  3. Reproducibility of somatosensory spatial perceptual maps.

    Science.gov (United States)

    Steenbergen, Peter; Buitenweg, Jan R; Trojan, Jörg; Veltink, Peter H

    2013-02-01

    Various studies have shown subjects to mislocalize cutaneous stimuli in an idiosyncratic manner. Spatial properties of individual localization behavior can be represented in the form of perceptual maps. Individual differences in these maps may reflect properties of internal body representations, and perceptual maps may therefore be a useful method for studying these representations. For this to be the case, individual perceptual maps need to be reproducible, which has not yet been demonstrated. We assessed the reproducibility of localizations measured twice on subsequent days. Ten subjects participated in the experiments. Non-painful electrocutaneous stimuli were applied at seven sites on the lower arm. Subjects localized the stimuli on a photograph of their own arm, which was presented on a tablet screen overlaying the real arm. Reproducibility was assessed by calculating intraclass correlation coefficients (ICC) for the mean localizations of each electrode site and the slope and offset of regression models of the localizations, which represent scaling and displacement of perceptual maps relative to the stimulated sites. The ICCs of the mean localizations ranged from 0.68 to 0.93; the ICCs of the regression parameters were 0.88 for the intercept and 0.92 for the slope. These results indicate a high degree of reproducibility. We conclude that localization patterns of non-painful electrocutaneous stimuli on the arm are reproducible on subsequent days. Reproducibility is a necessary property of perceptual maps for these to reflect properties of a subject's internal body representations. Perceptual maps are therefore a promising method for studying body representations.

  4. Configurational Model for Conductivity of Stabilized Fluorite Structure Oxides

    DEFF Research Database (Denmark)

    Poulsen, Finn Willy

    1981-01-01

    The formalism developed here furnishes means by which ionic configurations, solid solution limits, and conductivity mechanisms in doped fluorite structures can be described. The present model differs markedly from previous models but reproduces qualitatively reality. The analysis reported...

  5. The Systems Biology Markup Language (SBML) Level 3 Package: Qualitative Models, Version 1, Release 1.

    Science.gov (United States)

    Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš

    2015-09-04

    Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.

  6. Modelling soil erosion at European scale: towards harmonization and reproducibility

    Science.gov (United States)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  7. Reproducing tailing in breakthrough curves: Are statistical models equally representative and predictive?

    Science.gov (United States)

    Pedretti, Daniele; Bianchi, Marco

    2018-03-01

    Breakthrough curves (BTCs) observed during tracer tests in highly heterogeneous aquifers display strong tailing. Power laws are popular models for both the empirical fitting of these curves, and the prediction of transport using upscaling models based on best-fitted estimated parameters (e.g. the power law slope or exponent). The predictive capacity of power law based upscaling models can be however questioned due to the difficulties to link model parameters with the aquifers' physical properties. This work analyzes two aspects that can limit the use of power laws as effective predictive tools: (a) the implication of statistical subsampling, which often renders power laws undistinguishable from other heavily tailed distributions, such as the logarithmic (LOG); (b) the difficulties to reconcile fitting parameters obtained from models with different formulations, such as the presence of a late-time cutoff in the power law model. Two rigorous and systematic stochastic analyses, one based on benchmark distributions and the other on BTCs obtained from transport simulations, are considered. It is found that a power law model without cutoff (PL) results in best-fitted exponents (αPL) falling in the range of typical experimental values reported in the literature (1.5 tailing becomes heavier. Strong fluctuations occur when the number of samples is limited, due to the effects of subsampling. On the other hand, when the power law model embeds a cutoff (PLCO), the best-fitted exponent (αCO) is insensitive to the degree of tailing and to the effects of subsampling and tends to a constant αCO ≈ 1. In the PLCO model, the cutoff rate (λ) is the parameter that fully reproduces the persistence of the tailing and is shown to be inversely correlated to the LOG scale parameter (i.e. with the skewness of the distribution). The theoretical results are consistent with the fitting analysis of a tracer test performed during the MADE-5 experiment. It is shown that a simple

  8. A reproducible brain tumour model established from human glioblastoma biopsies

    International Nuclear Information System (INIS)

    Wang, Jian; Chekenya, Martha; Bjerkvig, Rolf; Enger, Per Ø; Miletic, Hrvoje; Sakariassen, Per Ø; Huszthy, Peter C; Jacobsen, Hege; Brekkå, Narve; Li, Xingang; Zhao, Peng; Mørk, Sverre

    2009-01-01

    Establishing clinically relevant animal models of glioblastoma multiforme (GBM) remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression

  9. A reproducible brain tumour model established from human glioblastoma biopsies

    Directory of Open Access Journals (Sweden)

    Li Xingang

    2009-12-01

    Full Text Available Abstract Background Establishing clinically relevant animal models of glioblastoma multiforme (GBM remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. Methods In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. Results The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. Conclusions In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression.

  10. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  11. Graphical means for inspecting qualitative models of system behaviour

    NARCIS (Netherlands)

    Bouwer, A.; Bredeweg, B.

    2010-01-01

    This article presents the design and evaluation of a tool for inspecting conceptual models of system behaviour. The basis for this research is the Garp framework for qualitative simulation. This framework includes modelling primitives, such as entities, quantities and causal dependencies, which are

  12. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  13. Qualitative mechanism models and the rationalization of procedures

    Science.gov (United States)

    Farley, Arthur M.

    1989-01-01

    A qualitative, cluster-based approach to the representation of hydraulic systems is described and its potential for generating and explaining procedures is demonstrated. Many ideas are formalized and implemented as part of an interactive, computer-based system. The system allows for designing, displaying, and reasoning about hydraulic systems. The interactive system has an interface consisting of three windows: a design/control window, a cluster window, and a diagnosis/plan window. A qualitative mechanism model for the ORS (Orbital Refueling System) is presented to coordinate with ongoing research on this system being conducted at NASA Ames Research Center.

  14. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  15. A qualitative evaluation approach for energy system modelling frameworks

    DEFF Research Database (Denmark)

    Wiese, Frauke; Hilpert, Simon; Kaldemeyer, Cord

    2018-01-01

    properties define how useful it is in regard to the existing challenges. For energy system models, evaluation methods exist, but we argue that many decisions upon properties are rather made on the model generator or framework level. Thus, this paper presents a qualitative approach to evaluate frameworks...

  16. A qualitative model construction method of nuclear power plants for effective diagnostic knowledge generation

    International Nuclear Information System (INIS)

    Yoshikawa, Shinji; Endou, Akira; Kitamura, Yoshinobu; Sasajima, Munehiko; Ikeda, Mitsuru; Mizoguchi, Riichiro.

    1994-01-01

    This paper discusses a method to construct a qualitative model of a nuclear power plant, in order to generate effective diagnostic knowledge. The proposed method is to prepare deep knowledge to be provided to a knowledge compiler based upon qualitative reasoning (QR). Necessity of knowledge compilation for nuclear plant diagnosis will be explained first, and conventionally-experienced problems in qualitative reasoning and a proposed method to overcome this problem is shown next, then a sample procedure to build a qualitative nuclear plant model is demonstrated. (author)

  17. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    Science.gov (United States)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  18. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    Science.gov (United States)

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  19. A Qualitative Acceleration Model Based on Intervals

    Directory of Open Access Journals (Sweden)

    Ester MARTINEZ-MARTIN

    2013-08-01

    Full Text Available On the way to autonomous service robots, spatial reasoning plays a main role since it properly deals with problems involving uncertainty. In particular, we are interested in knowing people's pose to avoid collisions. With that aim, in this paper, we present a qualitative acceleration model for robotic applications including representation, reasoning and a practical application.

  20. Improving the Pattern Reproducibility of Multiple-Point-Based Prior Models Using Frequency Matching

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2014-01-01

    Some multiple-point-based sampling algorithms, such as the snesim algorithm, rely on sequential simulation. The conditional probability distributions that are used for the simulation are based on statistics of multiple-point data events obtained from a training image. During the simulation, data...... events with zero probability in the training image statistics may occur. This is handled by pruning the set of conditioning data until an event with non-zero probability is found. The resulting probability distribution sampled by such algorithms is a pruned mixture model. The pruning strategy leads...... to a probability distribution that lacks some of the information provided by the multiple-point statistics from the training image, which reduces the reproducibility of the training image patterns in the outcome realizations. When pruned mixture models are used as prior models for inverse problems, local re...

  1. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  2. Reproducing the nonlinear dynamic behavior of a structured beam with a generalized continuum model

    Science.gov (United States)

    Vila, J.; Fernández-Sáez, J.; Zaera, R.

    2018-04-01

    In this paper we study the coupled axial-transverse nonlinear vibrations of a kind of one dimensional structured solids by application of the so called Inertia Gradient Nonlinear continuum model. To show the accuracy of this axiomatic model, previously proposed by the authors, its predictions are compared with numeric results from a previously defined finite discrete chain of lumped masses and springs, for several number of particles. A continualization of the discrete model equations based on Taylor series allowed us to set equivalent values of the mechanical properties in both discrete and axiomatic continuum models. Contrary to the classical continuum model, the inertia gradient nonlinear continuum model used herein is able to capture scale effects, which arise for modes in which the wavelength is comparable to the characteristic distance of the structured solid. The main conclusion of the work is that the proposed generalized continuum model captures the scale effects in both linear and nonlinear regimes, reproducing the behavior of the 1D nonlinear discrete model adequately.

  3. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  4. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  5. Global qualitative analysis of a quartic ecological model

    NARCIS (Netherlands)

    Broer, Hendrik; Gaiko, Valery A.

    2010-01-01

    in this paper we complete the global qualitative analysis of a quartic ecological model. In particular, studying global bifurcations of singular points and limit cycles, we prove that the corresponding dynamical system has at most two limit cycles. (C) 2009 Elsevier Ltd. All rights reserved.

  6. The spruce budworm and forest: a qualitative comparison of ODE and Boolean models

    Directory of Open Access Journals (Sweden)

    Raina Robeva

    2016-01-01

    Full Text Available Boolean and polynomial models of biological systems have emerged recently as viable companions to differential equations models. It is not immediately clear however whether such models are capable of capturing the multi-stable behaviour of certain biological systems: this behaviour is often sensitive to changes in the values of the model parameters, while Boolean and polynomial models are qualitative in nature. In the past few years, Boolean models of gene regulatory systems have been shown to capture multi-stability at the molecular level, confirming that such models can be used to obtain information about the system’s qualitative dynamics when precise information regarding its parameters may not be available. In this paper, we examine Boolean approximations of a classical ODE model of budworm outbreaks in a forest and show that these models exhibit a qualitative behaviour consistent with that derived from the ODE models. In particular, we demonstrate that these models can capture the bistable nature of insect population outbreaks, thus showing that Boolean models can be successfully utilized beyond the molecular level.

  7. Development of a Consistent and Reproducible Porcine Scald Burn Model

    Science.gov (United States)

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  8. Size Control of Sessile Microbubbles for Reproducibly Driven Acoustic Streaming

    Science.gov (United States)

    Volk, Andreas; Kähler, Christian J.

    2018-05-01

    Acoustically actuated bubbles are receiving growing interest in microfluidic applications, as they induce a streaming field that can be used for particle sorting and fluid mixing. An essential but often unspoken challenge in such applications is to maintain a constant bubble size to achieve reproducible conditions. We present an automatized system for the size control of a cylindrical bubble that is formed at a blind side pit of a polydimethylsiloxane microchannel. Using a pressure control system, we adapt the protrusion depth of the bubble into the microchannel to a precision of approximately 0.5 μ m on a timescale of seconds. By comparing the streaming field generated by bubbles of width 80 μ m with a protrusion depth between -12 and 60 μ m , we find that the mean velocity of the induced streaming fields varies by more than a factor of 4. We also find a qualitative change of the topology of the streaming field. Both observations confirm the importance of the bubble size control system in order to achieve reproducible and reliable bubble-driven streaming experiments.

  9. Magni Reproducibility Example

    DEFF Research Database (Denmark)

    2016-01-01

    An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set.......An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set....

  10. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes......, as well as overall preference, was based on consistency tests of binary paired-comparison judgments and on modeling the choice frequencies using probabilistic choice models. As a result, the preferences of non-expert listeners could be measured reliably at a ratio scale level. Principal components derived...

  11. Accelerating transition dynamics in city regions: A qualitative modeling perspective

    NARCIS (Netherlands)

    P.J. Valkering (Pieter); Yücel, G. (Gönenç); Gebetsroither-Geringer, E. (Ernst); Markvica, K. (Karin); Meynaerts, E. (Erika); N. Frantzeskaki (Niki)

    2017-01-01

    textabstractIn this article, we take stock of the findings from conceptual and empirical work on the role of transition initiatives for accelerating transitions as input for modeling acceleration dynamics. We applied the qualitative modeling approach of causal loop diagrams to capture the dynamics

  12. A qualitative reasoning model of algal bloom in the Danube Delta Biosphere Reserve (DDBR)

    NARCIS (Netherlands)

    Cioaca, E.; Linnebank, F.E.; Bredeweg, B.; Salles, P.

    2009-01-01

    This paper presents a Qualitative Reasoning model of the algal bloom phenomenon and its effects in the Danube Delta Biosphere Reserve (DDBR) in Romania. Qualitative Reasoning models represent processes and their cause-effect relationships in a flexible and conceptually rich manner and as such can be

  13. Demography-based adaptive network model reproduces the spatial organization of human linguistic groups

    Science.gov (United States)

    Capitán, José A.; Manrubia, Susanna

    2015-12-01

    The distribution of human linguistic groups presents a number of interesting and nontrivial patterns. The distributions of the number of speakers per language and the area each group covers follow log-normal distributions, while population and area fulfill an allometric relationship. The topology of networks of spatial contacts between different linguistic groups has been recently characterized, showing atypical properties of the degree distribution and clustering, among others. Human demography, spatial conflicts, and the construction of networks of contacts between linguistic groups are mutually dependent processes. Here we introduce an adaptive network model that takes all of them into account and successfully reproduces, using only four model parameters, not only those features of linguistic groups already described in the literature, but also correlations between demographic and topological properties uncovered in this work. Besides their relevance when modeling and understanding processes related to human biogeography, our adaptive network model admits a number of generalizations that broaden its scope and make it suitable to represent interactions between agents based on population dynamics and competition for space.

  14. Reproducibility and accuracy of linear measurements on dental models derived from cone-beam computed tomography compared with digital dental casts

    NARCIS (Netherlands)

    Waard, O. de; Rangel, F.A.; Fudalej, P.S.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Breuning, K.H.

    2014-01-01

    INTRODUCTION: The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models

  15. Recruiting Transcultural Qualitative Research Participants: A Conceptual Model

    Directory of Open Access Journals (Sweden)

    Phyllis Eide

    2005-06-01

    Full Text Available Working with diverse populations poses many challenges to the qualitative researcher who is a member of the dominant culture. Traditional methods of recruitment and selection (such as flyers and advertisements are often unproductive, leading to missed contributions from potential participants who were not recruited and researcher frustration. In this article, the authors explore recruitment issues related to the concept of personal knowing based on experiences with Aboriginal Hawai'ian and Micronesian populations, wherein knowing and being known are crucial to successful recruitment of participants. They present a conceptual model that incorporates key concepts of knowing the other, cultural context, and trust to guide other qualitative transcultural researchers. They also describe challenges, implications, and concrete suggestions for recruitment of participants.

  16. The Use of Modelling for Theory Building in Qualitative Analysis

    Science.gov (United States)

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  17. Double Regge model for non diffractive A1 production

    International Nuclear Information System (INIS)

    Anjos, J.C.; Endler, A.; Santoro, A.; Simao, F.R.A.

    1977-07-01

    A Reggeized double-nucleon-exchange model is shown to be able to to reproduce qualitatively the non-diffractive A 1 production recently observed in the reaction K - p → Σ - π + π - π + at 4.15 GeV/c

  18. Empirical questions for collective-behaviour modelling

    Indian Academy of Sciences (India)

    The collective behaviour of groups of social animals has been an active topic of study ... Models have been successful at reproducing qualitative features of ... quantitative and detailed empirical results for a range of animal systems. ... standard method [23], the redundant information recorded by the cameras can be used to.

  19. Reproducibility analysis of measurements with a mechanical semiautomatic eye model for evaluation of intraocular lenses

    Science.gov (United States)

    Rank, Elisabet; Traxler, Lukas; Bayer, Natascha; Reutterer, Bernd; Lux, Kirsten; Drauschke, Andreas

    2014-03-01

    Mechanical eye models are used to validate ex vivo the optical quality of intraocular lenses (IOLs). The quality measurement and test instructions for IOLs are defined in the ISO 11979-2. However, it was mentioned in literature that these test instructions could lead to inaccurate measurements in case of some modern IOL designs. Reproducibility of alignment and measurement processes are presented, performed with a semiautomatic mechanical ex vivo eye model based on optical properties published by Liou and Brennan in the scale 1:1. The cornea, the iris aperture and the IOL itself are separately changeable within the eye model. The adjustment of the IOL can be manipulated by automatic decentration and tilt of the IOL in reference to the optical axis of the whole system, which is defined by the connection line of the central point of the artificial cornea and the iris aperture. With the presented measurement setup two quality criteria are measurable: the modulation transfer function (MTF) and the Strehl ratio. First the reproducibility of the alignment process for definition of initial conditions of the lateral position and tilt in reference to the optical axis of the system is investigated. Furthermore, different IOL holders are tested related to the stable holding of the IOL. The measurement is performed by a before-after comparison of the lens position using a typical decentration and tilt tolerance analysis path. Modulation transfer function MTF and Strehl ratio S before and after this tolerance analysis are compared and requirements for lens holder construction are deduced from the presented results.

  20. AI/OR computational model for integrating qualitative and quantitative design methods

    Science.gov (United States)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  1. Reproducibility of temporomandibular joint tomography. Influence of shifted X-ray beam and tomographic focal plane on reproducibility

    International Nuclear Information System (INIS)

    Saito, Masashi

    1999-01-01

    Proper tomographic focal plane and x-ray beam direction are the most important factors to obtain accurate images of the temporomandibular joint (TMJ). In this study, to clarify the magnitude of effect of these two factors on the image quality. We evaluated the reproducibility of tomograms by measuring the distortion when the x-ray beam was shifted from the correct center of the object. The effects of the deviation of the tomographic focal plane on image quality were evaluated by the MTF (Modulation Transfer Function). Two types of tomograms, one the plane type, the other the rotational type were used in this study. A TMJ model was made from Teflon for the purpose of evaluation by shifting the x-ray beam. The x-ray images were obtained by tilting the model from 0 to 10 degrees 2-degree increments. These x-ray images were processed for computer image analysis, and then the distance between condyle and the joint space was measured. To evaluate the influence of the shifted tomographic focal plane on image sharpness, the x-ray images from each setting were analyzed by MTF. To obtain the MTF, ''knife-edge'' made from Pb was used. The images were scanned with a microdensitometer at the central focal plane, and 0, 0.5, 1 mm away respectively. The density curves were analyzed by Fourier analysis and the MTF was calculated. The reproducibility of images became worse by shifting the x-ray beam. This tendency was similar for both tomograms. Object characteristics such as anterior and posterior portion of the joint space affected the deterioration of reproducibility of the tomography. The deviation of the tomographic focal plane also decreased the reproducibility of the x-ray images. The rotational type showed a better MTF, but it became seriously unfavorable with slight changes of the tomographic focal plane. Contrarily, the plane type showed a lower MTF, but the image was stable with shifting of the tomographic focal plane. (author)

  2. Contrasting response to nutrient manipulation in Arctic mesocosms are reproduced by a minimum microbial food web model.

    Science.gov (United States)

    Larsen, Aud; Egge, Jorun K; Nejstgaard, Jens C; Di Capua, Iole; Thyrhaug, Runar; Bratbak, Gunnar; Thingstad, T Frede

    2015-03-01

    A minimum mathematical model of the marine pelagic microbial food web has previously shown to be able to reproduce central aspects of observed system response to different bottom-up manipulations in a mesocosm experiment Microbial Ecosystem Dynamics (MEDEA) in Danish waters. In this study, we apply this model to two mesocosm experiments (Polar Aquatic Microbial Ecology (PAME)-I and PAME-II) conducted at the Arctic location Kongsfjorden, Svalbard. The different responses of the microbial community to similar nutrient manipulation in the three mesocosm experiments may be described as diatom-dominated (MEDEA), bacteria-dominated (PAME-I), and flagellated-dominated (PAME-II). When allowing ciliates to be able to feed on small diatoms, the model describing the diatom-dominated MEDEA experiment give a bacteria-dominated response as observed in PAME I in which the diatom community comprised almost exclusively small-sized cells. Introducing a high initial mesozooplankton stock as observed in PAME-II, the model gives a flagellate-dominated response in accordance with the observed response also of this experiment. The ability of the model originally developed for temperate waters to reproduce population dynamics in a 10°C colder Arctic fjord, does not support the existence of important shifts in population balances over this temperature range. Rather, it suggests a quite resilient microbial food web when adapted to in situ temperature. The sensitivity of the model response to its mesozooplankton component suggests, however, that the seasonal vertical migration of Arctic copepods may be a strong forcing factor on Arctic microbial food webs.

  3. Disaster Reintegration Model: A Qualitative Analysis on Developing Korean Disaster Mental Health Support Model

    Directory of Open Access Journals (Sweden)

    Yun-Jung Choi

    2018-02-01

    Full Text Available This study sought to describe the mental health problems experienced by Korean disaster survivors, using a qualitative research method to provide empirical resources for effective disaster mental health support in Korea. Participants were 16 adults or elderly adults who experienced one or more disasters at least 12 months ago recruited via theoretical sampling. Participants underwent in-depth individual interviews on their disaster experiences, which were recorded and transcribed for qualitative analysis, which followed Strauss and Corbin’s (1998 Grounded theory. After open coding, participants’ experiences were categorized into 130 codes, 43 sub-categories and 17 categories. The categories were further analyzed in a paradigm model, conditional model and the Disaster Reintegration Model, which proposed potentially effective mental health recovery strategies for disaster survivors, health providers and administrators. To provide effective assistance for mental health recovery of disaster survivors, both personal and public resilience should be promoted while considering both cultural and spiritual elements.

  4. Can a coupled meteorology–chemistry model reproduce the historical trend in aerosol direct radiative effects over the Northern Hemisphere?

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere h...

  5. COGNITIVE MODELING AS A METHOD OF QUALITATIVE ANALYSIS OF IT PROJECTS

    Directory of Open Access Journals (Sweden)

    Інна Ігорівна ОНИЩЕНКО

    2016-03-01

    Full Text Available The example project implementing automated CRM-system demonstrated the possibility and features of cognitive modeling in the qualitative analysis of project risks to determine their additional features. Proposed construction of cognitive models of project risks in information technology within the qualitative risk analysis, additional assessments as a method of ranking risk to characterize the relationship between them. The proposed cognitive model reflecting the relationship between the risk of IT project to assess the negative and the positive impact of certain risks for the remaining risks of project implementation of the automated CRM-system. The ability to influence the risk of a fact of other project risks can increase the priority of risk with low impact on results due to its relationship with other project risks.

  6. Reproducibility in a multiprocessor system

    Science.gov (United States)

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  7. Climate model biases in seasonality of continental water storage revealed by satellite gravimetry

    Science.gov (United States)

    Swenson, Sean; Milly, P.C.D.

    2006-01-01

    Satellite gravimetric observations of monthly changes in continental water storage are compared with outputs from five climate models. All models qualitatively reproduce the global pattern of annual storage amplitude, and the seasonal cycle of global average storage is reproduced well, consistent with earlier studies. However, global average agreements mask systematic model biases in low latitudes. Seasonal extrema of low‐latitude, hemispheric storage generally occur too early in the models, and model‐specific errors in amplitude of the low‐latitude annual variations are substantial. These errors are potentially explicable in terms of neglected or suboptimally parameterized water stores in the land models and precipitation biases in the climate models.

  8. Modeling microtubule oscillations

    DEFF Research Database (Denmark)

    Jobs, E.; Wolf, D.E.; Flyvbjerg, H.

    1997-01-01

    Synchronization of molecular reactions in a macroscopic volume may cause the volume's physical properties to change dynamically and thus reveal much about the reactions. As an example, experimental time series for so-called microtubule oscillations are analyzed in terms of a minimal model...... for this complex polymerization-depolymerization cycle. The model reproduces well the qualitatively different time series that result from different experimental conditions, and illuminates the role and importance of individual processes in the cycle. Simple experiments are suggested that can further test...... and define the model and the polymer's reaction cycle....

  9. A Bayesian Perspective on the Reproducibility Project: Psychology.

    Science.gov (United States)

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  10. Contextual sensitivity in scientific reproducibility

    Science.gov (United States)

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  11. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  12. Standing Together for Reproducibility in Large-Scale Computing: Report on reproducibility@XSEDE

    OpenAIRE

    James, Doug; Wilkins-Diehr, Nancy; Stodden, Victoria; Colbry, Dirk; Rosales, Carlos; Fahey, Mark; Shi, Justin; Silva, Rafael F.; Lee, Kyo; Roskies, Ralph; Loewe, Laurence; Lindsey, Susan; Kooper, Rob; Barba, Lorena; Bailey, David

    2014-01-01

    This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enab...

  13. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  14. Reproducing American Sign Language Sentences: Cognitive Scaffolding in Working Memory

    Directory of Open Access Journals (Sweden)

    Ted eSupalla

    2014-08-01

    Full Text Available The American Sign Language Sentence Reproduction Test (ASL-SRT requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects’ recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies in the absence of linguistic knowledge. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are

  15. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  16. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  17. Validation of EURO-CORDEX regional climate models in reproducing the variability of precipitation extremes in Romania

    Science.gov (United States)

    Dumitrescu, Alexandru; Busuioc, Aristita

    2016-04-01

    EURO-CORDEX is the European branch of the international CORDEX initiative that aims to provide improved regional climate change projections for Europe. The main objective of this paper is to document the performance of the individual models in reproducing the variability of precipitation extremes in Romania. Here three EURO-CORDEX regional climate models (RCMs) ensemble (scenario RCP4.5) are analysed and inter-compared: DMI-HIRHAM5, KNMI-RACMO2.2 and MPI-REMO. Compared to previous studies, when the RCM validation regarding the Romanian climate has mainly been made on mean state and at station scale, a more quantitative approach of precipitation extremes is proposed. In this respect, to have a more reliable comparison with observation, a high resolution daily precipitation gridded data set was used as observational reference (CLIMHYDEX project). The comparison between the RCM outputs and observed grid point values has been made by calculating three extremes precipitation indices, recommended by the Expert Team on Climate Change Detection Indices (ETCCDI), for the 1976-2005 period: R10MM, annual count of days when precipitation ≥10mm; RX5DAY, annual maximum 5-day precipitation and R95P%, precipitation fraction of annual total precipitation due to daily precipitation > 95th percentile. The RCMs capability to reproduce the mean state for these variables, as well as the main modes of their spatial variability (given by the first three EOF patterns), are analysed. The investigation confirms the ability of RCMs to simulate the main features of the precipitation extreme variability over Romania, but some deficiencies in reproducing of their regional characteristics were found (for example, overestimation of the mea state, especially over the extra Carpathian regions). This work has been realised within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian

  18. Qualitative feature extractions of chaotic systems

    International Nuclear Information System (INIS)

    Vicha, T.; Dohnal, M.

    2008-01-01

    The theory of chaos offers useful tools for systems analysis. However, models of complex systems are based on a network of inconsistent, space and uncertain knowledge items. Traditional quantitative methods of chaos analysis are therefore not applicable. The paper by the same authors [Vicha T, Dohnal M. Qualitative identification of chaotic systems behaviours. Chaos, Solitons and Fractals, in press, [Log. No. 601019] ] presents qualitative interpretation of some chaos concepts. There are only three qualitative values positive/increasing, negative/decreasing and zero/constant. It means that any set of qualitative multidimensional descriptions of unsteady state behaviours is discrete and finite. A finite upper limit exists for the total number of qualitatively distinguishable scenarios. A set of 21 published chaotic models is solved qualitatively and 21 sets of all existing qualitative scenarios are presented. The intersection of all 21 scenario sets is empty. There is no such a behaviour which is common for all 21 models. The set of 21 qualitative models (e.g. Lorenz, Roessler) can be used to compare chaotic behaviours of an unknown qualitative model with them to evaluate if its chaotic behaviours is close to e.g. Lorenz chaotic model and how much

  19. Measurement of cerebral blood flow by intravenous xenon-133 technique and a mobile system. Reproducibility using the Obrist model compared to total curve analysis

    DEFF Research Database (Denmark)

    Schroeder, T; Holstein, P; Lassen, N A

    1986-01-01

    and side-to-side asymmetry. Data were analysed according to the Obrist model and the results compared with those obtained using a model correcting for the air passage artifact. Reproducibility was of the same order of magnitude as reported using stationary equipment. The side-to-side CBF asymmetry...... was considerably more reproducible than CBF level. Using a single detector instead of five regional values averaged as the hemispheric flow increased standard deviation of CBF level by 10-20%, while the variation in asymmetry was doubled. In optimal measuring conditions the two models revealed no significant...... differences, but in low flow situations the artifact model yielded significantly more stable results. The present apparatus, equipped with 3-5 detectors covering each hemisphere, offers the opportunity of performing serial CBF measurements in situations not otherwise feasible....

  20. Qualitative and quantitative combined nonlinear dynamics model and its application in analysis of price, supply–demand ratio and selling rate

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    The qualitative and quantitative combined nonlinear dynamics model proposed in this paper fill the gap in nonlinear dynamics model in terms of qualitative and quantitative combined methods, allowing the qualitative model and quantitative model to perfectly combine and overcome their weaknesses by learning from each other. These two types of models use their strengths to make up for the other’s deficiencies. The qualitative and quantitative combined models can surmount the weakness that the qualitative model cannot be applied and verified in a quantitative manner, and the high costs and long time of multiple construction as well as verification of the quantitative model. The combined model is more practical and efficient, which is of great significance for nonlinear dynamics. The qualitative and quantitative combined modeling and model analytical method raised in this paper is not only applied to nonlinear dynamics, but can be adopted and drawn on in the modeling and model analysis of other fields. Additionally, the analytical method of qualitative and quantitative combined nonlinear dynamics model proposed in this paper can satisfactorily resolve the problems with the price system’s existing nonlinear dynamics model analytical method. The three-dimensional dynamics model of price, supply–demand ratio and selling rate established in this paper make estimates about the best commodity prices using the model results, thereby providing a theoretical basis for the government’s macro-control of price. Meanwhile, this model also offer theoretical guidance to how to enhance people’s purchasing power and consumption levels through price regulation and hence to improve people’s living standards.

  1. A stable and reproducible human blood-brain barrier model derived from hematopoietic stem cells.

    Directory of Open Access Journals (Sweden)

    Romeo Cecchelli

    Full Text Available The human blood brain barrier (BBB is a selective barrier formed by human brain endothelial cells (hBECs, which is important to ensure adequate neuronal function and protect the central nervous system (CNS from disease. The development of human in vitro BBB models is thus of utmost importance for drug discovery programs related to CNS diseases. Here, we describe a method to generate a human BBB model using cord blood-derived hematopoietic stem cells. The cells were initially differentiated into ECs followed by the induction of BBB properties by co-culture with pericytes. The brain-like endothelial cells (BLECs express tight junctions and transporters typically observed in brain endothelium and maintain expression of most in vivo BBB properties for at least 20 days. The model is very reproducible since it can be generated from stem cells isolated from different donors and in different laboratories, and could be used to predict CNS distribution of compounds in human. Finally, we provide evidence that Wnt/β-catenin signaling pathway mediates in part the BBB inductive properties of pericytes.

  2. Prediction of lung tumour position based on spirometry and on abdominal displacement: Accuracy and reproducibility

    International Nuclear Information System (INIS)

    Hoisak, Jeremy D.P.; Sixel, Katharina E.; Tirona, Romeo; Cheung, Patrick C.F.; Pignol, Jean-Philippe

    2006-01-01

    Background and purpose: A simulation investigating the accuracy and reproducibility of a tumour motion prediction model over clinical time frames is presented. The model is formed from surrogate and tumour motion measurements, and used to predict the future position of the tumour from surrogate measurements alone. Patients and methods: Data were acquired from five non-small cell lung cancer patients, on 3 days. Measurements of respiratory volume by spirometry and abdominal displacement by a real-time position tracking system were acquired simultaneously with X-ray fluoroscopy measurements of superior-inferior tumour displacement. A model of tumour motion was established and used to predict future tumour position, based on surrogate input data. The calculated position was compared against true tumour motion as seen on fluoroscopy. Three different imaging strategies, pre-treatment, pre-fraction and intrafractional imaging, were employed in establishing the fitting parameters of the prediction model. The impact of each imaging strategy upon accuracy and reproducibility was quantified. Results: When establishing the predictive model using pre-treatment imaging, four of five patients exhibited poor interfractional reproducibility for either surrogate in subsequent sessions. Simulating the formulation of the predictive model prior to each fraction resulted in improved interfractional reproducibility. The accuracy of the prediction model was only improved in one of five patients when intrafractional imaging was used. Conclusions: Employing a prediction model established from measurements acquired at planning resulted in localization errors. Pre-fractional imaging improved the accuracy and reproducibility of the prediction model. Intrafractional imaging was of less value, suggesting that the accuracy limit of a surrogate-based prediction model is reached with once-daily imaging

  3. Diagnostic reasoning using qualitative causal models

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1992-01-01

    The application of expert systems to reasoning problems involving real-time data from plant measurements has been a topic of much research, but few practical systems have been deployed. One obstacle to wider use of expert systems in applications involving real-time data is the lack of adequate knowledge representation methodologies for dynamic processes. Knowledge bases composed mainly of rules have disadvantages when applied to dynamic processes and real-time data. This paper describes a methodology for the development of qualitative causal models that can be used as knowledge bases for reasoning about process dynamic behavior. These models provide a systematic method for knowledge base construction, considerably reducing the engineering effort required. They also offer much better opportunities for verification and validation of the knowledge base, thus increasing the possibility of the application of expert systems to reasoning about mission critical systems. Starting with the Signed Directed Graph (SDG) method that has been successfully applied to describe the behavior of diverse dynamic processes, the paper shows how certain non-physical behaviors that result from abstraction may be eliminated by applying causal constraint to the models. The resulting Extended Signed Directed Graph (ESDG) may then be compiled to produce a model for use in process fault diagnosis. This model based reasoning methodology is used in the MOBIAS system being developed by Duke Power Company under EPRI sponsorship. 15 refs., 4 figs

  4. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  5. A qualitative model of limiting factors for a salmon life cycle in the context of river rehabilitation

    NARCIS (Netherlands)

    Noble, R.A.A.; Bredeweg, B.; Linnebank, F.; Salles, P.; Cowx, I.G.

    2009-01-01

    Qualitative Reasoning modelling has been promoted as a tool for formalising, integrating and exploring conceptual knowledge in ecological systems, such as river rehabilitation, which draw different information from multiple domains. A qualitative model was developed in Garp3 to capture and formalise

  6. A simple spatiotemporal rabies model for skunk and bat interaction in northeast Texas.

    Science.gov (United States)

    Borchering, Rebecca K; Liu, Hao; Steinhaus, Mara C; Gardner, Carl L; Kuang, Yang

    2012-12-07

    We formulate a simple partial differential equation model in an effort to qualitatively reproduce the spread dynamics and spatial pattern of rabies in northeast Texas with overlapping reservoir species (skunks and bats). Most existing models ignore reservoir species or model them with patchy models by ordinary differential equations. In our model, we incorporate interspecies rabies infection in addition to rabid population random movement. We apply this model to the confirmed case data from northeast Texas with most parameter values obtained or computed from the literature. Results of simulations using both our skunk-only model and our skunk and bat model demonstrate that the model with overlapping reservoir species more accurately reproduces the progression of rabies spread in northeast Texas. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. The development of a qualitative dynamic attribute value model for healthcare institutes.

    Science.gov (United States)

    Lee, Wan-I

    2010-01-01

    Understanding customers has become an urgent topic for increasing competitiveness. The purpopse of the study was to develop a qualitative dynamic attribute value model which provides insight into the customers' value for healthcare institute managers by conducting the initial open-ended questionnaire survey to select participants purposefully. A total number of 427 questionnaires was conducted in two hospitals in Taiwan (one district hospital with 635 beds and one academic hospital with 2495 beds) and 419 questionnaires were received in nine weeks. Then, apply qualitative in-depth interviews to explore customers' perspective of values for building a model of partial differential equations. This study concludes nine categories of value, including cost, equipment, physician background, physicain care, environment, timing arrangement, relationship, brand image and additional value, to construct objective network for customer value and qualitative dynamic attribute value model where the network shows the value process of loyalty development via its effect on customer satisfaction, customer relationship, customer loyalty and healthcare service. One set predicts the customer relationship based on comminent, including service quality, communication and empahty. As the same time, customer loyalty based on trust, involves buzz marketing, brand and image. Customer value of the current instance is useful for traversing original customer attributes and identifing customers on different service share.

  8. Stochastic model of financial markets reproducing scaling and memory in volatility return intervals

    Science.gov (United States)

    Gontis, V.; Havlin, S.; Kononovicius, A.; Podobnik, B.; Stanley, H. E.

    2016-11-01

    We investigate the volatility return intervals in the NYSE and FOREX markets. We explain previous empirical findings using a model based on the interacting agent hypothesis instead of the widely-used efficient market hypothesis. We derive macroscopic equations based on the microscopic herding interactions of agents and find that they are able to reproduce various stylized facts of different markets and different assets with the same set of model parameters. We show that the power-law properties and the scaling of return intervals and other financial variables have a similar origin and could be a result of a general class of non-linear stochastic differential equations derived from a master equation of an agent system that is coupled by herding interactions. Specifically, we find that this approach enables us to recover the volatility return interval statistics as well as volatility probability and spectral densities for the NYSE and FOREX markets, for different assets, and for different time-scales. We find also that the historical S&P500 monthly series exhibits the same volatility return interval properties recovered by our proposed model. Our statistical results suggest that human herding is so strong that it persists even when other evolving fluctuations perturbate the financial system.

  9. A novel, comprehensive, and reproducible porcine model for determining the timing of bruises in forensic pathology

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2016-01-01

    Purpose Calculating the timing of bruises is crucial in forensic pathology but is a challenging discipline in both human and veterinary medicine. A mechanical device for inflicting bruises in pigs was developed and validated, and the pathological reactions in the bruises were studied over time......-dependent response. Combining these parameters, bruises could be grouped as being either less than 4 h old or between 4 and 10 h of age. Gross lesions and changes in the epidermis and dermis were inconclusive with respect to time determination. Conclusions The model was reproducible and resembled forensic cases...

  10. Highly reproducible and sensitive silver nanorod array for the rapid detection of Allura Red in candy

    Science.gov (United States)

    Yao, Yue; Wang, Wen; Tian, Kangzhen; Ingram, Whitney Marvella; Cheng, Jie; Qu, Lulu; Li, Haitao; Han, Caiqin

    2018-04-01

    Allura Red (AR) is a highly stable synthetic red azo dye, which is widely used in the food industry to dye food and increase its attraction to consumers. However, the excessive consumption of AR can result in adverse health effects to humans. Therefore, a highly reproducible silver nanorod (AgNR) array was developed for surface enhanced Raman scattering (SERS) detection of AR in candy. The relative standard deviation (RSD) of AgNR substrate obtained from the same batch and different batches were 5.7% and 11.0%, respectively, demonstrating the high reproducibility. Using these highly reproducible AgNR arrays as the SERS substrates, AR was detected successfully, and its characteristic peaks were assigned by the density function theory (DFT) calculation. The limit of detection (LOD) of AR was determined to be 0.05 mg/L with a wide linear range of 0.8-100 mg/L. Furthermore, the AgNR SERS arrays can detect AR directly in different candy samples within 3 min without any complicated pretreatment. These results suggest the AgNR array can be used for rapid and qualitative SERS detection of AR, holding a great promise for expanding SERS application in food safety control field.

  11. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  12. Repeatability and reproducibility of Population Viability Analysis (PVA and the implications for threatened species management

    Directory of Open Access Journals (Sweden)

    Clare Morrison

    2016-08-01

    Full Text Available Conservation triage focuses on prioritizing species, populations or habitats based on urgency, biodiversity benefits, recovery potential as well as cost. Population Viability Analysis (PVA is frequently used in population focused conservation prioritizations. The critical nature of many of these management decisions requires that PVA models are repeatable and reproducible to reliably rank species and/or populations quantitatively. This paper assessed the repeatability and reproducibility of a subset of previously published PVA models. We attempted to rerun baseline models from 90 publicly available PVA studies published between 2000-2012 using the two most common PVA modelling software programs, VORTEX and RAMAS-GIS. Forty percent (n = 36 failed, 50% (45 were both repeatable and reproducible, and 10% (9 had missing baseline models. Repeatability was not linked to taxa, IUCN category, PVA program version used, year published or the quality of publication outlet, suggesting that the problem is systemic within the discipline. Complete and systematic presentation of PVA parameters and results are needed to ensure that the scientific input into conservation planning is both robust and reliable, thereby increasing the chances of making decisions that are both beneficial and defensible. The implications for conservation triage may be far reaching if population viability models cannot be reproduced with confidence, thus undermining their intended value.

  13. On the origin of reproducible sequential activity in neural circuits

    Science.gov (United States)

    Afraimovich, V. S.; Zhigulin, V. P.; Rabinovich, M. I.

    2004-12-01

    Robustness and reproducibility of sequential spatio-temporal responses is an essential feature of many neural circuits in sensory and motor systems of animals. The most common mathematical images of dynamical regimes in neural systems are fixed points, limit cycles, chaotic attractors, and continuous attractors (attractive manifolds of neutrally stable fixed points). These are not suitable for the description of reproducible transient sequential neural dynamics. In this paper we present the concept of a stable heteroclinic sequence (SHS), which is not an attractor. SHS opens the way for understanding and modeling of transient sequential activity in neural circuits. We show that this new mathematical object can be used to describe robust and reproducible sequential neural dynamics. Using the framework of a generalized high-dimensional Lotka-Volterra model, that describes the dynamics of firing rates in an inhibitory network, we present analytical results on the existence of the SHS in the phase space of the network. With the help of numerical simulations we confirm its robustness in presence of noise in spite of the transient nature of the corresponding trajectories. Finally, by referring to several recent neurobiological experiments, we discuss possible applications of this new concept to several problems in neuroscience.

  14. Mouse Models of Diet-Induced Nonalcoholic Steatohepatitis Reproduce the Heterogeneity of the Human Disease

    Science.gov (United States)

    Machado, Mariana Verdelho; Michelotti, Gregory Alexander; Xie, Guanhua; de Almeida, Thiago Pereira; Boursier, Jerome; Bohnic, Brittany; Guy, Cynthia D.; Diehl, Anna Mae

    2015-01-01

    Background and aims Non-alcoholic steatohepatitis (NASH), the potentially progressive form of nonalcoholic fatty liver disease (NAFLD), is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD) diet and Western diet. Methods Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose) for 16 weeks. Liver pathology and metabolic profile were compared. Results The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation) was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation. Conclusion Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH. PMID:26017539

  15. Mouse models of diet-induced nonalcoholic steatohepatitis reproduce the heterogeneity of the human disease.

    Directory of Open Access Journals (Sweden)

    Mariana Verdelho Machado

    Full Text Available Non-alcoholic steatohepatitis (NASH, the potentially progressive form of nonalcoholic fatty liver disease (NAFLD, is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD diet and Western diet.Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose for 16 weeks. Liver pathology and metabolic profile were compared.The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation.Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH.

  16. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    Science.gov (United States)

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    osteosarcoma model was shown to be feasible: the take rate was high, surgical mortality was negligible and the procedure was simple to perform and easily reproduced. It may be a useful tool in the investigation of antiangiogenic and anticancer therapeutics. Ultrasound was found to be a highly accurate tool for tumor diagnosis, localization and measurement and may be recommended for monitoring tumor growth in this model.

  17. Reproducibility study of TLD-100 micro-cubes at radiotherapy dose level

    International Nuclear Information System (INIS)

    Rosa, Luiz Antonio R. da; Regulla, Dieter F.; Fill, Ute A.

    1999-01-01

    The precision of the thermoluminescent response of Harshaw micro-cube dosimeters (TLD-100), evaluated in both Harshaw thermoluminescent readers 5500 and 3500, for 1 Gy dose value, was investigated. The mean reproducibility for micro-cubes, pre-readout annealed at 100 deg. C for 15 min, evaluated with the manual planchet reader 3500, is 0.61% (1 standard deviation). When micro-cubes are evaluated with the automated hot-gas reader 5500, reproducibility values are undoubtedly worse, mean reproducibility for numerically stabilised dosimeters being equal to 3.27% (1 standard deviation). These results indicate that the reader model 5500, or, at least, the instrument used for the present measurements, is not adequate for micro-cube evaluation, if precise and accurate dosimetry is required. The difference in precision is apparently due to geometry inconsistencies in the orientation of the imperfect micro-cube faces during readout, requiring careful and manual reproducible arrangement of the selected micro-cube faces in contact with the manual reader planchet

  18. Global Qualitative Flow-Path Modeling for Local State Determination in Simulation and Analysis

    Science.gov (United States)

    Malin, Jane T. (Inventor); Fleming, Land D. (Inventor)

    1998-01-01

    For qualitative modeling and analysis, a general qualitative abstraction of power transmission variables (flow and effort) for elements of flow paths includes information on resistance, net flow, permissible directions of flow, and qualitative potential is discussed. Each type of component model has flow-related variables and an associated internal flow map, connected into an overall flow network of the system. For storage devices, the implicit power transfer to the environment is represented by "virtual" circuits that include an environmental junction. A heterogeneous aggregation method simplifies the path structure. A method determines global flow-path changes during dynamic simulation and analysis, and identifies corresponding local flow state changes that are effects of global configuration changes. Flow-path determination is triggered by any change in a flow-related device variable in a simulation or analysis. Components (path elements) that may be affected are identified, and flow-related attributes favoring flow in the two possible directions are collected for each of them. Next, flow-related attributes are determined for each affected path element, based on possibly conflicting indications of flow direction. Spurious qualitative ambiguities are minimized by using relative magnitudes and permissible directions of flow, and by favoring flow sources over effort sources when comparing flow tendencies. The results are output to local flow states of affected components.

  19. Towards a structured approach to building qualitative reasoning models and simulations

    NARCIS (Netherlands)

    Bredeweg, B.; Salles, P.; Bouwer, A.; Liem, J.; Nuttle, T.; Cioca, E.; Nakova, E.; Noble, R.; Caldas, A.L.R.; Uzunov, Y.; Varadinova, E.; Zitek, A.

    2008-01-01

    Successful transfer and uptake of qualitative reasoning technology for modelling and simulation in a variety of domains has been hampered by the lack of a structured methodology to support formalisation of ideas. We present a framework that structures and supports the capture of conceptual knowledge

  20. Getting added value from using qualitative research with randomized controlled trials: a qualitative interview study

    Science.gov (United States)

    2014-01-01

    Background Qualitative research is undertaken with randomized controlled trials of health interventions. Our aim was to explore the perceptions of researchers with experience of this endeavour to understand the added value of qualitative research to the trial in practice. Methods A telephone semi-structured interview study with 18 researchers with experience of undertaking the trial and/or the qualitative research. Results Interviewees described the added value of qualitative research for the trial, explaining how it solved problems at the pretrial stage, explained findings, and helped to increase the utility of the evidence generated by the trial. From the interviews, we identified three models of relationship of the qualitative research to the trial. In ‘the peripheral’ model, the trial was an opportunity to undertake qualitative research, with no intention that it would add value to the trial. In ‘the add-on’ model, the qualitative researcher understood the potential value of the qualitative research but it was viewed as a separate and complementary endeavour by the trial lead investigator and wider team. Interviewees described how this could limit the value of the qualitative research to the trial. Finally ‘the integral’ model played out in two ways. In ‘integral-in-theory’ studies, the lead investigator viewed the qualitative research as essential to the trial. However, in practice the qualitative research was under-resourced relative to the trial, potentially limiting its ability to add value to the trial. In ‘integral-in-practice’ studies, interviewees described how the qualitative research was planned from the beginning of the study, senior qualitative expertise was on the team from beginning to end, and staff and time were dedicated to the qualitative research. In these studies interviewees described the qualitative research adding value to the trial although this value was not necessarily visible beyond the original research team due

  1. Getting added value from using qualitative research with randomized controlled trials: a qualitative interview study.

    Science.gov (United States)

    O'Cathain, Alicia; Goode, Jackie; Drabble, Sarah J; Thomas, Kate J; Rudolph, Anne; Hewison, Jenny

    2014-06-09

    Qualitative research is undertaken with randomized controlled trials of health interventions. Our aim was to explore the perceptions of researchers with experience of this endeavour to understand the added value of qualitative research to the trial in practice. A telephone semi-structured interview study with 18 researchers with experience of undertaking the trial and/or the qualitative research. Interviewees described the added value of qualitative research for the trial, explaining how it solved problems at the pretrial stage, explained findings, and helped to increase the utility of the evidence generated by the trial. From the interviews, we identified three models of relationship of the qualitative research to the trial. In 'the peripheral' model, the trial was an opportunity to undertake qualitative research, with no intention that it would add value to the trial. In 'the add-on' model, the qualitative researcher understood the potential value of the qualitative research but it was viewed as a separate and complementary endeavour by the trial lead investigator and wider team. Interviewees described how this could limit the value of the qualitative research to the trial. Finally 'the integral' model played out in two ways. In 'integral-in-theory' studies, the lead investigator viewed the qualitative research as essential to the trial. However, in practice the qualitative research was under-resourced relative to the trial, potentially limiting its ability to add value to the trial. In 'integral-in-practice' studies, interviewees described how the qualitative research was planned from the beginning of the study, senior qualitative expertise was on the team from beginning to end, and staff and time were dedicated to the qualitative research. In these studies interviewees described the qualitative research adding value to the trial although this value was not necessarily visible beyond the original research team due to the challenges of publishing this research

  2. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    Science.gov (United States)

    2017-08-09

    project.org/) and SPSS (IBM Corp., Armonk, NY) for data analysis. Mean and confidence inter- vals for each measure are found in Tables 1–7. To assess...visits, and was calculated using a two- way mixed model in SPSS MCV and MRD values closer to 0 are considered to be the most reproducible, and ICC

  3. Evaluating quantitative and qualitative models: An application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  4. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  5. Relevant principal factors affecting the reproducibility of insect primary culture.

    Science.gov (United States)

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-06-01

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  6. Scenario development, qualitative causal analysis and system dynamics

    Directory of Open Access Journals (Sweden)

    Michael H. Ruge

    2009-02-01

    Full Text Available The aim of this article is to demonstrate that technology assessments can be supported by methods such as scenario modeling and qualitative causal analysis. At Siemens, these techniques are used to develop preliminary purely qualitative models. These or parts of these comprehensive models may be extended to system dynamics models. While it is currently not possible to automatically generate a system dynamics models (or vice versa, obtain a qualitative simulation model from a system dynamics model, the two thechniques scenario development and qualitative causal analysis provide valuable indications on how to proceed towards a system dynamics model. For the qualitative analysis phase, the Siemens – proprietary prototype Computer – Aided Technology Assessment Software (CATS supportes complete cycle and submodel analysis. Keywords: Health care, telecommucations, qualitative model, sensitivity analysis, system dynamics.

  7. Reproducible Hydrogeophysical Inversions through the Open-Source Library pyGIMLi

    Science.gov (United States)

    Wagner, F. M.; Rücker, C.; Günther, T.

    2017-12-01

    Many tasks in applied geosciences cannot be solved by a single measurement method and require the integration of geophysical, geotechnical and hydrological methods. In the emerging field of hydrogeophysics, researchers strive to gain quantitative information on process-relevant subsurface parameters by means of multi-physical models, which simulate the dynamic process of interest as well as its geophysical response. However, such endeavors are associated with considerable technical challenges, since they require coupling of different numerical models. This represents an obstacle for many practitioners and students. Even technically versatile users tend to build individually tailored solutions by coupling different (and potentially proprietary) forward simulators at the cost of scientific reproducibility. We argue that the reproducibility of studies in computational hydrogeophysics, and therefore the advancement of the field itself, requires versatile open-source software. To this end, we present pyGIMLi - a flexible and computationally efficient framework for modeling and inversion in geophysics. The object-oriented library provides management for structured and unstructured meshes in 2D and 3D, finite-element and finite-volume solvers, various geophysical forward operators, as well as Gauss-Newton based frameworks for constrained, joint and fully-coupled inversions with flexible regularization. In a step-by-step demonstration, it is shown how the hydrogeophysical response of a saline tracer migration can be simulated. Tracer concentration data from boreholes and measured voltages at the surface are subsequently used to estimate the hydraulic conductivity distribution of the aquifer within a single reproducible Python script.

  8. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  9. Downscaling SSPs in the GBM Delta - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila

    2016-04-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  10. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  11. Reproducing Epidemiologic Research and Ensuring Transparency.

    Science.gov (United States)

    Coughlin, Steven S

    2017-08-15

    Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Learning Qualitative Differential Equation models: a survey of algorithms and applications.

    Science.gov (United States)

    Pang, Wei; Coghill, George M

    2010-03-01

    Over the last two decades, qualitative reasoning (QR) has become an important domain in Artificial Intelligence. QDE (Qualitative Differential Equation) model learning (QML), as a branch of QR, has also received an increasing amount of attention; many systems have been proposed to solve various significant problems in this field. QML has been applied to a wide range of fields, including physics, biology and medical science. In this paper, we first identify the scope of this review by distinguishing QML from other QML systems, and then review all the noteworthy QML systems within this scope. The applications of QML in several application domains are also introduced briefly. Finally, the future directions of QML are explored from different perspectives.

  13. Qualitative Content Analysis

    OpenAIRE

    Philipp Mayring

    2000-01-01

    The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...

  14. Qualitative mathematics for the social sciences mathematical models for research on cultural dynamics

    CERN Document Server

    Rudolph, Lee

    2012-01-01

    In this book Lee Rudolph brings together international contributors who combine psychological and mathematical perspectives to analyse how qualitative mathematics can be used to create models of social and psychological processes. Bridging the gap between the fields with an imaginative and stimulating collection of contributed chapters, the volume updates the current research on the subject, which until now has been rather limited, focussing largely on the use of statistics. Qualitative Mathematics for the Social Sciences contains a variety of useful illustrative figures, in

  15. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  16. Properties of galaxies reproduced by a hydrodynamic simulation

    Science.gov (United States)

    Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.

  17. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  18. Reproducing the Wechsler Intelligence Scale for Children-Fifth Edition: Factor Model Results

    Science.gov (United States)

    Beaujean, A. Alexander

    2016-01-01

    One of the ways to increase the reproducibility of research is for authors to provide a sufficient description of the data analytic procedures so that others can replicate the results. The publishers of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) do not follow these guidelines when reporting their confirmatory factor…

  19. Qualitative identification of chaotic systems behaviours

    International Nuclear Information System (INIS)

    Vicha, T.; Dohnal, M.

    2008-01-01

    There are only three qualitative values positive, negative and zero. This means that there is a maximal number of qualitatively distinguishable scenarios, prescribed by the number of variables and the highest qualitative derivative taken into consideration. There are several chaos related tasks, which can be solved with great difficulties on the numerical level if multidimensional problems are studied. One of them is the identification of all qualitatively different behaviours. To make sure that all distinctive qualitative scenarios are identified a qualitative interpretation of a classical quantitative phase portrait is used. The highest derivatives are usually the second derivatives as it is not possible to safely identify higher derivatives if tasks related to ecology or economics are studied. Two classical models are discussed - Damped oscillation (non chaotic) and Lorenz model (chaotic). There are 191 scenarios of the Lorenz model if only the second derivatives are considered. If the third derivatives are taken into consideration then the number of scenarios is 2619. Complete qualitative results are given in details

  20. Criminalisation of clients: reproducing vulnerabilities for violence and poor health among street-based sex workers in Canada—a qualitative study

    Science.gov (United States)

    Krüsi, A; Pacey, K; Bird, L; Taylor, C; Chettiar, J; Allan, S; Bennett, D; Montaner, J S; Kerr, T; Shannon, K

    2014-01-01

    Objectives To explore how criminalisation and policing of sex buyers (clients) rather than sex workers shapes sex workers’ working conditions and sexual transactions including risk of violence and HIV/sexually transmitted infections (STIs). Design Qualitative and ethnographic study triangulated with sex work-related violence prevalence data and publicly available police statistics. Setting Vancouver, Canada, provides a unique opportunity to evaluate the impact of policies that criminalise clients as the local police department adopted a sex work enforcement policy in January 2013 that prioritises sex workers’ safety over arrest, while continuing to target clients. Participants 26 cisgender and 5 transgender women who were street-based sex workers (n=31) participated in semistructured interviews about their working conditions. All had exchanged sex for money in the previous 30 days in Vancouver. Outcome measures Thematic analysis of interview transcripts and ethnographic field notes focused on how police enforcement of clients shaped sex workers’ working conditions and sexual transactions, including risk of violence and HIV/STIs, over an 11-month period postpolicy implementation (January–November 2013). Results Sex workers’ narratives and ethnographic observations indicated that while police sustained a high level of visibility, they eased charging or arresting sex workers and showed increased concern for their safety. However, participants’ accounts and police statistics indicated continued police enforcement of clients. This profoundly impacted the safety strategies sex workers employed. Sex workers continued to mistrust police, had to rush screening clients and were displaced to outlying areas with increased risks of violence, including being forced to engage in unprotected sex. Conclusions These findings suggest that criminalisation and policing strategies that target clients reproduce the harms created by the criminalisation of sex work, in

  1. A Proposed Model of Retransformed Qualitative Data within a Mixed Methods Research Design

    Science.gov (United States)

    Palladino, John M.

    2009-01-01

    Most models of mixed methods research design provide equal emphasis of qualitative and quantitative data analyses and interpretation. Other models stress one method more than the other. The present article is a discourse about the investigator's decision to employ a mixed method design to examine special education teachers' advocacy and…

  2. Enriched reproducing kernel particle method for fractional advection-diffusion equation

    Science.gov (United States)

    Ying, Yuping; Lian, Yanping; Tang, Shaoqiang; Liu, Wing Kam

    2018-06-01

    The reproducing kernel particle method (RKPM) has been efficiently applied to problems with large deformations, high gradients and high modal density. In this paper, it is extended to solve a nonlocal problem modeled by a fractional advection-diffusion equation (FADE), which exhibits a boundary layer with low regularity. We formulate this method on a moving least-square approach. Via the enrichment of fractional-order power functions to the traditional integer-order basis for RKPM, leading terms of the solution to the FADE can be exactly reproduced, which guarantees a good approximation to the boundary layer. Numerical tests are performed to verify the proposed approach.

  3. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  4. Teaching Qualitative Research for Human Services Students: A Three-Phase Model

    Science.gov (United States)

    Goussinsky, Ruhama; Reshef, Arie; Yanay-Ventura, Galit; Yassour-Borochowitz, Dalit

    2011-01-01

    Qualitative research is an inherent part of the human services profession, since it emphasizes the great and multifaceted complexity characterizing human experience and the sociocultural context in which humans act. In the department of human services at Emek Yezreel College, Israel, we have developed a three-phase model to ensure a relatively…

  5. Low-frequency Raman spectra of sub- and supercritical CO2: qualitative analysis of the diffusion coefficient behavior.

    Science.gov (United States)

    Idrissi, A; Longelin, S; Damay, P; Leclercq, F

    2005-09-01

    We report the results of the low-frequency Raman experiments on CO(2) which were carried out in a wide density range, along the liquid-gas coexistence curve in a temperature range of 293-303 K, and on the critical isochore of 94.4 cm(3) mol(-1) in a temperature range of 304-315 K. In our approach, the qualitative behavior of the diffusion coefficient D is predicted, assuming the following: first, that the low-frequency Raman spectra can be interpreted in terms of the translation rotation motions; second, that the random force could be replaced by the total force to calculate the friction coefficient; and finally, that the Einstein frequency is associated with the position of the maximum of the low-frequency Raman spectrum. The results show that the diffusion coefficient increases along the coexistence curve, and its values are almost constant on the critical isochore. The predicted values reproduce qualitatively those obtained by other techniques. The values of D were also calculated by molecular-dynamics simulation and they qualitatively reproduce the behavior of D.

  6. Entangled states that cannot reproduce original classical games in their quantum version

    International Nuclear Information System (INIS)

    Shimamura, Junichi; Oezdemir, S.K.; Morikoshi, Fumiaki; Imoto, Nobuyuki

    2004-01-01

    A model of a quantum version of classical games should reproduce the original classical games in order to be able to make a comparative analysis of quantum and classical effects. We analyze a class of symmetric multipartite entangled states and their effect on the reproducibility of the classical games. We present the necessary and sufficient condition for the reproducibility of the original classical games. Satisfying this condition means that complete orthogonal bases can be constructed from a given multipartite entangled state provided that each party is restricted to two local unitary operators. We prove that most of the states belonging to the class of symmetric states with respect to permutations, including the N-qubit W state, do not satisfy this condition

  7. Ecological Applications of Qualitative Reasoning

    NARCIS (Netherlands)

    Bredeweg, B.; Salles, P.; Neumann, M.; Recknagel, F.

    2006-01-01

    Representing qualitative ecological knowledge is of great interest for ecological modelling. QR provides means to build conceptual models and to make qualitative knowledge explicit, organized and manageable by means of symbolic computing. This chapter discusses the main characteristics of QR using

  8. Biodiversity and soil quality in agroecosystems: the use of a qualitative multi-attribute model

    DEFF Research Database (Denmark)

    Cortet, J.; Bohanec, M.; Griffiths, B.

    2009-01-01

    In ecological impact assessment, special emphasis is put on soil biology and estimating soil quality from the observed biological parameters. The aim of this study is to propose a tool easy to use for scientists and decision makers for agroecosystems soil quality assessment using these biological...... parameters. This tool was developed as a collaboration between ECOGEN (www.ecogen.dk) soil experts and decision analysts. Methodologically, we have addressed this goal using model-based Decision Support Systems (DSS), taking the approach of qualitative multi-attribute modelling. The approach is based...... on developing various hierarchical multiattribute models that consist of qualitative attributes and utility (aggregation) functions, represented by decision rules. The assessment of soil quality is based on two main indicators: (1) soil diversity (assessed through microfauna, mesofauna and macrofauna richness...

  9. EMC3-EIRENE modeling of toroidally-localized divertor gas injection experiments on Alcator C-Mod

    Energy Technology Data Exchange (ETDEWEB)

    Lore, J.D., E-mail: lorejd@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Reinke, M.L. [York Plasma Institute, Department of Physics, University of York, Heslington, York YO10 5DD (United Kingdom); LaBombard, B. [Plasma Science and Fusion Center, MIT, Cambridge, MA 02139 (United States); Lipschultz, B. [York Plasma Institute, Department of Physics, University of York, Heslington, York YO10 5DD (United Kingdom); Churchill, R.M. [Plasma Science and Fusion Center, MIT, Cambridge, MA 02139 (United States); Pitts, R.A. [ITER Organization, Route de Vinon sur Verdon, 13115 Saint Paul Lez Durance (France); Feng, Y. [Max Planck Institute for Plasma Physics, Greifswald (Germany)

    2015-08-15

    Experiments on Alcator C-Mod with toroidally and poloidally localized divertor nitrogen injection have been modeled using the three-dimensional edge transport code EMC3-EIRENE to elucidate the mechanisms driving measured toroidal asymmetries. In these experiments five toroidally distributed gas injectors in the private flux region were sequentially activated in separate discharges resulting in clear evidence of toroidal asymmetries in radiated power and nitrogen line emission as well as a ∼50% toroidal modulation in electron pressure at the divertor target. The pressure modulation is qualitatively reproduced by the modeling, with the simulation yielding a toroidal asymmetry in the heat flow to the outer strike point. Toroidal variation in impurity line emission is qualitatively matched in the scrape-off layer above the strike point, however kinetic corrections and cross-field drifts are likely required to quantitatively reproduce impurity behavior in the private flux region and electron temperatures and densities directly in front of the target.

  10. Efficient and reproducible myogenic differentiation from human iPS cells: prospects for modeling Miyoshi Myopathy in vitro.

    Directory of Open Access Journals (Sweden)

    Akihito Tanaka

    Full Text Available The establishment of human induced pluripotent stem cells (hiPSCs has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1 in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70-90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF. These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs.

  11. Efficient and Reproducible Myogenic Differentiation from Human iPS Cells: Prospects for Modeling Miyoshi Myopathy In Vitro

    Science.gov (United States)

    Tanaka, Akihito; Woltjen, Knut; Miyake, Katsuya; Hotta, Akitsu; Ikeya, Makoto; Yamamoto, Takuya; Nishino, Tokiko; Shoji, Emi; Sehara-Fujisawa, Atsuko; Manabe, Yasuko; Fujii, Nobuharu; Hanaoka, Kazunori; Era, Takumi; Yamashita, Satoshi; Isobe, Ken-ichi; Kimura, En; Sakurai, Hidetoshi

    2013-01-01

    The establishment of human induced pluripotent stem cells (hiPSCs) has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1) in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70–90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM) is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF). These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs. PMID:23626698

  12. How well can DFT reproduce key interactions in Ziegler-Natta systems?

    KAUST Repository

    Correa, Andrea; Bahri-Laleh, Naeimeh; Cavallo, Luigi

    2013-01-01

    The performance of density functional theory in reproducing some of the main interactions occurring in MgCl2-supported Ziegler-Natta catalytic systems is assessed. Eight model systems, representatives of key interactions occurring in Ziegler

  13. Opening up the black box: an introduction to qualitative research methods in anaesthesia.

    Science.gov (United States)

    Shelton, C L; Smith, A F; Mort, M

    2014-03-01

    Qualitative research methods are a group of techniques designed to allow the researcher to understand phenomena in their natural setting. A wide range is used, including focus groups, interviews, observation, and discourse analysis techniques, which may be used within research approaches such as grounded theory or ethnography. Qualitative studies in the anaesthetic setting have been used to define excellence in anaesthesia, explore the reasons behind drug errors, investigate the acquisition of expertise and examine incentives for hand-hygiene in the operating theatre. Understanding how and why people act the way they do is essential for the advancement of anaesthetic practice, and rigorous, well-designed qualitative research can generate useful data and important insights. Meticulous social scientific methods, transparency, reproducibility and reflexivity are markers of quality in qualitative research. Tools such as the consolidated criteria for reporting qualitative research checklist and the critical appraisal skills programme are available to help authors, reviewers and readers unfamiliar with qualitative research assess its merits. © 2013 The Association of Anaesthetists of Great Britain and Ireland.

  14. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  15. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  16. Reproducibility of central lumbar vertebral BMD

    International Nuclear Information System (INIS)

    Chan, F.; Pocock, N.; Griffiths, M.; Majerovic, Y.; Freund, J.

    1997-01-01

    Full text: Lumbar vertebral bone mineral density (BMD) using dual X-ray absorptiometry (DXA) has generally been calculated from a region of interest which includes the entire vertebral body. Although this region excludes part of the transverse processes, it does include the outer cortical shell of the vertebra. Recent software has been devised to calculate BMD in a central vertebral region of interest which excludes the outer cortical envelope. Theoretically this area may be more sensitive to detecting osteoporosis which affects trabecular bone to a greater extent than cortical bone. Apart from the sensitivity of BMD estimation, the reproducibility of any measurement is important owing to the slow rate of change of bone mass. We have evaluated the reproducibility of this new vertebral region of interest in 23 women who had duplicate lumbar spine DXA scans performed on the same day. The patients were repositioned between each measurement. Central vertebral analysis was performed for L2-L4 and the reproducibility of area, bone mineral content (BMC) and BMD calculated as the coefficient of variation; these values were compared with those from conventional analysis. Thus we have shown that the reproducibility of the central BMD is comparable to the conventional analysis which is essential if this technique is to provide any additional clinical data. The reasons for the decrease in reproducibility of the area and hence BMC requires further investigation

  17. Three-dimensional surgical modelling with an open-source software protocol: study of precision and reproducibility in mandibular reconstruction with the fibula free flap.

    Science.gov (United States)

    Ganry, L; Quilichini, J; Bandini, C M; Leyder, P; Hersant, B; Meningaud, J P

    2017-08-01

    Very few surgical teams currently use totally independent and free solutions to perform three-dimensional (3D) surgical modelling for osseous free flaps in reconstructive surgery. This study assessed the precision and technical reproducibility of a 3D surgical modelling protocol using free open-source software in mandibular reconstruction with fibula free flaps and surgical guides. Precision was assessed through comparisons of the 3D surgical guide to the sterilized 3D-printed guide, determining accuracy to the millimetre level. Reproducibility was assessed in three surgical cases by volumetric comparison to the millimetre level. For the 3D surgical modelling, a difference of less than 0.1mm was observed. Almost no deformations (free flap modelling was between 0.1mm and 0.4mm, and the average precision of the complete reconstructed mandible was less than 1mm. The open-source software protocol demonstrated high accuracy without complications. However, the precision of the surgical case depends on the surgeon's 3D surgical modelling. Therefore, surgeons need training on the use of this protocol before applying it to surgical cases; this constitutes a limitation. Further studies should address the transfer of expertise. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  18. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  19. Timbral aspects of reproduced sound in small rooms. I

    DEFF Research Database (Denmark)

    Bech, Søren

    1995-01-01

    , has been simulated using an electroacoustic setup. The model included the direct sound, 17 individual reflections, and the reverberant field. The threshold of detection and just-noticeable differences for an increase in level were measured for individual reflections using eight subjects for noise......This paper reports some of the influences of individual reflections on the timbre of reproduced sound. A single loudspeaker with frequency-independent directivity characteristics, positioned in a listening room of normal size with frequency-independent absorption coefficients of the room surfaces...... and speech. The results have shown that the first-order floor and ceiling reflections are likely to individually contribute to the timbre of reproduced speech. For a noise signal, additional reflections from the left sidewall will contribute individually. The level of the reverberant field has been found...

  20. A conceptual framework to model long-run qualitative change in the energy system

    OpenAIRE

    Ebersberger, Bernd

    2004-01-01

    A conceptual framework to model long-run qualitative change in the energy system / A. Pyka, B. Ebersberger, H. Hanusch. - In: Evolution and economic complexity / ed. J. Stanley Metcalfe ... - Cheltenham [u.a.] : Elgar, 2004. - S. 191-213

  1. A qualitative model of the salmon life cycle in the context of river rehabilitation

    NARCIS (Netherlands)

    Noble, R.A.A.; Bredeweg, B.; Linnebank, F.; Salles, P.; Cowx, I.G.; Žabkar, J.; Bratko, I.

    2009-01-01

    A qualitative model was developed in Garp3 to capture and formalise knowledge about river rehabilitation and the management of an Atlantic salmon population. The model integrates information about the ecology of the salmon life cycle, the environmental factors that may limit the survival of key life

  2. Scientific Reproducibility in Biomedical Research: Provenance Metadata Ontology for Semantic Annotation of Study Description.

    Science.gov (United States)

    Sahoo, Satya S; Valdez, Joshua; Rueschman, Michael

    2016-01-01

    Scientific reproducibility is key to scientific progress as it allows the research community to build on validated results, protect patients from potentially harmful trial drugs derived from incorrect results, and reduce wastage of valuable resources. The National Institutes of Health (NIH) recently published a systematic guideline titled "Rigor and Reproducibility " for supporting reproducible research studies, which has also been accepted by several scientific journals. These journals will require published articles to conform to these new guidelines. Provenance metadata describes the history or origin of data and it has been long used in computer science to capture metadata information for ensuring data quality and supporting scientific reproducibility. In this paper, we describe the development of Provenance for Clinical and healthcare Research (ProvCaRe) framework together with a provenance ontology to support scientific reproducibility by formally modeling a core set of data elements representing details of research study. We extend the PROV Ontology (PROV-O), which has been recommended as the provenance representation model by World Wide Web Consortium (W3C), to represent both: (a) data provenance, and (b) process provenance. We use 124 study variables from 6 clinical research studies from the National Sleep Research Resource (NSRR) to evaluate the coverage of the provenance ontology. NSRR is the largest repository of NIH-funded sleep datasets with 50,000 studies from 36,000 participants. The provenance ontology reuses ontology concepts from existing biomedical ontologies, for example the Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), to model the provenance information of research studies. The ProvCaRe framework is being developed as part of the Big Data to Knowledge (BD2K) data provenance project.

  3. Why are models unable to reproduce multi-decadal trends in lower tropospheric baseline ozone levels?

    Science.gov (United States)

    Hu, L.; Liu, J.; Mickley, L. J.; Strahan, S. E.; Steenrod, S.

    2017-12-01

    Assessments of tropospheric ozone radiative forcing rely on accurate model simulations. Parrish et al (2014) found that three chemistry-climate models (CCMs) overestimate present-day O3 mixing ratios and capture only 50% of the observed O3 increase over the last five decades at 12 baseline sites in the northern mid-latitudes, indicating large uncertainties in our understanding of the ozone trends and their implications for radiative forcing. Here we present comparisons of outputs from two chemical transport models (CTMs) - GEOS-Chem and the Global Modeling Initiative model - with O3 observations from the same sites and from the global ozonesonde network. Both CTMs are driven by reanalysis meteorological data (MERRA or MERRA2) and thus are expected to be different in atmospheric transport processes relative to those freely running CCMs. We test whether recent model developments leading to more active ozone chemistry affect the computed ozone sensitivity to perturbations in emissions. Preliminary results suggest these CTMs can reproduce present-day ozone levels but fail to capture the multi-decadal trend since 1980. Both models yield widespread overpredictions of free tropospheric ozone in the 1980s. Sensitivity studies in GEOS-Chem suggest that the model estimate of natural background ozone is too high. We discuss factors that contribute to the variability and trends of tropospheric ozone over the last 30 years, with a focus on intermodel differences in spatial resolution and in the representation of stratospheric chemistry, stratosphere-troposphere exchange, halogen chemistry, and biogenic VOC emissions and chemistry. We also discuss uncertainty in the historical emission inventories used in models, and how these affect the simulated ozone trends.

  4. How to Use Qualitative Analysis to Support a DSA

    International Nuclear Information System (INIS)

    Coutts, D.A.

    2003-01-01

    The use of judgement-based analyses that produces qualitative results can be a very effective method to demonstrate the safety posture of a nuclear facility. Such methodologies are recognized as appropriate through the graded approach established by the 10 CFR 830, Nuclear Safety Management and DOE-STD-3009. To successfully implement judgement-based analysis requires recognition of the uncertainties and biases that may be inherent with this approach. This paper will summarize the common errors that can occur when conducting judgement-based analyses and recommend techniques to improve the reproducibility and accuracy of such qualitative analyses. This paper will examine some of the Apparent and Not-So-Apparent Weaknesses associated with expert judgement and how to minimize these weaknesses. Examples related to the development of Documented Safety Analyses will be presented

  5. On the Bengtsson-Frauendorf cranked-quasiparticle model

    International Nuclear Information System (INIS)

    Pal, K.F.; Nagarajan, M.A.; Rowley, N.

    1989-01-01

    The cranked-quasiparticle model of Bengtsson and Frauendorf (non-self-consistent HFB) is compared with some exact calculations of particles moving in a cranked, deformed mean field but interacting via rotationally invariant two-body forces. In order to make the exact calculations manageable, a single shell is used but despite this small basis the quasiparticle model is shown to have a high degree of success. The usual choice of pair gap is discussed and shown to be good. The general structures of band crossings in the exact calculations are well reproduced and some crossing frequencies are given quantitatively though the odd-particle systems require blocking. Interaction strengths are not well reproduced though some qualitative features, e.g. oscillations, are obtained. These interactions are generally underestimated, an effect which causes the HFB yrast band to behave less collectively than it should. (orig.)

  6. Disease management projects and the Chronic CareModel in action: Baseline qualitative research

    NARCIS (Netherlands)

    B.J. Hipple Walters (Bethany); S.A. Adams (Samantha); A.P. Nieboer (Anna); R.A. Bal (Roland)

    2012-01-01

    textabstractBackground: Disease management programs, especially those based on the Chronic Care Model (CCM),are increasingly common in the Netherlands. While disease management programs have beenwell-researched quantitatively and economically, less qualitative research has been done. Theoverall aim

  7. Inter- and intra-laboratory study to determine the reproducibility of toxicogenomics datasets.

    Science.gov (United States)

    Scott, D J; Devonshire, A S; Adeleye, Y A; Schutte, M E; Rodrigues, M R; Wilkes, T M; Sacco, M G; Gribaldo, L; Fabbri, M; Coecke, S; Whelan, M; Skinner, N; Bennett, A; White, A; Foy, C A

    2011-11-28

    The application of toxicogenomics as a predictive tool for chemical risk assessment has been under evaluation by the toxicology community for more than a decade. However, it predominately remains a tool for investigative research rather than for regulatory risk assessment. In this study, we assessed whether the current generation of microarray technology in combination with an in vitro experimental design was capable of generating robust, reproducible data of sufficient quality to show promise as a tool for regulatory risk assessment. To this end, we designed a prospective collaborative study to determine the level of inter- and intra-laboratory reproducibility between three independent laboratories. All test centres (TCs) adopted the same protocols for all aspects of the toxicogenomic experiment including cell culture, chemical exposure, RNA extraction, microarray data generation and analysis. As a case study, the genotoxic carcinogen benzo[a]pyrene (B[a]P) and the human hepatoma cell line HepG2 were used to generate three comparable toxicogenomic data sets. High levels of technical reproducibility were demonstrated using a widely employed gene expression microarray platform. While differences at the global transcriptome level were observed between the TCs, a common subset of B[a]P responsive genes (n=400 gene probes) was identified at all TCs which included many genes previously reported in the literature as B[a]P responsive. These data show promise that the current generation of microarray technology, in combination with a standard in vitro experimental design, can produce robust data that can be generated reproducibly in independent laboratories. Future work will need to determine whether such reproducible in vitro model(s) can be predictive for a range of toxic chemicals with different mechanisms of action and thus be considered as part of future testing regimes for regulatory risk assessment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  8. Qualitative Knowledge Representations for Intelligent Nuclear Power Plants

    International Nuclear Information System (INIS)

    Cha, Kyoungho; Huh, Young H.

    1993-01-01

    Qualitative Physics(QP) has systematically been approached to qualitative modeling of physical systems for recent two decades. Designing intelligent systems for NPP requires an efficient representation of qualitative knowledge about the behavior and structure of NPP or its components. A novel representation of qualitative knowledge also enables intelligent systems to derive meaningful conclusions from incomplete or uncertain knowledge of a plant behavior. We look mainly into representative QP works on nuclear applications and the representation of qualitative knowledge for the diagnostic model, the qualitative simulation of a mental model of NPP operator, and the qualitative interpretation of the measured raw data from NPP. We present the challenging areas for QP applications in nuclear industry. QP technology will make NPP more intelligent

  9. Reproducibility of radionuclide gastroesophageal reflux studies using quantitative parameters and potential role of quantitative assessment in follow-up

    International Nuclear Information System (INIS)

    Fatima, S.; Khursheed, K.; Nasir, W.; Saeed, M.A.; Fatmi, S.; Jafri, S.; Asghar, S.

    2004-01-01

    Radionuclide gastroesophageal reflux studies have been widely used in the assessment of gastroesophageal reflux disease (GERD) in infants and children. Various qualitative and quantitative parameters have been used for the interpretation of reflux studies but there is little consensus on the use of these parameters in routine gastroesophageal reflux scintigraphic studies. Aim of this study was to evaluate the methodological issues underlying the qualitative and quantitative assessment of gastroesophageal reflux and to determine the potential power of the reflux index calculation in follow-up assessment of the reflux positive patients. Methods: Total 147 patients suffering from recurrent lower respiratory tract infection, asthma and having strong clinical suspicion of GER were recruited in the study. Dynamic scintigraphic study was acquired for 30 minutes after oral administration of 99mTc phytate. Each study was analyzed three times by two nuclear medicine physicians. Clinical symptoms were graded according to predefined criteria and there correlation with severity reflux was done. Time activity curves were generated by drawing ROIs from esophagus. Reflux index was calculated by the standard formula and cut off value of 4% was used for RI calculation. Reflux indices were used for follow-up assessments in reflux positive patients. Kappa statistics and chi square test were used to evaluate the agreement and concordance between qualitative and quantitative parameters. Results: Tlae over all incidence of reflux in total study population was 63.94 %( 94 patients). The kappa value for both qualitative and quantitative parameters showed good agreement for intra and inter-observer reproducibility (kappa value > 0.75). Concordance between visual analysis and time activity curves was not observed. Reflux index and visuat interpretation shows concordance in the interpretation. The severity of clinical symptoms was directly related to the severity of the reflux observed in the

  10. On the solutions of electrohydrodynamic flow with fractional differential equations by reproducing kernel method

    Directory of Open Access Journals (Sweden)

    Akgül Ali

    2016-01-01

    Full Text Available In this manuscript we investigate electrodynamic flow. For several values of the intimate parameters we proved that the approximate solution depends on a reproducing kernel model. Obtained results prove that the reproducing kernel method (RKM is very effective. We obtain good results without any transformation or discretization. Numerical experiments on test examples show that our proposed schemes are of high accuracy and strongly support the theoretical results.

  11. A CFBPN Artificial Neural Network Model for Educational Qualitative Data Analyses: Example of Students' Attitudes Based on Kellerts' Typologies

    Science.gov (United States)

    Yorek, Nurettin; Ugulu, Ilker

    2015-01-01

    In this study, artificial neural networks are suggested as a model that can be "trained" to yield qualitative results out of a huge amount of categorical data. It can be said that this is a new approach applied in educational qualitative data analysis. In this direction, a cascade-forward back-propagation neural network (CFBPN) model was…

  12. Universal free school breakfast: a qualitative model for breakfast behaviors

    Directory of Open Access Journals (Sweden)

    Louise eHarvey-Golding

    2015-06-01

    Full Text Available In recent years the provision of school breakfast has increased significantly in the UK. However, research examining the effectiveness of school breakfast is still within relative stages of infancy, and findings to date have been rather mixed. Moreover, previous evaluations of school breakfast schemes have been predominantly quantitative in their methodologies. Presently there are few qualitative studies examining the subjective perceptions and experiences of stakeholders, and thereby an absence of knowledge regarding the sociocultural impacts of school breakfast. The purpose of this study was to investigate the beliefs, views and attitudes, and breakfast consumption behaviors, among key stakeholders, served by a council-wide universal free school breakfast initiative, within the North West of England, UK. A sample of children, parents and school staff were recruited from three primary schools, participating in the universal free school breakfast scheme, to partake in semi-structured interviews and small focus groups. A Grounded Theory analysis of the data collected identified a theoretical model of breakfast behaviors, underpinned by the subjective perceptions and experiences of these key stakeholders. The model comprises of three domains relating to breakfast behaviors, and the internal and external factors that are perceived to influence breakfast behaviors, among children, parents and school staff. Findings were validated using triangulation methods, member checks and inter-rater reliability measures. In presenting this theoretically grounded model for breakfast behaviors, this paper provides a unique qualitative insight into the breakfast consumption behaviors and barriers to breakfast consumption, within a socioeconomically deprived community, participating in a universal free school breakfast intervention program.

  13. A 40-year accumulation dataset for Adelie Land, Antarctica and its application for model validation

    Energy Technology Data Exchange (ETDEWEB)

    Agosta, Cecile; Favier, Vincent [UJF-Grenoble 1 / CNRS, Laboratoire de Glaciologie et de Geophysique de l' Environnement UMR 5183, Saint Martin d' Heres (France); Genthon, Christophe; Gallee, Hubert; Krinner, Gerhard [CNRS / UJF-Grenoble 1, Laboratoire de Glaciologie et de Geophysique de l' Environnement UMR 5183, Saint Martin d' Heres (France); Lenaerts, Jan T.M.; Broeke, Michiel R. van den [Utrecht University, Institute for Marine and Atmospheric Research Utrecht (Netherlands)

    2012-01-15

    The GLACIOCLIM-SAMBA (GS) Antarctic accumulation monitoring network, which extends from the coast of Adelie Land to the Antarctic plateau, has been surveyed annually since 2004. The network includes a 156-km stake-line from the coast inland, along which accumulation shows high spatial and interannual variability with a mean value of 362 mm water equivalent a{sup -1}. In this paper, this accumulation is compared with older accumulation reports from between 1971 and 1991. The mean and annual standard deviation and the km-scale spatial pattern of accumulation were seen to be very similar in the older and more recent data. The data did not reveal any significant accumulation trend over the last 40 years. The ECMWF analysis-based forecasts (ERA-40 and ERA-Interim), a stretched-grid global general circulation model (LMDZ4) and three regional circulation models (PMM5, MAR and RACMO2), all with high resolution over Antarctica (27-125 km), were tested against the GS reports. They qualitatively reproduced the meso-scale spatial pattern of the annual-mean accumulation except MAR. MAR significantly underestimated mean accumulation, while LMDZ4 and RACMO2 overestimated it. ERA-40 and the regional models that use ERA-40 as lateral boundary condition qualitatively reproduced the chronology of interannual variability but underestimated the magnitude of interannual variations. Two widely used climatologies for Antarctic accumulation agreed well with the mean GS data. The model-based climatology was also able to reproduce the observed spatial pattern. These data thus provide new stringent constraints on models and other large-scale evaluations of the Antarctic accumulation. (orig.)

  14. Ratio-scaling of listener preference of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian

    2005-01-01

    -trivial assumption in the case of complex spatial sounds. In the present study the Bradley-Terry-Luce (BTL) model was employed to investigate the unidimensionality of preference judgments made by 40 listeners on multichannel reproduced sound. Short musical excerpts played back in eight reproduction modes (mono...... music). As a main result, the BTL model was found to predict the choice frequencies well. This implies that listeners were able to integrate the complex nature of the sounds into a unidimensional preference judgment. It further implies the existence of a preference scale on which the reproduction modes...

  15. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  16. Qualitative knowledge engineering for nuclear applications

    International Nuclear Information System (INIS)

    Kim, Jae H.; Kim, Ko R.; Lee, Jae C.

    1996-01-01

    After the TMI nuclear power plant accident, the two topics of plant safety and operational efficiency became more important areas of artificial intelligence, which have difference characteristics. Qualitative deep model is the recently prospective technology of AI, that can overcome several handicaps of the existing expert systems such as lack of common sense reasoning. The application of AI to the large and complex system like nuclear power plants is typically and effectively done through a module-based hierarchical system. As each module has to be built with suitable AI system. Through the experiences of hierarchical system construction, we aimed to develop basic AI application schemes for the power plant safety and operational efficiency as well as basic technologies for autonomous power plants. The goal of the research is to develop qualitative reasoning technologies for nuclear power plants. For this purpose, the development of qualitative modeling technologies and qualitative behaviour prediction technologies of the power plant are accomplished. In addition, the feasibility of application of typical qualitative reasoning technologies to power plants is studied . The goal of the application is to develop intelligent control technologies of power plants, support technologies. For these purposes, we analyzed the operation of power plants according to its operation purpose: power generation operation, shut-down and start-up operation. As a result, qualitative model of basic components were sketched, including pipes, valves, pumps and heat exchangers. Finally, plant behaviour prediction technologies through qualitative plant heat transfer model and design support technologies through 2nd-order differential equation were developed. For the construction of AI system of power plants, we have studied on the mixed module based hierarchical software. As a testbed, we have considered the spent fuel system and the feedwater system. We also studied the integration

  17. Prognostic Value and Reproducibility of Pretreatment CT Texture Features in Stage III Non-Small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Fried, David V. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas (United States); Tucker, Susan L. [Department of Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhou, Shouhao [Division of Quantitative Sciences, Department of Bioinformatics and Computational Biology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Liao, Zhongxing [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Mawlawi, Osama [Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas (United States); Ibbott, Geoffrey [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas (United States); Court, Laurence E., E-mail: LECourt@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Graduate School of Biomedical Sciences, The University of Texas Health Science Center at Houston, Houston, Texas (United States)

    2014-11-15

    Purpose: To determine whether pretreatment CT texture features can improve patient risk stratification beyond conventional prognostic factors (CPFs) in stage III non-small cell lung cancer (NSCLC). Methods and Materials: We retrospectively reviewed 91 cases with stage III NSCLC treated with definitive chemoradiation therapy. All patients underwent pretreatment diagnostic contrast enhanced computed tomography (CE-CT) followed by 4-dimensional CT (4D-CT) for treatment simulation. We used the average-CT and expiratory (T50-CT) images from the 4D-CT along with the CE-CT for texture extraction. Histogram, gradient, co-occurrence, gray tone difference, and filtration-based techniques were used for texture feature extraction. Penalized Cox regression implementing cross-validation was used for covariate selection and modeling. Models incorporating texture features from the 33 image types and CPFs were compared to those with models incorporating CPFs alone for overall survival (OS), local-regional control (LRC), and freedom from distant metastases (FFDM). Predictive Kaplan-Meier curves were generated using leave-one-out cross-validation. Patients were stratified based on whether their predicted outcome was above or below the median. Reproducibility of texture features was evaluated using test-retest scans from independent patients and quantified using concordance correlation coefficients (CCC). We compared models incorporating the reproducibility seen on test-retest scans to our original models and determined the classification reproducibility. Results: Models incorporating both texture features and CPFs demonstrated a significant improvement in risk stratification compared to models using CPFs alone for OS (P=.046), LRC (P=.01), and FFDM (P=.005). The average CCCs were 0.89, 0.91, and 0.67 for texture features extracted from the average-CT, T50-CT, and CE-CT, respectively. Incorporating reproducibility within our models yielded 80.4% (±3.7% SD), 78.3% (±4.0% SD), and 78

  18. Prognostic Value and Reproducibility of Pretreatment CT Texture Features in Stage III Non-Small Cell Lung Cancer

    International Nuclear Information System (INIS)

    Fried, David V.; Tucker, Susan L.; Zhou, Shouhao; Liao, Zhongxing; Mawlawi, Osama; Ibbott, Geoffrey; Court, Laurence E.

    2014-01-01

    Purpose: To determine whether pretreatment CT texture features can improve patient risk stratification beyond conventional prognostic factors (CPFs) in stage III non-small cell lung cancer (NSCLC). Methods and Materials: We retrospectively reviewed 91 cases with stage III NSCLC treated with definitive chemoradiation therapy. All patients underwent pretreatment diagnostic contrast enhanced computed tomography (CE-CT) followed by 4-dimensional CT (4D-CT) for treatment simulation. We used the average-CT and expiratory (T50-CT) images from the 4D-CT along with the CE-CT for texture extraction. Histogram, gradient, co-occurrence, gray tone difference, and filtration-based techniques were used for texture feature extraction. Penalized Cox regression implementing cross-validation was used for covariate selection and modeling. Models incorporating texture features from the 33 image types and CPFs were compared to those with models incorporating CPFs alone for overall survival (OS), local-regional control (LRC), and freedom from distant metastases (FFDM). Predictive Kaplan-Meier curves were generated using leave-one-out cross-validation. Patients were stratified based on whether their predicted outcome was above or below the median. Reproducibility of texture features was evaluated using test-retest scans from independent patients and quantified using concordance correlation coefficients (CCC). We compared models incorporating the reproducibility seen on test-retest scans to our original models and determined the classification reproducibility. Results: Models incorporating both texture features and CPFs demonstrated a significant improvement in risk stratification compared to models using CPFs alone for OS (P=.046), LRC (P=.01), and FFDM (P=.005). The average CCCs were 0.89, 0.91, and 0.67 for texture features extracted from the average-CT, T50-CT, and CE-CT, respectively. Incorporating reproducibility within our models yielded 80.4% (±3.7% SD), 78.3% (±4.0% SD), and 78

  19. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  20. Does systematic variation improve the reproducibility of animal experiments?

    NARCIS (Netherlands)

    Jonker, R.M.; Guenther, A.; Engqvist, L.; Schmoll, T.

    2013-01-01

    Reproducibility of results is a fundamental tenet of science. In this journal, Richter et al.1 tested whether systematic variation in experimental conditions (heterogenization) affects the reproducibility of results. Comparing this approach with the current standard of ensuring reproducibility

  1. TU-AB-BRC-05: Creation of a Monte Carlo TrueBeam Model by Reproducing Varian Phase Space Data

    International Nuclear Information System (INIS)

    O’Grady, K; Davis, S; Seuntjens, J

    2016-01-01

    Purpose: To create a Varian TrueBeam 6 MV FFF Monte Carlo model using BEAMnrc/EGSnrc that accurately reproduces the Varian representative dataset, followed by tuning the model’s source parameters to accurately reproduce in-house measurements. Methods: A BEAMnrc TrueBeam model for 6 MV FFF has been created by modifying a validated 6 MV Varian CL21EX model. Geometric dimensions and materials were adjusted in a trial and error approach to match the fluence and spectra of TrueBeam phase spaces output by the Varian VirtuaLinac. Once the model’s phase space matched Varian’s counterpart using the default source parameters, it was validated to match 10 × 10 cm"2 Varian representative data obtained with the IBA CC13. The source parameters were then tuned to match in-house 5 × 5 cm"2 PTW microDiamond measurements. All dose to water simulations included detector models to include the effects of volume averaging and the non-water equivalence of the chamber materials, allowing for more accurate source parameter selection. Results: The Varian phase space spectra and fluence were matched with excellent agreement. The in-house model’s PDD agreement with CC13 TrueBeam representative data was within 0.9% local percent difference beyond the first 3 mm. Profile agreement at 10 cm depth was within 0.9% local percent difference and 1.3 mm distance-to-agreement in the central axis and penumbra regions, respectively. Once the source parameters were tuned, PDD agreement with microDiamond measurements was within 0.9% local percent difference beyond 2 mm. The microDiamond profile agreement at 10 cm depth was within 0.6% local percent difference and 0.4 mm distance-to-agreement in the central axis and penumbra regions, respectively. Conclusion: An accurate in-house Monte Carlo model of the Varian TrueBeam was achieved independently of the Varian phase space solution and was tuned to in-house measurements. KO acknowledges partial support by the CREATE Medical Physics Research

  2. Qualitative analysis in reliability and safety studies

    International Nuclear Information System (INIS)

    Worrell, R.B.; Burdick, G.R.

    1976-01-01

    The qualitative evaluation of system logic models is described as it pertains to assessing the reliability and safety characteristics of nuclear systems. Qualitative analysis of system logic models, i.e., models couched in an event (Boolean) algebra, is defined, and the advantages inherent in qualitative analysis are explained. Certain qualitative procedures that were developed as a part of fault-tree analysis are presented for illustration. Five fault-tree analysis computer-programs that contain a qualitative procedure for determining minimal cut sets are surveyed. For each program the minimal cut-set algorithm and limitations on its use are described. The recently developed common-cause analysis for studying the effect of common-causes of failure on system behavior is explained. This qualitative procedure does not require altering the fault tree, but does use minimal cut sets from the fault tree as part of its input. The method is applied using two different computer programs. 25 refs

  3. Qualitative models to predict impacts of human interventions in a wetland ecosystem

    Directory of Open Access Journals (Sweden)

    S. Loiselle

    2002-07-01

    Full Text Available The large shallow wetlands that dominate much of the South American continent are rich in biodiversity and complexity. Many of these undamaged ecosystems are presently being examined for their potential economic utility, putting pressure on local authorities and the conservation community to find ways of correctly utilising the available natural resources without compromising the ecosystem functioning and overall integrity. Contrary to many northern hemisphere ecosystems, there have been little long term ecological studies of these systems, leading to a lack of quantitative data on which to construct ecological or resource use models. As a result, decision makers, even well meaning ones, have difficulty in determining if particular economic activities can potentially cause significant damage to the ecosystem and how one should go about monitoring the impacts of such activities. While the direct impact of many activities is often known, the secondary indirect impacts are usually less clear and can depend on local ecological conditions.

    The use of qualitative models is a helpful tool to highlight potential feedback mechanisms and secondary effects of management action on ecosystem integrity. The harvesting of a single, apparently abundant, species can have indirect secondary effects on key trophic and abiotic compartments. In this paper, loop model analysis is used to qualitatively examine secondary effects of potential economic activities in a large wetland area in northeast Argentina, the Esteros del Ibera. Based on interaction with local actors together with observed ecological information, loop models were constructed to reflect relationships between biotic and abiotic compartments. A series of analyses were made to study the effect of different economic scenarios on key ecosystem compartments. Important impacts on key biotic compartments (phytoplankton, zooplankton, ichthyofauna, aquatic macrophytes and on the abiotic environment

  4. A Reliable and Reproducible Model for Assessing the Effect of Different Concentrations of α-Solanine on Rat Bone Marrow Mesenchymal Stem Cells

    Directory of Open Access Journals (Sweden)

    Adriana Ordóñez-Vásquez

    2017-01-01

    Full Text Available Αlpha-solanine (α-solanine is a glycoalkaloid present in potato (Solanum tuberosum. It has been of particular interest because of its toxicity and potential teratogenic effects that include abnormalities of the central nervous system, such as exencephaly, encephalocele, and anophthalmia. Various types of cell culture have been used as experimental models to determine the effect of α-solanine on cell physiology. The morphological changes in the mesenchymal stem cell upon exposure to α-solanine have not been established. This study aimed to describe a reliable and reproducible model for assessing the structural changes induced by exposure of mouse bone marrow mesenchymal stem cells (MSCs to different concentrations of α-solanine for 24 h. The results demonstrate that nonlethal concentrations of α-solanine (2–6 μM changed the morphology of the cells, including an increase in the number of nucleoli, suggesting elevated protein synthesis, and the formation of spicules. In addition, treatment with α-solanine reduced the number of adherent cells and the formation of colonies in culture. Immunophenotypic characterization and staining of MSCs are proposed as a reproducible method that allows description of cells exposed to the glycoalkaloid, α-solanine.

  5. Reproducibility of image quality for moving objects using respiratory-gated computed tomography. A study using a phantom model

    International Nuclear Information System (INIS)

    Fukumitsu, Nobuyoshi; Ishida, Masaya; Terunuma, Toshiyuki

    2012-01-01

    To investigate the reproducibility of computed tomography (CT) imaging quality in respiratory-gated radiation treatment planning is essential in radiotherapy of movable tumors. Seven series of regular and six series of irregular respiratory motions were performed using a thorax dynamic phantom. For the regular respiratory motions, the respiratory cycle was changed from 2.5 to 4 s and the amplitude was changed from 4 to 10 mm. For the irregular respiratory motions, a cycle of 2.5 to 4 or an amplitude of 4 to 10 mm was added to the base data (id est (i.e.) 3.5-s cycle, 6-mm amplitude) every three cycles. Images of the object were acquired six times using respiratory-gated data acquisition. The volume of the object was calculated and the reproducibility of the volume was decided based on the variety. The registered image of the object was added and the reproducibility of the shape was decided based on the degree of overlap of objects. The variety in the volumes and shapes differed significantly as the respiratory cycle changed according to regular respiratory motions. In irregular respiratory motion, shape reproducibility was further inferior, and the percentage of overlap among the six images was 35.26% in the 2.5- and 3.5-s cycle mixed group. Amplitude changes did not produce significant differences in the variety of the volumes and shapes. Respiratory cycle changes reduced the reproducibility of the image quality in respiratory-gated CT. (author)

  6. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  7. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  8. Evaluation of recent quantitative magnetospheric magnetic field models

    International Nuclear Information System (INIS)

    Walker, R.J.

    1976-01-01

    Recent quantitative magnetospheric field models contain many features not found in earlier models. Magnetopause models which include the effects of the dipole tilt were presented. More realistic models of the tail field include tail currents which close on the magnetopause, cross-tail currents of finite thickness, and cross-tail current models which model the position of the neutral sheet as a function of tilt. Finally, models have attempted to calculate the field of currents distributed in the inner magnetosphere. As the purpose of a magnetospheric model is to provide a mathematical description of the field that reasonably reproduces the observed magnetospheric field, several recent models were compared with the observed ΔB(B/sub observed/--B/sub main field/) contours. Models containing only contributions from magnetopause and tail current systems are able to reproduce the observed quiet time field only in an extremely qualitative way. The best quantitative agreement between models and observations occurs when currents distributed in the inner magnetosphere are added to the magnetopause and tail current systems. However, the distributed current models are valid only for zero tilt. Even the models which reproduce the average observed field reasonably well may not give physically reasonable field gradients. Three of the models evaluated contain regions in the near tail in which the field gradient reverses direction. One region in which all the models fall short is that around the polar cusp, though most can be used to calculate the position of the last closed field line reasonably well

  9. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  10. Eccentric Contraction-Induced Muscle Injury: Reproducible, Quantitative, Physiological Models to Impair Skeletal Muscle's Capacity to Generate Force.

    Science.gov (United States)

    Call, Jarrod A; Lowe, Dawn A

    2016-01-01

    In order to investigate the molecular and cellular mechanisms of muscle regeneration an experimental injury model is required. Advantages of eccentric contraction-induced injury are that it is a controllable, reproducible, and physiologically relevant model to cause muscle injury, with injury being defined as a loss of force generating capacity. While eccentric contractions can be incorporated into conscious animal study designs such as downhill treadmill running, electrophysiological approaches to elicit eccentric contractions and examine muscle contractility, for example before and after the injurious eccentric contractions, allows researchers to circumvent common issues in determining muscle function in a conscious animal (e.g., unwillingness to participate). Herein, we describe in vitro and in vivo methods that are reliable, repeatable, and truly maximal because the muscle contractions are evoked in a controlled, quantifiable manner independent of subject motivation. Both methods can be used to initiate eccentric contraction-induced injury and are suitable for monitoring functional muscle regeneration hours to days to weeks post-injury.

  11. Reproducibility of precipitation distributions over extratropical continental regions in the CMIP5

    Science.gov (United States)

    Hirota, Nagio; Takayabu, Yukari

    2013-04-01

    Reproducibility of precipitation distributions over extratropical continental regions in the CMIP5 Nagio Hirota1,2 and Yukari N. Takayabu2 (1) National Institute of Polar Research (NIPR) (2) Atmosphere and Ocean Research Institute (AORI), the University of Tokyo Reproducibility of precipitation distributions over extratropical continental regions by CMIP5 climate models in their historical runs are evaluated, in comparison with GPCP(V2.2), CMAP(V0911), daily gridded gauge data APHRODITE. Surface temperature, cloud radiative forcing, and atmospheric circulations are also compared with observations of CRU-UEA, CERES, and ERA-interim/ERA40/JRA reanalysis data. It is shown that many CMIP5 models underestimate and overestimate summer precipitation over West and East Eurasia, respectively. These precipitation biases correspond to moisture transport associated with a cyclonic circulation bias over the whole continent of Eurasia. Meanwhile, many models underestimate cloud over the Eurasian continent, and associated shortwave cloud radiative forcing result in a significant warm bias. Evaporation feedback amplify the warm bias over West Eurasia. These processes consistently explain the precipitation biases over the Erasian continent in summer. We also examined reproducibility of winter precipitation, but robust results are not obtained yet due to the large uncertainty in observation associated with the adjustment of snow measurement in windy condition. Better observational data sets are necessary for further model validation. Acknowledgment: This study is supported by the PMM RA of JAXA, Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology, Japan, and Environment Research and Technology Development Fund (A-1201) of the Ministry of the Environment, Japan.

  12. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  13. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  14. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  15. Model of physico-chemical effect on flow accelerated corrosion in power plant

    International Nuclear Information System (INIS)

    Fujiwara, Kazutoshi; Domae, Masafumi; Yoneda, Kimitoshi; Inada, Fumio

    2011-01-01

    Highlights: → Model of chemical effect on FAC was developed. → Equation to evaluate the dissolved oxygen concentration for FAC suppression was derived. → The model explains the qualitatively the effect of parameters on FAC rate. → Diffusion of soluble species well reproduces the unique FAC behavior. - Abstract: Flow accelerated corrosion (FAC) is caused by the accelerated dissolution of protective oxide film under the condition of high flow rate and has been one of the most important subjects in fossil and nuclear power plants. The dominant factors of FAC are water chemistry, material, and fluid dynamics. Understanding of the thinning mechanism is very important to estimate the quantitative effects of the dominant factors on FAC. In this study, a novel model of chemical effect on FAC under the steady-state condition was developed in consideration of the diffusion of soluble iron and chromium species, dissolved hydrogen, and dissolved oxygen. The formula to evaluate the critical concentration of dissolved oxygen for FAC suppression was derived. The present model reproduced qualitatively the effect of major environmental parameters on FAC rate. The model could explain the following facts. (1) The FAC rate shows a peak around 413 K. (2) The FAC rate decreases with an increase in Cr content. (3) The FAC rate decreases with an increase in pH. (4) The FAC rate decreases with an increase in dissolved oxygen concentration. (5) The maximum of critical dissolved oxygen concentration is observed around 353 K. (6) The critical dissolved oxygen concentration decreases with an increase in pH. We conclude that the diffusion of soluble species from the saturated layer under the steady-state condition well reproduces the unique FAC behavior with variation of water chemistry parameters.

  16. Impact of SLA assimilation in the Sicily Channel Regional Model: model skills and mesoscale features

    Directory of Open Access Journals (Sweden)

    A. Olita

    2012-07-01

    Full Text Available The impact of the assimilation of MyOcean sea level anomalies along-track data on the analyses of the Sicily Channel Regional Model was studied. The numerical model has a resolution of 1/32° degrees and is capable to reproduce mesoscale and sub-mesoscale features. The impact of the SLA assimilation is studied by comparing a simulation (SIM, which does not assimilate data with an analysis (AN assimilating SLA along-track multi-mission data produced in the framework of MyOcean project. The quality of the analysis was evaluated by computing RMSE of the misfits between analysis background and observations (sea level before assimilation. A qualitative evaluation of the ability of the analyses to reproduce mesoscale structures is accomplished by comparing model results with ocean colour and SST satellite data, able to detect such features on the ocean surface. CTD profiles allowed to evaluate the impact of the SLA assimilation along the water column. We found a significant improvement for AN solution in terms of SLA RMSE with respect to SIM (the averaged RMSE of AN SLA misfits over 2 years is about 0.5 cm smaller than SIM. Comparison with CTD data shows a questionable improvement produced by the assimilation process in terms of vertical features: AN is better in temperature while for salinity it gets worse than SIM at the surface. This suggests that a better a-priori description of the vertical error covariances would be desirable. The qualitative comparison of simulation and analyses with synoptic satellite independent data proves that SLA assimilation allows to correctly reproduce some dynamical features (above all the circulation in the Ionian portion of the domain and mesoscale structures otherwise misplaced or neglected by SIM. Such mesoscale changes also infer that the eddy momentum fluxes (i.e. Reynolds stresses show major changes in the Ionian area. Changes in Reynolds stresses reflect a different pumping of eastward momentum from the eddy to

  17. Qualitative and quantitative guidelines for the comparison of environmental model predictions

    International Nuclear Information System (INIS)

    Scott, M.

    1995-03-01

    The question of how to assess or compare predictions from a number of models is one of concern in the validation of models, in understanding the effects of different models and model parameterizations on model output, and ultimately in assessing model reliability. Comparison of model predictions with observed data is the basic tool of model validation while comparison of predictions amongst different models provides one measure of model credibility. The guidance provided here is intended to provide qualitative and quantitative approaches (including graphical and statistical techniques) to such comparisons for use within the BIOMOVS II project. It is hoped that others may find it useful. It contains little technical information on the actual methods but several references are provided for the interested reader. The guidelines are illustrated on data from the VAMP CB scenario. Unfortunately, these data do not permit all of the possible approaches to be demonstrated since predicted uncertainties were not provided. The questions considered are concerned with a) intercomparison of model predictions and b) comparison of model predictions with the observed data. A series of examples illustrating some of the different types of data structure and some possible analyses have been constructed. A bibliography of references on model validation is provided. It is important to note that the results of the various techniques discussed here, whether qualitative or quantitative, should not be considered in isolation. Overall model performance must also include an evaluation of model structure and formulation, i.e. conceptual model uncertainties, and results for performance measures must be interpreted in this context. Consider a number of models which are used to provide predictions of a number of quantities at a number of time points. In the case of the VAMP CB scenario, the results include predictions of total deposition of Cs-137 and time dependent concentrations in various

  18. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  19. Undefined cellulase formulations hinder scientific reproducibility.

    Science.gov (United States)

    Himmel, Michael E; Abbas, Charles A; Baker, John O; Bayer, Edward A; Bomble, Yannick J; Brunecky, Roman; Chen, Xiaowen; Felby, Claus; Jeoh, Tina; Kumar, Rajeev; McCleary, Barry V; Pletschke, Brett I; Tucker, Melvin P; Wyman, Charles E; Decker, Stephen R

    2017-01-01

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparations may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.

  20. Integrated decision-making about housing, energy and wellbeing: a qualitative system dynamics model.

    Science.gov (United States)

    Macmillan, Alexandra; Davies, Michael; Shrubsole, Clive; Luxford, Naomi; May, Neil; Chiu, Lai Fong; Trutnevyte, Evelina; Bobrova, Yekatherina; Chalabi, Zaid

    2016-03-08

    The UK government has an ambitious goal to reduce carbon emissions from the housing stock through energy efficiency improvements. This single policy goal is a strong driver for change in the housing system, but comes with positive and negative "unintended consequences" across a broad range of outcomes for health, equity and environmental sustainability. The resulting policies are also already experiencing under-performance through a failure to consider housing as a complex system. This research aimed to move from considering disparate objectives of housing policies in isolation to mapping the links between environmental, economic, social and health outcomes as a complex system. We aimed to support a broad range of housing policy stakeholders to improve their understanding of housing as a complex system through a collaborative learning process. We used participatory system dynamics modelling to develop a qualitative causal theory linking housing, energy and wellbeing. Qualitative interviews were followed by two interactive workshops to develop the model, involving representatives from national and local government, housing industries, non-government organisations, communities and academia. More than 50 stakeholders from 37 organisations participated. The process resulted in a shared understanding of wellbeing as it relates to housing; an agreed set of criteria against which to assess to future policy options; and a comprehensive set of causal loop diagrams describing the housing, energy and wellbeing system. The causal loop diagrams cover seven interconnected themes: community connection and quality of neighbourhoods; energy efficiency and climate change; fuel poverty and indoor temperature; household crowding; housing affordability; land ownership, value and development patterns; and ventilation and indoor air pollution. The collaborative learning process and the model have been useful for shifting the thinking of a wide range of housing stakeholders towards a more

  1. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  2. Reproducibility of haemodynamical simulations in a subject-specific stented aneurysm model--a report on the Virtual Intracranial Stenting Challenge 2007.

    Science.gov (United States)

    Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F

    2008-07-19

    This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and

  3. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  4. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  5. Systematic heterogenization for better reproducibility in animal experimentation.

    Science.gov (United States)

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  6. An examination of qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations

    International Nuclear Information System (INIS)

    Herbert, M.; Williams, G.

    1986-01-01

    New qualitative techniques for representing the behaviour of physical systems have recently been developed. These allow a qualitative representation to be formally derived from a quantitative plant model. One such technique, Incremental Qualitative Analysis, is based on manipulating qualitative differential equations, called confluences, using sign algebra. This is described and its potential for reducing the amount of information presented to the reactor operator is discussed. In order to illustrate the technique, a specific example relating to the influence of failures associated with a pressurized water reactor pressuriser is presented. It is shown that, although failures cannot necessarily be diagnosed unambiguously, the number of possible failures inferred is low. Techniques for discriminating between these possible failures are discussed. (author)

  7. Evaluation of Oceanic Surface Observation for Reproducing the Upper Ocean Structure in ECHAM5/MPI-OM

    Science.gov (United States)

    Luo, Hao; Zheng, Fei; Zhu, Jiang

    2017-12-01

    Better constraints of initial conditions from data assimilation are necessary for climate simulations and predictions, and they are particularly important for the ocean due to its long climate memory; as such, ocean data assimilation (ODA) is regarded as an effective tool for seasonal to decadal predictions. In this work, an ODA system is established for a coupled climate model (ECHAM5/MPI-OM), which can assimilate all available oceanic observations using an ensemble optimal interpolation approach. To validate and isolate the performance of different surface observations in reproducing air-sea climate variations in the model, a set of observing system simulation experiments (OSSEs) was performed over 150 model years. Generally, assimilating sea surface temperature, sea surface salinity, and sea surface height (SSH) can reasonably reproduce the climate variability and vertical structure of the upper ocean, and assimilating SSH achieves the best results compared to the true states. For the El Niño-Southern Oscillation (ENSO), assimilating different surface observations captures true aspects of ENSO well, but assimilating SSH can further enhance the accuracy of ENSO-related feedback processes in the coupled model, leading to a more reasonable ENSO evolution and air-sea interaction over the tropical Pacific. For ocean heat content, there are still limitations in reproducing the long time-scale variability in the North Atlantic, even if SSH has been taken into consideration. These results demonstrate the effectiveness of assimilating surface observations in capturing the interannual signal and, to some extent, the decadal signal but still highlight the necessity of assimilating profile data to reproduce specific decadal variability.

  8. The use of real-time cell analyzer technology in drug discovery: defining optimal cell culture conditions and assay reproducibility with different adherent cellular models.

    Science.gov (United States)

    Atienzar, Franck A; Tilmant, Karen; Gerets, Helga H; Toussaint, Gaelle; Speeckaert, Sebastien; Hanon, Etienne; Depelchin, Olympe; Dhalluin, Stephane

    2011-07-01

    The use of impedance-based label-free technology applied to drug discovery is nowadays receiving more and more attention. Indeed, such a simple and noninvasive assay that interferes minimally with cell morphology and function allows one to perform kinetic measurements and to obtain information on proliferation, migration, cytotoxicity, and receptor-mediated signaling. The objective of the study was to further assess the usefulness of a real-time cell analyzer (RTCA) platform based on impedance in the context of quality control and data reproducibility. The data indicate that this technology is useful to determine the best coating and cellular density conditions for different adherent cellular models including hepatocytes, cardiomyocytes, fibroblasts, and hybrid neuroblastoma/neuronal cells. Based on 31 independent experiments, the reproducibility of cell index data generated from HepG2 cells exposed to DMSO and to Triton X-100 was satisfactory, with a coefficient of variation close to 10%. Cell index data were also well reproduced when cardiomyocytes and fibroblasts were exposed to 21 compounds three times (correlation >0.91, p technology appears to be a powerful and reliable tool in drug discovery because of the reasonable throughput, rapid and efficient performance, technical optimization, and cell quality control.

  9. From qualitative reasoning models to Bayesian-based learner modeling

    NARCIS (Netherlands)

    Milošević, U.; Bredeweg, B.; de Kleer, J.; Forbus, K.D.

    2010-01-01

    Assessing the knowledge of a student is a fundamental part of intelligent learning environments. We present a Bayesian network based approach to dealing with uncertainty when estimating a learner’s state of knowledge in the context of Qualitative Reasoning (QR). A proposal for a global architecture

  10. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  11. Intestinal microdialysis--applicability, reproducibility and local tissue response in a pig model

    DEFF Research Database (Denmark)

    Emmertsen, K J; Wara, P; Sørensen, Flemming Brandt

    2005-01-01

    BACKGROUND AND AIMS: Microdialysis has been applied to the intestinal wall for the purpose of monitoring local ischemia. The aim of this study was to investigate the applicability, reproducibility and local response to microdialysis in the intestinal wall. MATERIALS AND METHODS: In 12 pigs two...... the probes were processed for histological examination. RESULTS: Large intra- and inter-group differences in the relative recovery were found between all locations. Absolute values of metabolites showed no significant changes during the study period. The lactate in blood was 25-30% of the intra-tissue values...

  12. Development of a Three-Dimensional Hand Model Using Three-Dimensional Stereophotogrammetry: Assessment of Image Reproducibility.

    Directory of Open Access Journals (Sweden)

    Inge A Hoevenaren

    Full Text Available Using three-dimensional (3D stereophotogrammetry precise images and reconstructions of the human body can be produced. Over the last few years, this technique is mainly being developed in the field of maxillofacial reconstructive surgery, creating fusion images with computed tomography (CT data for precise planning and prediction of treatment outcome. Though, in hand surgery 3D stereophotogrammetry is not yet being used in clinical settings.A total of 34 three-dimensional hand photographs were analyzed to investigate the reproducibility. For every individual, 3D photographs were captured at two different time points (baseline T0 and one week later T1. Using two different registration methods, the reproducibility of the methods was analyzed. Furthermore, the differences between 3D photos of men and women were compared in a distance map as a first clinical pilot testing our registration method.The absolute mean registration error for the complete hand was 1.46 mm. This reduced to an error of 0.56 mm isolating the region to the palm of the hand. When comparing hands of both sexes, it was seen that the male hand was larger (broader base and longer fingers than the female hand.This study shows that 3D stereophotogrammetry can produce reproducible images of the hand without harmful side effects for the patient, so proving to be a reliable method for soft tissue analysis. Its potential use in everyday practice of hand surgery needs to be further explored.

  13. Thermodynamics of strongly interacting system from reparametrized Polyakov-Nambu-Jona-Lasinio model

    International Nuclear Information System (INIS)

    Bhattacharyya, Abhijit; Ghosh, Sanjay K.; Maity, Soumitra; Raha, Sibaji; Ray, Rajarshi; Saha, Kinkar; Upadhaya, Sudipa

    2017-01-01

    The Polyakov-Nambu-Jona-Lasinio model has been quite successful in describing various qualitative features of observables for strongly interacting matter, that are measurable in heavy-ion collision experiments. The question still remains on the quantitative uncertainties in the model results. Such an estimation is possible only by contrasting these results with those obtained from rst principles using the lattice QCD framework. Recently a variety of lattice QCD data were reported in the realistic continuum limit. Here we make a first attempt at reparametrizing the model so as to reproduce these lattice data

  14. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built-up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognize systematic error distorting the performance test....

  15. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built–up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognise systematic error distorting the performance test....

  16. Acute multi-sgRNA knockdown of KEOPS complex genes reproduces the microcephaly phenotype of the stable knockout zebrafish model.

    Directory of Open Access Journals (Sweden)

    Tilman Jobst-Schwan

    Full Text Available Until recently, morpholino oligonucleotides have been widely employed in zebrafish as an acute and efficient loss-of-function assay. However, off-target effects and reproducibility issues when compared to stable knockout lines have compromised their further use. Here we employed an acute CRISPR/Cas approach using multiple single guide RNAs targeting simultaneously different positions in two exemplar genes (osgep or tprkb to increase the likelihood of generating mutations on both alleles in the injected F0 generation and to achieve a similar effect as morpholinos but with the reproducibility of stable lines. This multi single guide RNA approach resulted in median likelihoods for at least one mutation on each allele of >99% and sgRNA specific insertion/deletion profiles as revealed by deep-sequencing. Immunoblot showed a significant reduction for Osgep and Tprkb proteins. For both genes, the acute multi-sgRNA knockout recapitulated the microcephaly phenotype and reduction in survival that we observed previously in stable knockout lines, though milder in the acute multi-sgRNA knockout. Finally, we quantify the degree of mutagenesis by deep sequencing, and provide a mathematical model to quantitate the chance for a biallelic loss-of-function mutation. Our findings can be generalized to acute and stable CRISPR/Cas targeting for any zebrafish gene of interest.

  17. A coupled RL and transport model for mixed-field proton irradiation of Al2O3:C

    DEFF Research Database (Denmark)

    Greilich, Steffen; Edmund, Jens Morgenthaler; Jain, Mayank

    2008-01-01

    effects and inelastic hadronic scattering occur in proton therapy dosimetry. To investigate these aspects in relation to our system, we have combined simulation of particle transportation with a luminescence generation code based on track structure theory. The model was found to qualitatively reproduce...... the main features in experimental data from proton irradiations. (c) 2008 Elsevier Ltd. All rights reserved....

  18. Reproducibility of summertime diurnal precipitation over northern Eurasia simulated by CMIP5 climate models

    Science.gov (United States)

    Hirota, N.; Takayabu, Y. N.

    2015-12-01

    Reproducibility of diurnal precipitation over northern Eurasia simulated by CMIP5 climate models in their historical runs were evaluated, in comparison with station data (NCDC-9813) and satellite data (GSMaP-V5). We first calculated diurnal cycles by averaging precipitation at each local solar time (LST) in June-July-August during 1981-2000 over the continent of northern Eurasia (0-180E, 45-90N). Then we examined occurrence time of maximum precipitation and a contribution of diurnally varying precipitation to the total precipitation.The contribution of diurnal precipitation was about 21% in both NCDC-9813 and GSMaP-V5. The maximum precipitation occurred at 18LST in NCDC-9813 but 16LST in GSMaP-V5, indicating some uncertainties even in the observational datasets. The diurnal contribution of the CMIP5 models varied largely from 11% to 62%, and their timing of the precipitation maximum ranged from 11LST to 20LST. Interestingly, the contribution and the timing had strong negative correlation of -0.65. The models with larger diurnal precipitation showed precipitation maximum earlier around noon. Next, we compared sensitivity of precipitation to surface temperature and tropospheric humidity between 5 models with large diurnal precipitation (LDMs) and 5 models with small diurnal precipitation (SDMs). Precipitation in LDMs showed high sensitivity to surface temperature, indicating its close relationship with local instability. On the other hand, synoptic disturbances were more active in SDMs with a dominant role of the large scale condensation, and precipitation in SDMs was more related with tropospheric moisture. Therefore, the relative importance of the local instability and the synoptic disturbances was suggested to be an important factor in determining the contribution and timing of the diurnal precipitation. Acknowledgment: This study is supported by Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology

  19. Two-Finger Tightness: What Is It? Measuring Torque and Reproducibility in a Simulated Model.

    Science.gov (United States)

    Acker, William B; Tai, Bruce L; Belmont, Barry; Shih, Albert J; Irwin, Todd A; Holmes, James R

    2016-05-01

    Residents in training are often directed to insert screws using "two-finger tightness" to impart adequate torque but minimize the chance of a screw stripping in bone. This study seeks to quantify and describe two-finger tightness and to assess the variability of its application by residents in training. Cortical bone was simulated using a polyurethane foam block (30-pcf density) that was prepared with predrilled holes for tightening 3.5 × 14-mm long cortical screws and mounted to a custom-built apparatus on a load cell to capture torque data. Thirty-three residents in training, ranging from the first through fifth years of residency, along with 8 staff members, were directed to tighten 6 screws to two-finger tightness in the test block, and peak torque values were recorded. The participants were blinded to their torque values. Stripping torque (2.73 ± 0.56 N·m) was determined from 36 trials and served as a threshold for failed screw placement. The average torques varied substantially with regard to absolute torque values, thus poorly defining two-finger tightness. Junior residents less consistently reproduced torque compared with other groups (0.29 and 0.32, respectively). These data quantify absolute values of two-finger tightness but demonstrate considerable variability in absolute torque values, percentage of stripping torque, and ability to consistently reproduce given torque levels. Increased years in training are weakly correlated with reproducibility, but experience does not seem to affect absolute torque levels. These results question the usefulness of two-finger tightness as a teaching tool and highlight the need for improvement in resident motor skill training and development within a teaching curriculum. Torque measuring devices may be a useful simulation tools for this purpose.

  20. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Science.gov (United States)

    Nada, Rania M; Maal, Thomas J J; Breuning, K Hero; Bergé, Stefaan J; Mostafa, Yehya A; Kuijpers-Jagtman, Anne Marie

    2011-02-09

    Superimposition of serial Cone Beam Computed Tomography (CBCT) scans has become a valuable tool for three dimensional (3D) assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16) for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27) for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  1. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  2. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  3. A quantitative evaluation of a qualitative risk assessment framework: Examining the assumptions and predictions of the Productivity Susceptibility Analysis (PSA)

    Science.gov (United States)

    2018-01-01

    Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869

  4. [Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].

    Science.gov (United States)

    Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping

    2014-06-01

    The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value.

  5. Matrix model approximations of fuzzy scalar field theories and their phase diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Tekel, Juraj [Department of Theoretical Physics, Faculty of Mathematics, Physics and Informatics, Comenius University, Mlynska Dolina, Bratislava, 842 48 (Slovakia)

    2015-12-29

    We present an analysis of two different approximations to the scalar field theory on the fuzzy sphere, a nonperturbative and a perturbative one, which are both multitrace matrix models. We show that the former reproduces a phase diagram with correct features in a qualitative agreement with the previous numerical studies and that the latter gives a phase diagram with features not expected in the phase diagram of the field theory.

  6. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  7. Simple cortical and thalamic neuron models for digital arithmetic circuit implementation

    Directory of Open Access Journals (Sweden)

    Takuya eNanami

    2016-05-01

    Full Text Available Trade-off between reproducibility of neuronal activities and computational efficiency is one ofcrucial subjects in computational neuroscience and neuromorphic engineering. A wide variety ofneuronal models have been studied from different viewpoints. The digital spiking silicon neuron(DSSN model is a qualitative model that focuses on efficient implementation by digital arithmeticcircuits. We expanded the DSSN model and found appropriate parameter sets with which itreproduces the dynamical behaviors of the ionic-conductance models of four classes of corticaland thalamic neurons. We first developed a 4-variable model by reducing the number of variablesin the ionic-conductance models and elucidated its mathematical structures using bifurcationanalysis. Then, expanded DSSN models were constructed that reproduce these mathematicalstructures and capture the characteristic behavior of each neuron class. We confirmed thatstatistics of the neuronal spike sequences are similar in the DSSN and the ionic-conductancemodels. Computational cost of the DSSN model is larger than that of the recent sophisticatedIntegrate-and-Fire-based models, but smaller than the ionic-conductance models. This modelis intended to provide another meeting point for above trade-off that satisfies the demand forlarge-scale neuronal network simulation with closer-to-biology models.

  8. [Natural head position's reproducibility on photographs].

    Science.gov (United States)

    Eddo, Marie-Line; El Hayeck, Émilie; Hoyeck, Maha; Khoury, Élie; Ghoubril, Joseph

    2017-12-01

    The purpose of this study is to evaluate the reproducibility of natural head position with time on profile photographs. Our sample is composed of 96 students (20-30 years old) at the department of dentistry of Saint Joseph University in Beirut. Two profile photographs were taken in natural head position about a week apart. No significant differences were found between T0 and T1 (E = 1.065°). Many studies confirmed this reproducibility with time. Natural head position can be adopted as an orientation for profile photographs in orthodontics. © EDP Sciences, SFODF, 2017.

  9. Audiovisual biofeedback improves diaphragm motion reproducibility in MRI

    Science.gov (United States)

    Kim, Taeho; Pollock, Sean; Lee, Danny; O’Brien, Ricky; Keall, Paul

    2012-01-01

    Purpose: In lung radiotherapy, variations in cycle-to-cycle breathing results in four-dimensional computed tomography imaging artifacts, leading to inaccurate beam coverage and tumor targeting. In previous studies, the effect of audiovisual (AV) biofeedback on the external respiratory signal reproducibility has been investigated but the internal anatomy motion has not been fully studied. The aim of this study is to test the hypothesis that AV biofeedback improves diaphragm motion reproducibility of internal anatomy using magnetic resonance imaging (MRI). Methods: To test the hypothesis 15 healthy human subjects were enrolled in an ethics-approved AV biofeedback study consisting of two imaging sessions spaced ∼1 week apart. Within each session MR images were acquired under free breathing and AV biofeedback conditions. The respiratory signal to the AV biofeedback system utilized optical monitoring of an external marker placed on the abdomen. Synchronously, serial thoracic 2D MR images were obtained to measure the diaphragm motion using a fast gradient-recalled-echo MR pulse sequence in both coronal and sagittal planes. The improvement in the diaphragm motion reproducibility using the AV biofeedback system was quantified by comparing cycle-to-cycle variability in displacement, respiratory period, and baseline drift. Additionally, the variation in improvement between the two sessions was also quantified. Results: The average root mean square error (RMSE) of diaphragm cycle-to-cycle displacement was reduced from 2.6 mm with free breathing to 1.6 mm (38% reduction) with the implementation of AV biofeedback (p-value biofeedback (p-value biofeedback (p-value = 0.012). The diaphragm motion reproducibility improvements with AV biofeedback were consistent with the abdominal motion reproducibility that was observed from the external marker motion variation. Conclusions: This study was the first to investigate the potential of AV biofeedback to improve the motion

  10. Towards reproducible descriptions of neuronal network models.

    Directory of Open Access Journals (Sweden)

    Eilen Nordlie

    2009-08-01

    Full Text Available Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.

  11. The qualitative and quantitative accuracy of DFT methods in computing 1J(C–F), 1J(C–N) and nJ(F–F) spin–spin coupling of fluorobenzene and fluoropyridine molecules

    International Nuclear Information System (INIS)

    Adeniyi, Adebayo A.; Ajibade, Peter A.

    2015-01-01

    The qualitative and quantitative quality of DFT methods combined with different basis sets in computing the J-coupling of the types 1 J(C–F) and n J(F–F) are investigated for the fluorobenzene and fluoropyridine derivatives. Interestingly, all of the computational methods perfectly reproduced the experimental order for n J(F–F) but many failed to reproduce the experimental order for 1 J(C–F) coupling. The functional PBEPBE gives the best quantitative values that are closer to the experimental spin–spin coupling when combined with the basis sets aug-cc-pVDZ and DGDZVP but is also part of the methods that fail to perfectly reproduce the experimental order for the 1 J(C–F) coupling. The basis set DGDZVP combined with all the methods except with PBEPBE perfectly reproduces the 1 J(C–F) experimental order. All the methods reproduce either the positive or the negative sign of the experimental spin–spin coupling except for the basis set 6-31+G(d,p) which fails to reproduce the experimental positive value of 3 J(F–F) regardless of what type of DFT methods was used. The values of the FC term is far higher than all other Ramsey terms in the one bond 1 J(C–F) coupling but in the two, three and four bonds n J(F–F) the values of PSO and SD are higher. - Graphical abstract: DFT methods were used to compute the J-coupling of molecules benf, benf2, benf2c, benf2c2, pyrf, pyrfc and pyrfc2, and are presented. Right combination of DFT functional with basis set can reproduce high level EOM-CCSD and experimental J-coupling results. All the methods can reproduce the qualitative order of the experimental J-coupling but not all reproduce the quantitative. The best quantitative results were obtained from PBEPBE combined with the high basis set aug-cc-pVDZ Also, PBEPBE combines with lower basis set DGDZVP to give a highly similar value. - Highlights: • DFT methods were used to compute the J-coupling of the molecules. • Right combination of DFT functional with basis

  12. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.

    Science.gov (United States)

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  13. Short- and long-term reproducibility of radioisotopic examination of gastric emptying

    Energy Technology Data Exchange (ETDEWEB)

    Jonderko, K. (Silesian School of Medicine, Katowice (Poland). Dept. of Gastroenterology)

    1990-01-01

    Reproducibility of gastric emptying (GE) of a radiolabelled solid meal was assessed. The short-term reproducibility was evaluated on the basis of 12 paired GE examinations performed 1-3 days apart. Twelve paired GE examinations taken 3-8 months apart enabled long-term reproducibility assessment. Reproducibility of GE parameters was expressed in terms of the coefficient of variation, CV. No significant between-day variation of solid GE was found either regarding the short-term or the long-term reproducibility. Although slightly higher CV values characterized the long-term reproducibility of the GE parameters considered, the variations of the differences between repeated GE examinations did not differ significantly between short- and long-term GE reproducibility. The results obtained justify the use of radioisotopic GE measurement for the assessment of early and late results of pharmacologic or surgical management. (author).

  14. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  15. Reproducing an extreme flood with uncertain post-event information

    Directory of Open Access Journals (Sweden)

    D. Fuentes-Andino

    2017-07-01

    Full Text Available Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum–Cunge–Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events

  16. A computational model incorporating neural stem cell dynamics reproduces glioma incidence across the lifespan in the human population.

    Directory of Open Access Journals (Sweden)

    Roman Bauer

    Full Text Available Glioma is the most common form of primary brain tumor. Demographically, the risk of occurrence increases until old age. Here we present a novel computational model to reproduce the probability of glioma incidence across the lifespan. Previous mathematical models explaining glioma incidence are framed in a rather abstract way, and do not directly relate to empirical findings. To decrease this gap between theory and experimental observations, we incorporate recent data on cellular and molecular factors underlying gliomagenesis. Since evidence implicates the adult neural stem cell as the likely cell-of-origin of glioma, we have incorporated empirically-determined estimates of neural stem cell number, cell division rate, mutation rate and oncogenic potential into our model. We demonstrate that our model yields results which match actual demographic data in the human population. In particular, this model accounts for the observed peak incidence of glioma at approximately 80 years of age, without the need to assert differential susceptibility throughout the population. Overall, our model supports the hypothesis that glioma is caused by randomly-occurring oncogenic mutations within the neural stem cell population. Based on this model, we assess the influence of the (experimentally indicated decrease in the number of neural stem cells and increase of cell division rate during aging. Our model provides multiple testable predictions, and suggests that different temporal sequences of oncogenic mutations can lead to tumorigenesis. Finally, we conclude that four or five oncogenic mutations are sufficient for the formation of glioma.

  17. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  18. Reproducibility of computer-aided detection system in digital mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Cho, Nariya; Cha, Joo Hee; Chung, Hye Kyung; Lee, Sin Ho; Cho, Kyung Soo; Kim, Sun Mi; Moon, Woo Kyung

    2005-01-01

    To evaluate the reproducibility of the computer-aided detection (CAD) system for digital mammograms. We applied the CAD system (ImageChecker M1000-DM, version 3.1; R2 Technology) to full field digital mammograms. These mammograms were taken twice at an interval of 10-45 days (mean:25 days) for 34 preoperative patients (breast cancer n=27, benign disease n=7, age range:20-66 years, mean age:47.9 years). On the mammograms, lesions were visible in 19 patients and these were depicted as 15 masses and 12 calcification clusters. We analyzed the sensitivity, the false positive rate (FPR) and the reproducibility of the CAD marks. The broader sensitivities of the CAD system were 80% (12 of 15), 67%(10 of 15) for masses and those for calcification clusters were 100% (12 of 12). The strict sensitivities were 50% (15 of 30) and 50% (15 of 30) for masses and 92% (22 of 24) and 79% (19 of 24) for the clusters. The FPR for the masses was 0.21-0.22/image, the FPR for the clusters was 0.03-0.04/image and the total FPR was 0.24-0.26/image. Among 132 mammography images, the identical images regardless of the existence of CAD marks were 59% (78 of 132), and the identical images with CAD marks were 22% (15 of 69). The reproducibility of the CAD marks for the true positive mass was 67% (12 of 18) and 71% (17 of 24) for the true positive cluster. The reproducibility of CAD marks for the false positive mass was 8% (4 of 53), and the reproducibility of CAD marks for the false positive clusters was 14% (1 of 7). The reproducibility of the total mass marks was 23% (16 of 71), and the reproducibility of the total cluster marks was 58% (18 of 31). CAD system showed higher sensitivity and reproducibility of CAD marks for the calcification clusters which are related to breast cancer. Yet the overall reproducibility of CAD marks was low; therefore, the CAD system must be applied considering this limitation

  19. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Directory of Open Access Journals (Sweden)

    Wilke Daniel N.

    2017-01-01

    Full Text Available The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  20. Towards reproducible experimental studies for non-convex polyhedral shaped particles

    Science.gov (United States)

    Wilke, Daniel N.; Pizette, Patrick; Govender, Nicolin; Abriak, Nor-Edine

    2017-06-01

    The packing density and flat bottomed hopper discharge of non-convex polyhedral particles are investigated in a systematic experimental study. The motivation for this study is two-fold. Firstly, to establish an approach to deliver quality experimental particle packing data for non-convex polyhedral particles that can be used for characterization and validation purposes of discrete element codes. Secondly, to make the reproducibility of experimental setups as convenient and readily available as possible using affordable and accessible technology. The primary technology for this study is fused deposition modeling used to 3D print polylactic acid (PLA) particles using readily available 3D printer technology. A total of 8000 biodegradable particles were printed, 1000 white particles and 1000 black particles for each of the four particle types considered in this study. Reproducibility is one benefit of using fused deposition modeling to print particles, but an extremely important additional benefit is that specific particle properties can be explicitly controlled. As an example in this study the volume fraction of each particle can be controlled i.e. the effective particle density can be adjusted. In this study the particle volumes reduces drastically as the non-convexity is increased, however all printed white particles in this study have the same mass within 2% of each other.

  1. Qualitative methods in nuclear reactor dynamics. Issue 23

    International Nuclear Information System (INIS)

    Goryachenko, V.D.

    1983-01-01

    Applicability of qualitative methods of the theory of nonlinear oscillations including the bifurcation theory to the problems of nuclear reactor nonlinear dynamics is investigated. Basic statements of the dynamic system qualitative theory on a phase plane and the bifurcation theory of multidimensional dynamic systems are briefly outlined. The model of reactor dynamics with two reactivity temperature coefficients neglecting delayed neutrons, the model of slow process dynamics in a reactor with two reactivity temperature coefficients, the simplified model of reactor dynamics as an object with delay and the model of a reactor with linear feedback are considered. A conclusion is drawn that the usage of the above models allows one to reveal qualitative peculiarities of reactor dynamics creating conditions for more purposeful utilization of more complicated models

  2. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  3. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Directory of Open Access Journals (Sweden)

    Rania M Nada

    Full Text Available Superimposition of serial Cone Beam Computed Tomography (CBCT scans has become a valuable tool for three dimensional (3D assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16 for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27 for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  4. Involving mental health service users in suicide-related research: a qualitative inquiry model.

    Science.gov (United States)

    Lees, David; Procter, Nicholas; Fassett, Denise; Handley, Christine

    2016-03-01

    To describe the research model developed and successfully deployed as part of a multi-method qualitative study investigating suicidal service-users' experiences of mental health nursing care. Quality mental health care is essential to limiting the occurrence and burden of suicide, however there is a lack of relevant research informing practice in this context. Research utilising first-person accounts of suicidality is of particular importance to expanding the existing evidence base. However, conducting ethical research to support this imperative is challenging. The model discussed here illustrates specific and more generally applicable principles for qualitative research regarding sensitive topics and involving potentially vulnerable service-users. Researching into mental health service users with first-person experience of suicidality requires stakeholder and institutional support, researcher competency, and participant recruitment, consent, confidentiality, support and protection. Research with service users into their experiences of sensitive issues such as suicidality can result in rich and valuable data, and may also provide positive experiences of collaboration and inclusivity. If challenges are not met, objectification and marginalisation of service-users may be reinforced, and limitations in the evidence base and service provision may be perpetuated.

  5. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    Science.gov (United States)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  6. Deuteron-deuteron scattering at 3.0, 3.4 and 3.7 GeV/c, theoretical interpretation in the Glauber model

    International Nuclear Information System (INIS)

    Ballot, J.L.; L'Huillier, M.; Combes, M.P.; Tatischeff, B.

    1984-01-01

    An analysis of deuteron-deuteron scattering data at 3.0, 3.4 and 3.7 GeV/c is given in the framework of the Glauber NN multiple scattering model. The model accounts qualitatively well for the larger momentum transfer data. The model cannot reproduce the observed experimental bump at small transfer in the region of missing mass 2GeV/c, because pionic processes have been considered. Nevertheless it shows that the nucleon background is important

  7. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    Science.gov (United States)

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P 30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

  8. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  9. The operator model as a framework of research on errors and temporal, qualitative and analogical reasoning

    International Nuclear Information System (INIS)

    Decortis, F.; Drozdowicz, B.; Masson, M.

    1990-01-01

    In this paper the needs and requirements for developing a cognitive model of a human operator are discussed and the computer architecture, currently being developed, is described. Given the approach taken, namely the division of the problem into specialised tasks within an area and using the architecture chosen, it is possible to build independently several cognitive and psychological models such as errors and stress models, as well as models of temporal, qualitative and an analogical reasoning. (author)

  10. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  11. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  12. Reproducibility, controllability, and optimization of LENR experiments

    International Nuclear Information System (INIS)

    Nagel, David J.

    2006-01-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR

  13. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    Directory of Open Access Journals (Sweden)

    Eiji Watanabe

    2018-03-01

    Full Text Available The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  14. Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach

    Science.gov (United States)

    Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil

    2016-01-01

    Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.

  15. Use of Game Theory to model patient engagement after surgery: a qualitative analysis.

    Science.gov (United States)

    Castellanos, Stephen A; Buentello, Gerardo; Gutierrez-Meza, Diana; Forgues, Angela; Haubert, Lisa; Artinyan, Avo; Macdonald, Cameron L; Suliburk, James W

    2018-01-01

    Patient engagement is challenging to define and operationalize. Qualitative analysis allows us to explore patient perspectives on this topic and establish themes. A game theoretic signaling model also provides a framework through which to further explore engagement. Over a 6-mo period, thirty-eight interviews were conducted within 6 wk of discharge in patients undergoing thyroid, parathyroid, or colorectal surgery. Interviews were transcribed, anonymized, and analyzed using the NVivo 11 platform. A signaling model was then developed depicting the doctor-patient interaction surrounding the patient's choice to reach out to their physician with postoperative concerns based upon the patient's perspective of the doctor's availability. This was defined as "engagement". We applied the model to the qualitative data to determine possible causations for a patient's engagement or lack thereof. A private hospital's and a safety net hospital's populations were contrasted. The private patient population was more likely to engage than their safety-net counterparts. Using our model in conjunction with patient data, we determined possible etiologies for this engagement to be due to the private patient's perceived probability of dealing with an available doctor and apparent signals from the doctor indicating so. For the safety-net population, decreased access to care caused them to be less willing to engage with a doctor perceived as possibly unavailable. A physician who understands these Game Theory concepts may be able to alter their interactions with their patients, tailoring responses and demeanor to fit the patient's circumstances and possible barriers to engagement. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Reproducing Kernels and Coherent States on Julia Sets

    Energy Technology Data Exchange (ETDEWEB)

    Thirulogasanthar, K., E-mail: santhar@cs.concordia.ca; Krzyzak, A. [Concordia University, Department of Computer Science and Software Engineering (Canada)], E-mail: krzyzak@cs.concordia.ca; Honnouvo, G. [Concordia University, Department of Mathematics and Statistics (Canada)], E-mail: g_honnouvo@yahoo.fr

    2007-11-15

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems.

  17. Reproducing Kernels and Coherent States on Julia Sets

    International Nuclear Information System (INIS)

    Thirulogasanthar, K.; Krzyzak, A.; Honnouvo, G.

    2007-01-01

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems

  18. In vivo reproducibility of robotic probe placement for an integrated US-CT image-guided radiation therapy system

    Science.gov (United States)

    Lediju Bell, Muyinatu A.; Sen, H. Tutkun; Iordachita, Iulian; Kazanzides, Peter; Wong, John

    2014-03-01

    Radiation therapy is used to treat cancer by delivering high-dose radiation to a pre-defined target volume. Ultrasound (US) has the potential to provide real-time, image-guidance of radiation therapy to identify when a target moves outside of the treatment volume (e.g. due to breathing), but the associated probe-induced tissue deformation causes local anatomical deviations from the treatment plan. If the US probe is placed to achieve similar tissue deformations in the CT images required for treatment planning, its presence causes streak artifacts that will interfere with treatment planning calculations. To overcome these challenges, we propose robot-assisted placement of a real ultrasound probe, followed by probe removal and replacement with a geometrically-identical, CT-compatible model probe. This work is the first to investigate in vivo deformation reproducibility with the proposed approach. A dog's prostate, liver, and pancreas were each implanted with three 2.38-mm spherical metallic markers, and the US probe was placed to visualize the implanted markers in each organ. The real and model probes were automatically removed and returned to the same position (i.e. position control), and CT images were acquired with each probe placement. The model probe was also removed and returned with the same normal force measured with the real US probe (i.e. force control). Marker positions in CT images were analyzed to determine reproducibility, and a corollary reproducibility study was performed on ex vivo tissue. In vivo results indicate that tissue deformations with the real probe were repeatable under position control for the prostate, liver, and pancreas, with median 3D reproducibility of 0.3 mm, 0.3 mm, and 1.6 mm, respectively, compared to 0.6 mm for the ex vivo tissue. For the prostate, the mean 3D tissue displacement errors between the real and model probes were 0.2 mm under position control and 0.6 mm under force control, which are both within acceptable

  19. Reproducibility of preclinical animal research improves with heterogeneity of study samples

    Science.gov (United States)

    Vogt, Lucile; Sena, Emily S.; Würbel, Hanno

    2018-01-01

    Single-laboratory studies conducted under highly standardized conditions are the gold standard in preclinical animal research. Using simulations based on 440 preclinical studies across 13 different interventions in animal models of stroke, myocardial infarction, and breast cancer, we compared the accuracy of effect size estimates between single-laboratory and multi-laboratory study designs. Single-laboratory studies generally failed to predict effect size accurately, and larger sample sizes rendered effect size estimates even less accurate. By contrast, multi-laboratory designs including as few as 2 to 4 laboratories increased coverage probability by up to 42 percentage points without a need for larger sample sizes. These findings demonstrate that within-study standardization is a major cause of poor reproducibility. More representative study samples are required to improve the external validity and reproducibility of preclinical animal research and to prevent wasting animals and resources for inconclusive research. PMID:29470495

  20. When Quality Beats Quantity: Decision Theory, Drug Discovery, and the Reproducibility Crisis.

    Directory of Open Access Journals (Sweden)

    Jack W Scannell

    Full Text Available A striking contrast runs through the last 60 years of biopharmaceutical discovery, research, and development. Huge scientific and technological gains should have increased the quality of academic science and raised industrial R&D efficiency. However, academia faces a "reproducibility crisis"; inflation-adjusted industrial R&D costs per novel drug increased nearly 100 fold between 1950 and 2010; and drugs are more likely to fail in clinical development today than in the 1970s. The contrast is explicable only if powerful headwinds reversed the gains and/or if many "gains" have proved illusory. However, discussions of reproducibility and R&D productivity rarely address this point explicitly. The main objectives of the primary research in this paper are: (a to provide quantitatively and historically plausible explanations of the contrast; and (b identify factors to which R&D efficiency is sensitive. We present a quantitative decision-theoretic model of the R&D process. The model represents therapeutic candidates (e.g., putative drug targets, molecules in a screening library, etc. within a "measurement space", with candidates' positions determined by their performance on a variety of assays (e.g., binding affinity, toxicity, in vivo efficacy, etc. whose results correlate to a greater or lesser degree. We apply decision rules to segment the space, and assess the probability of correct R&D decisions. We find that when searching for rare positives (e.g., candidates that will successfully complete clinical development, changes in the predictive validity of screening and disease models that many people working in drug discovery would regard as small and/or unknowable (i.e., an 0.1 absolute change in correlation coefficient between model output and clinical outcomes in man can offset large (e.g., 10 fold, even 100 fold changes in models' brute-force efficiency. We also show how validity and reproducibility correlate across a population of simulated

  1. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  2. Reproducibility of the sella turcica landmark in three dimensions using a sella turcica-specific reference system

    International Nuclear Information System (INIS)

    Pittayapat, Pisha; Jacobs, Reinhilde; Odri, Guillaume A.; De Faria Vasconcelos, Karla; Willems, Guy; Olszewski, Raphael

    2015-01-01

    This study was performed to assess the reproducibility of identifying the sella turcica landmark in a three-dimensional (3D) model by using a new sella-specific landmark reference system. Thirty-two cone-beam computed tomographic scans (3D Accuitomo 170, J. Morita, Kyoto, Japan) were retrospectively collected. The 3D data were exported into the Digital Imaging and Communications in Medicine standard and then imported into the Maxilim software (Medicim NV, Sint-Niklaas, Belgium) to create 3D surface models. Five observers identified four osseous landmarks in order to create the reference frame and then identified two sella landmarks. The x, y, and z coordinates of each landmark were exported. The observations were repeated after four weeks. Statistical analysis was performed using the multiple paired t-test with Bonferroni correction (intraobserver precision: p<0.005, interobserver precision: p<0.0011). The intraobserver mean precision of all landmarks was <1 mm. Significant differences were found when comparing the intraobserver precision of each observer (p<0.005). For the sella landmarks, the intraobserver mean precision ranged from 0.43±0.34 mm to 0.51±0.46 mm. The intraobserver reproducibility was generally good. The overall interobserver mean precision was <1 mm. Significant differences between each pair of observers for all anatomical landmarks were found (p<0.0011). The interobserver reproducibility of sella landmarks was good, with >50% precision in locating the landmark within 1 mm. A newly developed reference system offers high precision and reproducibility for sella turcica identification in a 3D model without being based on two-dimensional images derived from 3D data.

  3. Reproducibility of the sella turcica landmark in three dimensions using a sella turcica-specific reference system

    Energy Technology Data Exchange (ETDEWEB)

    Pittayapat, Pisha; Jacobs, Reinhilde [University Hospitals Leuven, University of Leuven, Leuven (Belgium); Odri, Guillaume A. [Service de Chirurgie Orthopedique et Traumatologique, Centre Hospitalier Regional d' Orleans, Orleans Cedex2 (France); De Faria Vasconcelos, Karla [Dept. of Oral Diagnosis, Division of Oral Radiology, Piracicaba Dental School, University of Campinas, Sao Paulo (Brazil); Willems, Guy [Dept. of Oral Health Sciences, Orthodontics, KU Leuven and Dentistry, University Hospitals Leuven, University of Leuven, Leuven (Belgium); Olszewski, Raphael [Dept. of Oral and Maxillofacial Surgery, Cliniques Universitaires Saint Luc, Universite Catholique de Louvain, Brussels (Belgium)

    2015-03-15

    This study was performed to assess the reproducibility of identifying the sella turcica landmark in a three-dimensional (3D) model by using a new sella-specific landmark reference system. Thirty-two cone-beam computed tomographic scans (3D Accuitomo 170, J. Morita, Kyoto, Japan) were retrospectively collected. The 3D data were exported into the Digital Imaging and Communications in Medicine standard and then imported into the Maxilim software (Medicim NV, Sint-Niklaas, Belgium) to create 3D surface models. Five observers identified four osseous landmarks in order to create the reference frame and then identified two sella landmarks. The x, y, and z coordinates of each landmark were exported. The observations were repeated after four weeks. Statistical analysis was performed using the multiple paired t-test with Bonferroni correction (intraobserver precision: p<0.005, interobserver precision: p<0.0011). The intraobserver mean precision of all landmarks was <1 mm. Significant differences were found when comparing the intraobserver precision of each observer (p<0.005). For the sella landmarks, the intraobserver mean precision ranged from 0.43±0.34 mm to 0.51±0.46 mm. The intraobserver reproducibility was generally good. The overall interobserver mean precision was <1 mm. Significant differences between each pair of observers for all anatomical landmarks were found (p<0.0011). The interobserver reproducibility of sella landmarks was good, with >50% precision in locating the landmark within 1 mm. A newly developed reference system offers high precision and reproducibility for sella turcica identification in a 3D model without being based on two-dimensional images derived from 3D data.

  4. Reproducible diagnosis of Chronic Lymphocytic Leukemia by flow cytometry

    DEFF Research Database (Denmark)

    Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha

    2018-01-01

    The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the di...

  5. Participant Nonnaiveté and the reproducibility of cognitive psychology

    NARCIS (Netherlands)

    R.A. Zwaan (Rolf); D. Pecher (Diane); G. Paolacci (Gabriele); S. Bouwmeester (Samantha); P.P.J.L. Verkoeijen (Peter); K. Dijkstra (Katinka); R. Zeelenberg (René)

    2017-01-01

    textabstractMany argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature—three each from the domains of perception/action, memory, and language, respectively—and found that they are highly reproducible. Not only can

  6. Ceramic molar crown reproducibility by digital workflow manufacturing: An in vitro study.

    Science.gov (United States)

    Jeong, Ii-Do; Kim, Woong-Chul; Park, Jinyoung; Kim, Chong-Myeong; Kim, Ji-Hwan

    2017-08-01

    This in vitro study aimed to analyze and compare the reproducibility of zirconia and lithium disilicate crowns manufactured by digital workflow. A typodont model with a prepped upper first molar was set in a phantom head, and a digital impression was obtained with a video intraoral scanner (CEREC Omnicam; Sirona GmbH), from which a single crown was designed and manufactured with CAD/CAM into a zirconia crown and lithium disilicate crown (n=12). Reproducibility of each crown was quantitatively retrieved by superimposing the digitized data of the crown in 3D inspection software, and differences were graphically mapped in color. Areas with large differences were analyzed with digital microscopy. Mean quadratic deviations (RMS) quantitatively obtained from each ceramic group were statistically analyzed with Student's t-test (α=.05). The RMS value of lithium disilicate crown was 29.2 (4.1) µm and 17.6 (5.5) µm on the outer and inner surfaces, respectively, whereas these values were 18.6 (2.0) µm and 20.6 (5.1) µm for the zirconia crown. Reproducibility of zirconia and lithium disilicate crowns had a statistically significant difference only on the outer surface ( P <.001). The outer surface of lithium disilicate crown showed over-contouring on the buccal surface and under-contouring on the inner occlusal surface. The outer surface of zirconia crown showed both over- and under-contouring on the buccal surface, and the inner surface showed under-contouring in the marginal areas. Restoration manufacturing by digital workflow will enhance the reproducibility of zirconia single crowns more than that of lithium disilicate single crowns.

  7. Structure and thermodynamics of core-softened models for alcohols

    International Nuclear Information System (INIS)

    Munaò, Gianmarco; Urbic, Tomaz

    2015-01-01

    The phase behavior and the fluid structure of coarse-grain models for alcohols are studied by means of reference interaction site model (RISM) theory and Monte Carlo simulations. Specifically, we model ethanol and 1-propanol as linear rigid chains constituted by three (trimers) and four (tetramers) partially fused spheres, respectively. Thermodynamic properties of these models are examined in the RISM context, by employing closed formulæ for the calculation of free energy and pressure. Gas-liquid coexistence curves for trimers and tetramers are reported and compared with already existing data for a dimer model of methanol. Critical temperatures slightly increase with the number of CH 2 groups in the chain, while critical pressures and densities decrease. Such a behavior qualitatively reproduces the trend observed in experiments on methanol, ethanol, and 1-propanol and suggests that our coarse-grain models, despite their simplicity, can reproduce the essential features of the phase behavior of such alcohols. The fluid structure of these models is investigated by computing radial distribution function g ij (r) and static structure factor S ij (k); the latter shows the presence of a low−k peak at intermediate-high packing fractions and low temperatures, suggesting the presence of aggregates for both trimers and tetramers

  8. Modelling Market Dynamics with a "Market Game"

    Science.gov (United States)

    Katahira, Kei; Chen, Yu

    In the financial market, traders, especially speculators, typically behave as to yield capital gains by the difference between selling and buying prices. Making use of the structure of Minority Game, we build a novel market toy model which takes account of such the speculative mind involving a round-trip trade to analyze the market dynamics as a system. Even though the micro-level behavioral rules of players in this new model is quite simple, its macroscopic aggregational output has the reproducibility of the well-known stylized facts such as volatility clustering and heavy tails. The proposed model may become a new alternative bottom-up approach in order to study the emerging mechanism of those stylized qualitative properties of asset returns.

  9. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  10. An improved cost-effective, reproducible method for evaluation of bone loss in a rodent model.

    Science.gov (United States)

    Fine, Daniel H; Schreiner, Helen; Nasri-Heir, Cibele; Greenberg, Barbara; Jiang, Shuying; Markowitz, Kenneth; Furgang, David

    2009-02-01

    This study was designed to investigate the utility of two "new" definitions for assessment of bone loss in a rodent model of periodontitis. Eighteen rats were divided into three groups. Group 1 was infected by Aggregatibacter actinomycetemcomitans (Aa), group 2 was infected with an Aa leukotoxin knock-out, and group 3 received no Aa (controls). Microbial sampling and antibody titres were determined. Initially, two examiners measured the distance from the cemento-enamel-junction to alveolar bone crest using the three following methods; (1) total area of bone loss by radiograph, (2) linear bone loss by radiograph, (3) a direct visual measurement (DVM) of horizontal bone loss. Two "new" definitions were adopted; (1) any site in infected animals showing bone loss >2 standard deviations above the mean seen at that site in control animals was recorded as bone loss, (2) any animal with two or more sites in any quadrant affected by bone loss was considered as diseased. Using the "new" definitions both evaluators independently found that infected animals had significantly more disease than controls (DVM system; p<0.05). The DVM method provides a simple, cost effective, and reproducible method for studying periodontal disease in rodents.

  11. Prediction of wall friction for fluids at supercritical pressure with CFD models

    International Nuclear Information System (INIS)

    Angelucci, M.; Ambrosini, W.; Forgione, N.

    2011-01-01

    In this paper, the STAR-CCM+ CFD code is used in the attempt to reproduce the values of friction factor observed in experimental data at supercritical pressures at various operating conditions. A short survey of available data and correlations for smooth pipe friction in circular pipes puts the basis for the discussion, reporting observed trends of friction factor in the liquid-like and the gas-like regions and within the transitional region around the pseudo-critical temperature. For smooth pipes, a general decrease of the friction factor in the transitional region is reported, constituting one of the relevant effects to be predicted by the computational fluid-dynamic models. A limited number of low-Reynolds number models is adopted, making use of refined near-wall discretisations as required by the constraint y + < 1 at the wall. In particular, the Lien k-ε and the SST k-ω models are considered. The values of the wall shear stress calculated by the code are then post-processed on the basis of bulk fluid properties to obtain the Fanning and then the Darcy-Weisbach friction factors, basing on their classical definitions. The obtained values are compared with those provided by experimental tests and correlations, finding a reasonable qualitative agreement. Expectedly, the agreement is better in the gas-like and liquid-like regions, where fluid property changes are moderate, than in the transitional region, where the trends provided by available correlations are reproduced only in a qualitative way. (author)

  12. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    Science.gov (United States)

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-08-01

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  13. Numerical modeling of an estuary: A comprehensive skill assessment

    Science.gov (United States)

    Warner, J.C.; Geyer, W.R.; Lerczak, J.A.

    2005-01-01

    Numerical simulations of the Hudson River estuary using a terrain-following, three-dimensional model (Regional Ocean Modeling System (ROMS)) are compared with an extensive set of time series and spatially resolved measurements over a 43 day period with large variations in tidal forcing and river discharge. The model is particularly effective at reproducing the observed temporal variations in both the salinity and current structure, including tidal, spring neap, and river discharge-induced variability. Large observed variations in stratification between neap and spring tides are captured qualitatively and quantitatively by the model. The observed structure and variations of the longitudinal salinity gradient are also well reproduced. The most notable discrepancy between the model and the data is in the vertical salinity structure. While the surface-to-bottom salinity difference is well reproduced, the stratification in the model tends to extend all the way to the water surface, whereas the observations indicate a distinct pycnocline and a surface mixed layer. Because the southern boundary coindition is located near the mouth the estuary, the salinity within the domain is particularly sensitive to the specification of salinity at the boundary. A boundary condition for the horizontal salinity gradient, based on the local value of salinity, is developed to incorporate physical processes beyond the open boundary not resolved by the model. Model results are sensitive to the specification of the bottom roughness length and vertical stability functions, insofar as they influence the intensity of vertical mixing. The results only varied slightly between different turbulence closure methods of k-??, k-??, and k-kl. Copyright 2005 by the American Geophysical Union.

  14. How well do CMIP5 Climate Models Reproduce the Hydrologic Cycle of the Colorado River Basin?

    Science.gov (United States)

    Gautam, J.; Mascaro, G.

    2017-12-01

    The Colorado River, which is the primary source of water for nearly 40 million people in the arid Southwestern states of the United States, has been experiencing an extended drought since 2000, which has led to a significant reduction in water supply. As the water demands increase, one of the major challenges for water management in the region has been the quantification of uncertainties associated with streamflow predictions in the Colorado River Basin (CRB) under potential changes of future climate. Hence, testing the reliability of model predictions in the CRB is critical in addressing this challenge. In this study, we evaluated the performances of 17 General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase Five (CMIP5) and 4 Regional Climate Models (RCMs) in reproducing the statistical properties of the hydrologic cycle in the CRB. We evaluated the water balance components at four nested sub-basins along with the inter-annual and intra-annual changes of precipitation (P), evaporation (E), runoff (R) and temperature (T) from 1979 to 2005. Most of the models captured the net water balance fairly well in the most-upstream basin but simulated a weak hydrological cycle in the evaporation channel at the downstream locations. The simulated monthly variability of P had different patterns, with correlation coefficients ranging from -0.6 to 0.8 depending on the sub-basin and the models from same parent institution clustering together. Apart from the most-upstream sub-basin where the models were mainly characterized by a negative seasonal bias in SON (of up to -50%), most of them had a positive bias in all seasons (of up to +260%) in the other three sub-basins. The models, however, captured the monthly variability of T well at all sites with small inter-model variabilities and a relatively similar range of bias (-7 °C to +5 °C) across all seasons. Mann-Kendall test was applied to the annual P and T time-series where majority of the models

  15. A fuzzy-logic-based approach to qualitative safety modelling for marine systems

    International Nuclear Information System (INIS)

    Sii, H.S.; Ruxton, Tom; Wang Jin

    2001-01-01

    Safety assessment based on conventional tools (e.g. probability risk assessment (PRA)) may not be well suited for dealing with systems having a high level of uncertainty, particularly in the feasibility and concept design stages of a maritime or offshore system. By contrast, a safety model using fuzzy logic approach employing fuzzy IF-THEN rules can model the qualitative aspects of human knowledge and reasoning processes without employing precise quantitative analyses. A fuzzy-logic-based approach may be more appropriately used to carry out risk analysis in the initial design stages. This provides a tool for working directly with the linguistic terms commonly used in carrying out safety assessment. This research focuses on the development and representation of linguistic variables to model risk levels subjectively. These variables are then quantified using fuzzy sets. In this paper, the development of a safety model using fuzzy logic approach for modelling various design variables for maritime and offshore safety based decision making in the concept design stage is presented. An example is used to illustrate the proposed approach

  16. Spring-block Model for Barkhausen Noise

    International Nuclear Information System (INIS)

    Kovacs, K.; Brechet, Y.; Neda, Z.

    2005-01-01

    A simple mechanical spring-block model is used for studying Barkhausen noise (BN). The model incorporates the generally accepted physics of domain wall movement and pinning. Computer simulations on this model reproduces the main features of the hysteresis loop and Barkhausen jumps. The statistics of the obtained Barkhausen jumps follows several scaling laws, in qualitative agreement with experimental results. The model consists of a one-dimensional frictional spring-block system. The blocks model the Bloch-walls that separate inversely oriented magnetic domains, and springs correspond to the magnetized regions. Three types of realistic forces are modelled with this system: 1. the force resulting from the magnetic energy of the neighboring domains in external magnetic field (modelled by forces having alternating orientations and acting directly on the blocks); 2. the force resulting from the magnetic self-energy of each domain (modelled by the elastic forces of the springs); 3. the pinning forces acting on the domain walls (modelled by position dependent static friction acting on blocks). The dynamics of the system is governed by searching for equilibrium: one particular domain wall can jump to the next pinning center if the resultant of forces 1. and 2. is greater then the pinning force. The external magnetic field is successively increased (or decreased) and the system is relaxed to mechanical equilibrium. During the simulations we are monitoring the variation of the magnetization focusing on the shape of the hysteresis loop, power spectrum, jump size (avalanche size) distribution, signal duration distribution, signal area distribution. The simulated shape of the hysteresis loops fulfills all the requirements for real magnetization phenomena. The power spectrum indicates different behavior in the low (1/f noise) and high (white noise) frequency region. All the relevant distribution functions show scaling behavior over several decades of magnitude with a naturally

  17. Fault diagnosis of air conditioning systems based on qualitative bond graph

    International Nuclear Information System (INIS)

    Ghiaus, C.

    1999-01-01

    The bond graph method represents a unified approach for modeling engineering systems. The main idea is that power transfer bonds the components of a system. The bond graph model is the same for both quantitative representation, in which parameters have numerical values, and qualitative approach, in which they are classified qualitatively. To infer the cause of faults using a qualitative method, a system of qualitative equations must be solved. However, the characteristics of qualitative operators require specific methods for solving systems of equations having qualitative variables. This paper proposes both a method for recursively solving the qualitative system of equations derived from bond graph, and a bond graph model of a direct-expansion, mechanical vapor-compression air conditioning system. Results from diagnosing two faults in a real air conditioning system are presented and discussed. Occasionally, more than one fault candidate is inferred for the same set of qualitative values derived from measurements. In these cases, additional information is required to localize the fault. Fault diagnosis is initiated by a fault detection mechanism which also classifies the quantitative measurements into qualitative values; the fault detection is not presented here. (author)

  18. [Qualitative Determination of Organic Vapour Using Violet and Visible Spectrum].

    Science.gov (United States)

    Jiang, Bo; Hu, Wen-zhong; Liu, Chang-jian; Zheng, Wei; Qi, Xiao-hui; Jiang, Ai-li; Wang, Yan-ying

    2015-12-01

    Vapours of organic matters were determined qualitatively employed with ultraviolet-visible absorption spectroscopy. Vapours of organic matters were detected using ultraviolet-visible spectrophotometer employing polyethylene film as medium, the ultraviolet and visible absorption spectra of vegetable oil vapours of soybean oil, sunflower seed oil, peanut oil, rapeseed oil, sesame oil, cotton seed oil, tung tree seed oil, and organic compound vapours of acetone, ethyl acetate, 95% ethanol, glacial acetic acid were obtained. Experimental results showed that spectra of the vegetable oil vapour and the organic compound vapour could be obtained commendably, since ultra violet and visible spectrum of polyethylene film could be deducted by spectrograph zero setting. Different kinds of vegetable oils could been distinguished commendably in the spectra since the λ(max), λ(min), number of absorption peak, position, inflection point in the ultra violet and visible spectra obtained from the vapours of the vegetable oils were all inconsistent, and the vapours of organic compounds were also determined perfectly. The method had a good reproducibility, the ultraviolet and visible absorption spectra of the vapours of sunflower seed oil in 10 times determination were absolutely the same. The experimental result indicated that polyethylene film as a kind of medium could be used for qualitative analysis of ultraviolet and visible absorption spectroscopy. The method for determination of the vapours of the vegetable oils and organic compounds had the peculiarities of fast speed analysis, well reproducibility, accuracy and reliability and low cost, and so on. Ultraviolet and visible absorption spectrum of organic vapour could provide feature information of material vapour and structural information of organic compound, and provide a novel test method for identifying vapour of compound and organic matter.

  19. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    Science.gov (United States)

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  20. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Kevin T. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts Institute of Technology, Division of Health Sciences and Technology, Cambridge, MA (United States); Izquierdo-Garcia, David; Catana, Ciprian [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Poynton, Clare B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts General Hospital, Department of Psychiatry, Boston, MA (United States); University of California, San Francisco, Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Chonde, Daniel B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Harvard University, Program in Biophysics, Cambridge, MA (United States)

    2017-03-15

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps (''μ-maps'') were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map (''PAC-map'') generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach. (orig.)

  1. 3D-modeling of the spine using EOS imaging system: Inter-reader reproducibility and reliability.

    Directory of Open Access Journals (Sweden)

    Johannes Rehm

    Full Text Available To retrospectively assess the interreader reproducibility and reliability of EOS 3D full spine reconstructions in patients with adolescent idiopathic scoliosis (AIS.73 patients with mean age of 17 years and a moderate AIS (median Cobb Angle 18.2° obtained low-dose standing biplanar radiographs with EOS. Two independent readers performed "full spine" 3D reconstructions of the spine with the "full-spine" method adjusting the bone contour of every thoracic and lumbar vertebra (Th1-L5. Interreader reproducibility was assessed regarding rotation of every single vertebra in the coronal (i.e. frontal, sagittal (i.e. lateral, and axial plane, T1/T12 kyphosis, T4/T12 kyphosis, L1/L5 lordosis, L1/S1 lordosis and pelvic parameters. Radiation exposure, scan-time and 3D reconstruction time were recorded.Interclass correlation (ICC ranged between 0.83 and 0.98 for frontal vertebral rotation, between 0.94 and 0.99 for lateral vertebral rotation and between 0.51 and 0.88 for axial vertebral rotation. ICC was 0.92 for T1/T12 kyphosis, 0.95 for T4/T12 kyphosis, 0.90 for L1/L5 lordosis, 0.85 for L1/S1 lordosis, 0.97 for pelvic incidence, 0.96 for sacral slope, 0.98 for sagittal pelvic tilt and 0.94 for lateral pelvic tilt. The mean time for reconstruction was 14.9 minutes (reader 1: 14.6 minutes, reader 2: 15.2 minutes, p<0.0001. The mean total absorbed dose was 593.4μGy ±212.3 per patient.EOS "full spine" 3D angle measurement of vertebral rotation proved to be reliable and was performed in an acceptable reconstruction time. Interreader reproducibility of axial rotation was limited to some degree in the upper and middle thoracic spine due the obtuse angulation of the pedicles and the processi spinosi in the frontal view somewhat complicating their delineation.

  2. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    Science.gov (United States)

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  3. Reproducible and controllable induction voltage adder for scaled beam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko [Department of Energy Sciences, Tokyo Institute of Technology, 4259 Nagatsuta, Midori-ku, Yokohama 226-8502 (Japan)

    2016-08-15

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  4. Qualitative Validation of the IMM Model for ISS and STS Programs

    Science.gov (United States)

    Kerstman, E.; Walton, M.; Reyes, D.; Boley, L.; Saile, L.; Young, M.; Arellano, J.; Garcia, Y.; Myers, J. G.

    2016-01-01

    To validate and further improve the Integrated Medical Model (IMM), medical event data were obtained from 32 ISS and 122 STS person-missions. Using the crew characteristics from these observed missions, IMM v4.0 was used to forecast medical events and medical resource utilization. The IMM medical condition incidence values were compared to the actual observed medical event incidence values, and the IMM forecasted medical resource utilization was compared to actual observed medical resource utilization. Qualitative comparisons of these parameters were conducted for both the ISS and STS programs. The results of these analyses will provide validation of IMM v4.0 and reveal areas of the model requiring adjustments to improve the overall accuracy of IMM outputs. This validation effort should result in enhanced credibility of the IMM and improved confidence in the use of IMM as a decision support tool for human space flight.

  5. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  6. Reproducibility of tumor uptake heterogeneity characterization through textural feature analysis in 18F-FDG PET.

    Science.gov (United States)

    Tixier, Florent; Hatt, Mathieu; Le Rest, Catherine Cheze; Le Pogam, Adrien; Corcos, Laurent; Visvikis, Dimitris

    2012-05-01

    (18)F-FDG PET measurement of standardized uptake value (SUV) is increasingly used for monitoring therapy response and predicting outcome. Alternative parameters computed through textural analysis were recently proposed to quantify the heterogeneity of tracer uptake by tumors as a significant predictor of response. The primary objective of this study was to evaluate the reproducibility of these heterogeneity measurements. Double baseline (18)F-FDG PET scans were acquired within 4 d of each other for 16 patients before any treatment was considered. A Bland-Altman analysis was performed on 8 parameters based on histogram measurements and 17 parameters based on textural heterogeneity features after discretization with values between 8 and 128. The reproducibility of maximum and mean SUV was similar to that in previously reported studies, with a mean percentage difference of 4.7% ± 19.5% and 5.5% ± 21.2%, respectively. By comparison, better reproducibility was measured for some textural features describing local heterogeneity of tracer uptake, such as entropy and homogeneity, with a mean percentage difference of -2% ± 5.4% and 1.8% ± 11.5%, respectively. Several regional heterogeneity parameters such as variability in the intensity and size of regions of homogeneous activity distribution had reproducibility similar to that of SUV measurements, with 95% confidence intervals of -22.5% to 3.1% and -1.1% to 23.5%, respectively. These parameters were largely insensitive to the discretization range. Several parameters derived from textural analysis describing heterogeneity of tracer uptake by tumors on local and regional scales had reproducibility similar to or better than that of simple SUV measurements. These reproducibility results suggest that these (18)F-FDG PET-derived parameters, which have already been shown to have predictive and prognostic value in certain cancer models, may be used to monitor therapy response and predict patient outcome.

  7. Qualitative phase space reconstruction analysis of supply-chain inventor time series

    Directory of Open Access Journals (Sweden)

    Jinliang Wu

    2010-10-01

    Full Text Available The economy systems are usually too complex to be analysed, but some advanced methods have been developed in order to do so, such as system dynamics modelling, multi-agent modelling, complex adaptive system modelling and qualitative modelling. In this paper, we considered a supply-chain (SC system including several kinds of products. Using historic suppliers’ demand data, we firstly applied the phase space analysis method and then used qualitative analysis to improve the complex system’s performance. Quantitative methods can forecast the quantitative SC demands, but they cannot indicate the qualitative aspects of SC, so when we apply quantitative methods to a SC system we get only numerous data of demand. By contrast, qualitative methods can show the qualitative change and trend of the SC demand. We therefore used qualitative methods to improve the quantitative forecasting results. Comparing the quantitative only method and the combined method used in this paper, we found that the combined method is far more accurate. Not only is the inventory cost lower, but the forecasting accuracy is also better.

  8. Qualitative approaches to use of the RE-AIM framework: rationale and methods.

    Science.gov (United States)

    Holtrop, Jodi Summers; Rabin, Borsika A; Glasgow, Russell E

    2018-03-13

    There have been over 430 publications using the RE-AIM model for planning and evaluation of health programs and policies, as well as numerous applications of the model in grant proposals and national programs. Full use of the model includes use of qualitative methods to understand why and how results were obtained on different RE-AIM dimensions, however, recent reviews have revealed that qualitative methods have been used infrequently. Having quantitative and qualitative methods and results iteratively inform each other should enhance understanding and lessons learned. Because there have been few published examples of qualitative approaches and methods using RE-AIM for planning or assessment and no guidance on how qualitative approaches can inform these processes, we provide guidance on qualitative methods to address the RE-AIM model and its various dimensions. The intended audience is researchers interested in applying RE-AIM or similar implementation models, but the methods discussed should also be relevant to those in community or clinical settings. We present directions for, examples of, and guidance on how qualitative methods can be used to address each of the five RE-AIM dimensions. Formative qualitative methods can be helpful in planning interventions and designing for dissemination. Summative qualitative methods are useful when used in an iterative, mixed methods approach for understanding how and why different patterns of results occur. In summary, qualitative and mixed methods approaches to RE-AIM help understand complex situations and results, why and how outcomes were obtained, and contextual factors not easily assessed using quantitative measures.

  9. Can CFMIP2 models reproduce the leading modes of cloud vertical structure in the CALIPSO-GOCCP observations?

    Science.gov (United States)

    Wang, Fang; Yang, Song

    2018-02-01

    Using principal component (PC) analysis, three leading modes of cloud vertical structure (CVS) are revealed by the GCM-Oriented CALIPSO Cloud Product (GOCCP), i.e. tropical high, subtropical anticyclonic and extratropical cyclonic cloud modes (THCM, SACM and ECCM, respectively). THCM mainly reflect the contrast between tropical high clouds and clouds in middle/high latitudes. SACM is closely associated with middle-high clouds in tropical convective cores, few-cloud regimes in subtropical anticyclonic clouds and stratocumulus over subtropical eastern oceans. ECCM mainly corresponds to clouds along extratropical cyclonic regions. Models of phase 2 of Cloud Feedback Model Intercomparison Project (CFMIP2) well reproduce the THCM, but SACM and ECCM are generally poorly simulated compared to GOCCP. Standardized PCs corresponding to CVS modes are generally captured, whereas original PCs (OPCs) are consistently underestimated (overestimated) for THCM (SACM and ECCM) by CFMIP2 models. The effects of CVS modes on relative cloud radiative forcing (RSCRF/RLCRF) (RSCRF being calculated at the surface while RLCRF at the top of atmosphere) are studied in terms of principal component regression method. Results show that CFMIP2 models tend to overestimate (underestimated or simulate the opposite sign) RSCRF/RLCRF radiative effects (REs) of ECCM (THCM and SACM) in unit global mean OPC compared to observations. These RE biases may be attributed to two factors, one of which is underestimation (overestimation) of low/middle clouds (high clouds) (also known as stronger (weaker) REs in unit low/middle (high) clouds) in simulated global mean cloud profiles, the other is eigenvector biases in CVS modes (especially for SACM and ECCM). It is suggested that much more attention should be paid on improvement of CVS, especially cloud parameterization associated with particular physical processes (e.g. downwelling regimes with the Hadley circulation, extratropical storm tracks and others), which

  10. Efficient and reproducible identification of mismatch repair deficient colon cancer

    DEFF Research Database (Denmark)

    Joost, Patrick; Bendahl, Pär-Ola; Halvarsson, Britta

    2013-01-01

    BACKGROUND: The identification of mismatch-repair (MMR) defective colon cancer is clinically relevant for diagnostic, prognostic and potentially also for treatment predictive purposes. Preselection of tumors for MMR analysis can be obtained with predictive models, which need to demonstrate ease...... of application and favorable reproducibility. METHODS: We validated the MMR index for the identification of prognostically favorable MMR deficient colon cancers and compared performance to 5 other prediction models. In total, 474 colon cancers diagnosed ≥ age 50 were evaluated with correlation between...... clinicopathologic variables and immunohistochemical MMR protein expression. RESULTS: Female sex, age ≥60 years, proximal tumor location, expanding growth pattern, lack of dirty necrosis, mucinous differentiation and presence of tumor-infiltrating lymphocytes significantly correlated with MMR deficiency. Presence...

  11. Teaching qualitative research as a means of socialization to nursing.

    Science.gov (United States)

    Arieli, Daniella; Tamir, Batya; Man, Michal

    2015-06-01

    The aim of the present article is to present a model for teaching qualitative research as part of nursing education. The uniqueness of the course model is that it seeks to combine two objectives: (1) initial familiarization of the students with the clinical-nursing environment and the role of the nurse; and (2) understanding the qualitative research approach and inculcation of basic qualitative research skills. The article describes how teaching two central genres in qualitative research - ethnographic and narrative research - constitutes a way of teaching the important skills, concepts, and values of the nursing profession. The article presents the model's structure, details its principal stages, and explains the rationale of each stage. It also presents the central findings of an evaluation of the model's implementation in eight groups over a two-year period. In this way the article seeks to contribute to nursing education literature in general, and to those engaged in clinical training and teaching qualitative research in nursing education in particular. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  13. Reproducibility in the analysis of multigated radionuclide studies of left ventricular ejection fraction

    International Nuclear Information System (INIS)

    Gjorup, T.; Kelbaek, H.; Vestergaard, B.; Fogh, J.; Munck, O.; Jensen, A.M.

    1989-01-01

    The authors determined the reproducibility (the standard deviation [SD]) in the analysis of multigated radionuclide studies of left ventricular ejection fraction (LVEF). Radionuclide studies from a consecutive series of 38 patients suspected of ischemic heart disease were analyzed independently by four nuclear medicine physiologists and four laboratory technicians. Each study was analyzed three times by each of the observers. Based on the analyses of the eight observers, the SD could be estimated by the use of a variance component model for LVEF determinations calculated as the average of the analyses of an arbitrary number of observers making an arbitrary number of analyses. This study presents the SDs for LVEF determinations based on the analyses of one to five observers making one to five analyses each. The SD of a LVEF determination decreased from 3.96% to 2.98% when an observer increased his number of analyses from one to five. A more pronounced decrease in the SD from 3.96% to 1.77% was obtained when the LVEF determinations were based on the average of a single analysis made by one to five observers. However, when dealing with the difference between LVEF determinations from two studies, the highest reproducibility was obtained if the LVEF determinations at both studies were based on the analyses made by the same observer. No significant difference was found in the reproducibility of analyses made by nuclear medicine physicians and laboratory technicians. Our study revealed that to increase the reproducibility of LVEF determinations, special efforts should be made to standardize the outlining of the end-systolic region interest

  14. Simulation of antiproton-nucleus interactions in the framework of the UrQMD model

    International Nuclear Information System (INIS)

    Galoyan, A.S.; Polanski, A.

    2003-01-01

    This paper proposes to apply the Ultra-Relativistic Quantum Molecular Dynamics (UrQMD) approach to implement the PANDA project (GSI, Germany). Simulation of p bar A interactions has been performed at antiproton energies from 1 to 200 GeV by using the UrQMD model. We have studied average multiplicities, multiplicity distributions of various types of secondary particles, correlations between the multiplicities, rapidity, and transverse momentum distributions of the particles. The UrQMD model predictions on inelastic p bar A collisions have been found to reproduce qualitatively the experimental data. However, to reach the quantitative agreement, especially, in fragmentation regions, it is needed to modify the UrQMD model

  15. A Microstructure-Based Model to Characterize Micromechanical Parameters Controlling Compressive and Tensile Failure in Crystallized Rock

    Science.gov (United States)

    Kazerani, T.; Zhao, J.

    2014-03-01

    A discrete element model is proposed to examine rock strength and failure. The model is implemented by UDEC which is developed for this purpose. The material is represented as a collection of irregular-sized deformable particles interacting at their cohesive boundaries. The interface between two adjacent particles is viewed as a flexible contact whose stress-displacement law is assumed to control the material fracture and fragmentation process. To reproduce rock anisotropy, an innovative orthotropic cohesive law is developed for contact which allows the interfacial shear and tensile behaviours to be different from each other. The model is applied to a crystallized igneous rock and the individual and interactional effects of the microstructural parameters on the material compressive and tensile failure response are examined. A new methodical calibration process is also established. It is shown that the model successfully reproduces the rock mechanical behaviour quantitatively and qualitatively. Ultimately, the model is used to understand how and under what circumstances micro-tensile and micro-shear cracking mechanisms control the material failure at different loading paths.

  16. Reproducibility problems of in-service ultrasonic testing results

    International Nuclear Information System (INIS)

    Honcu, E.

    1974-01-01

    The reproducibility of the results of ultrasonic testing is the basic precondition for its successful application in in-service inspection of changes in the quality of components of nuclear power installations. The results of periodic ultrasonic inspections are not satisfactory from the point of view of reproducibility. Regardless, the ultrasonic pulse-type method is suitable for evaluating the quality of most components of nuclear installations and often the sole method which may be recommended for inspection with regard to its technical and economic aspects. (J.B.)

  17. Comment on "Most computational hydrology is not reproducible, so is it really science?" by Christopher Hutton et al.

    Science.gov (United States)

    Añel, Juan A.

    2017-03-01

    Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.Plain Language SummaryThis article highlights several important issues to ensure the scientific reproducibility of results within the current scientific framework, going beyond simple documentation. Several specific examples are discussed in the field of hydrological modeling.

  18. Numerical estimation of wall friction ratio near the pseudo-critical point with CFD-models

    International Nuclear Information System (INIS)

    Angelucci, M.; Ambrosini, W.; Forgione, N.

    2013-01-01

    In this paper, the STAR-CCM+ CFD code is used in the attempt to reproduce the values of friction factor observed in experimental data at supercritical pressures at various operating conditions. A short survey of available data and correlations for smooth pipe friction in circular pipes puts the basis for the discussion, reporting observed trends of friction factor in the liquid-like and the gas-like regions and within the transitional region across the pseudo-critical temperature. For smooth pipes, a general decrease of the friction factor in the transitional region is reported, constituting one of the relevant effects to be predicted by the computational fluid-dynamic models. A limited number of low-Reynolds number models are adopted, making use of refined near-wall discretisation as required by the constraint y + < 1 at the wall. In particular, the Lien k–ε and the SST k–ω models are considered. The values of the wall shear stress calculated by the code are then post-processed on the basis of bulk fluid properties to obtain the Fanning and then the Darcy–Weisbach friction factors, based on their classical definitions. The obtained values are compared with those provided by experimental tests and correlations, finding a reasonable qualitative agreement. Expectedly, the agreement is better in the gas-like and liquid-like regions, where fluid property changes are moderate, than in the transitional region, where the trends provided by available correlations are reproduced only in a qualitative way

  19. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  20. Mate-finding as an overlooked critical determinant of dispersal variation in sexually-reproducing animals.

    Science.gov (United States)

    Gilroy, James J; Lockwood, Julie L

    2012-01-01

    Dispersal is a critically important process in ecology, but robust predictive models of animal dispersal remain elusive. We identify a potentially ubiquitous component of variation in animal dispersal that has been largely overlooked until now: the influence of mate encounters on settlement probability. We use an individual-based model to simulate dispersal in sexually-reproducing organisms that follow a simple set of movement rules based on conspecific encounters, within an environment lacking spatial habitat heterogeneity. We show that dispersal distances vary dramatically with fluctuations in population density in such a model, even in the absence of variation in dispersive traits between individuals. In a simple random-walk model with promiscuous mating, dispersal distributions become increasingly 'fat-tailed' at low population densities due to the increasing scarcity of mates. Similar variation arises in models incorporating territoriality. In a model with polygynous mating, we show that patterns of sex-biased dispersal can even be reversed across a gradient of population density, despite underlying dispersal mechanisms remaining unchanged. We show that some widespread dispersal patterns found in nature (e.g. fat tailed distributions) can arise as a result of demographic variability in the absence of heterogeneity in dispersive traits across the population. This implies that models in which individual dispersal distances are considered to be fixed traits might be unrealistic, as dispersal distances vary widely under a single dispersal mechanism when settlement is influenced by mate encounters. Mechanistic models offer a promising means of advancing our understanding of dispersal in sexually-reproducing organisms.

  1. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  2. Using prediction markets to estimate the reproducibility of scientific research

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  3. Using prediction markets to estimate the reproducibility of scientific research.

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  4. Composting in small laboratory pilots: Performance and reproducibility

    International Nuclear Information System (INIS)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-01-01

    Highlights: ► We design an innovative small-scale composting device including six 4-l reactors. ► We investigate the performance and reproducibility of composting on a small scale. ► Thermophilic conditions are established by self-heating in all replicates. ► Biochemical transformations, organic matter losses and stabilisation are realistic. ► The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors ( 2 consumption and CO 2 emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  5. Reproducibility of biomarkers in induced sputum and in serum from chronic smokers.

    Science.gov (United States)

    Zuiker, Rob G J A; Kamerling, Ingrid M C; Morelli, Nicoletta; Calderon, Cesar; Boot, J Diderik; de Kam, Marieke; Diamant, Zuzana; Burggraaf, Jacobus; Cohen, Adam F

    2015-08-01

    Soluble inflammatory markers obtained from non-invasive airway sampling such as induced sputum may be useful biomarkers for targeted pharmaceutical interventions. However, before these soluble markers can be used as potential targets, their variability and reproducibility need to be established in distinct study populations. This study aimed to assess the reproducibility of biomarkers obtained from induced sputum and serum in chronic smokers and non-smokers. Sputum and serum samples were obtained from 16 healthy non-smokers and 16 asymptomatic chronic smokers (for both groups: 8M/8F, 30-52 years, FEV1 ≥80% pred.; ≥10 pack years for the smokers) on 2 separate visits 4-10 days apart. Soluble markers in serum and sputum were analysed by ELISA. The differences between smokers vs non-smokers were analysed with a t-test and variability was assessed on log-transformed data by a mixed model ANOVA. Analysable sputum samples could be obtained from all 32 subjects. In both study populations neutrophils and macrophages were the predominant cell types. Serum Pulmonary Surfactant Associated Protein D had favourable reproducibility criteria for reliability ratio (0.99), intra-subject coefficient of variation (11.2%) and the Bland Altman limits of agreement. Furthermore, chronic smokers, compared to non-smokers, had significantly higher sputum concentrations of IL-8 (1094.6 pg/mL vs 460.8 pg/mL, p = 0.006)), and higher serum concentrations of Pulmonary Surfactant Associated Protein D (110.9 pg/mL vs 64.7 pg/mL, p = 0.019), and lower concentrations of Serum Amyloid A (1352.4 pg/mL vs 2297.5 pg/mL, p = 0.022). Serum Pulmonary Surfactant Associated Protein D proved to be a biomarker that fulfilled the criteria for reproducibility in both study groups. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Investigation of the Intra- and Interlaboratory Reproducibility of a Small Scale Standardized Supersaturation and Precipitation Method

    DEFF Research Database (Denmark)

    Plum, Jakob; Madsen, Cecilie M; Teleki, Alexandra

    2017-01-01

    order for the three model compounds using the SSPM (aprepitant > felodipine ≈ fenofibrate). The α-value is dependent on the experimental setup and can be used as a parameter to evaluate the uniformity of the data set. This study indicated that the SSPM was able to obtain the same rank order of the β...... compound available for absorption. However, due to the stochastic nature of nucleation, supersaturating drug delivery systems may lead to inter- and intrapersonal variability. The ability to define a feasible range with respect to the supersaturation level is a crucial factor for a successful formulation...... reproducibility study of felodipine was conducted, after which seven partners contributed with data for three model compounds; aprepitant, felodipine, and fenofibrate, to determine the interlaboratory reproducibility of the SSPM. The first part of the SSPM determines the apparent degrees of supersaturation (a...

  7. Reproducibility of esophageal scintigraphy using semi-solid yoghurt

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Yukinori; Kinoshita, Manabu; Asakura, Yasushi; Kakinuma, Tohru; Shimoji, Katsunori; Fujiwara, Kenji; Suzuki, Kenji; Miyamae, Tatsuya [Saitama Medical School, Moroyama (Japan)

    1999-10-01

    Esophageal scintigraphy is a non-invasive method which evaluate esophageal function quantitatively. We applied new technique using semi-solid yoghurt, which can evaluate esophageal function in a sitting position. To evaluate the reproducibility of this method, scintigraphy were performed in 16 healthy volunteers. From the result of four swallows except the first one, the mean coefficients of variation in esophageal transit time and esophageal emptying time were 12.8% and 13.4% respectively (interday variation). As regards the interday variation, this method had also good reproducibility from the result on the 2 separate days. (author)

  8. An Evaluation Model of Quantitative and Qualitative Fuzzy Multi-Criteria Decision-Making Approach for Location Selection of Transshipment Ports

    Directory of Open Access Journals (Sweden)

    Ji-Feng Ding

    2013-01-01

    Full Text Available The role of container logistics centre as home bases for merchandise transportation has become increasingly important. The container carriers need to select a suitable centre location of transshipment port to meet the requirements of container shipping logistics. In the light of this, the main purpose of this paper is to develop a fuzzy multi-criteria decision-making (MCDM model to evaluate the best selection of transshipment ports for container carriers. At first, some concepts and methods used to develop the proposed model are briefly introduced. The performance values of quantitative and qualitative subcriteria are discussed to evaluate the fuzzy ratings. Then, the ideal and anti-ideal concepts and the modified distance measure method are used in the proposed model. Finally, a step-by-step example is illustrated to study the computational process of the quantitative and qualitative fuzzy MCDM model. The proposed approach has successfully accomplished our goal. In addition, the proposed fuzzy MCDM model can be empirically employed to select the best location of transshipment port for container carriers in the future study.

  9. Priority research directions in the area of qualitative methodology

    OpenAIRE

    Melnikova, Olga; Khoroshilov, Dmitry

    2010-01-01

    The basic directions of modern theoretical and practical research in the area of qualitative methodology in Russia are discussed in the article. The complexity of research is considered from three points of view: the development of methodology of qualitative analysis, qualitative methods, and verbal and nonverbal projective techniques. The authors present an integrative model of the qualitative analysis, the research on specificity of the use of discourse-analysis method and projective techni...

  10. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  11. Alpha-transfer reactions and the pairing-vibration model

    International Nuclear Information System (INIS)

    Betts, R.R.

    1977-01-01

    The pairing-vibration model with isospin is extended to include α-transfer reactions. Selection rules and expressions for transition strengths are derived and compared with experimental results for A = 40--66 nuclei. The selection rules are found to be followed quite well in the examples studied. The systematics of ground-state transition strengths are qualitatively quite well reproduced although the quantitative agreement is poor. When the changing nature of the pairing quanta is incorporated using two-particle transfer data the agreement becomes quantitatively good. Evidence is presented for clustering other than that due to pairing in 40 Ca and 44 Ti

  12. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  13. Magnet stability and reproducibility

    CERN Document Server

    Marks, N

    2010-01-01

    Magnet stability and reproducibility have become increasingly important as greater precision and beams with smaller dimension are required for research, medical and other purpose. The observed causes of mechanical and electrical instability are introduced and the engineering arrangements needed to minimize these problems discussed; the resulting performance of a state-of-the-art synchrotron source (Diamond) is then presented. The need for orbit feedback to obtain best possible beam stability is briefly introduced, but omitting any details of the necessary technical equipment, which is outside the scope of the presentation.

  14. Model for self-polarization and motility of keratocyte fragments

    KAUST Repository

    Ziebert, F.; Swaminathan, S.; Aranson, I. S.

    2011-01-01

    Computational modelling of cell motility on substrates is a formidable challenge; regulatory pathways are intertwined and forces that influence cell motion are not fully quantified. Additional challenges arise from the need to describe a moving deformable cell boundary. Here, we present a simple mathematical model coupling cell shape dynamics, treated by the phase-field approach, to a vector field describing the mean orientation (polarization) of the actin filament network. The model successfully reproduces the primary phenomenology of cell motility: discontinuous onset of motion, diversity of cell shapes and shape oscillations. The results are in qualitative agreement with recent experiments on motility of keratocyte cells and cell fragments. The asymmetry of the shapes is captured to a large extent in this simple model, which may prove useful for the interpretation of experiments.

  15. Model for self-polarization and motility of keratocyte fragments

    KAUST Repository

    Ziebert, F.

    2011-10-19

    Computational modelling of cell motility on substrates is a formidable challenge; regulatory pathways are intertwined and forces that influence cell motion are not fully quantified. Additional challenges arise from the need to describe a moving deformable cell boundary. Here, we present a simple mathematical model coupling cell shape dynamics, treated by the phase-field approach, to a vector field describing the mean orientation (polarization) of the actin filament network. The model successfully reproduces the primary phenomenology of cell motility: discontinuous onset of motion, diversity of cell shapes and shape oscillations. The results are in qualitative agreement with recent experiments on motility of keratocyte cells and cell fragments. The asymmetry of the shapes is captured to a large extent in this simple model, which may prove useful for the interpretation of experiments.

  16. A rotating bag model for hadrons. 2

    International Nuclear Information System (INIS)

    Iwasaki, Masaharu

    1994-01-01

    The MIT bag model is modified in order to describe rotational motion of hadrons. It has a kind of 'diatomic molecular' structure; The rotational excitation of the MIT bag is described by the polarized two colored sub-bags which are connected with each other by the gluon flux. One sub-bag contains a quark and the other has an antiquark for mesons. For baryons, the latter sub-bag contains the remaining two quarks instead of the antiquark. The Regge trajectories of hadrons are explained qualitatively by our new model with the usual MIT bag parameters. In particular the Regge slopes are reproduced fairly well. It is also pointed out that the gluon flux plays an important role in the rotational motion of hadrons. (author)

  17. Clinical application of qualitative assessment for breast masses in shear-wave elastography

    International Nuclear Information System (INIS)

    Gweon, Hye Mi; Youk, Ji Hyun; Son, Eun Ju; Kim, Jeong-Ah

    2013-01-01

    Purpose: To evaluate the interobserver agreement and the diagnostic performance of various qualitative features in shear-wave elastography (SWE) for breast masses. Materials and methods: A total of 153 breast lesions in 152 women who underwent B-mode ultrasound and SWE before biopsy were included. Qualitative analysis in SWE was performed using two different classifications: E values (Ecol; 6-point color score, Ehomo; homogeneity score and Esha; shape score) and a four-color pattern classification. Two radiologists reviewed five data sets: B-mode ultrasound, SWE, and combination of both for E values and four-color pattern. The BI-RADS categories were assessed B-mode and combined sets. Interobserver agreement was assessed using weighted κ statistics. Areas under the receiver operating characteristic curve (AUC), sensitivity, and specificity were analyzed. Results: Interobserver agreement was substantial for Ecol (κ = 0.79), Ehomo (κ = 0.77) and four-color pattern (κ = 0.64), and moderate for Esha (κ = 0.56). Better-performing qualitative features were Ecol and four-color pattern (AUCs, 0.932 and 0.925) compared with Ehomo and Esha (AUCs, 0.857 and 0.864; P < 0.05). The diagnostic performance of B-mode ultrasound (AUC, 0.950) was not significantly different from combined sets with E value and with four color pattern (AUCs, 0.962 and 0.954). When all qualitative values were negative, leading to downgrade the BI-RADS category, the specificity increased significantly from 16.5% to 56.1% (E value) and 57.0% (four-color pattern) (P < 0.001) without improvement in sensitivity. Conclusion: The qualitative SWE features were highly reproducible and showed good diagnostic performance in suspicious breast masses. Adding qualitative SWE to B-mode ultrasound increased specificity in decision making for biopsy recommendation

  18. Clinical application of qualitative assessment for breast masses in shear-wave elastography

    Energy Technology Data Exchange (ETDEWEB)

    Gweon, Hye Mi; Youk, Ji Hyun, E-mail: jhyouk@yuhs.ac; Son, Eun Ju; Kim, Jeong-Ah

    2013-11-01

    Purpose: To evaluate the interobserver agreement and the diagnostic performance of various qualitative features in shear-wave elastography (SWE) for breast masses. Materials and methods: A total of 153 breast lesions in 152 women who underwent B-mode ultrasound and SWE before biopsy were included. Qualitative analysis in SWE was performed using two different classifications: E values (Ecol; 6-point color score, Ehomo; homogeneity score and Esha; shape score) and a four-color pattern classification. Two radiologists reviewed five data sets: B-mode ultrasound, SWE, and combination of both for E values and four-color pattern. The BI-RADS categories were assessed B-mode and combined sets. Interobserver agreement was assessed using weighted κ statistics. Areas under the receiver operating characteristic curve (AUC), sensitivity, and specificity were analyzed. Results: Interobserver agreement was substantial for Ecol (κ = 0.79), Ehomo (κ = 0.77) and four-color pattern (κ = 0.64), and moderate for Esha (κ = 0.56). Better-performing qualitative features were Ecol and four-color pattern (AUCs, 0.932 and 0.925) compared with Ehomo and Esha (AUCs, 0.857 and 0.864; P < 0.05). The diagnostic performance of B-mode ultrasound (AUC, 0.950) was not significantly different from combined sets with E value and with four color pattern (AUCs, 0.962 and 0.954). When all qualitative values were negative, leading to downgrade the BI-RADS category, the specificity increased significantly from 16.5% to 56.1% (E value) and 57.0% (four-color pattern) (P < 0.001) without improvement in sensitivity. Conclusion: The qualitative SWE features were highly reproducible and showed good diagnostic performance in suspicious breast masses. Adding qualitative SWE to B-mode ultrasound increased specificity in decision making for biopsy recommendation.

  19. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  20. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  1. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  2. Religious views of the 'medical' rehabilitation model: a pilot qualitative study.

    Science.gov (United States)

    Yamey, Gavin; Greenwood, Richard

    2004-04-22

    To explore the religious beliefs that patients may bring to the rehabilitation process, and the hypothesis that these beliefs may diverge from the medical model of rehabilitation. Qualitative semi-structured interviews with representatives of six major religions--Islam, Buddhism, Christianity, Judaism, Sikhism, and Hinduism. Representatives were either health care professionals or religious leaders, all with an interest in how their religion approached health issues. There were three recurrent themes in the interviews: religious explanations for injury and illness; beliefs about recovery; religious duties of care towards family members. The Buddhist, Sikh, and Hindu interviewees described beliefs about karma--unfortunate events happening due to a person's former deeds. Fatalistic ideas, involving God having control over an individual's recovery, were expressed by the Muslim, Jewish, and Christian interviewees. All interviewees expressed the fundamental importance of a family's religious duty of care towards ill or injured relatives, and all expressed some views that were compatible with the medical model of rehabilitation. Religious beliefs may both diverge from and resonate with the medical rehabilitation model. Understanding these beliefs may be valuable in facilitating the rehabilitation of diverse religious groups.

  3. Can Computational Sediment Transport Models Reproduce the Observed Variability of Channel Networks in Modern Deltas?

    Science.gov (United States)

    Nesvold, E.; Mukerji, T.

    2017-12-01

    River deltas display complex channel networks that can be characterized through the framework of graph theory, as shown by Tejedor et al. (2015). Deltaic patterns may also be useful in a Bayesian approach to uncertainty quantification of the subsurface, but this requires a prior distribution of the networks of ancient deltas. By considering subaerial deltas, one can at least obtain a snapshot in time of the channel network spectrum across deltas. In this study, the directed graph structure is semi-automatically extracted from satellite imagery using techniques from statistical processing and machine learning. Once the network is labeled with vertices and edges, spatial trends and width and sinuosity distributions can also be found easily. Since imagery is inherently 2D, computational sediment transport models can serve as a link between 2D network structure and 3D depositional elements; the numerous empirical rules and parameters built into such models makes it necessary to validate the output with field data. For this purpose we have used a set of 110 modern deltas, with average water discharge ranging from 10 - 200,000 m3/s, as a benchmark for natural variability. Both graph theoretic and more general distributions are established. A key question is whether it is possible to reproduce this deltaic network spectrum with computational models. Delft3D was used to solve the shallow water equations coupled with sediment transport. The experimental setup was relatively simple; incoming channelized flow onto a tilted plane, with varying wave and tidal energy, sediment types and grain size distributions, river discharge and a few other input parameters. Each realization was run until a delta had fully developed: between 50 and 500 years (with a morphology acceleration factor). It is shown that input parameters should not be sampled independently from the natural ranges, since this may result in deltaic output that falls well outside the natural spectrum. Since we are

  4. In-vitro accuracy and reproducibility evaluation of probing depth measurements of selected periodontal probes

    Directory of Open Access Journals (Sweden)

    K.N. Al Shayeb

    2014-01-01

    Conclusion: Depth measurements with the Chapple UB-CF-15 probe were more accurate and reproducible compared to measurements with the Vivacare TPS and Williams 14 W probes. This in vitro model may be useful for intra-examiner calibration or clinician training prior to the clinical evaluation of patients or in longitudinal studies involving periodontal evaluation.

  5. Dynamic Contrast-enhanced MR Imaging in Renal Cell Carcinoma: Reproducibility of Histogram Analysis on Pharmacokinetic Parameters

    Science.gov (United States)

    Wang, Hai-yi; Su, Zi-hua; Xu, Xiao; Sun, Zhi-peng; Duan, Fei-xue; Song, Yuan-yuan; Li, Lu; Wang, Ying-wei; Ma, Xin; Guo, Ai-tao; Ma, Lin; Ye, Hui-yi

    2016-01-01

    Pharmacokinetic parameters derived from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) have been increasingly used to evaluate the permeability of tumor vessel. Histogram metrics are a recognized promising method of quantitative MR imaging that has been recently introduced in analysis of DCE-MRI pharmacokinetic parameters in oncology due to tumor heterogeneity. In this study, 21 patients with renal cell carcinoma (RCC) underwent paired DCE-MRI studies on a 3.0 T MR system. Extended Tofts model and population-based arterial input function were used to calculate kinetic parameters of RCC tumors. Mean value and histogram metrics (Mode, Skewness and Kurtosis) of each pharmacokinetic parameter were generated automatically using ImageJ software. Intra- and inter-observer reproducibility and scan–rescan reproducibility were evaluated using intra-class correlation coefficients (ICCs) and coefficient of variation (CoV). Our results demonstrated that the histogram method (Mode, Skewness and Kurtosis) was not superior to the conventional Mean value method in reproducibility evaluation on DCE-MRI pharmacokinetic parameters (K trans & Ve) in renal cell carcinoma, especially for Skewness and Kurtosis which showed lower intra-, inter-observer and scan-rescan reproducibility than Mean value. Our findings suggest that additional studies are necessary before wide incorporation of histogram metrics in quantitative analysis of DCE-MRI pharmacokinetic parameters. PMID:27380733

  6. Photoionization Models for the Inner Gaseous Disks of Herbig Be Stars: Evidence against Magnetospheric Accretion?

    Energy Technology Data Exchange (ETDEWEB)

    Patel, P.; Sigut, T. A. A.; Landstreet, J. D., E-mail: ppatel54@uwo.ca [Department of Physics and Astronomy, The University of Western Ontario, London, ON N6A 3K7 (Canada)

    2017-02-20

    We investigate the physical properties of the inner gaseous disks of three hot Herbig B2e stars, HD 76534, HD 114981, and HD 216629, by modeling CFHT-ESPaDOns spectra using non-LTE radiative transfer codes. We assume that the emission lines are produced in a circumstellar disk heated solely by photospheric radiation from the central star in order to test whether the optical and near-infrared emission lines can be reproduced without invoking magnetospheric accretion. The inner gaseous disk density was assumed to follow a simple power-law in the equatorial plane, and we searched for models that could reproduce observed lines of H i (H α and H β ), He i, Ca ii, and Fe ii. For the three stars, good matches were found for all emission line profiles individually; however, no density model based on a single power-law was able to reproduce all of the observed emission lines. Among the single power-law models, the one with the gas density varying as ∼10{sup −10}( R {sub *}/ R ){sup 3} g cm{sup −3} in the equatorial plane of a 25 R {sub *} (0.78 au) disk did the best overall job of representing the optical emission lines of the three stars. This model implies a mass for the H α -emitting portion of the inner gaseous disk of ∼10{sup −9} M {sub *}. We conclude that the optical emission line spectra of these HBe stars can be qualitatively reproduced by a ≈1 au, geometrically thin, circumstellar disk of negligible mass compared to the central star in Keplerian rotation and radiative equilibrium.

  7. Photoionization Models for the Inner Gaseous Disks of Herbig Be Stars: Evidence against Magnetospheric Accretion?

    International Nuclear Information System (INIS)

    Patel, P.; Sigut, T. A. A.; Landstreet, J. D.

    2017-01-01

    We investigate the physical properties of the inner gaseous disks of three hot Herbig B2e stars, HD 76534, HD 114981, and HD 216629, by modeling CFHT-ESPaDOns spectra using non-LTE radiative transfer codes. We assume that the emission lines are produced in a circumstellar disk heated solely by photospheric radiation from the central star in order to test whether the optical and near-infrared emission lines can be reproduced without invoking magnetospheric accretion. The inner gaseous disk density was assumed to follow a simple power-law in the equatorial plane, and we searched for models that could reproduce observed lines of H i (H α and H β ), He i, Ca ii, and Fe ii. For the three stars, good matches were found for all emission line profiles individually; however, no density model based on a single power-law was able to reproduce all of the observed emission lines. Among the single power-law models, the one with the gas density varying as ∼10 −10 ( R * / R ) 3 g cm −3 in the equatorial plane of a 25 R * (0.78 au) disk did the best overall job of representing the optical emission lines of the three stars. This model implies a mass for the H α -emitting portion of the inner gaseous disk of ∼10 −9 M * . We conclude that the optical emission line spectra of these HBe stars can be qualitatively reproduced by a ≈1 au, geometrically thin, circumstellar disk of negligible mass compared to the central star in Keplerian rotation and radiative equilibrium.

  8. An evaluation of WRF's ability to reproduce the surface wind over complex terrain based on typical circulation patterns.

    NARCIS (Netherlands)

    Jiménez, P.A.; Dudhia, J.; González-Rouco, J.F.; Montávez, J.P.; Garcia-Bustamante, E.; Navarro, J.; Vilà-Guerau de Arellano, J.; Munoz-Roldán, A.

    2013-01-01

    [1] The performance of the Weather Research and Forecasting (WRF) model to reproduce the surface wind circulations over complex terrain is examined. The atmospheric evolution is simulated using two versions of the WRF model during an over 13¿year period (1992 to 2005) over a complex terrain region

  9. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  10. The persistence of subsistence: qualitative social-ecological modeling of indigenous aquatic hunting and gathering in tropical Australia

    Directory of Open Access Journals (Sweden)

    Marcus Barber

    2015-03-01

    Full Text Available Subsistence remains critical to indigenous people in settler-colonial states such as Australia, providing key foundations for indigenous identities and for wider state recognition. However, the drivers of contemporary subsistence are rarely fully articulated and analyzed in terms of likely changing conditions. Our interdisciplinary team combined past research experience gained from multiple sites with published literature to create two generalized qualitative models of the socio-cultural and environmental influences on indigenous aquatic subsistence in northern Australia. One model focused on the longer term (inter-year to generational persistence of subsistence at the community scale, the other model on shorter term (day to season drivers of effort by active individuals. The specification of driver definitions and relationships demonstrates the complexities of even generalized and materialist models of contemporary subsistence practices. The qualitative models were analyzed for emergent properties and for responses to plausible changes in key variables: access, habitat degradation, social security availability, and community dysfunction. Positive human community condition is shown to be critical to the long-term persistence of subsistence, but complex interactions of negative and positive drivers shape subsistence effort expended at the individual scale and within shorter time frames. Such models enable motivations, complexities, and the potential management and policy levers of significance to be identified, defined, causally related, and debated. The models can be used to augment future models of human-natural systems, be tested against case-specific field conditions and/or indigenous perspectives, and aid preliminary assessments of the effects on subsistence of changes in social and environmental conditions, including policy settings.

  11. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  12. Qualitative and numerical study of Bianchi IX Models

    International Nuclear Information System (INIS)

    Francisco, G.; Matsas, G.E.A.

    1987-01-01

    The qualitative behaviour of trajectories in the Mixmaster universe is studied. The Lyapunov exponents computed directly from the differential equations and from the Poincare map are shown to be different. A detailed discussion of the role of these exponents in analysing the effect of chaos on trajectories is presented. (Author) [pt

  13. Evolvix BEST Names for semantic reproducibility across code2brain interfaces.

    Science.gov (United States)

    Loewe, Laurence; Scheuer, Katherine S; Keel, Seth A; Vyas, Vaibhav; Liblit, Ben; Hanlon, Bret; Ferris, Michael C; Yin, John; Dutra, Inês; Pietsch, Anthony; Javid, Christine G; Moog, Cecilia L; Meyer, Jocelyn; Dresel, Jerdon; McLoone, Brian; Loberger, Sonya; Movaghar, Arezoo; Gilchrist-Scott, Morgaine; Sabri, Yazeed; Sescleifer, Dave; Pereda-Zorrilla, Ivan; Zietlow, Andrew; Smith, Rodrigo; Pietenpol, Samantha; Goldfinger, Jacob; Atzen, Sarah L; Freiberg, Erika; Waters, Noah P; Nusbaum, Claire; Nolan, Erik; Hotz, Alyssa; Kliman, Richard M; Mentewab, Ayalew; Fregien, Nathan; Loewe, Martha

    2017-01-01

    Names in programming are vital for understanding the meaning of code and big data. We define code2brain (C2B) interfaces as maps in compilers and brains between meaning and naming syntax, which help to understand executable code. While working toward an Evolvix syntax for general-purpose programming that makes accurate modeling easy for biologists, we observed how names affect C2B quality. To protect learning and coding investments, C2B interfaces require long-term backward compatibility and semantic reproducibility (accurate reproduction of computational meaning from coder-brains to reader-brains by code alone). Semantic reproducibility is often assumed until confusing synonyms degrade modeling in biology to deciphering exercises. We highlight empirical naming priorities from diverse individuals and roles of names in different modes of computing to show how naming easily becomes impossibly difficult. We present the Evolvix BEST (Brief, Explicit, Summarizing, Technical) Names concept for reducing naming priority conflicts, test it on a real challenge by naming subfolders for the Project Organization Stabilizing Tool system, and provide naming questionnaires designed to facilitate C2B debugging by improving names used as keywords in a stabilizing programming language. Our experiences inspired us to develop Evolvix using a flipped programming language design approach with some unexpected features and BEST Names at its core. © 2016 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  14. Reproducibility in cyclostratigraphy: initiating an intercomparison project

    Science.gov (United States)

    Sinnesael, Matthias; De Vleeschouwer, David; Zeeden, Christian; Claeys, Philippe

    2017-04-01

    The study of astronomical climate forcing and the application of cyclostratigraphy have experienced a spectacular growth over the last decades. In the field of cyclostratigraphy a broad range in methodological approaches exist. However, comparative study between the different approaches is lacking. Different cases demand different approaches, but with the growing importance of the field, questions arise about reproducibility, uncertainties and standardization of results. The radioisotopic dating community, in particular, has done far-reaching efforts to improve reproducibility and intercomparison of radioisotopic dates and their errors. To satisfy this need in cyclostratigraphy, we initiate a comparable framework for the community. The aims are to investigate and quantify reproducibility of, and uncertainties related to cyclostratigraphic studies and to provide a platform to discuss the merits and pitfalls of different methodologies, and their applicabilities. With this poster, we ask the feedback from the community on how to design this comparative framework in a useful, meaningful and productive manner. In parallel, we would like to discuss how reproducibility should be tested and what uncertainties should stand for in cyclostratigraphy. On the other hand, we intend to trigger interest for a cyclostratigraphic intercomparison project. This intercomparison project would imply the analysis of artificial and genuine geological records by individual researchers. All participants would be free to determine their method of choice. However, a handful of criterions will be required for an outcome to be comparable. The different results would be compared (e.g. during a workshop or a special session), and the lessons learned from the comparison could potentially be reported in a review paper. The aim of an intercomparison project is not to rank the different methods according to their merits, but to get insight into which specific methods are most suitable for which

  15. Numerical modeling of plasma plume evolution against ambient background gas in laser blow off experiments

    International Nuclear Information System (INIS)

    Patel, Bhavesh G.; Das, Amita; Kaw, Predhiman; Singh, Rajesh; Kumar, Ajai

    2012-01-01

    Two dimensional numerical modelling based on simplified hydrodynamic evolution for an expanding plasma plume (created by laser blow off) against an ambient background gas has been carried out. A comparison with experimental observations shows that these simulations capture most features of the plasma plume expansion. The plume location and other gross features are reproduced as per the experimental observation in quantitative detail. The plume shape evolution and its dependence on the ambient background gas are in good qualitative agreement with the experiment. This suggests that a simplified hydrodynamic expansion model is adequate for the description of plasma plume expansion.

  16. Reproducibility and consistency of proteomic experiments on natural populations of a non-model aquatic insect.

    Science.gov (United States)

    Hidalgo-Galiana, Amparo; Monge, Marta; Biron, David G; Canals, Francesc; Ribera, Ignacio; Cieslak, Alexandra

    2014-01-01

    Population proteomics has a great potential to address evolutionary and ecological questions, but its use in wild populations of non-model organisms is hampered by uncontrolled sources of variation. Here we compare the response to temperature extremes of two geographically distant populations of a diving beetle species (Agabus ramblae) using 2-D DIGE. After one week of acclimation in the laboratory under standard conditions, a third of the specimens of each population were placed at either 4 or 27°C for 12 h, with another third left as a control. We then compared the protein expression level of three replicated samples of 2-3 specimens for each treatment. Within each population, variation between replicated samples of the same treatment was always lower than variation between treatments, except for some control samples that retained a wider range of expression levels. The two populations had a similar response, without significant differences in the number of protein spots over- or under-expressed in the pairwise comparisons between treatments. We identified exemplary proteins among those differently expressed between treatments, which proved to be proteins known to be related to thermal response or stress. Overall, our results indicate that specimens collected in the wild are suitable for proteomic analyses, as the additional sources of variation were not enough to mask the consistency and reproducibility of the response to the temperature treatments.

  17. The quest for improved reproducibility in MALDI mass spectrometry.

    Science.gov (United States)

    O'Rourke, Matthew B; Djordjevic, Steven P; Padula, Matthew P

    2018-03-01

    Reproducibility has been one of the biggest hurdles faced when attempting to develop quantitative protocols for MALDI mass spectrometry. The heterogeneous nature of sample recrystallization has made automated sample acquisition somewhat "hit and miss" with manual intervention needed to ensure that all sample spots have been analyzed. In this review, we explore the last 30 years of literature and anecdotal evidence that has attempted to address and improve reproducibility in MALDI MS. Though many methods have been attempted, we have discovered a significant publication history surrounding the use of nitrocellulose as a substrate to improve homogeneity of crystal formation and therefore reproducibility. We therefore propose that this is the most promising avenue of research for developing a comprehensive and universal preparation protocol for quantitative MALDI MS analysis. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 37:217-228, 2018. © 2016 Wiley Periodicals, Inc.

  18. Reproducibility of the cutoff probe for the measurement of electron density

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. W.; Oh, W. Y. [Department of Mechanical Engineering, Korea Advanced Institute of Science and Technology, Daejeon 305-701 (Korea, Republic of); You, S. J., E-mail: sjyou@cnu.ac.kr [Department of Physics, Chungnam National University, Daejeon 305-701 (Korea, Republic of); Kwon, J. H.; You, K. H.; Seo, B. H.; Kim, J. H., E-mail: jhkim86@kriss.re.kr [Center for Vacuum Technology, Korea Research Institute of Standards and Science, Daejeon 305-306 (Korea, Republic of); Yoon, J.-S. [Plasma Technology Research Center, National Fusion Research Institute, Gunsan 573-540 (Korea, Republic of)

    2016-06-15

    Since a plasma processing control based on plasma diagnostics attracted considerable attention in industry, the reproducibility of the diagnostics using in this application has become a great interest. Because the cutoff probe is one of the potential candidates for this application, knowing the reproducibility of the cutoff probe measurement becomes quit important in the cutoff probe application research. To test the reproducibility of the cutoff probe measurement, in this paper, a comparative study among the different cutoff probe measurements was performed. The comparative study revealed remarkable result: the cutoff probe has a great reproducibility for the electron density measurement, i.e., there are little differences among measurements by different probes made by different experimenters. The discussion including the reason for the result was addressed via this paper by using a basic measurement principle of cutoff probe and a comparative experiment with Langmuir probe.

  19. Reproducibility of the cutoff probe for the measurement of electron density

    International Nuclear Information System (INIS)

    Kim, D. W.; Oh, W. Y.; You, S. J.; Kwon, J. H.; You, K. H.; Seo, B. H.; Kim, J. H.; Yoon, J.-S.

    2016-01-01

    Since a plasma processing control based on plasma diagnostics attracted considerable attention in industry, the reproducibility of the diagnostics using in this application has become a great interest. Because the cutoff probe is one of the potential candidates for this application, knowing the reproducibility of the cutoff probe measurement becomes quit important in the cutoff probe application research. To test the reproducibility of the cutoff probe measurement, in this paper, a comparative study among the different cutoff probe measurements was performed. The comparative study revealed remarkable result: the cutoff probe has a great reproducibility for the electron density measurement, i.e., there are little differences among measurements by different probes made by different experimenters. The discussion including the reason for the result was addressed via this paper by using a basic measurement principle of cutoff probe and a comparative experiment with Langmuir probe.

  20. Qualitative Analysis of a Diffusive Ratio-Dependent Holling-Tanner Predator-Prey Model with Smith Growth

    Directory of Open Access Journals (Sweden)

    Zongmin Yue

    2013-01-01

    Full Text Available We investigated the dynamics of a diffusive ratio-dependent Holling-Tanner predator-prey model with Smith growth subject to zero-flux boundary condition. Some qualitative properties, including the dissipation, persistence, and local and global stability of positive constant solution, are discussed. Moreover, we give the refined a priori estimates of positive solutions and derive some results for the existence and nonexistence of nonconstant positive steady state.

  1. Reproducibility between conventional and digital periapical radiography for bone height measurement

    Directory of Open Access Journals (Sweden)

    Miguel Simancas Pallares

    2015-10-01

    Conclusions. Reproducibility between methods was considered poor, including subgroup analysis, therefore, reproducibility between methods is minimal. Usage of these methods in periodontics should be made implementing the whole knowledge of the technical features and the advantages of these systems.

  2. Qualitative analysis of cosmological models in Brans-Dicke theory, solutions from non-minimal coupling and viscous universe

    International Nuclear Information System (INIS)

    Romero Filho, C.A.

    1988-01-01

    Using dynamical system theory we investigate homogeneous and isotropic models in Brans-Dicke theory for perfect fluids with general equation of state and arbitrary ω. Phase diagrams are drawn on the Poincare sphere which permits a qualitative analysis of the models. Based on this analysis we construct a method for generating classes of solutions in Brans-Dicke theory. The same technique is used for studying models arising from non-minimal coupling of electromagnetism with gravity. In addition, viscous fluids are considered and non-singular solutions with bulk viscosity are found. (author)

  3. Reproducibility Between Brain Uptake Ratio Using Anatomic Standardization and Patlak-Plot Methods.

    Science.gov (United States)

    Shibutani, Takayuki; Onoguchi, Masahisa; Noguchi, Atsushi; Yamada, Tomoki; Tsuchihashi, Hiroko; Nakajima, Tadashi; Kinuya, Seigo

    2015-12-01

    The Patlak-plot and conventional methods of determining brain uptake ratio (BUR) have some problems with reproducibility. We formulated a method of determining BUR using anatomic standardization (BUR-AS) in a statistical parametric mapping algorithm to improve reproducibility. The objective of this study was to demonstrate the inter- and intraoperator reproducibility of mean cerebral blood flow as determined using BUR-AS in comparison to the conventional-BUR (BUR-C) and Patlak-plot methods. The images of 30 patients who underwent brain perfusion SPECT were retrospectively used in this study. The images were reconstructed using ordered-subset expectation maximization and processed using an automatic quantitative analysis for cerebral blood flow of ECD tool. The mean SPECT count was calculated from axial basal ganglia slices of the normal side (slices 31-40) drawn using a 3-dimensional stereotactic region-of-interest template after anatomic standardization. The mean cerebral blood flow was calculated from the mean SPECT count. Reproducibility was evaluated using coefficient of variation and Bland-Altman plotting. For both inter- and intraoperator reproducibility, the BUR-AS method had the lowest coefficient of variation and smallest error range about the Bland-Altman plot. Mean CBF obtained using the BUR-AS method had the highest reproducibility. Compared with the Patlak-plot and BUR-C methods, the BUR-AS method provides greater inter- and intraoperator reproducibility of cerebral blood flow measurement. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  4. Reproducibility study of [{sup 18}F]FPP(RGD){sub 2} uptake in murine models of human tumor xenografts

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Edwin; Liu, Shuangdong; Chin, Frederick; Cheng, Zhen [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Gowrishankar, Gayatri; Yaghoubi, Shahriar [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Wedgeworth, James Patrick [Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Berndorff, Dietmar; Gekeler, Volker [Bayer Schering Pharma AG, Global Drug Discovery, Berlin (Germany); Gambhir, Sanjiv S. [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Canary Center at Stanford for Cancer Early Detection, Nuclear Medicine, Departments of Radiology and Bioengineering, Molecular Imaging Program at Stanford, Stanford, CA (United States)

    2011-04-15

    An {sup 18}F-labeled PEGylated arginine-glycine-aspartic acid (RGD) dimer [{sup 18}F]FPP(RGD){sub 2} has been used to image tumor {alpha}{sub v}{beta}{sub 3} integrin levels in preclinical and clinical studies. Serial positron emission tomography (PET) studies may be useful for monitoring antiangiogenic therapy response or for drug screening; however, the reproducibility of serial scans has not been determined for this PET probe. The purpose of this study was to determine the reproducibility of the integrin {alpha}{sub v}{beta}{sub 3}-targeted PET probe, [{sup 18}F ]FPP(RGD){sub 2} using small animal PET. Human HCT116 colon cancer xenografts were implanted into nude mice (n = 12) in the breast and scapular region and grown to mean diameters of 5-15 mm for approximately 2.5 weeks. A 3-min acquisition was performed on a small animal PET scanner approximately 1 h after administration of [{sup 18}F]FPP(RGD){sub 2} (1.9-3.8 MBq, 50-100 {mu}Ci) via the tail vein. A second small animal PET scan was performed approximately 6 h later after reinjection of the probe to assess for reproducibility. Images were analyzed by drawing an ellipsoidal region of interest (ROI) around the tumor xenograft activity. Percentage injected dose per gram (%ID/g) values were calculated from the mean or maximum activity in the ROIs. Coefficients of variation and differences in %ID/g values between studies from the same day were calculated to determine the reproducibility. The coefficient of variation (mean {+-}SD) for %ID{sub mean}/g and %ID{sub max}/g values between [{sup 18}F]FPP(RGD){sub 2} small animal PET scans performed 6 h apart on the same day were 11.1 {+-} 7.6% and 10.4 {+-} 9.3%, respectively. The corresponding differences in %ID{sub mean}/g and %ID{sub max}/g values between scans were -0.025 {+-} 0.067 and -0.039 {+-} 0.426. Immunofluorescence studies revealed a direct relationship between extent of {alpha}{sub {nu}}{beta}{sub 3} integrin expression in tumors and tumor vasculature

  5. [Qualitative research methodology in health care].

    Science.gov (United States)

    Bedregal, Paula; Besoain, Carolina; Reinoso, Alejandro; Zubarew, Tamara

    2017-03-01

    Health care research requires different methodological approaches such as qualitative and quantitative analyzes to understand the phenomena under study. Qualitative research is usually the least considered. Central elements of the qualitative method are that the object of study is constituted by perceptions, emotions and beliefs, non-random sampling by purpose, circular process of knowledge construction, and methodological rigor throughout the research process, from quality design to the consistency of results. The objective of this work is to contribute to the methodological knowledge about qualitative research in health services, based on the implementation of the study, “The transition process from pediatric to adult services: perspectives from adolescents with chronic diseases, caregivers and health professionals”. The information gathered through the qualitative methodology facilitated the understanding of critical points, barriers and facilitators of the transition process of adolescents with chronic diseases, considering the perspective of users and the health team. This study allowed the design of a transition services model from pediatric to adult health services based on the needs of adolescents with chronic diseases, their caregivers and the health team.

  6. Establishing rigour in qualitative radiography research

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, F.J. [School of Healthcare Professions, University of Salford, Salford M6 6PU (United Kingdom)], E-mail: f.j.murphy@salford.ac.uk; Yielder, J. [Medical Imaging, School of Health Sciences, Unitec, Auckland (New Zealand)

    2010-02-15

    The vast majority of radiography research is subject to critique and evaluation from peers in order to justify the method and the outcome of the study. Within the quantitative domain, which the majority of medical imaging publications tend to fall into, there are prescribed methods for establishing scientific rigour and quality in order to critique a study. However, researchers within the qualitative paradigm, which is a developing area of radiography research, are often unclear about the most appropriate methods to measure the rigour (standards and quality) of a research study. This article considers the issues related to rigour, reliability and validity within qualitative research. The concepts of reliability and validity are briefly discussed within traditional positivism and then the attempts to use these terms as a measure of quality within qualitative research are explored. Alternative methods for research rigour in interpretive research (meanings and emotions) are suggested in order to compliment the existing radiography framework that exists for qualitative studies. The authors propose the use of an established model that is adapted to reflect the iterative process of qualitative research. Although a mechanistic approach to establishing rigour is rejected by many qualitative researchers, it is argued that a guide for novice researchers within a developing research base such as radiography is appropriate in order to establish the credibility and trustworthiness of a qualitative study.

  7. Establishing rigour in qualitative radiography research

    International Nuclear Information System (INIS)

    Murphy, F.J.; Yielder, J.

    2010-01-01

    The vast majority of radiography research is subject to critique and evaluation from peers in order to justify the method and the outcome of the study. Within the quantitative domain, which the majority of medical imaging publications tend to fall into, there are prescribed methods for establishing scientific rigour and quality in order to critique a study. However, researchers within the qualitative paradigm, which is a developing area of radiography research, are often unclear about the most appropriate methods to measure the rigour (standards and quality) of a research study. This article considers the issues related to rigour, reliability and validity within qualitative research. The concepts of reliability and validity are briefly discussed within traditional positivism and then the attempts to use these terms as a measure of quality within qualitative research are explored. Alternative methods for research rigour in interpretive research (meanings and emotions) are suggested in order to compliment the existing radiography framework that exists for qualitative studies. The authors propose the use of an established model that is adapted to reflect the iterative process of qualitative research. Although a mechanistic approach to establishing rigour is rejected by many qualitative researchers, it is argued that a guide for novice researchers within a developing research base such as radiography is appropriate in order to establish the credibility and trustworthiness of a qualitative study.

  8. Two-phase 1D+1D model of a DMFC: development and validation on extensive operating conditions range

    Energy Technology Data Exchange (ETDEWEB)

    Casalegno, A.; Marchesi, R.; Parenti, D. [Dipartimento di Energetica, Politecnico di Milano (Italy)

    2008-02-15

    A two-phase 1D+1D model of a direct methanol fuel cell (DMFC) is developed, considering overall mass balance, methanol transport in gas phase through anode diffusion layer, methanol and water crossover. The model is quantitatively validated on an extensive range of operating conditions, 24 polarisation curves. The model accurately reproduces DMFC performance in the validation range and, outside this, it is able to predict values under feasible operating conditions. Finally, the estimations of methanol crossover flux are qualitatively and quantitatively similar to experimental measures and the main local quantities' trends are coherent with results obtained with more complex models. (Abstract Copyright [2008], Wiley Periodicals, Inc.)

  9. Reproducibility of 3.0 Tesla Magnetic Resonance Spectroscopy for Measuring Hepatic Fat Content

    NARCIS (Netherlands)

    van Werven, Jochem R.; Hoogduin, Johannes M.; Nederveen, Aart J.; van Vliet, Andre A.; Wajs, Ewa; Vandenberk, Petra; Stroes, Erik S. G.; Stoker, Jaap

    Purpose: To investigate reproducibility of proton magnetic resonance spectroscopy (H-1-MRS) to measure hepatic triglyceride content (HTGC). Materials and Methods: In 24 subjects, HTGC was evaluated using H-1-MRS at 3.0 Tesla. We studied "between-weeks" reproducibility and reproducibility of H-1-MRS

  10. Reproducibility and Practical Adoption of GEOBIA with Open-Source Software in Docker Containers

    Directory of Open Access Journals (Sweden)

    Christian Knoth

    2017-03-01

    Full Text Available Geographic Object-Based Image Analysis (GEOBIA mostly uses proprietary software,but the interest in Free and Open-Source Software (FOSS for GEOBIA is growing. This interest stems not only from cost savings, but also from benefits concerning reproducibility and collaboration. Technical challenges hamper practical reproducibility, especially when multiple software packages are required to conduct an analysis. In this study, we use containerization to package a GEOBIA workflow in a well-defined FOSS environment. We explore the approach using two software stacks to perform an exemplary analysis detecting destruction of buildings in bi-temporal images of a conflict area. The analysis combines feature extraction techniques with segmentation and object-based analysis to detect changes using automatically-defined local reference values and to distinguish disappeared buildings from non-target structures. The resulting workflow is published as FOSS comprising both the model and data in a ready to use Docker image and a user interface for interaction with the containerized workflow. The presented solution advances GEOBIA in the following aspects: higher transparency of methodology; easier reuse and adaption of workflows; better transferability between operating systems; complete description of the software environment; and easy application of workflows by image analysis experts and non-experts. As a result, it promotes not only the reproducibility of GEOBIA, but also its practical adoption.

  11. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    Directory of Open Access Journals (Sweden)

    Sonal eGoswami

    2012-06-01

    Full Text Available Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e. that require the hippocampus or not the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze. Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma, PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  12. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome.

    Science.gov (United States)

    Goswami, Sonal; Samuel, Sherin; Sierra, Olga R; Cascardi, Michele; Paré, Denis

    2012-01-01

    Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD) remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e., that require the hippocampus or not) the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze (EPM). Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  13. Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops

    Science.gov (United States)

    Rahman, Aminur; Jordan, Ian; Blackmore, Denis

    2018-01-01

    It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.

  14. PDB-NMA of a protein homodimer reproduces distinct experimental motility asymmetry

    Science.gov (United States)

    Tirion, Monique M.; ben-Avraham, Daniel

    2018-03-01

    We have extended our analytically derived PDB-NMA formulation, Atomic Torsional Modal Analysis or ATMAN (Tirion and ben-Avraham 2015 Phys. Rev. E 91 032712), to include protein dimers using mixed internal and Cartesian coordinates. A test case on a 1.3 {\\mathringA} resolution model of a small homodimer, ActVA-ORF6, consisting of two 112-residue subunits identically folded in a compact 50 {\\mathringA} sphere, reproduces the distinct experimental Debye-Waller motility asymmetry for the two chains, demonstrating that structure sensitively selects vibrational signatures. The vibrational analysis of this PDB entry, together with biochemical and crystallographic data, demonstrates the cooperative nature of the dimeric interaction of the two subunits and suggests a mechanical model for subunit interconversion during the catalytic cycle.

  15. Statistical and non statistical models for delayed neutron emission: applications to nuclei near A = 90

    International Nuclear Information System (INIS)

    De Oliveira, Z.M.

    1980-01-01

    A detailed analysis of the simple statistical model description for delayed neutron emission of 87 Br, 137 I, 85 As and 135 Sb has been performed. In agreement with experimental findings, structure in the #betta#-strength function is required to reproduce the envelope of the neutron spectrum from 87 Br. For 85 As and 135 Sb the model is found incapable of simultaneously reproducing envelopes of delayed neutron spectra and neutron branching ratios to excited states in the final nuclei for any choice of #betta#-strength function. The results indicate that partial widths for neutron emission are not compatible with optical-model transmission coefficients. The simple shell model with pairing is shown to qualitatively describe the main features of the #betta#-strength functions for decay of 87 Br and 91 93 95 97 Rb. It is found that the location of apparent resonances in the experimental data are in rough agreement with the location of centroids of strength calculated with this model. An extension of the shell model picture which includes the Gamow-Teller residual interaction is used to investigate decay properties of 84 86 As, 86 92 Br and 88 102 Rb. For a realistic choice of interaction strength, the half lives of these isotopes are fairly well reproduced and semiquantitative agreement with experimental #betta#-strength functions is found. Delayed neutron emission probabilities are reproduced for precursors nearer stability with systematic deviations being observed for the heavier nuclei. Contrary to the assumption of a structureless Gamow-Teller giant resonance as embodied gross theory of #betta#-decay, we find that structures in the tail of the Gamow-Teller giant resonances are expected which strongly influence the decay properties of nuclides in this region

  16. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  17. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  18. Oxygen distribution in tumors: A qualitative analysis and modeling study providing a novel Monte Carlo approach

    International Nuclear Information System (INIS)

    Lagerlöf, Jakob H.; Kindblom, Jon; Bernhardt, Peter

    2014-01-01

    Purpose: To construct a Monte Carlo (MC)-based simulation model for analyzing the dependence of tumor oxygen distribution on different variables related to tumor vasculature [blood velocity, vessel-to-vessel proximity (vessel proximity), and inflowing oxygen partial pressure (pO 2 )]. Methods: A voxel-based tissue model containing parallel capillaries with square cross-sections (sides of 10 μm) was constructed. Green's function was used for diffusion calculations and Michaelis-Menten's kinetics to manage oxygen consumption. The model was tuned to approximately reproduce the oxygenational status of a renal carcinoma; the depth oxygenation curves (DOC) were fitted with an analytical expression to facilitate rapid MC simulations of tumor oxygen distribution. DOCs were simulated with three variables at three settings each (blood velocity, vessel proximity, and inflowing pO 2 ), which resulted in 27 combinations of conditions. To create a model that simulated variable oxygen distributions, the oxygen tension at a specific point was randomly sampled with trilinear interpolation in the dataset from the first simulation. Six correlations between blood velocity, vessel proximity, and inflowing pO 2 were hypothesized. Variable models with correlated parameters were compared to each other and to a nonvariable, DOC-based model to evaluate the differences in simulated oxygen distributions and tumor radiosensitivities for different tumor sizes. Results: For tumors with radii ranging from 5 to 30 mm, the nonvariable DOC model tended to generate normal or log-normal oxygen distributions, with a cut-off at zero. The pO 2 distributions simulated with the six-variable DOC models were quite different from the distributions generated with the nonvariable DOC model; in the former case the variable models simulated oxygen distributions that were more similar to in vivo results found in the literature. For larger tumors, the oxygen distributions became truncated in the lower

  19. Travelling Methods: Tracing the Globalization of Qualitative Communication Research

    Directory of Open Access Journals (Sweden)

    Bryan C. Taylor

    2016-05-01

    Full Text Available Existing discussion of the relationships between globalization, communication research, and qualitative methods emphasizes two images: the challenges posed by globalization to existing communication theory and research methods, and the impact of post-colonial politics and ethics on qualitative research. We draw in this paper on a third image – qualitative research methods as artifacts of globalization – to explore the globalization of qualitative communication research methods. Following a review of literature which tentatively models this process, we discuss two case studies of qualitative research in the disciplinary subfields of intercultural communication and media audience studies. These cases elaborate the forces which influence the articulation of national, disciplinary, and methodological identities which mediate the globalization of qualitative communication research methods.

  20. Completely reproducible description of digital sound data with cellular automata

    International Nuclear Information System (INIS)

    Wada, Masato; Kuroiwa, Jousuke; Nara, Shigetoshi

    2002-01-01

    A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information

  1. Empirical particle transport model for tokamaks

    International Nuclear Information System (INIS)

    Petravic, M.; Kuo-Petravic, G.

    1986-08-01

    A simple empirical particle transport model has been constructed with the purpose of gaining insight into the L- to H-mode transition in tokamaks. The aim was to construct the simplest possible model which would reproduce the measured density profiles in the L-regime, and also produce a qualitatively correct transition to the H-regime without having to assume a completely different transport mode for the bulk of the plasma. Rather than using completely ad hoc constructions for the particle diffusion coefficient, we assume D = 1/5 chi/sub total/, where chi/sub total/ ≅ chi/sub e/ is the thermal diffusivity, and then use the κ/sub e/ = n/sub e/chi/sub e/ values derived from experiments. The observed temperature profiles are then automatically reproduced, but nontrivially, the correct density profiles are also obtained, for realistic fueling rates and profiles. Our conclusion is that it is sufficient to reduce the transport coefficients within a few centimeters of the surface to produce the H-mode behavior. An additional simple assumption, concerning the particle mean-free path, leads to a convective transport term which reverses sign a few centimeters inside the surface, as required by the H-mode density profiles

  2. Reproducibility of gene expression across generations of Affymetrix microarrays

    Directory of Open Access Journals (Sweden)

    Haslett Judith N

    2003-06-01

    Full Text Available Abstract Background The development of large-scale gene expression profiling technologies is rapidly changing the norms of biological investigation. But the rapid pace of change itself presents challenges. Commercial microarrays are regularly modified to incorporate new genes and improved target sequences. Although the ability to compare datasets across generations is crucial for any long-term research project, to date no means to allow such comparisons have been developed. In this study the reproducibility of gene expression levels across two generations of Affymetrix GeneChips® (HuGeneFL and HG-U95A was measured. Results Correlation coefficients were computed for gene expression values across chip generations based on different measures of similarity. Comparing the absolute calls assigned to the individual probe sets across the generations found them to be largely unchanged. Conclusion We show that experimental replicates are highly reproducible, but that reproducibility across generations depends on the degree of similarity of the probe sets and the expression level of the corresponding transcript.

  3. Reproducibility of the Portuguese version of the PEDro Scale

    Directory of Open Access Journals (Sweden)

    Silvia Regina Shiwa

    2011-10-01

    Full Text Available The objective of this study was to test the inter-rater reproducibility of the Portuguese version of the PEDro Scale. Seven physiotherapists rated the methodological quality of 50 reports of randomized controlled trials written in Portuguese indexed on the PEDro database. Each report was also rated using the English version of the PEDro Scale. Reproducibility was evaluated by comparing two separate ratings of reports written in Portuguese and comparing the Portuguese PEDro score with the English version of the scale. Kappa coefficients ranged from 0.53 to 1.00 for individual item and an intraclass correlation coefficient (ICC of 0.82 for the total PEDro score was observed. The standard error of the measurement of the scale was 0.58. The Portuguese version of the scale was comparable with the English version, with an ICC of 0.78. The inter-rater reproducibility of the Brazilian Portuguese PEDro Scale is adequate and similar to the original English version.

  4. A Qualitative Simulation Framework in Smalltalk Based on Fuzzy Arithmetic

    Science.gov (United States)

    Richard L. Olson; Daniel L. Schmoldt; David L. Peterson

    1996-01-01

    For many systems, it is not practical to collect and correlate empirical data necessary to formulate a mathematical model. However, it is often sufficient to predict qualitative dynamics effects (as opposed to system quantities), especially for research purposes. In this effort, an object-oriented application framework (AF) was developed for the qualitative modeling of...

  5. The Mathematics of Psychotherapy: A Nonlinear Model of Change Dynamics.

    Science.gov (United States)

    Schiepek, Gunter; Aas, Benjamin; Viol, Kathrin

    2016-07-01

    Psychotherapy is a dynamic process produced by a complex system of interacting variables. Even though there are qualitative models of such systems the link between structure and function, between network and network dynamics is still missing. The aim of this study is to realize these links. The proposed model is composed of five state variables (P: problem severity, S: success and therapeutic progress, M: motivation to change, E: emotions, I: insight and new perspectives) interconnected by 16 functions. The shape of each function is modified by four parameters (a: capability to form a trustful working alliance, c: mentalization and emotion regulation, r: behavioral resources and skills, m: self-efficacy and reward expectation). Psychologically, the parameters play the role of competencies or traits, which translate into the concept of control parameters in synergetics. The qualitative model was transferred into five coupled, deterministic, nonlinear difference equations generating the dynamics of each variable as a function of other variables. The mathematical model is able to reproduce important features of psychotherapy processes. Examples of parameter-dependent bifurcation diagrams are given. Beyond the illustrated similarities between simulated and empirical dynamics, the model has to be further developed, systematically tested by simulated experiments, and compared to empirical data.

  6. A new strategy to deliver synthetic protein drugs: self-reproducible biologics using minicircles.

    Science.gov (United States)

    Yi, Hyoju; Kim, Youngkyun; Kim, Juryun; Jung, Hyerin; Rim, Yeri Alice; Jung, Seung Min; Park, Sung-Hwan; Ju, Ji Hyeon

    2014-08-05

    Biologics are the most successful drugs used in anticytokine therapy. However, they remain partially unsuccessful because of the elevated cost of their synthesis and purification. Development of novel biologics has also been hampered by the high cost. Biologics are made of protein components; thus, theoretically, they can be produced in vivo. Here we tried to invent a novel strategy to allow the production of synthetic drugs in vivo by the host itself. The recombinant minicircles encoding etanercept or tocilizumab, which are synthesized currently by pharmaceutical companies, were injected intravenously into animal models. Self-reproduced etanercept and tocilizumab were detected in the serum of mice. Moreover, arthritis subsided in mice that were injected with minicircle vectors carrying biologics. Self-reproducible biologics need neither factory facilities for drug production nor clinical processes, such as frequent drug injection. Although this novel strategy is in its very early conceptual stage, it seems to represent a potential alternative method for the delivery of biologics.

  7. Intra-and interobserver reproducibility of shear wave elastography for evaluation of the breast lesions

    International Nuclear Information System (INIS)

    Hong, Min Ji; Kim, Hak Hee

    2017-01-01

    To evaluate reproducibility of shear wave elastography (SWE) for breast lesions within and between observers and compare the reproducibility of SWE features. For intraobserver reproducibility, 225 masses with 208 patients were included; and two consecutive SWE images were acquired by each observer. For interobserver reproducibility, SWE images of the same mass were obtained by another observer before surgery in 40 patients. Intraclass correlation coefficients (ICC) were used to determine intra- and interobserver reproducibility. Intraobserver reliability for mean elasticity (Emean) and maximum elasticity (Emax) were excellent (ICC = 0.803, 0.799). ICC for SWE ratio and minimum elasticity (Emin) were fair to good (ICC = 0.703, 0.539). Emean showed excellent ICC regardless of histopathologic type and tumor size. Emax, SWE ratio and Emin represented excellent or fair to good reproducibility based on histopathologic type and tumor size. In interobserver study, ICC for Emean, Emax and SWE ratio were excellent. Emean, Emax and SWE ratio represented excellent ICC irrespective of histopathologic type. ICC for Emean was excellent regardless of tumor size. SWE ratio and Emax showed fair to good interobserver reproducibility based on tumor size. Emin represented poor interobserver reliability. Emean in SWE was highly reproducible within and between observers

  8. Intra-and interobserver reproducibility of shear wave elastography for evaluation of the breast lesions

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Min Ji [Dept. of Radiology, Gil Hospital, Gachon University of Medicine and Science, Incheon (Korea, Republic of); Kim, Hak Hee [Dept. of Radiology, and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul (Korea, Republic of)

    2017-03-15

    To evaluate reproducibility of shear wave elastography (SWE) for breast lesions within and between observers and compare the reproducibility of SWE features. For intraobserver reproducibility, 225 masses with 208 patients were included; and two consecutive SWE images were acquired by each observer. For interobserver reproducibility, SWE images of the same mass were obtained by another observer before surgery in 40 patients. Intraclass correlation coefficients (ICC) were used to determine intra- and interobserver reproducibility. Intraobserver reliability for mean elasticity (Emean) and maximum elasticity (Emax) were excellent (ICC = 0.803, 0.799). ICC for SWE ratio and minimum elasticity (Emin) were fair to good (ICC = 0.703, 0.539). Emean showed excellent ICC regardless of histopathologic type and tumor size. Emax, SWE ratio and Emin represented excellent or fair to good reproducibility based on histopathologic type and tumor size. In interobserver study, ICC for Emean, Emax and SWE ratio were excellent. Emean, Emax and SWE ratio represented excellent ICC irrespective of histopathologic type. ICC for Emean was excellent regardless of tumor size. SWE ratio and Emax showed fair to good interobserver reproducibility based on tumor size. Emin represented poor interobserver reliability. Emean in SWE was highly reproducible within and between observers.

  9. Reproducibility of abdominal fat assessment by ultrasound and computed tomography.

    Science.gov (United States)

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaeté; Benedeti, Augusto César Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge

    2017-01-01

    To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility.

  10. An empirical analysis of journal policy effectiveness for computational reproducibility.

    Science.gov (United States)

    Stodden, Victoria; Seiler, Jennifer; Ma, Zhaokun

    2018-03-13

    A key component of scientific communication is sufficient information for other researchers in the field to reproduce published findings. For computational and data-enabled research, this has often been interpreted to mean making available the raw data from which results were generated, the computer code that generated the findings, and any additional information needed such as workflows and input parameters. Many journals are revising author guidelines to include data and code availability. This work evaluates the effectiveness of journal policy that requires the data and code necessary for reproducibility be made available postpublication by the authors upon request. We assess the effectiveness of such a policy by ( i ) requesting data and code from authors and ( ii ) attempting replication of the published findings. We chose a random sample of 204 scientific papers published in the journal Science after the implementation of their policy in February 2011. We found that we were able to obtain artifacts from 44% of our sample and were able to reproduce the findings for 26%. We find this policy-author remission of data and code postpublication upon request-an improvement over no policy, but currently insufficient for reproducibility.

  11. Reproducing Kernel Method for Solving Nonlinear Differential-Difference Equations

    Directory of Open Access Journals (Sweden)

    Reza Mokhtari

    2012-01-01

    Full Text Available On the basis of reproducing kernel Hilbert spaces theory, an iterative algorithm for solving some nonlinear differential-difference equations (NDDEs is presented. The analytical solution is shown in a series form in a reproducing kernel space, and the approximate solution , is constructed by truncating the series to terms. The convergence of , to the analytical solution is also proved. Results obtained by the proposed method imply that it can be considered as a simple and accurate method for solving such differential-difference problems.

  12. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  13. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  14. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  15. Reproducibility of cervical range of motion in patients with neck pain

    NARCIS (Netherlands)

    Pool, JJM; van Mameren, H; Deville, WJLM; Assendelft, WJJ; de Vet, HCW; de Winter, AF; Koes, BW; Bouter, LM; Hoving, J.L.

    2005-01-01

    Background: Reproducibility measurements of the range of motion are an important prerequisite for the interpretation of study results. The aim of the study is to assess the intra-rater and interrater reproducibility of the measurement of active Range of Motion ( ROM) in patients with neck pain using

  16. Reproducibility of clinical research in critical care: a scoping review.

    Science.gov (United States)

    Niven, Daniel J; McCormick, T Jared; Straus, Sharon E; Hemmelgarn, Brenda R; Jeffs, Lianne; Barnes, Tavish R M; Stelfox, Henry T

    2018-02-21

    The ability to reproduce experiments is a defining principle of science. Reproducibility of clinical research has received relatively little scientific attention. However, it is important as it may inform clinical practice, research agendas, and the design of future studies. We used scoping review methods to examine reproducibility within a cohort of randomized trials examining clinical critical care research and published in the top general medical and critical care journals. To identify relevant clinical practices, we searched the New England Journal of Medicine, The Lancet, and JAMA for randomized trials published up to April 2016. To identify a comprehensive set of studies for these practices, included articles informed secondary searches within other high-impact medical and specialty journals. We included late-phase randomized controlled trials examining therapeutic clinical practices in adults admitted to general medical-surgical or specialty intensive care units (ICUs). Included articles were classified using a reproducibility framework. An original study was the first to evaluate a clinical practice. A reproduction attempt re-evaluated that practice in a new set of participants. Overall, 158 practices were examined in 275 included articles. A reproduction attempt was identified for 66 practices (42%, 95% CI 33-50%). Original studies reported larger effects than reproduction attempts (primary endpoint, risk difference 16.0%, 95% CI 11.6-20.5% vs. 8.4%, 95% CI 6.0-10.8%, P = 0.003). More than half of clinical practices with a reproduction attempt demonstrated effects that were inconsistent with the original study (56%, 95% CI 42-68%), among which a large number were reported to be efficacious in the original study and to lack efficacy in the reproduction attempt (34%, 95% CI 19-52%). Two practices reported to be efficacious in the original study were found to be harmful in the reproduction attempt. A minority of critical care practices with research published

  17. Reproducibility of Psychological Experiments as a Problem of Post-Nonclassical Science

    Directory of Open Access Journals (Sweden)

    Vachkov I.V.,

    2016-04-01

    Full Text Available A fundamental project on reproducibility carried out in the USA by Brian Nosek in 2015 (the Reproducibility Project revealed a serious methodological problem in psychology: the issue of replication of psycho- logical experiments. Reproducibility has been traditionally perceived as one of the basic principles of the scientific method. However, methodological analysis of the modern post-nonclassical stage in the development of science suggests that this might be a bit too uncompromising as applied to psychology. It seems that the very criteria of scientific research need to be reconsidered with regard to the specifics of post-nonclassical science, and, as the authors put it, as a result, reproducibility might lose its key status or even be excluded at all. The reviewed problem and the proposed ways of coping with it are of high importance to research and practice in psychology as they define the strategies for organizing, conducting and evaluating experimental research.

  18. Coupled RipCAS-DFLOW (CoRD) Software and Data Management System for Reproducible Floodplain Vegetation Succession Modeling

    Science.gov (United States)

    Turner, M. A.; Miller, S.; Gregory, A.; Cadol, D. D.; Stone, M. C.; Sheneman, L.

    2016-12-01

    We present the Coupled RipCAS-DFLOW (CoRD) modeling system created to encapsulate the workflow to analyze the effects of stream flooding on vegetation succession. CoRD provides an intuitive command-line and web interface to run DFLOW and RipCAS in succession over many years automatically, which is a challenge because, for our application, DFLOW must be run on a supercomputing cluster via the PBS job scheduler. RipCAS is a vegetation succession model, and DFLOW is a 2D open channel flow model. Data adaptors have been developed to seamlessly connect DFLOW output data to be RipCAS inputs, and vice-versa. CoRD provides automated statistical analysis and visualization, plus automatic syncing of input and output files and model run metadata to the hydrological data management system HydroShare using its excellent Python REST client. This combination of technologies and data management techniques allows the results to be shared with collaborators and eventually published. Perhaps most importantly, it allows results to be easily reproduced via either the command-line or web user interface. This system is a result of collaboration between software developers and hydrologists participating in the Western Consortium for Watershed Analysis, Visualization, and Exploration (WC-WAVE). Because of the computing-intensive nature of this particular workflow, including automating job submission/monitoring and data adaptors, software engineering expertise is required. However, the hydrologists provide the software developers with a purpose and ensure a useful, intuitive tool is developed. Our hydrologists contribute software, too: RipCAS was developed from scratch by hydrologists on the team as a specialized, open-source version of the Computer Aided Simulation Model for Instream Flow and Riparia (CASiMiR) vegetation model; our hydrologists running DFLOW provided numerous examples and help with the supercomputing system. This project is written in Python, a popular language in the

  19. Scapular dyskinesis in trapezius myalgia and intraexaminer reproducibility of clinical tests

    DEFF Research Database (Denmark)

    Juul-Kristensen, Birgit; Hilt, Kenneth; Enoch, Flemming

    2011-01-01

    dyskinesis, general health, and work ability, and 19 cases and 14 controls participated in the reproducibility study. Intraexaminer reproducibility was good to excellent for 6 of 10 clinical variables (Intraclass Correlation Coefficient [ICC] 0.76-0.91; kappa 0.84-1.00), and fair to good for four variables...

  20. Reproducibility of cervical range of motion in patients with neck pain

    NARCIS (Netherlands)

    Hoving, Jan Lucas; Pool, Jan J. M.; van Mameren, Henk; Devillé, Walter J. L. M.; Assendelft, Willem J. J.; de Vet, Henrica C. W.; de Winter, Andrea F.; Koes, Bart W.; Bouter, Lex M.

    2005-01-01

    BACKGROUND: Reproducibility measurements of the range of motion are an important prerequisite for the interpretation of study results. The aim of the study is to assess the intra-rater and inter-rater reproducibility of the measurement of active Range of Motion (ROM) in patients with neck pain using

  1. Centrifugal compressor fault diagnosis based on qualitative simulation and thermal parameters

    Science.gov (United States)

    Lu, Yunsong; Wang, Fuli; Jia, Mingxing; Qi, Yuanchen

    2016-12-01

    This paper concerns fault diagnosis of centrifugal compressor based on thermal parameters. An improved qualitative simulation (QSIM) based fault diagnosis method is proposed to diagnose the faults of centrifugal compressor in a gas-steam combined-cycle power plant (CCPP). The qualitative models under normal and two faulty conditions have been built through the analysis of the principle of centrifugal compressor. To solve the problem of qualitative description of the observations of system variables, a qualitative trend extraction algorithm is applied to extract the trends of the observations. For qualitative states matching, a sliding window based matching strategy which consists of variables operating ranges constraints and qualitative constraints is proposed. The matching results are used to determine which QSIM model is more consistent with the running state of system. The correct diagnosis of two typical faults: seal leakage and valve stuck in the centrifugal compressor has validated the targeted performance of the proposed method, showing the advantages of fault roots containing in thermal parameters.

  2. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data

    Directory of Open Access Journals (Sweden)

    Kuroda Mitzi I

    2010-07-01

    Full Text Available Abstract Background Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. Results We develop the Quantized correlation coefficient (QCC that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization, a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. Conclusions To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  3. Quantized correlation coefficient for measuring reproducibility of ChIP-chip data.

    Science.gov (United States)

    Peng, Shouyong; Kuroda, Mitzi I; Park, Peter J

    2010-07-27

    Chromatin immunoprecipitation followed by microarray hybridization (ChIP-chip) is used to study protein-DNA interactions and histone modifications on a genome-scale. To ensure data quality, these experiments are usually performed in replicates, and a correlation coefficient between replicates is used often to assess reproducibility. However, the correlation coefficient can be misleading because it is affected not only by the reproducibility of the signal but also by the amount of binding signal present in the data. We develop the Quantized correlation coefficient (QCC) that is much less dependent on the amount of signal. This involves discretization of data into set of quantiles (quantization), a merging procedure to group the background probes, and recalculation of the Pearson correlation coefficient. This procedure reduces the influence of the background noise on the statistic, which then properly focuses more on the reproducibility of the signal. The performance of this procedure is tested in both simulated and real ChIP-chip data. For replicates with different levels of enrichment over background and coverage, we find that QCC reflects reproducibility more accurately and is more robust than the standard Pearson or Spearman correlation coefficients. The quantization and the merging procedure can also suggest a proper quantile threshold for separating signal from background for further analysis. To measure reproducibility of ChIP-chip data correctly, a correlation coefficient that is robust to the amount of signal present should be used. QCC is one such measure. The QCC statistic can also be applied in a variety of other contexts for measuring reproducibility, including analysis of array CGH data for DNA copy number and gene expression data.

  4. Reproducibility of the dynamics of facial expressions in unilateral facial palsy.

    Science.gov (United States)

    Alagha, M A; Ju, X; Morley, S; Ayoub, A

    2018-02-01

    The aim of this study was to assess the reproducibility of non-verbal facial expressions in unilateral facial paralysis using dynamic four-dimensional (4D) imaging. The Di4D system was used to record five facial expressions of 20 adult patients. The system captured 60 three-dimensional (3D) images per second; each facial expression took 3-4seconds which was recorded in real time. Thus a set of 180 3D facial images was generated for each expression. The procedure was repeated after 30min to assess the reproducibility of the expressions. A mathematical facial mesh consisting of thousands of quasi-point 'vertices' was conformed to the face in order to determine the morphological characteristics in a comprehensive manner. The vertices were tracked throughout the sequence of the 180 images. Five key 3D facial frames from each sequence of images were analyzed. Comparisons were made between the first and second capture of each facial expression to assess the reproducibility of facial movements. Corresponding images were aligned using partial Procrustes analysis, and the root mean square distance between them was calculated and analyzed statistically (paired Student t-test, PFacial expressions of lip purse, cheek puff, and raising of eyebrows were reproducible. Facial expressions of maximum smile and forceful eye closure were not reproducible. The limited coordination of various groups of facial muscles contributed to the lack of reproducibility of these facial expressions. 4D imaging is a useful clinical tool for the assessment of facial expressions. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  5. Qualitative analysis of homogeneous universes

    International Nuclear Information System (INIS)

    Novello, M.; Araujo, R.A.

    1980-01-01

    The qualitative behaviour of cosmological models is investigated in two cases: Homogeneous and isotropic Universes containing viscous fluids in a stokesian non-linear regime; Rotating expanding universes in a state which matter is off thermal equilibrium. (Author) [pt

  6. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  7. Reproducibility, Controllability, and Optimization of Lenr Experiments

    Science.gov (United States)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  8. Respiratory-Gated Helical Computed Tomography of Lung: Reproducibility of Small Volumes in an Ex Vivo Model

    International Nuclear Information System (INIS)

    Biederer, Juergen; Dinkel, Julien; Bolte, Hendrik; Welzel, Thomas; Hoffmann, Beata M.Sc.; Thierfelder, Carsten; Mende, Ulrich; Debus, Juergen; Heller, Martin; Kauczor, Hans-Ulrich

    2007-01-01

    Purpose: Motion-adapted radiotherapy with gated irradiation or tracking of tumor positions requires dedicated imaging techniques such as four-dimensional (4D) helical computed tomography (CT) for patient selection and treatment planning. The objective was to evaluate the reproducibility of spatial information for small objects on respiratory-gated 4D helical CT using computer-assisted volumetry of lung nodules in a ventilated ex vivo system. Methods and Materials: Five porcine lungs were inflated inside a chest phantom and prepared with 55 artificial nodules (mean diameter, 8.4 mm ± 1.8). The lungs were respirated by a flexible diaphragm and scanned with 40-row detector CT (collimation, 24 x 1.2 mm; pitch, 0.1; rotation time, 1 s; slice thickness, 1.5 mm; increment, 0.8 mm). The 4D-CT scans acquired during respiration (eight per minute) and reconstructed at 0-100% inspiration and equivalent static scans were scored for motion-related artifacts (0 or absent to 3 or relevant). The reproducibility of nodule volumetry (three readers) was assessed using the variation coefficient (VC). Results: The mean volumes from the static and dynamic inspiratory scans were equal (364.9 and 360.8 mm 3 , respectively, p = 0.24). The static and dynamic end-expiratory volumes were slightly greater (371.9 and 369.7 mm 3 , respectively, p = 0.019). The VC for volumetry (static) was 3.1%, with no significant difference between 20 apical and 20 caudal nodules (2.6% and 3.5%, p = 0.25). In dynamic scans, the VC was greater (3.9%, p = 0.004; apical and caudal, 2.6% and 4.9%; p = 0.004), with a significant difference between static and dynamic in the 20 caudal nodules (3.5% and 4.9%, p = 0.015). This was consistent with greater motion-related artifacts and image noise at the diaphragm (p <0.05). The VC for interobserver variability was 0.6%. Conclusion: Residual motion-related artifacts had only minimal influence on volumetry of small solid lesions. This indicates a high reproducibility of

  9. On-line quantile regression in the RKHS (Reproducing Kernel Hilbert Space) for operational probabilistic forecasting of wind power

    International Nuclear Information System (INIS)

    Gallego-Castillo, Cristobal; Bessa, Ricardo; Cavalcante, Laura; Lopez-Garcia, Oscar

    2016-01-01

    Wind power probabilistic forecast is being used as input in several decision-making problems, such as stochastic unit commitment, operating reserve setting and electricity market bidding. This work introduces a new on-line quantile regression model based on the Reproducing Kernel Hilbert Space (RKHS) framework. Its application to the field of wind power forecasting involves a discussion on the choice of the bias term of the quantile models, and the consideration of the operational framework in order to mimic real conditions. Benchmark against linear and splines quantile regression models was performed for a real case study during a 18 months period. Model parameter selection was based on k-fold crossvalidation. Results showed a noticeable improvement in terms of calibration, a key criterion for the wind power industry. Modest improvements in terms of Continuous Ranked Probability Score (CRPS) were also observed for prediction horizons between 6 and 20 h ahead. - Highlights: • New online quantile regression model based on the Reproducing Kernel Hilbert Space. • First application to operational probabilistic wind power forecasting. • Modest improvements of CRPS for prediction horizons between 6 and 20 h ahead. • Noticeable improvements in terms of Calibration due to online learning.

  10. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    2017-01-01

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economistís model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  11. Cervical vertebrae maturation method morphologic criteria: poor reproducibility.

    Science.gov (United States)

    Nestman, Trenton S; Marshall, Steven D; Qian, Fang; Holton, Nathan; Franciscus, Robert G; Southard, Thomas E

    2011-08-01

    The cervical vertebrae maturation (CVM) method has been advocated as a predictor of peak mandibular growth. A careful review of the literature showed potential methodologic errors that might influence the high reported reproducibility of the CVM method, and we recently established that the reproducibility of the CVM method was poor when these potential errors were eliminated. The purpose of this study was to further investigate the reproducibility of the individual vertebral patterns. In other words, the purpose was to determine which of the individual CVM vertebral patterns could be classified reliably and which could not. Ten practicing orthodontists, trained in the CVM method, evaluated the morphology of cervical vertebrae C2 through C4 from 30 cephalometric radiographs using questions based on the CVM method. The Fleiss kappa statistic was used to assess interobserver agreement when evaluating each cervical vertebrae morphology question for each subject. The Kendall coefficient of concordance was used to assess the level of interobserver agreement when determining a "derived CVM stage" for each subject. Interobserver agreement was high for assessment of the lower borders of C2, C3, and C4 that were either flat or curved in the CVM method, but interobserver agreement was low for assessment of the vertebral bodies of C3 and C4 when they were either trapezoidal, rectangular horizontal, square, or rectangular vertical; this led to the overall poor reproducibility of the CVM method. These findings were reflected in the Fleiss kappa statistic. Furthermore, nearly 30% of the time, individual morphologic criteria could not be combined to generate a final CVM stage because of incompatible responses to the 5 questions. Intraobserver agreement in this study was only 62%, on average, when the inconclusive stagings were excluded as disagreements. Intraobserver agreement was worse (44%) when the inconclusive stagings were included as disagreements. For the group of subjects

  12. Simplified Qualitative Discrete Numerical Model to Determine Cracking Pattern in Brittle Materials by Means of Finite Element Method

    OpenAIRE

    Ochoa-Avendaño, J.; Garzon-Alvarado, D. A.; Linero, Dorian L.; Cerrolaza, M.

    2017-01-01

    This paper presents the formulation, implementation, and validation of a simplified qualitative model to determine the crack path of solids considering static loads, infinitesimal strain, and plane stress condition. This model is based on finite element method with a special meshing technique, where nonlinear link elements are included between the faces of the linear triangular elements. The stiffness loss of some link elements represents the crack opening. Three experimental tests of bending...

  13. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    Science.gov (United States)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  14. Retrospective Correction of Physiological Noise: Impact on Sensitivity, Specificity, and Reproducibility of Resting-State Functional Connectivity in a Reading Network Model.

    Science.gov (United States)

    Krishnamurthy, Venkatagiri; Krishnamurthy, Lisa C; Schwam, Dina M; Ealey, Ashley; Shin, Jaemin; Greenberg, Daphne; Morris, Robin D

    2018-03-01

    It is well accepted that physiological noise (PN) obscures the detection of neural fluctuations in resting-state functional connectivity (rsFC) magnetic resonance imaging. However, a clear consensus for an optimal PN correction (PNC) methodology and how it can impact the rsFC signal characteristics is still lacking. In this study, we probe the impact of three PNC methods: RETROICOR: (Glover et al., 2000 ), ANATICOR: (Jo et al., 2010 ), and RVTMBPM: (Bianciardi et al., 2009 ). Using a reading network model, we systematically explore the effects of PNC optimization on sensitivity, specificity, and reproducibility of rsFC signals. In terms of specificity, ANATICOR was found to be effective in removing local white matter (WM) fluctuations and also resulted in aggressive removal of expected cortical-to-subcortical functional connections. The ability of RETROICOR to remove PN was equivalent to removal of simulated random PN such that it artificially inflated the connection strength, thereby decreasing sensitivity. RVTMBPM maintained specificity and sensitivity by balanced removal of vasodilatory PN and local WM nuisance edges. Another aspect of this work was exploring the effects of PNC on identifying reading group differences. Most PNC methods accounted for between-subject PN variability resulting in reduced intersession reproducibility. This effect facilitated the detection of the most consistent group differences. RVTMBPM was most effective in detecting significant group differences due to its inherent sensitivity to removing spatially structured and temporally repeating PN arising from dense vasculature. Finally, results suggest that combining all three PNC resulted in "overcorrection" by removing signal along with noise.

  15. Microbial community development in a dynamic gut model is reproducible, colon region specific, and selective for Bacteroidetes and Clostridium cluster IX.

    Science.gov (United States)

    Van den Abbeele, Pieter; Grootaert, Charlotte; Marzorati, Massimo; Possemiers, Sam; Verstraete, Willy; Gérard, Philippe; Rabot, Sylvie; Bruneau, Aurélia; El Aidy, Sahar; Derrien, Muriel; Zoetendal, Erwin; Kleerebezem, Michiel; Smidt, Hauke; Van de Wiele, Tom

    2010-08-01

    Dynamic, multicompartment in vitro gastrointestinal simulators are often used to monitor gut microbial dynamics and activity. These reactors need to harbor a microbial community that is stable upon inoculation, colon region specific, and relevant to in vivo conditions. Together with the reproducibility of the colonization process, these criteria are often overlooked when the modulatory properties from different treatments are compared. We therefore investigated the microbial colonization process in two identical simulators of the human intestinal microbial ecosystem (SHIME), simultaneously inoculated with the same human fecal microbiota with a high-resolution phylogenetic microarray: the human intestinal tract chip (HITChip). Following inoculation of the in vitro colon compartments, microbial community composition reached steady state after 2 weeks, whereas 3 weeks were required to reach functional stability. This dynamic colonization process was reproducible in both SHIME units and resulted in highly diverse microbial communities which were colon region specific, with the proximal regions harboring saccharolytic microbes (e.g., Bacteroides spp. and Eubacterium spp.) and the distal regions harboring mucin-degrading microbes (e.g., Akkermansia spp.). Importantly, the shift from an in vivo to an in vitro environment resulted in an increased Bacteroidetes/Firmicutes ratio, whereas Clostridium cluster IX (propionate producers) was enriched compared to clusters IV and XIVa (butyrate producers). This was supported by proportionally higher in vitro propionate concentrations. In conclusion, high-resolution analysis of in vitro-cultured gut microbiota offers new insight on the microbial colonization process and indicates the importance of digestive parameters that may be crucial in the development of new in vitro models.

  16. Reproducibility of heart rate variability parameters measured in healthy subjects at rest and after a postural change maneuver

    Directory of Open Access Journals (Sweden)

    E.M. Dantas

    2010-10-01

    Full Text Available Heart rate variability (HRV provides important information about cardiac autonomic modulation. Since it is a noninvasive and inexpensive method, HRV has been used to evaluate several parameters of cardiovascular health. However, the internal reproducibility of this method has been challenged in some studies. Our aim was to determine the intra-individual reproducibility of HRV parameters in short-term recordings obtained in supine and orthostatic positions. Electrocardiographic (ECG recordings were obtained from 30 healthy subjects (20-49 years, 14 men using a digital apparatus (sampling ratio = 250 Hz. ECG was recorded for 10 min in the supine position and for 10 min in the orthostatic position. The procedure was repeated 2-3 h later. Time and frequency domain analyses were performed. Frequency domain included low (LF, 0.04-0.15 Hz and high frequency (HF, 0.15-0.4 Hz bands. Power spectral analysis was performed by the autoregressive method and model order was set at 16. Intra-subject agreement was assessed by linear regression analysis, test of difference in variances and limits of agreement. Most HRV measures (pNN50, RMSSD, LF, HF, and LF/HF ratio were reproducible independent of body position. Better correlation indexes (r > 0.6 were obtained in the orthostatic position. Bland-Altman plots revealed that most values were inside the agreement limits, indicating concordance between measures. Only SDNN and NNv in the supine position were not reproducible. Our results showed reproducibility of HRV parameters when recorded in the same individual with a short time between two exams. The increased sympathetic activity occurring in the orthostatic position probably facilitates reproducibility of the HRV indexes.

  17. Application of qualitative reasoning with functional knowledge represented by Multilevel Flow Modeling to diagnosis of accidental situation in nuclear power plant

    International Nuclear Information System (INIS)

    Yoshida, Kazuo; Tanabe, Fumiya; Kawase, Katumi.

    1996-01-01

    It has been proposed to use the Multilevel Flow Modeling (MFM) by M. Lind as a framework for functional knowledge representation for qualitative reasoning in a complex process system such as nuclear power plant. To build a knowledge base with MFM framework makes it possible to represent functional characteristics in different levels of abstraction and aggregation. A pilot inference system based on the qualitative reasoning with MFM has been developed to diagnose a cause of abnormal events in a typical PWR power plant. Some single failure events has been diagnosed with this system to verify the proposed method. In the verification study, some investigation has been also performed to clarify the effects of this knowledge representation in efficiency of reasoning and ambiguity of qualitative reasoning. (author)

  18. Validation of the 3D Skin Comet assay using full thickness skin models: Transferability and reproducibility.

    Science.gov (United States)

    Reisinger, Kerstin; Blatz, Veronika; Brinkmann, Joep; Downs, Thomas R; Fischer, Anja; Henkler, Frank; Hoffmann, Sebastian; Krul, Cyrille; Liebsch, Manfred; Luch, Andreas; Pirow, Ralph; Reus, Astrid A; Schulz, Markus; Pfuhler, Stefan

    2018-03-01

    Recently revised OECD Testing Guidelines highlight the importance of considering the first site-of-contact when investigating the genotoxic hazard. Thus far, only in vivo approaches are available to address the dermal route of exposure. The 3D Skin Comet and Reconstructed Skin Micronucleus (RSMN) assays intend to close this gap in the in vitro genotoxicity toolbox by investigating DNA damage after topical application. This represents the most relevant route of exposure for a variety of compounds found in household products, cosmetics, and industrial chemicals. The comet assay methodology is able to detect both chromosomal damage and DNA lesions that may give rise to gene mutations, thereby complementing the RSMN which detects only chromosomal damage. Here, the comet assay was adapted to two reconstructed full thickness human skin models: the EpiDerm™- and Phenion ® Full-Thickness Skin Models. First, tissue-specific protocols for the isolation of single cells and the general comet assay were transferred to European and US-American laboratories. After establishment of the assay, the protocol was then further optimized with appropriate cytotoxicity measurements and the use of aphidicolin, a DNA repair inhibitor, to improve the assay's sensitivity. In the first phase of an ongoing validation study eight chemicals were tested in three laboratories each using the Phenion ® Full-Thickness Skin Model, informing several validation modules. Ultimately, the 3D Skin Comet assay demonstrated a high predictive capacity and good intra- and inter-laboratory reproducibility with four laboratories reaching a 100% predictivity and the fifth yielding 70%. The data are intended to demonstrate the use of the 3D Skin Comet assay as a new in vitro tool for following up on positive findings from the standard in vitro genotoxicity test battery for dermally applied chemicals, ultimately helping to drive the regulatory acceptance of the assay. To expand the database, the validation will

  19. Recent advances on multidimensional liquid chromatography–mass spectrometry for proteomics: From qualitative to quantitative analysis—A review

    International Nuclear Information System (INIS)

    Wu Qi; Yuan Huiming; Zhang Lihua; Zhang Yukui

    2012-01-01

    Highlights: ► We discuss progress of MDLC–MS systems in qualitative and quantitative proteomics. ► Both “Top-down” and “bottom-up” strategies are discussed in detail. ► On-line integrations of stable isotope labeling process are highlighted. ► This review gives insights into further directions for higher level integration. - Abstract: With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography–mass spectrometry (MDLC–MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including “top-down” and “bottom-up” to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes.

  20. Reproducibility of abdominal fat assessment by ultrasound and computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Mauad, Fernando Marum; Chagas-Neto, Francisco Abaete; Benedeti, Augusto Cesar Garcia Saab; Nogueira-Barbosa, Marcello Henrique; Muglia, Valdair Francisco; Carneiro, Antonio Adilton Oliveira; Muller, Enrico Mattana; Elias Junior, Jorge, E-mail: fernando@fatesa.edu.br [Faculdade de Tecnologia em Saude (FATESA), Ribeirao Preto, SP (Brazil); Universidade de Fortaleza (UNIFOR), Fortaleza, CE (Brazil). Departmento de Radiologia; Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Medicina. Departmento de Medicina Clinica; Universidade de Sao Paulo (FFCLRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Filosofia, Ciencias e Letras; Hospital Mae de Deus, Porto Alegre, RS (Brazil)

    2017-05-15

    Objective: To test the accuracy and reproducibility of ultrasound and computed tomography (CT) for the quantification of abdominal fat in correlation with the anthropometric, clinical, and biochemical assessments. Materials and Methods: Using ultrasound and CT, we determined the thickness of subcutaneous and intra-abdominal fat in 101 subjects-of whom 39 (38.6%) were men and 62 (61.4%) were women-with a mean age of 66.3 years (60-80 years). The ultrasound data were correlated with the anthropometric, clinical, and biochemical parameters, as well as with the areas measured by abdominal CT. Results: Intra-abdominal thickness was the variable for which the correlation with the areas of abdominal fat was strongest (i.e., the correlation coefficient was highest). We also tested the reproducibility of ultrasound and CT for the assessment of abdominal fat and found that CT measurements of abdominal fat showed greater reproducibility, having higher intraobserver and interobserver reliability than had the ultrasound measurements. There was a significant correlation between ultrasound and CT, with a correlation coefficient of 0.71. Conclusion: In the assessment of abdominal fat, the intraobserver and interobserver reliability were greater for CT than for ultrasound, although both methods showed high accuracy and good reproducibility. (author)

  1. Current induced torques and interfacial spin-orbit coupling: Semiclassical modeling

    KAUST Repository

    Haney, Paul M.; Lee, Hyun-Woo; Lee, Kyung-Jin; Manchon, Aurelien; Stiles, M. D.

    2013-01-01

    , that qualitatively reproduces the behavior, but quantitatively differs in some regimes. We show that the Boltzmann equation with physically reasonable parameters can match the torques for any particular sample, but in some cases, it fails to describe

  2. Empirical evaluation of cross-site reproducibility in radiomic features for characterizing prostate MRI

    Science.gov (United States)

    Chirra, Prathyush; Leo, Patrick; Yim, Michael; Bloch, B. Nicolas; Rastinehad, Ardeshir R.; Purysko, Andrei; Rosen, Mark; Madabhushi, Anant; Viswanath, Satish

    2018-02-01

    The recent advent of radiomics has enabled the development of prognostic and predictive tools which use routine imaging, but a key question that still remains is how reproducible these features may be across multiple sites and scanners. This is especially relevant in the context of MRI data, where signal intensity values lack tissue specific, quantitative meaning, as well as being dependent on acquisition parameters (magnetic field strength, image resolution, type of receiver coil). In this paper we present the first empirical study of the reproducibility of 5 different radiomic feature families in a multi-site setting; specifically, for characterizing prostate MRI appearance. Our cohort comprised 147 patient T2w MRI datasets from 4 different sites, all of which were first pre-processed to correct acquisition-related for artifacts such as bias field, differing voxel resolutions, as well as intensity drift (non-standardness). 406 3D voxel wise radiomic features were extracted and evaluated in a cross-site setting to determine how reproducible they were within a relatively homogeneous non-tumor tissue region; using 2 different measures of reproducibility: Multivariate Coefficient of Variation and Instability Score. Our results demonstrated that Haralick features were most reproducible between all 4 sites. By comparison, Laws features were among the least reproducible between sites, as well as performing highly variably across their entire parameter space. Similarly, the Gabor feature family demonstrated good cross-site reproducibility, but for certain parameter combinations alone. These trends indicate that despite extensive pre-processing, only a subset of radiomic features and associated parameters may be reproducible enough for use within radiomics-based machine learning classifier schemes.

  3. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.

    Science.gov (United States)

    Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P

    2018-02-23

    Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.

  4. Horizontal, anomalous U(1) symmetry for the more minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Nelson, A.E.; Wright, D.

    1997-01-01

    We construct explicit examples with a horizontal, open-quotes anomalousclose quotes U(1) gauge group, which, in a supersymmetric extension of the standard model, reproduce qualitative features of the fermion spectrum and CKM matrix, and suppress FCNC and proton decay rates without the imposition of global symmetries. We review the motivation for such open-quotes moreclose quotes minimal supersymmetric standard models and their predictions for the sparticle spectrum. There is a mass hierarchy in the scalar sector which is the inverse of the fermion mass hierarchy. We show in detail why ΔS=2 FCNCs are greatly suppressed when compared with naive estimates for nondegenerate squarks. copyright 1997 The American Physical Society

  5. The new AP Physics exams: Integrating qualitative and quantitative reasoning

    Science.gov (United States)

    Elby, Andrew

    2015-04-01

    When physics instructors and education researchers emphasize the importance of integrating qualitative and quantitative reasoning in problem solving, they usually mean using those types of reasoning serially and separately: first students should analyze the physical situation qualitatively/conceptually to figure out the relevant equations, then they should process those equations quantitatively to generate a solution, and finally they should use qualitative reasoning to check that answer for plausibility (Heller, Keith, & Anderson, 1992). The new AP Physics 1 and 2 exams will, of course, reward this approach to problem solving. But one kind of free response question will demand and reward a further integration of qualitative and quantitative reasoning, namely mathematical modeling and sense-making--inventing new equations to capture a physical situation and focusing on proportionalities, inverse proportionalities, and other functional relations to infer what the equation ``says'' about the physical world. In this talk, I discuss examples of these qualitative-quantitative translation questions, highlighting how they differ from both standard quantitative and standard qualitative questions. I then discuss the kinds of modeling activities that can help AP and college students develop these skills and habits of mind.

  6. Serous tubal intraepithelial carcinoma: diagnostic reproducibility and its implications.

    Science.gov (United States)

    Carlson, Joseph W; Jarboe, Elke A; Kindelberger, David; Nucci, Marisa R; Hirsch, Michelle S; Crum, Christopher P

    2010-07-01

    Serous tubal intraepithelial carcinoma (STIC) is detected in between 5% and 7% of women undergoing risk-reduction salpingooophorectomy for mutations in the BRCA1 or 2 genes (BRCA+), and seems to play a role in the pathogenesis of many ovarian and "primary peritoneal" serous carcinomas. The recognition of STIC is germane to the management of BRCA+ women; however, the diagnostic reproducibility of STIC is unknown. Twenty-one cases were selected and classified as STIC or benign, using both hematoxylin and eosin and immunohistochemical stains for p53 and MIB-1. Digital images of 30 hematoxylin and eosin-stained STICs (n=14) or benign tubal epithelium (n=16) were photographed and randomized for blind digital review in a Powerpoint format by 6 experienced gynecologic pathologists and 6 pathology trainees. A generalized kappa statistic for multiple raters was calculated for all groups. For all reviewers, the kappa was 0.333, indicating poor reproducibility; kappa was 0.453 for the experienced gynecologic pathologists (fair-to-good reproducibility), and kappa=0.253 for the pathology residents (poor reproducibility). In the experienced group, 3 of 14 STICs were diagnosed by all 6 reviewers, and 9 of 14 by a majority of the reviewers. These results show that interobserver concordance in the recognition of STIC in high-quality digital images is at best fair-to-good for even experienced gynecologic pathologists, and a proportion cannot be consistently identified even among experienced observers. In view of these findings, a diagnosis of STIC should be corroborated by a second pathologist, if feasible.

  7. LHC Orbit Correction Reproducibility and Related Machine Protection

    CERN Document Server

    Baer, T; Schmidt, R; Wenninger, J

    2012-01-01

    The Large Hadron Collider (LHC) has an unprecedented nominal stored beam energy of up to 362 MJ per beam. In order to ensure an adequate machine protection by the collimation system, a high reproducibility of the beam position at collimators and special elements like the final focus quadrupoles is essential. This is realized by a combination of manual orbit corrections, feed forward and real time feedback. In order to protect the LHC against inconsistent orbit corrections, which could put the machine in a vulnerable state, a novel software-based interlock system for orbit corrector currents was developed. In this paper, the principle of the new interlock system is described and the reproducibility of the LHC orbit correction is discussed against the background of this system.

  8. Understanding reproducibility of human IVF traits to predict next IVF cycle outcome.

    Science.gov (United States)

    Wu, Bin; Shi, Juanzi; Zhao, Wanqiu; Lu, Suzhen; Silva, Marta; Gelety, Timothy J

    2014-10-01

    Evaluating the failed IVF cycle often provides useful prognostic information. Before undergoing another attempt, patients experiencing an unsuccessful IVF cycle frequently request information about the probability of future success. Here, we introduced the concept of reproducibility and formulae to predict the next IVF cycle outcome. The experimental design was based on the retrospective review of IVF cycle data from 2006 to 2013 in two different IVF centers and statistical analysis. The reproducibility coefficients (r) of IVF traits including number of oocytes retrieved, oocyte maturity, fertilization, embryo quality and pregnancy were estimated using the interclass correlation coefficient between the repeated IVF cycle measurements for the same patient by variance component analysis. The formulae were designed to predict next IVF cycle outcome. The number of oocytes retrieved from patients and their fertilization rate had the highest reproducibility coefficients (r = 0.81 ~ 0.84), which indicated a very close correlation between the first retrieval cycle and subsequent IVF cycles. Oocyte maturity and number of top quality embryos had middle level reproducibility (r = 0.38 ~ 0.76) and pregnancy rate had a relative lower reproducibility (r = 0.23 ~ 0.27). Based on these parameters, the next outcome for these IVF traits might be accurately predicted by the designed formulae. The introduction of the concept of reproducibility to our human IVF program allows us to predict future IVF cycle outcomes. The traits of oocyte numbers retrieved, oocyte maturity, fertilization, and top quality embryos had higher or middle reproducibility, which provides a basis for accurate prediction of future IVF outcomes. Based on this prediction, physicians may counsel their patients or change patient's stimulation plans, and laboratory embryologists may improve their IVF techniques accordingly.

  9. On the Reproducibility of Label-Free Quantitative Cross-Linking/Mass Spectrometry

    Science.gov (United States)

    Müller, Fränze; Fischer, Lutz; Chen, Zhuo Angel; Auchynnikava, Tania; Rappsilber, Juri

    2018-02-01

    Quantitative cross-linking/mass spectrometry (QCLMS) is an emerging approach to study conformational changes of proteins and multi-subunit complexes. Distinguishing protein conformations requires reproducibly identifying and quantifying cross-linked peptides. Here we analyzed the variation between multiple cross-linking reactions using bis[sulfosuccinimidyl] suberate (BS3)-cross-linked human serum albumin (HSA) and evaluated how reproducible cross-linked peptides can be identified and quantified by LC-MS analysis. To make QCLMS accessible to a broader research community, we developed a workflow that integrates the established software tools MaxQuant for spectra preprocessing, Xi for cross-linked peptide identification, and finally Skyline for quantification (MS1 filtering). Out of the 221 unique residue pairs identified in our sample, 124 were subsequently quantified across 10 analyses with coefficient of variation (CV) values of 14% (injection replica) and 32% (reaction replica). Thus our results demonstrate that the reproducibility of QCLMS is in line with the reproducibility of general quantitative proteomics and we establish a robust workflow for MS1-based quantitation of cross-linked peptides.

  10. Using a nursing theory or a model in nursing PhD dissertations: a qualitative study from Turkey.

    Science.gov (United States)

    Mete, Samiye; Gokçe İsbir, Gozde

    2015-04-01

    The aim of this study was to reveal experiences of nursing students and their advisors using theories and models in their PhD dissertations. The study adopted a descriptive qualitative approach. This study was performed with 10 PhD candidates and their five advisors from nursing faculty. The results of the study were categorized into four. These are reasons for using a theory/model in a PhD dissertation, reasons for preferring a given model, causes of difficulties in using models in PhD dissertations, and facilitating factors of using theories and models in PhD of dissertations. It was also reported to contribute to the methodology of research and professional development of the students and advisors. © 2014 NANDA International, Inc.

  11. Preliminary clinical nursing leadership competency model: a qualitative study from Thailand.

    Science.gov (United States)

    Supamanee, Treeyaphan; Krairiksh, Marisa; Singhakhumfu, Laddawan; Turale, Sue

    2011-12-01

    This qualitative study explored the clinical nursing leadership competency perspectives of Thai nurses working in a university hospital. To collect data, in-depth interviews were undertaken with 23 nurse administrators, and focus groups were used with 31 registered nurses. Data were analyzed using content analysis, and theory development was guided by the Iceberg model. Nurses' clinical leadership competencies emerged, comprising hidden characteristics and surface characteristics. The hidden characteristics composed three elements: motive (respect from the nursing and healthcare team and being secure in life), self-concept (representing positive attitudes and values), and traits (personal qualities necessary for leadership). The surface characteristics comprised specific knowledge of nurse leaders about clinical leadership, management and nursing informatics, and clinical skills, such as coordination, effective communication, problem solving, and clinical decision-making. The study findings help nursing to gain greater knowledge of the essence of clinical nursing leadership competencies, a matter critical for theory development in leadership. This study's results later led to the instigation of a training program for registered nurse leaders at the study site, and the formation of a preliminary clinical nursing leadership competency model. © 2011 Blackwell Publishing Asia Pty Ltd.

  12. Reproducibility of airway luminal size in asthma measured by HRCT.

    Science.gov (United States)

    Brown, Robert H; Henderson, Robert J; Sugar, Elizabeth A; Holbrook, Janet T; Wise, Robert A

    2017-10-01

    Brown RH, Henderson RJ, Sugar EA, Holbrook JT, Wise RA, on behalf of the American Lung Association Airways Clinical Research Centers. Reproducibility of airway luminal size in asthma measured by HRCT. J Appl Physiol 123: 876-883, 2017. First published July 13, 2017; doi:10.1152/japplphysiol.00307.2017.-High-resolution CT (HRCT) is a well-established imaging technology used to measure lung and airway morphology in vivo. However, there is a surprising lack of studies examining HRCT reproducibility. The CPAP Trial was a multicenter, randomized, three-parallel-arm, sham-controlled 12-wk clinical trial to assess the use of a nocturnal continuous positive airway pressure (CPAP) device on airway reactivity to methacholine. The lack of a treatment effect of CPAP on clinical or HRCT measures provided an opportunity for the current analysis. We assessed the reproducibility of HRCT imaging over 12 wk. Intraclass correlation coefficients (ICCs) were calculated for individual airway segments, individual lung lobes, both lungs, and air trapping. The ICC [95% confidence interval (CI)] for airway luminal size at total lung capacity ranged from 0.95 (0.91, 0.97) to 0.47 (0.27, 0.69). The ICC (95% CI) for airway luminal size at functional residual capacity ranged from 0.91 (0.85, 0.95) to 0.32 (0.11, 0.65). The ICC measurements for airway distensibility index and wall thickness were lower, ranging from poor (0.08) to moderate (0.63) agreement. The ICC for air trapping at functional residual capacity was 0.89 (0.81, 0.94) and varied only modestly by lobe from 0.76 (0.61, 0.87) to 0.95 (0.92, 0.97). In stable well-controlled asthmatic subjects, it is possible to reproducibly image unstimulated airway luminal areas over time, by region, and by size at total lung capacity throughout the lungs. Therefore, any changes in luminal size on repeat CT imaging are more likely due to changes in disease state and less likely due to normal variability. NEW & NOTEWORTHY There is a surprising lack

  13. Reproducing Electric Field Observations during Magnetic Storms by means of Rigorous 3-D Modelling and Distortion Matrix Co-estimation

    Science.gov (United States)

    Püthe, Christoph; Manoj, Chandrasekharan; Kuvshinov, Alexey

    2015-04-01

    Electric fields induced in the conducting Earth during magnetic storms drive currents in power transmission grids, telecommunication lines or buried pipelines. These geomagnetically induced currents (GIC) can cause severe service disruptions. The prediction of GIC is thus of great importance for public and industry. A key step in the prediction of the hazard to technological systems during magnetic storms is the calculation of the geoelectric field. To address this issue for mid-latitude regions, we developed a method that involves 3-D modelling of induction processes in a heterogeneous Earth and the construction of a model of the magnetospheric source. The latter is described by low-degree spherical harmonics; its temporal evolution is derived from observatory magnetic data. Time series of the electric field can be computed for every location on Earth's surface. The actual electric field however is known to be perturbed by galvanic effects, arising from very local near-surface heterogeneities or topography, which cannot be included in the conductivity model. Galvanic effects are commonly accounted for with a real-valued time-independent distortion matrix, which linearly relates measured and computed electric fields. Using data of various magnetic storms that occurred between 2000 and 2003, we estimated distortion matrices for observatory sites onshore and on the ocean bottom. Strong correlations between modellings and measurements validate our method. The distortion matrix estimates prove to be reliable, as they are accurately reproduced for different magnetic storms. We further show that 3-D modelling is crucial for a correct separation of galvanic and inductive effects and a precise prediction of electric field time series during magnetic storms. Since the required computational resources are negligible, our approach is suitable for a real-time prediction of GIC. For this purpose, a reliable forecast of the source field, e.g. based on data from satellites

  14. Qualitative modeling of the decision-making process using electrooculography.

    Science.gov (United States)

    Zargari Marandi, Ramtin; Sabzpoushan, S H

    2015-12-01

    A novel method based on electrooculography (EOG) has been introduced in this work to study the decision-making process. An experiment was designed and implemented wherein subjects were asked to choose between two items from the same category that were presented within a limited time. The EOG and voice signals of the subjects were recorded during the experiment. A calibration task was performed to map the EOG signals to their corresponding gaze positions on the screen by using an artificial neural network. To analyze the data, 16 parameters were extracted from the response time and EOG signals of the subjects. Evaluation and comparison of the parameters, together with subjects' choices, revealed functional information. On the basis of this information, subjects switched their eye gazes between items about three times on average. We also found, according to statistical hypothesis testing-that is, a t test, t(10) = 71.62, SE = 1.25, p < .0001-that the correspondence rate of a subjects' gaze at the moment of selection with the selected item was significant. Ultimately, on the basis of these results, we propose a qualitative choice model for the decision-making task.

  15. Measurements of boat motion in waves at Durban harbour for qualitative validation of motion model

    CSIR Research Space (South Africa)

    Mosikare, OR

    2010-09-01

    Full Text Available in Waves at Durban Harbour for Qualitative Validation of Motion Model O.R. Mosikare1,2, N.J. Theron1, W. Van der Molen 1 University of Pretoria, South Africa, 0001 2Council for Scientific and Industrial Research, Meiring Naude Rd, Brummeria, 0001... stream_source_info Mosikare_2010.pdf.txt stream_content_type text/plain stream_size 3033 Content-Encoding UTF-8 stream_name Mosikare_2010.pdf.txt Content-Type text/plain; charset=UTF-8 Measurements of Boat Motion...

  16. The Maudsley Model of Family-Based Treatment for Anorexia Nervosa: A Qualitative Evaluation of Parent-to-Parent Consultation

    Science.gov (United States)

    Rhodes, Paul; Brown, Jac; Madden, Sloane

    2009-01-01

    This article describes the qualitative analysis of a randomized control trial that explores the use of parent-to-parent consultations as an augmentation to the Maudsley model of family-based treatment for anorexia. Twenty families were randomized into two groups, 10 receiving standard treatment and 10 receiving an additional parent-to-parent…

  17. Thermoluminescence of zircon: a kinetic model

    CERN Document Server

    Turkin, A A; Vainshtein, D I; Hartog, H W D

    2003-01-01

    The mineral zircon, ZrSiO sub 4 , belongs to a class of promising materials for geochronometry by means of thermoluminescence (TL) dating. The development of a reliable and reproducible method for TL dating with zircon requires detailed knowledge of the processes taking place during exposure to ionizing radiation, long-term storage, annealing at moderate temperatures and heating at a constant rate (TL measurements). To understand these processes one needs a kinetic model of TL. This paper is devoted to the construction of such a model. The goal is to study the qualitative behaviour of the system and to determine the parameters and processes controlling TL phenomena of zircon. The model considers the following processes: (i) Filling of electron and hole traps at the excitation stage as a function of the dose rate and the dose for both (low dose rate) natural and (high dose rate) laboratory irradiation. (ii) Time dependence of TL fading in samples irradiated under laboratory conditions. (iii) Short time anneali...

  18. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques.

    Directory of Open Access Journals (Sweden)

    Emi Kamimura

    Full Text Available The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D images of teeth captured by a digital impression technique to a conventional impression technique in vivo.Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE. A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE. Stereolithography (STL data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D laboratory scanner (D810, 3shape. The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test.The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm than when using a conventional impression technique (0.023 ± 0.01 mm.The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator.

  19. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques

    Science.gov (United States)

    Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi

    2017-01-01

    Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642

  20. Validity and reproducibility of a Spanish dietary history.

    Directory of Open Access Journals (Sweden)

    Pilar Guallar-Castillón

    Full Text Available To assess the validity and reproducibility of food and nutrient intake estimated with the electronic diet history of ENRICA (DH-E, which collects information on numerous aspects of the Spanish diet.The validity of food and nutrient intake was estimated using Pearson correlation coefficients between the DH-E and the mean of seven 24-hour recalls collected every 2 months over the previous year. The reproducibility was estimated using intraclass correlation coefficients between two DH-E made one year apart.The correlations coefficients between the DH-E and the mean of seven 24-hour recalls for the main food groups were cereals (r = 0.66, meat (r = 0.66, fish (r = 0.42, vegetables (r = 0.62 and fruits (r = 0.44. The mean correlation coefficient for all 15 food groups considered was 0.53. The correlations for macronutrients were: energy (r = 0.76, proteins (r= 0.58, lipids (r = 0.73, saturated fat (r = 0.73, monounsaturated fat (r = 0.59, polyunsaturated fat (r = 0.57, and carbohydrates (r = 0.66. The mean correlation coefficient for all 41 nutrients studied was 0.55. The intraclass correlation coefficient between the two DH-E was greater than 0.40 for most foods and nutrients.The DH-E shows good validity and reproducibility for estimating usual intake of foods and nutrients.

  1. The repeatability and reproducibility of the BioRID IIg in a repeatable laboratory seat based on a production car seat.

    Science.gov (United States)

    Hynd, David; Depinet, Paul; Lorenz, Bernd

    2013-01-01

    The United Nations Economic Commission for Europe Informal Group on GTR No. 7 Phase 2 are working to define a build level for the BioRID II rear impact (whiplash) crash test dummy that ensures repeatable and reproducible performance in a test procedure that has been proposed for future legislation. This includes the specification of dummy hardware, as well as the development of comprehensive certification procedures for the dummy. This study evaluated whether the dummy build level and certification procedures deliver the desired level of repeatability and reproducibility. A custom-designed laboratory seat was made using the seat base, back, and head restraint from a production car seat to ensure a representative interface with the dummy. The seat back was reinforced for use in multiple tests and the recliner mechanism was replaced by an external spring-damper mechanism. A total of 65 tests were performed with 6 BioRID IIg dummies using the draft GTR No.7 sled pulse and seating procedure. All dummies were subject to the build, maintenance, and certification procedures defined by the Informal Group. The test condition was highly repeatable, with a very repeatable pulse, a well-controlled seat back response, and minimal observed degradation of seat foams. The results showed qualitatively reasonable repeatability and reproducibility for the upper torso and head accelerations, as well as for T1 Fx and upper neck Fx . However, reproducibility was not acceptable for T1 and upper neck Fz or for T1 and upper neck My . The Informal Group has not selected injury or seat assessment criteria for use with BioRID II, so it is not known whether these channels would be used in the regulation. However, the ramping-up behavior of the dummy showed poor reproducibility, which would be expected to affect the reproducibility of dummy measurements in general. Pelvis and spine characteristics were found to significantly influence the dummy measurements for which poor reproducibility was

  2. Reproducibility of the serum lipid response to coffee oil in healthy volunteers

    Directory of Open Access Journals (Sweden)

    Katan Martijn B

    2003-10-01

    Full Text Available Abstract Background Humans and animals show a certain consistency in the response of their serum lipids to fat-modified diets. This may indicate a genetic basis underlying this response. Coffee oil might be used as a model substance to investigate which genes determine differences in the serum lipid response. Before carrying out such studies our objective was to investigate to what extent the effect of coffee oil on serum lipid concentrations is reproducible within subjects. Methods The serum lipid response of 32 healthy volunteers was measured twice in separate five-week periods in which coffee oil was administered (69 mg cafestol / day. Results Total cholesterol levels increased by 24% in period 1 (range:0;52% and 18% in period 2 (1;48%, LDL cholesterol by 29 % (-9;71% and 20% (-12;57%, triglycerides by 66% (16;175% and 58% (-13;202%, and HDL cholesterol did not change significantly: The range of the HDL response was -19;25% in period 1 and -20;33% in period 2. The correlation between the two responses was 0.20 (95%CI -0.16, 0.51 for total cholesterol, 0.16 (95%CI -0.20, 0.48 for LDL, 0.67 (95%CI 0.42, 0.83 for HDL, and 0.77 (95%CI 0.56, 0.88 for triglycerides. Conclusions The responses of total and LDL cholesterol to coffee oil were poorly reproducible within subjects. The responses of HDL and triglycerides, however, appeared to be highly reproducible. Therefore, investigating the genetic sources of the variation in the serum-lipid response to coffee oil is more promising for HDL and triglycerides.

  3. Reproducibility and comparative validity of a food frequency questionnaire for Australian adults.

    Science.gov (United States)

    Collins, Clare E; Boggess, May M; Watson, Jane F; Guest, Maya; Duncanson, Kerith; Pezdirc, Kristine; Rollo, Megan; Hutchesson, Melinda J; Burrows, Tracy L

    2014-10-01

    Food frequency questionnaires (FFQ) are used in epidemiological studies to investigate the relationship between diet and disease. There is a need for a valid and reliable adult FFQ with a contemporary food list in Australia. To evaluate the reproducibility and comparative validity of the Australian Eating Survey (AES) FFQ in adults compared to weighed food records (WFRs). Two rounds of AES and three-day WFRs were conducted in 97 adults (31 males, median age and BMI for males of 44.9 years, 26.2 kg/m(2), females 41.3 years, 24.0 kg/m(2). Reproducibility was assessed over six months using Wilcoxon signed-rank tests and comparative validity was assessed by intraclass correlation coefficients (ICC) estimated by fitting a mixed effects model for each nutrient to account for age, sex and BMI to allow estimation of between and within person variance. Reproducibility was found to be good for both WFR and FFQ since there were no significant differences between round 1 and 2 administrations. For comparative validity, FFQ ICCs were at least as large as those for WFR. The ICC of the WFR-FFQ difference for total energy intake was 0.6 (95% CI 0.43, 0.77) and the median ICC for all nutrients was 0.47, with all ICCs between 0.15 (%E from saturated fat) and 0.7 (g/day sugars). Compared to WFR the AES FFQ is suitable for reliably estimating the dietary intakes of Australian adults across a wide range of nutrients. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  4. Reproducible positioning in chest X-ray radiography

    International Nuclear Information System (INIS)

    1974-01-01

    A device is described that can be used to ensure reproducibility in the positioning of the patient during X-ray radiography of the thorax. Signals are taken from an electrocardiographic monitor and from a device recording the respiratory cycle. Radiography is performed only when two preselected signals coincide

  5. A Qualitative Application of the Belsky Model to Explore Early Care and Education Teachers' Mealtime History, Beliefs, and Interactions.

    Science.gov (United States)

    Swindle, Taren M; Patterson, Zachary; Boden, Carrie J

    Studies on factors associated with nutrition practices in early care and education settings often focus on sociodemographic and programmatic characteristics. This qualitative study adapted and applied Belsky's determinants of parenting model to inform a broader exploration of Early Care and Education Teachers (ECETs) practices. Qualitative cross-sectional study with ECETs. The researchers interviewed ECETs in their communities across a Southern state. Purposive sampling was employed to recruit ECETs (n = 28) from Head Start or state-funded centers serving low-income families. Developmental histories of ECETs regarding food and nutrition, beliefs about child nutrition, and teaching interactions related to food. Qualitative interviews were coded using a deductive content analysis approach. Three distinct interrelationships were observed across the themes. First, rules and routines regarding food and mealtime in the educators' childhood often aligned with educator beliefs and behaviors at meals in their classroom. Second, some ECETs described motivations to leave a healthy food legacy for children in their class. Finally, an experience of food insecurity appeared in narratives that also emphasized making sure children got enough through various strategies. The influence of ECET developmental histories and their related beliefs can be addressed through professional development and ongoing support. Future study should quantify model constructs in a larger sample and study their relationships over time. Copyright © 2017 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  6. A catalyzing phantom for reproducible dynamic conversion of hyperpolarized [1-¹³C]-pyruvate.

    Science.gov (United States)

    Walker, Christopher M; Lee, Jaehyuk; Ramirez, Marc S; Schellingerhout, Dawid; Millward, Steven; Bankson, James A

    2013-01-01

    In vivo real time spectroscopic imaging of hyperpolarized ¹³C labeled metabolites shows substantial promise for the assessment of physiological processes that were previously inaccessible. However, reliable and reproducible methods of measurement are necessary to maximize the effectiveness of imaging biomarkers that may one day guide personalized care for diseases such as cancer. Animal models of human disease serve as poor reference standards due to the complexity, heterogeneity, and transient nature of advancing disease. In this study, we describe the reproducible conversion of hyperpolarized [1-¹³C]-pyruvate to [1-¹³C]-lactate using a novel synthetic enzyme phantom system. The rate of reaction can be controlled and tuned to mimic normal or pathologic conditions of varying degree. Variations observed in the use of this phantom compare favorably against within-group variations observed in recent animal studies. This novel phantom system provides crucial capabilities as a reference standard for the optimization, comparison, and certification of quantitative imaging strategies for hyperpolarized tracers.

  7. A catalyzing phantom for reproducible dynamic conversion of hyperpolarized [1-¹³C]-pyruvate.

    Directory of Open Access Journals (Sweden)

    Christopher M Walker

    Full Text Available In vivo real time spectroscopic imaging of hyperpolarized ¹³C labeled metabolites shows substantial promise for the assessment of physiological processes that were previously inaccessible. However, reliable and reproducible methods of measurement are necessary to maximize the effectiveness of imaging biomarkers that may one day guide personalized care for diseases such as cancer. Animal models of human disease serve as poor reference standards due to the complexity, heterogeneity, and transient nature of advancing disease. In this study, we describe the reproducible conversion of hyperpolarized [1-¹³C]-pyruvate to [1-¹³C]-lactate using a novel synthetic enzyme phantom system. The rate of reaction can be controlled and tuned to mimic normal or pathologic conditions of varying degree. Variations observed in the use of this phantom compare favorably against within-group variations observed in recent animal studies. This novel phantom system provides crucial capabilities as a reference standard for the optimization, comparison, and certification of quantitative imaging strategies for hyperpolarized tracers.

  8. Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings

    OpenAIRE

    Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.

    2007-01-01

    The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often ...

  9. Reproducibility of Manual Platelet Estimation Following Automated Low Platelet Counts

    Directory of Open Access Journals (Sweden)

    Zainab S Al-Hosni

    2016-11-01

    Full Text Available Objectives: Manual platelet estimation is one of the methods used when automated platelet estimates are very low. However, the reproducibility of manual platelet estimation has not been adequately studied. We sought to assess the reproducibility of manual platelet estimation following automated low platelet counts and to evaluate the impact of the level of experience of the person counting on the reproducibility of manual platelet estimates. Methods: In this cross-sectional study, peripheral blood films of patients with platelet counts less than 100 × 109/L were retrieved and given to four raters to perform manual platelet estimation independently using a predefined method (average of platelet counts in 10 fields using 100× objective multiplied by 20. Data were analyzed using intraclass correlation coefficient (ICC as a method of reproducibility assessment. Results: The ICC across the four raters was 0.840, indicating excellent agreement. The median difference of the two most experienced raters was 0 (range: -64 to 78. The level of platelet estimate by the least-experienced rater predicted the disagreement (p = 0.037. When assessing the difference between pairs of raters, there was no significant difference in the ICC (p = 0.420. Conclusions: The agreement between different raters using manual platelet estimation was excellent. Further confirmation is necessary, with a prospective study using a gold standard method of platelet counts.

  10. Reproducibility in the assessment of acute pancreatitis with computed tomography

    International Nuclear Information System (INIS)

    Freire Filho, Edison de Oliveira; Vieira, Renata La Rocca; Yamada, Andre Fukunishi; Shigueoka, David Carlos; Bekhor, Daniel; Freire, Maxime Figueiredo de Oliveira; Ajzen, Sergio; D'Ippolito, Giuseppe

    2007-01-01

    Objective: To evaluate the reproducibility of unenhanced and contrast-enhanced computed tomography in the assessment of patients with acute pancreatitis. Materials and methods: Fifty-one unenhanced and contrast-enhanced abdominal computed tomography studies of patients with acute pancreatitis were blindly reviewed by two radiologists (observers 1 and 2). The morphological index was separately calculated for unenhanced and contrast-enhanced computed tomography and the disease severity index was established. Intraobserver and interobserver reproducibility of computed tomography was measured by means of the kappa index (κ). Results: Interobserver agreement was κ 0.666, 0.705, 0.648, 0.547 and 0.631, respectively for unenhanced and contrast-enhanced morphological index, presence of pancreatic necrosis, pancreatic necrosis extension, and disease severity index. Intraobserver agreement (observers 1 and 2, respectively) was κ = 0.796 and 0.732 for unenhanced morphological index; κ 0.725 and 0.802 for contrast- enhanced morphological index; κ = 0.674 and 0.849 for presence of pancreatic necrosis; κ = 0.606 and 0.770 for pancreatic necrosis extension; and κ = 0.801 and 0.687 for disease severity index at computed tomography. Conclusion: Computed tomography for determination of morphological index and disease severity index in the staging of acute pancreatitis is a quite reproducible method. The absence of contrast- enhancement does not affect the computed tomography morphological index reproducibility. (author)

  11. Qualitative research.

    Science.gov (United States)

    Gelling, Leslie

    2015-03-25

    Qualitative research has an important role in helping nurses and other healthcare professionals understand patient experiences of health and illness. Qualitative researchers have a large number of methodological options and therefore should take care in planning and conducting their research. This article offers a brief overview of some of the key issues qualitative researchers should consider.

  12. Handling Imprecision in Qualitative Data Warehouse: Urban Building Sites Annoyance Analysis Use Case

    Science.gov (United States)

    Amanzougarene, F.; Chachoua, M.; Zeitouni, K.

    2013-05-01

    Data warehouse means a decision support database allowing integration, organization, historisation, and management of data from heterogeneous sources, with the aim of exploiting them for decision-making. Data warehouses are essentially based on multidimensional model. This model organizes data into facts (subjects of analysis) and dimensions (axes of analysis). In classical data warehouses, facts are composed of numerical measures and dimensions which characterize it. Dimensions are organized into hierarchical levels of detail. Based on the navigation and aggregation mechanisms offered by OLAP (On-Line Analytical Processing) tools, facts can be analyzed according to the desired level of detail. In real world applications, facts are not always numerical, and can be of qualitative nature. In addition, sometimes a human expert or learned model such as a decision tree provides a qualitative evaluation of phenomenon based on its different parameters i.e. dimensions. Conventional data warehouses are thus not adapted to qualitative reasoning and have not the ability to deal with qualitative data. In previous work, we have proposed an original approach of qualitative data warehouse modeling, which permits integrating qualitative measures. Based on computing with words methodology, we have extended classical multidimensional data model to allow the aggregation and analysis of qualitative data in OLAP environment. We have implemented this model in a Spatial Decision Support System to help managers of public spaces to reduce annoyances and improve the quality of life of the citizens. In this paper, we will focus our study on the representation and management of imprecision in annoyance analysis process. The main objective of this process consists in determining the least harmful scenario of urban building sites, particularly in dense urban environments.

  13. "Personified as Paragon of Suffering...... Optimistic Being of Achieving Normalcy:" A Conceptual Model Derived from Qualitative Research

    Science.gov (United States)

    Nayak, Shalini G; Pai, Mamatha Shivananda; George, Linu Sara

    2018-01-01

    Background: Conceptual models developed through qualitative research are based on the unique experiences of suffering and individuals’ adoptions of each participant. A wide array of problems are faced by head-and-neck cancer (HNC) patients due to disease pathology and treatment modalities which are sufficient to influence the quality of life (QOL). Men possess greater self-acceptance and are better equipped with intrapersonal strength to cope with stress and adequacy compared to women. Methodology: A qualitative phenomenology study was conducted among seven women suffering from HNC, with the objective to understand their experiences of suffering and to describe the phenomenon. Data were collected by face-to-face, in-depth, open-ended interviews. Data were analyzed using Open Code software (OPC 4.0) by following the steps of Colaizzi process. Results: The phenomenon that emerged out of the lived experiences of HNC women was "Personified as paragon of suffering.optimistic being of achieving normalcy," with five major themes and 13 subthemes. Conclusion: The conceptual model developed with the phenomenological approach is very specific to the women suffering from HNC, which will be contributing to develop strategies to improve the QOL of women. PMID:29440812

  14. An efficient and reproducible process for transmission electron microscopy (TEM) of rare cell populations

    Science.gov (United States)

    Kumar, Sachin; Ciraolo, Georgianne; Hinge, Ashwini; Filippi, Marie-Dominique

    2014-01-01

    Transmission electron microscopy (TEM) provides ultra-structural details of cells at the sub-organelle level. However, details of the cellular ultrastructure, and the cellular organization and content of various organelles in rare populations, particularly in the suspension, like hematopoietic stem cells (HSCs) remained elusive. This is mainly due to the requirement of millions of cells for TEM studies. Thus, there is a vital requirement of a method that will allow TEM studies with low cell numbers of such rare populations. We describe an alternative and novel approach for TEM studies for rare cell populations. Here we performed TEM study from 10,000 HSC cells with quite ease. In particular, tiny cell pellets were identified by Evans blue staining after PFA-GA fixation. The cell pellet was pre-embedded in agarose in a small microcentrifuge tube and processed for dehydration, infiltration and embedding. Semi-thin and ultra-thin sections identified clusters of numerous cells per sections with well preserved morphology and ultrastructural details of golgi complex and mitochondria. Together, this method provides an efficient, easy and reproducible process to perform qualitative and quantitative TEM analysis from limited biological samples including cells in suspension. PMID:24291346

  15. Qualitative cosmology

    International Nuclear Information System (INIS)

    Khalatnikov, I.M.; Belinskij, V.A.

    1984-01-01

    Application of the qualitative theory of dynamic systems to analysis of homogeneous cosmological models is described. Together with the well-known cases, requiring ideal liquid, the properties of cosmological evolution of matter with dissipative processes due to viscosity are considered. New cosmological effects occur, when viscosity terms being one and the same order with the rest terms in the equations of gravitation or even exceeding them. In these cases the description of the dissipative process by means of only two viscosity coefficients (volume and shift) may become inapplicable because all the rest decomposition terms of dissipative addition to the energy-momentum in velocity gradient can be large application of equations with hydrodynamic viscosty should be considered as a model of dissipative effects in cosmology

  16. Inter-examiner reproducibility of tests for lumbar motor control

    Directory of Open Access Journals (Sweden)

    Elkjaer Arne

    2011-05-01

    Full Text Available Abstract Background Many studies show a relation between reduced lumbar motor control (LMC and low back pain (LBP. However, test circumstances vary and during test performance, subjects may change position. In other words, the reliability - i.e. reproducibility and validity - of tests for LMC should be based on quantitative data. This has not been considered before. The aim was to analyse the reproducibility of five different quantitative tests for LMC commonly used in daily clinical practice. Methods The five tests for LMC were: repositioning (RPS, sitting forward lean (SFL, sitting knee extension (SKE, and bent knee fall out (BKFO, all measured in cm, and leg lowering (LL, measured in mm Hg. A total of 40 subjects (14 males, 26 females 25 with and 15 without LBP, with a mean age of 46.5 years (SD 14.8, were examined independently and in random order by two examiners on the same day. LBP subjects were recruited from three physiotherapy clinics with a connection to the clinic's gym or back-school. Non-LBP subjects were recruited from the clinic's staff acquaintances, and from patients without LBP. Results The means and standard deviations for each of the tests were 0.36 (0.27 cm for RPS, 1.01 (0.62 cm for SFL, 0.40 (0.29 cm for SKE, 1.07 (0.52 cm for BKFO, and 32.9 (7.1 mm Hg for LL. All five tests for LMC had reproducibility with the following ICCs: 0.90 for RPS, 0.96 for SFL, 0.96 for SKE, 0.94 for BKFO, and 0.98 for LL. Bland and Altman plots showed that most of the differences between examiners A and B were less than 0.20 cm. Conclusion These five tests for LMC displayed excellent reproducibility. However, the diagnostic accuracy of these tests needs to be addressed in larger cohorts of subjects, establishing values for the normal population. Also cut-points between subjects with and without LBP must be determined, taking into account age, level of activity, degree of impairment and participation in sports. Whether reproducibility of these

  17. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    Science.gov (United States)

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Measurement of Trabecular Bone Parameters in Porcine Vertebral Bodies Using Multidetector CT: Evaluation of Reproducibility of 3-Dimensional CT Histomorphometry

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Sung Hwan; Goo, Jin Mo [Dept. of Radiology, Seoul National University Hospital, Seoul National University College of Medicine, Seoul (Korea, Republic of); Moon Kyung Chul [Dept. of Pathology, Seoul National University Hospital, Seoul National University College of Medicine, Seoul (Korea, Republic of); An, Sang Bu [Dept. of radiology, National Cancer Center, Goyang (Korea, Republic of); Kim, Kwang Gi [Dept. of Biomedical Engineering, Division of Basic and Applied Sciences, National Cancer Center, Goyang (Korea, Republic of)

    2011-05-15

    To evaluate the reproducibility of 3-dimensional histomorphometry for the microarchitecture analysis of trabecular bone parameters using multidetector computed tomography (MDCT). Thirty-six specimens from porcine vertebral bodies were imaged five times with a 64- detector row MDCT system using the same scan protocols. Locations of the specimens were nearly identical through the scans. Three-dimensional structural parameters of trabecular bone were derived from the five data sets using image analyzing software. The features measured by the analysis programs were trabecular bone volume, trabecular bone volume/tissue volume, trabecular thickness, trabecular separation, trabecular number, trabecular bone pattern factor, structural model index. The structural trabecular parameters showed excellent reproducibility through repeated scanning. Intraclass correlation coefficients of all seven structural parameters were in the range of 0.998 to 1.000. Coefficients of variation of the six structural parameters, excluding structural model index, were not over 1.6%. The measurement of the trabecular structural parameters using multidetector CT and three-dimensional histomophometry analysis program was validated and showed excellent reproducibility. This method could be used as a noninvasive and easily available test in a clinical setting.

  19. Information Risk Management: Qualitative or Quantitative? Cross industry lessons from medical and financial fields

    OpenAIRE

    Upasna Saluja; Norbik Bashah Idris

    2012-01-01

    Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in consid...

  20. Qualitative research, tourism

    DEFF Research Database (Denmark)

    Ren, Carina Bregnholm

    2016-01-01

    of qualitative research has meant a need to question and redefine criteria and research standards otherwise used in tourism research, as qualitative approach does not (seek to) conform to ideals such as truth, objectivity, and validity retrieved in the positivist sciences. In order to develop new ways by which......, the understanding of qualitative research as unable (or rather unwilling) to deliver the types of outcome which “explain and predict” tourism, has impacted upon its ability to gain general acceptance. Only slowly has tourism research made room for the changes in social and cultural sciences, which since the 1960s......Qualitative research, tourism Qualitative research refers to research applying a range of qualitative methods in order to inductively explore, interpret, and understand a given field or object under study. Qualitative research in tourism takes its inspiration primarily from the cultural and social...

  1. The general theory of the Quasi-reproducible experiments: How to describe the measured data of complex systems?

    Science.gov (United States)

    Nigmatullin, Raoul R.; Maione, Guido; Lino, Paolo; Saponaro, Fabrizio; Zhang, Wei

    2017-01-01

    In this paper, we suggest a general theory that enables to describe experiments associated with reproducible or quasi-reproducible data reflecting the dynamical and self-similar properties of a wide class of complex systems. Under complex system we understand a system when the model based on microscopic principles and suppositions about the nature of the matter is absent. This microscopic model is usually determined as ;the best fit" model. The behavior of the complex system relatively to a control variable (time, frequency, wavelength, etc.) can be described in terms of the so-called intermediate model (IM). One can prove that the fitting parameters of the IM are associated with the amplitude-frequency response of the segment of the Prony series. The segment of the Prony series including the set of the decomposition coefficients and the set of the exponential functions (with k = 1,2,…,K) is limited by the final mode K. The exponential functions of this decomposition depend on time and are found by the original algorithm described in the paper. This approach serves as a logical continuation of the results obtained earlier in paper [Nigmatullin RR, W. Zhang and Striccoli D. General theory of experiment containing reproducible data: The reduction to an ideal experiment. Commun Nonlinear Sci Numer Simul, 27, (2015), pp 175-192] for reproducible experiments and includes the previous results as a partial case. In this paper, we consider a more complex case when the available data can create short samplings or exhibit some instability during the process of measurements. We give some justified evidences and conditions proving the validity of this theory for the description of a wide class of complex systems in terms of the reduced set of the fitting parameters belonging to the segment of the Prony series. The elimination of uncontrollable factors expressed in the form of the apparatus function is discussed. To illustrate how to apply the theory and take advantage of its

  2. Reproducibility of prompts in computer-aided detection (CAD) of breast cancer

    International Nuclear Information System (INIS)

    Taylor, C.G.; Champness, J.; Reddy, M.; Taylor, P.; Potts, H.W.W.; Given-Wilson, R.

    2003-01-01

    AIM: We evaluated the reproducibility of prompts using the R2 ImageChecker M2000 computer-aided detection (CAD) system. MATERIALS AND METHODS: Forty selected two-view mammograms of women with breast cancer were digitized and analysed using the ImageChecker on 10 separate occasions. The mammograms were chosen to provide both straightforward and subtle signs of malignancy. Data analysed included mammographic abnormality, pathology, and whether the cancer was prompted or given an emphasized prompt. RESULTS: Correct prompts were generated in 86 out of 100 occasions for screen-detected cancers. Reproducibility was less in the other categories of more subtle cancers: 21% for cancers previously missed by CAD, a group that contained more grade 1 and small (<10 mm) tumours. Prompts for calcifications were more reproducible than those for masses (76% versus 53%) and these cancers were more likely to have an emphasized prompt. CONCLUSIONS: Probably the most important cause of variability of prompts is shifts in film position between sequential digitizations. Consequently subtle lesions that are only just above the threshold for display may not be prompted on repeat scanning. However, users of CAD should be aware that even emphasized prompts are not consistently reproducible

  3. Reproducing {sup 137}Cs vertical migration in Spanish soils - Reproducing {sup 137}Cs and {sup 90}Sr vertical migration in Spanish mainland

    Energy Technology Data Exchange (ETDEWEB)

    Olondo, C.; Legarda, F.; Herranz, M.; Idoeta, R. [The University of the Basque Country - UPV/EHU, Nuclear Engineering and Fluid Mechanics Dept. Faculty of Engineering, Alda. Urquijo 48013, Bilbao (Spain)

    2014-07-01

    As a result of caesium's and strontium's activity migration study developed in Spanish mainland soils, there has been obtained convective - diffusive migration equation that will reproduce adequately the movement that an activity deposit would follow in this land. Taking into account the dependence on rain that apparent convection velocity shows, it has been defined a new migration parameter that depends only on soil's properties. By means of a least square method and fitting the migration equation to experimental activity profiles, the values showed by the migration parameters in the studied soils, characteristics of that area, have been obtained. After that, there have been obtained the mean values of these parameters for each defined group that, depending on soil's texture, have been observed in the study performed about the movement of both radionuclides in soils and to whom these soils belong. Using these mean values and obtained equation, it has been properly reproduce those vertical activity profiles that were experimentally determined. In order to validate these values, a new sampling programme is carrying out in the north of Spain and, with obtained new sampling points' information, is going to verify if, indeed, obtained mean values also reproduce these new sampling points' activity vertical profile. (authors)

  4. Numerical modeling of hypolimnetic oxygenation by electrolysis of water

    Directory of Open Access Journals (Sweden)

    Jaćimović Nenad M.

    2017-01-01

    Full Text Available The paper presents a novel method for hypolimnetic oxygenation by electrolysis of water. The performance of the method is investigated by the laboratory and the field experiment. The laboratory experiment is conducted in a 90 L vessel, while the field experiment is conducted at the lake Biwa in Japan. In order to provide a better insight into involved processes, a numerical model for simulation of bubble flow is developed with consideration of gas compressibility and oxygen dissolution. The model simultaneously solves 3-D volume averaged two-fluid governing equations. Developed model is firstly verified by simulation of bubble flow experiments, reported in the literature, where good qualitative agreement between measured and simulated results is observed. In the second part, the model is applied for simulation of conducted water electrolysis experiments. The model reproduced the observed oxygen concentration dynamics reasonably well. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. 37009

  5. A simulation-based analytic model of radio galaxies

    Science.gov (United States)

    Hardcastle, M. J.

    2018-04-01

    I derive and discuss a simple semi-analytical model of the evolution of powerful radio galaxies which is not based on assumptions of self-similar growth, but rather implements some insights about the dynamics and energetics of these systems derived from numerical simulations, and can be applied to arbitrary pressure/density profiles of the host environment. The model can qualitatively and quantitatively reproduce the source dynamics and synchrotron light curves derived from numerical modelling. Approximate corrections for radiative and adiabatic losses allow it to predict the evolution of radio spectral index and of inverse-Compton emission both for active and `remnant' sources after the jet has turned off. Code to implement the model is publicly available. Using a standard model with a light relativistic (electron-positron) jet, subequipartition magnetic fields, and a range of realistic group/cluster environments, I simulate populations of sources and show that the model can reproduce the range of properties of powerful radio sources as well as observed trends in the relationship between jet power and radio luminosity, and predicts their dependence on redshift and environment. I show that the distribution of source lifetimes has a significant effect on both the source length distribution and the fraction of remnant sources expected in observations, and so can in principle be constrained by observations. The remnant fraction is expected to be low even at low redshift and low observing frequency due to the rapid luminosity evolution of remnants, and to tend rapidly to zero at high redshift due to inverse-Compton losses.

  6. Qualitative Computing and Qualitative Research: Addressing the Challenges of Technology and Globalization

    Directory of Open Access Journals (Sweden)

    César A. Cisneros Puebla

    2012-05-01

    Full Text Available Qualitative computing has been part of our lives for thirty years. Today, we urgently call for an evaluation of its international impact on qualitative research. Evaluating the international impact of qualitative research and qualitative computing requires a consideration of the vast amount of qualitative research over the last decades, as well as thoughtfulness about the uneven and unequal way in which qualitative research and qualitative computing are present in different fields of study and geographical regions. To understand the international impact of qualitative computing requires evaluation of the digital divide and the huge differences between center and peripheries. The international impact of qualitative research, and, in particular qualitative computing, is the question at the heart of this array of selected papers from the "Qualitative Computing: Diverse Worlds and Research Practices Conference." In this article, we introduce the reader to the goals, motivation, and atmosphere at the conference, taking place in Istanbul, Turkey, in 2011. The dialogue generated there is still in the air, and this introduction is a call to spread that voice. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1202285

  7. Novel burn device for rapid, reproducible burn wound generation.

    Science.gov (United States)

    Kim, J Y; Dunham, D M; Supp, D M; Sen, C K; Powell, H M

    2016-03-01

    Scarring following full thickness burns leads to significant reductions in range of motion and quality of life for burn patients. To effectively study scar development and the efficacy of anti-scarring treatments in a large animal model (female red Duroc pigs), reproducible, uniform, full-thickness, burn wounds are needed to reduce variability in observed results that occur with burn depth. Prior studies have proposed that initial temperature of the burner, contact time with skin, thermal capacity of burner material, and the amount of pressure applied to the skin need to be strictly controlled to ensure reproducibility. The purpose of this study was to develop a new burner that enables temperature and pressure to be digitally controlled and monitored in real-time throughout burn wound creation and compare it to a standard burn device. A custom burn device was manufactured with an electrically heated burn stylus and a temperature control feedback loop via an electronic microstat. Pressure monitoring was controlled by incorporation of a digital scale into the device, which measured downward force. The standard device was comprised of a heat resistant handle with a long rod connected to the burn stylus, which was heated using a hot plate. To quantify skin surface temperature and internal stylus temperature as a function of contact time, the burners were heated to the target temperature (200±5°C) and pressed into the skin for 40s to create the thermal injuries. Time to reach target temperature and elapsed time between burns were recorded. In addition, each unit was evaluated for reproducibility within and across three independent users by generating burn wounds at contact times spanning from 5 to 40s at a constant pressure and at pressures of 1 or 3lbs with a constant contact time of 40s. Biopsies were collected for histological analysis and burn depth quantification using digital image analysis (ImageJ). The custom burn device maintained both its internal

  8. Reproducibility of a 3-dimensional gyroscope in measuring shoulder anteflexion and abduction

    Directory of Open Access Journals (Sweden)

    Penning Ludo I F

    2012-07-01

    Full Text Available Abstract Background Few studies have investigated the use of a 3-dimensional gyroscope for measuring the range of motion (ROM in the impaired shoulder. Reproducibility of digital inclinometer and visual estimation is poor. This study aims to investigate the reproducibility of a tri axial gyroscope in measurement of anteflexion, abduction and related rotations in the impaired shoulder. Methods Fifty-eight patients with either subacromial impingement (27 or osteoarthritis of the shoulder (31 participated. Active anteflexion, abduction and related rotations were measured with a tri axial gyroscope according to a test retest protocol. Severity of shoulder impairment and patient perceived pain were assessed by the Disability of Arm Shoulder and Hand score (DASH and the Visual Analogue Scale (VAS. VAS scores were recorded before and after testing. Results In two out of three hospitals patients with osteoarthritis (n = 31 were measured, in the third hospital patients with subacromial impingement (n = 27. There were significant differences among hospitals for the VAS and DASH scores measured before and after testing. The mean differences between the test and retest means for anteflexion were −6 degrees (affected side, 9 (contralateral side and for abduction 15 degrees (affected side and 10 degrees (contralateral side. Bland & Altman plots showed that the confidence intervals for the mean differences fall within −6 up to 15 degrees, individual test - retest differences could exceed these limits. A simulation according to ‘Generalizability Theory’ produces very good coefficients for anteflexion and related rotation as a comprehensive measure of reproducibility. Optimal reproducibility is achieved with 2 repetitions for anteflexion. Conclusions Measurements were influenced by patient perceived pain. Differences in VAS and DASH might be explained by different underlying pathology. These differences in shoulder pathology however did not alter

  9. Reproducibility of liver position using active breathing coordinator for liver cancer radiotherapy

    International Nuclear Information System (INIS)

    Eccles, Cynthia; Brock, Kristy K.; Bissonnette, Jean-Pierre; Hawkins, Maria; Dawson, Laura A.

    2006-01-01

    Purpose: To measure the intrabreath-hold liver motion and the intrafraction and interfraction reproducibility of liver position relative to vertebral bodies using an active breathing coordinator (ABC) in patients with unresectable liver cancer treated with hypofractionated stereotactic body radiation therapy (SBRT). Methods: Tolerability of ABC and organ motion during ABC was assessed using kV fluoroscopy in 34 patients. For patients treated with ABC, repeat breath-hold CT scans in the ABC breath-hold position were acquired at simulation to estimate the volumetric intrafraction reproducibility of the liver relative to the vertebral bodies. In addition, preceding each radiation therapy fraction, with the liver immobilized using ABC, repeat anteroposterior (AP) megavoltage verification images were obtained. Off-line alignments were completed to determine intrafraction reproducibility (from repeat images obtained before one treatment) and interfraction reproducibility (from comparisons of the final image for each fraction with the AP) of diaphragm position relative to vertebral bodies. For each image set, the vertebral bodies were aligned, and the resultant craniocaudal (CC) offset in diaphragm position was measured. Liver position during ABC was also evaluated from kV fluoroscopy acquired at the time of simulation, kV fluoroscopy at the time of treatment, and from MV beam's-eye view movie loops acquired during treatment. Results: Twenty-one of 34 patients were screened to be suitable for ABC. The average free breathing range of these patients was 13 mm (range, 5-1 mm). Fluoroscopy revealed that the average maximal diaphragm motion during ABC breath-hold was 1.4 mm (range, 0-3.4 mm). The MV treatment movie loops confirmed diaphragm stability during treatment. For a measure of intrafraction reproducibility, an analysis of 36 repeat ABC computed tomography (CT) scans in 14 patients was conducted. The average mean difference in the liver surface position was -0.9 mm, -0

  10. The Globalization of Qualitative Research: Challenging Anglo-American Domination and Local Hegemonic Discourse

    Directory of Open Access Journals (Sweden)

    Ping-Chun Hsiung

    2012-01-01

    Full Text Available Over the past decades, scholarly interest has led to publications on the practices and development of qualitative research (QR in countries outside of the Anglo-American core. Much of the writing is descriptive, providing an overview of the QR path and development in a particular country. Recently, qualitative researchers in the periphery have begun to articulate a collective professional identity in relation to the Anglo-American core by questioning both the dominance of the Anglo-American core and the current divide between QR in the core and the periphery. To date, insufficient effort has been made to develop this collective professional identity in order to overcome Anglo-American domination in the periphery and to indigenize QR. In this article, I propose a globally-informed, locally-situated analytical framework as a means of developing a globalized QR (GQR. I argue that qualitative scholars in the periphery must simultaneously confront Anglo-American domination and local hegemonic discourses. I discuss what scholars in the core and periphery can do to lead to a shift in the current division of labor that sees scholarship in the core producing theory and methods while those in the periphery consume and reproduce it. More attention needs to be paid to the indigenization of QR in the periphery. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1201216

  11. The reproducibility of random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    RAPD) profiles of Streptococcus thermophilus strains by using the polymerase chain reaction (PCR). Several factors can cause the amplification of false and non reproducible bands in the RAPD profiles. We tested three primers, OPI-02 MOD, ...

  12. Digital Differentiation in Young People’s Internet Use—Eliminating or Reproducing Disability Stereotypes

    Directory of Open Access Journals (Sweden)

    Sylvia Söderström

    2013-05-01

    Full Text Available Norwegian authorities’ policy aims at securing an information society for all, emphasizing the importance of accessible and usable Information and Communication Technology (ICT for everyone. While the body of research on young people’s use of ICT is quite comprehensive, research addressing digital differentiation in young people with disabilities’ use of ICT is still in its early days. This article investigates how young people with disabilities’ use, or non-use, of assistive ICT creates digital differentiations. The investigation elaborates on how the anticipations and stereotypes of disability establish an authoritative definition of assistive ICT, and the consequence this creates for the use of the Web by young people with disabilities. The object of the article is to provide enhanced insight into the field of technology and disability by illuminating how assistive ICT sometimes eliminates and sometimes reproduces stereotypes and digital differentiations. The investigation draws on a qualitative interview study with 23 young Norwegians with disabilities, aged 15–20 years. I draw on a theoretical perspective to analyze the findings of the study, which employs the concept of identity multiplicity. The article’s closing discussion expands on technology’s significance in young people’s negotiations of impairment and of perceptions of disability

  13. Timbral aspects of reproduced sound in small rooms. II

    DEFF Research Database (Denmark)

    Bech, Søren

    1996-01-01

    A single loudspeaker with frequency-dependent directivity characteristics, positioned in a room of normal size with frequency-dependent absorption coefficients of the room surfaces, has been simulated using an electroacoustic setup. The model included the direct sound, seventeen individual...... reflections and the reverberant field. The threshold of detection, and just-noticeable differences for an increase in level were measured for individual reflections. The results have confirmed that the first-order floor reflection is likely to contribute individually to the timbre of reproduced noise. However......, for a speech signal none of the investigated reflections will contribute individually to the timbre. It is suggested that the threshold of detection is determined by the spectral changes in the dominant frequency range of 500 Hz to 2 kHz. For increases in the level of individual reflections, the most likely...

  14. An integrated qualitative and quantitative modeling framework for computer‐assisted HAZOP studies

    DEFF Research Database (Denmark)

    Wu, Jing; Zhang, Laibin; Hu, Jinqiu

    2014-01-01

    safety critical operations, its causes and consequences. The outcome is a qualitative hazard analysis of selected process deviations from normal operations and their consequences as input to a traditional HAZOP table. The list of unacceptable high risk deviations identified by the qualitative HAZOP......‐assisted HAZOP studies introduced in this article allows the HAZOP team to devote more attention to high consequence hazards. © 2014 American Institute of Chemical Engineers AIChE J 60: 4150–4173, 2014...

  15. A computational model for histone mark propagation reproduces the distribution of heterochromatin in different human cell types.

    Science.gov (United States)

    Schwämmle, Veit; Jensen, Ole Nørregaard

    2013-01-01

    Chromatin is a highly compact and dynamic nuclear structure that consists of DNA and associated proteins. The main organizational unit is the nucleosome, which consists of a histone octamer with DNA wrapped around it. Histone proteins are implicated in the regulation of eukaryote genes and they carry numerous reversible post-translational modifications that control DNA-protein interactions and the recruitment of chromatin binding proteins. Heterochromatin, the transcriptionally inactive part of the genome, is densely packed and contains histone H3 that is methylated at Lys 9 (H3K9me). The propagation of H3K9me in nucleosomes along the DNA in chromatin is antagonizing by methylation of H3 Lysine 4 (H3K4me) and acetylations of several lysines, which is related to euchromatin and active genes. We show that the related histone modifications form antagonized domains on a coarse scale. These histone marks are assumed to be initiated within distinct nucleation sites in the DNA and to propagate bi-directionally. We propose a simple computer model that simulates the distribution of heterochromatin in human chromosomes. The simulations are in agreement with previously reported experimental observations from two different human cell lines. We reproduced different types of barriers between heterochromatin and euchromatin providing a unified model for their function. The effect of changes in the nucleation site distribution and of propagation rates were studied. The former occurs mainly with the aim of (de-)activation of single genes or gene groups and the latter has the power of controlling the transcriptional programs of entire chromosomes. Generally, the regulatory program of gene transcription is controlled by the distribution of nucleation sites along the DNA string.

  16. A toy model that predicts the qualitative role of bar bend in a push jerk.

    Science.gov (United States)

    Santos, Aaron; Meltzer, Norman E

    2009-11-01

    In this work, we describe a simple coarse-grained model of a barbell that can be used to determine the qualitative role of bar bend during a jerk. In simulations of this model, we observed a narrow time window during which the lifter can leverage the elasticity of the bar in order to lift the weight to a maximal height. This time window shifted to later times as the weight was increased. In addition, we found that the optimal time to initiate the drive was strongly correlated with the time at which the bar had reached a maximum upward velocity after recoiling. By isolating the effect of the bar, we obtained a generalized strategy for lifting heavy weight in the jerk.

  17. Current induced torques and interfacial spin-orbit coupling: Semiclassical modeling

    KAUST Repository

    Haney, Paul M.

    2013-05-07

    In bilayer nanowires consisting of a ferromagnetic layer and a nonmagnetic layer with strong spin-orbit coupling, currents create torques on the magnetization beyond those found in simple ferromagnetic nanowires. The resulting magnetic dynamics appear to require torques that can be separated into two terms, dampinglike and fieldlike. The dampinglike torque is typically derived from models describing the bulk spin Hall effect and the spin transfer torque, and the fieldlike torque is typically derived from a Rashba model describing interfacial spin-orbit coupling. We derive a model based on the Boltzmann equation that unifies these approaches. We also consider an approximation to the Boltzmann equation, the drift-diffusion model, that qualitatively reproduces the behavior, but quantitatively differs in some regimes. We show that the Boltzmann equation with physically reasonable parameters can match the torques for any particular sample, but in some cases, it fails to describe the experimentally observed thickness dependencies.

  18. Stochastic models of edge turbulent transport in the thermonuclear reactors

    International Nuclear Information System (INIS)

    Volchenkov, Dima

    2005-01-01

    Two-dimensional stochastic model of turbulent transport in the scrape-off layer (SOL) of thermonuclear reactors is considered. Convective instability arisen in the system with respect to perturbations reveals itself in the strong outward bursts of particle density propagating ballistically across the SOL. The criterion of stability for the fluctuations of particle density is formulated. A possibility to stabilize the system depends upon the certain type of plasma waves interactions and the certain scenario of turbulence. A bias of limiter surface would provide a fairly good insulation of chamber walls excepting for the resonant cases. Pdf of the particle flux for the large magnitudes of flux events is modeled with a simple discrete time toy model of I-dimensional random walks concluding at the boundary. The spectra of wandering times feature the pdf of particle flux in the model and qualitatively reproduce the experimental statistics of transport events

  19. Regulating Ultrasound Cavitation in order to Induce Reproducible Sonoporation

    Science.gov (United States)

    Mestas, J.-L.; Alberti, L.; El Maalouf, J.; Béra, J.-C.; Gilles, B.

    2010-03-01

    Sonoporation would be linked to cavitation, which generally appears to be a non reproducible and unstationary phenomenon. In order to obtain an acceptable trade-off between cell mortality and transfection, a regulated cavitation generator based on an acoustical cavitation measurement was developed and tested. The medium to be sonicated is placed in a sample tray. This tray is immersed in in degassed water and positioned above the face of a flat ultrasonic transducer (frequency: 445 kHz; intensity range: 0.08-1.09 W/cm2). This technical configuration was admitted to be conducive to standing-wave generation through reflection at the air/medium interface in the well thus enhancing the cavitation phenomenon. Laterally to the transducer, a homemade hydrophone was oriented to receive the acoustical signal from the bubbles. From this spectral signal recorded at intervals of 5 ms, a cavitation index was calculated as the mean of the cavitation spectrum integration in a logarithmic scale, and the excitation power is automatically corrected. The device generates stable and reproducible cavitation level for a wide range of cavitation setpoint from stable cavitation condition up to full-developed inertial cavitation. For the ultrasound intensity range used, the time delay of the response is lower than 200 ms. The cavitation regulation device was evaluated in terms of chemical bubble collapse effect. Hydroxyl radical production was measured on terephthalic acid solutions. In open loop, the results present a great variability whatever the excitation power. On the contrary the closed loop allows a great reproducibility. This device was implemented for study of sonodynamic effect. The regulation provides more reproducible results independent of cell medium and experimental conditions (temperature, pressure). Other applications of this regulated cavitation device concern internalization of different particles (Quantum Dot) molecules (SiRNA) or plasmids (GFP, DsRed) into different

  20. Reproducibility of Computer-Aided Detection Marks in Digital Mammography

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Kim, Sun Mi; Im, Jung Gi; Cha, Joo Hee

    2007-01-01

    To evaluate the performance and reproducibility of a computeraided detection (CAD) system in mediolateral oblique (MLO) digital mammograms taken serially, without release of breast compression. A CAD system was applied preoperatively to the fulfilled digital mammograms of two MLO views taken without release of breast compression in 82 patients (age range: 33 83 years; mean age: 49 years) with previously diagnosed breast cancers. The total number of visible lesion components in 82 patients was 101: 66 masses and 35 microcalcifications. We analyzed the sensitivity and reproducibility of the CAD marks. The sensitivity of the CAD system for first MLO views was 71% (47/66) for masses and 80% (28/35) for microcalcifications. The sensitivity of the CAD system for second MLO views was 68% (45/66) for masses and 17% (6/35) for microcalcifications. In 84 ipsilateral serial MLO image sets (two patients had bilateral cancers), identical images, regardless of the existence of CAD marks, were obtained for 35% (29/84) and identical images with CAD marks were obtained for 29% (23/78). Identical images, regardless of the existence of CAD marks, for contralateral MLO images were 65% (52/80) and identical images with CAD marks were obtained for 28% (11/39). The reproducibility of CAD marks for the true positive masses in serial MLO views was 84% (42/50) and that for the true positive microcalcifications was 0% (0/34). The CAD system in digital mammograms showed a high sensitivity for detecting masses and microcalcifications. However, reproducibility of microcalcification marks was very low in MLO views taken serially without release of breast compression. Minute positional change and patient movement can alter the images and result in a significant effect on the algorithm utilized by the CAD for detecting microcalcifications

  1. Sample Size in Qualitative Interview Studies: Guided by Information Power.

    Science.gov (United States)

    Malterud, Kirsti; Siersma, Volkert Dirk; Guassora, Ann Dorrit

    2015-11-27

    Sample sizes must be ascertained in qualitative studies like in quantitative studies but not by the same means. The prevailing concept for sample size in qualitative studies is "saturation." Saturation is closely tied to a specific methodology, and the term is inconsistently applied. We propose the concept "information power" to guide adequate sample size for qualitative studies. Information power indicates that the more information the sample holds, relevant for the actual study, the lower amount of participants is needed. We suggest that the size of a sample with sufficient information power depends on (a) the aim of the study, (b) sample specificity, (c) use of established theory, (d) quality of dialogue, and (e) analysis strategy. We present a model where these elements of information and their relevant dimensions are related to information power. Application of this model in the planning and during data collection of a qualitative study is discussed. © The Author(s) 2015.

  2. Prototyping qualitative controllers for fuzzy-logic controller design

    International Nuclear Information System (INIS)

    Bakhtiari, S.; Jabedar-Maralani, P.

    1999-05-01

    Qualitative controls can be designed for linear and nonlinear models with the same computational complexity. At the same time they show the general form of the proper control. These properties can help ease the design process for quantitative controls. In this paper qualitative controls are used as prototypes for the design of linear or nonlinear, and in particular Sugeno-type fuzzy, controls. The LMS identification method is used to approximate the qualitative control with the nearest fuzzy control. The method is applied to the problem of position control in a permanent magnet synchronous motor; moreover, the performance and the robustness of the two controllers are compared

  3. Reproducibility of heart rate variability, blood pressure variability and baroreceptor sensitivity during rest and head-up tilt

    DEFF Research Database (Denmark)

    Højgaard, Michael V; Agner, Erik; Kanters, Jørgen K

    2005-01-01

    OBJECTIVE: Previous studies have indicated moderate-to-poor reproducibility of heart rate variability (HRV) but the reproducibility of blood pressure variability (BPV) and spectral measures of baroreceptor sensitivity (BRS) are not well established. METHODS: We measured normal-to-normal heart beat...... pressures were extracted for the assessment of day-to-day and short-term reproducibility. Power spectrum analysis (Fourier) and transfer function analysis was performed. Reproducibility was assessed using the coefficient of variation (CV). The reproducibility of the mean RR interval, mean systolic......, diastolic and mean blood pressure was good (CVspectral parameters of HRV (CV range 18-36%) and BPV (16-44%) and moderate reproducibility of BRS (14-20%). CONCLUSION: Spectral estimates of BRS had only moderate reproducibility although...

  4. Stable, precise, and reproducible patterning of bicoid and hunchback molecules in the early Drosophila embryo.

    Directory of Open Access Journals (Sweden)

    Yurie Okabe-Oho

    2009-08-01

    Full Text Available Precise patterning of morphogen molecules and their accurate reading out are of key importance in embryonic development. Recent experiments have visualized distributions of proteins in developing embryos and shown that the gradient of concentration of Bicoid morphogen in Drosophila embryos is established rapidly after fertilization and remains stable through syncytial mitoses. This stable Bicoid gradient is read out in a precise way to distribute Hunchback with small fluctuations in each embryo and in a reproducible way, with small embryo-to-embryo fluctuation. The mechanisms of such stable, precise, and reproducible patterning through noisy cellular processes, however, still remain mysterious. To address these issues, here we develop the one- and three-dimensional stochastic models of the early Drosophila embryo. The simulated results show that the fluctuation in expression of the hunchback gene is dominated by the random arrival of Bicoid at the hunchback enhancer. Slow diffusion of Hunchback protein, however, averages out this intense fluctuation, leading to the precise patterning of distribution of Hunchback without loss of sharpness of the boundary of its distribution. The coordinated rates of diffusion and transport of input Bicoid and output Hunchback play decisive roles in suppressing fluctuations arising from the dynamical structure change in embryos and those arising from the random diffusion of molecules, and give rise to the stable, precise, and reproducible patterning of Bicoid and Hunchback distributions.

  5. Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings

    Science.gov (United States)

    Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.

    2008-01-01

    The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often present in addition to the results of bivariate and multivariable analyses. Qualitative metasummary, which includes the extraction, grouping, and formatting of findings, and the calculation of frequency and intensity effect sizes, can be used to produce mixed research syntheses and to conduct a posteriori analyses of the relationship between reports and findings. PMID:17243111

  6. Reproducibility Test for Thermoluminescence Dosimeter (TLD) Using TLD Radpro

    International Nuclear Information System (INIS)

    Nur Khairunisa Zahidi; Ahmad Bazlie Abdul Kadir; Faizal Azrin Abdul Razalim

    2016-01-01

    Thermoluminescence dosimeters (TLD) as one type of dosimeter which are often used to substitute the film badge. Like a film badge, it is worn for a period of time and then must be processed to determine the dose received. This study was to test the reproducibility of TLD using Radpro reader. This study aimed to determine the dose obtained by TLD-100 chips when irradiated with Co-60 gamma source and to test the effectiveness of TLD Radpro reader as a machine to analyse the TLD. Ten chips of TLD -100 were irradiated using Eldorado machine with Co-60 source at a distance of 5 meters from the source with 2 mSv dose exposure. After the irradiation process, TLD-100 chips were read using the TLD Radpro reader. These steps will be repeated for nine times to obtain reproducibility coefficient, r i . The readings of dose obtained from experiment was almost equivalent to the actual dose. Results shows that the average value obtained for reproducibility coefficient, r i is 6.39 % which is less than 10 %. As conclusion, the dose obtained from experiment considered accurate because its value were almost equivalent to the actual dose and TLD Radpro was verified as a good reader to analyse the TLD. (author)

  7. Repeatability and Reproducibility of Fibre-Based Nanogenerator Synthesized by Electrospinning Machine

    International Nuclear Information System (INIS)

    Suyitno; Huda, Sholiehul; Arifin, Zainal; Hadi, Syamsul; Lambang, Raymundus Lullus

    2014-01-01

    Zinc oxide fibres-based nanogenerators synthesized easily by electrospinning machine are promising to harvest electricity from mechanical energy. However, the repeatability and reproducibility were two major factors needed to be investigated to minimize product failure and to determine the feasibility of mass production of nanogenerators. The green fibres of zinc oxide were produced by electrospinning machine of zinc acetate and polyvinyl alcohol solution at a flow rate of 4 μL/min followed by sintering at temperature 550°C with heating rate 240°C/h. Each 10 nanogenerators was tested by three trained operators with three times of repetition at compressive load 0.5 kg. The nanogenerators revealed the maximum output voltage ranging from 203 to 217 mV. The value of repeatability and reproducibility of nanogenerators was approximately 24.29% showing that nanogenerators were still acceptable to be mass-produced. The relatively low reproducibility was mainly due to the operators, so that the checklist needed to be made easier and simpler for all the variables affecting to the quality of the fibres. Reducing the value of the repeatability and reproducibility is interesting to study further by creating a rotating collector so that the thickness and orientation of fibres can be arranged better

  8. Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study

    Directory of Open Access Journals (Sweden)

    van de Water Tanja

    2017-01-01

    Full Text Available Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT, fifteen elite (25 ± 4 years and nine non-elite (24 ± 4 years Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6% and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%. Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p 0.05. Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p 0.05. In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players’ performance.

  9. Assessing Cognitive Performance in Badminton Players: A Reproducibility and Validity Study.

    Science.gov (United States)

    van de Water, Tanja; Huijgen, Barbara; Faber, Irene; Elferink-Gemser, Marije

    2017-01-01

    Fast reaction and good inhibitory control are associated with elite sports performance. To evaluate the reproducibility and validity of a newly developed Badminton Reaction Inhibition Test (BRIT), fifteen elite (25 ± 4 years) and nine non-elite (24 ± 4 years) Dutch male badminton players participated in the study. The BRIT measured four components: domain-general reaction time, badminton-specific reaction time, domain-general inhibitory control and badminton-specific inhibitory control. Five participants were retested within three weeks on the badminton-specific components. Reproducibility was acceptable for badminton-specific reaction time (ICC = 0.626, CV = 6%) and for badminton-specific inhibitory control (ICC = 0.317, CV = 13%). Good construct validity was shown for badminton-specific reaction time discriminating between elite and non-elite players (F = 6.650, p 0.05). Concurrent validity for domain-general reaction time was good, as it was associated with a national ranking for elite (p = 0.70, p badminton-specific reaction time, nor both components of inhibitory control (p > 0.05). In conclusion, reproducibility and validity of inhibitory control assessment was not confirmed, however, the BRIT appears a reproducible and valid measure of reaction time in badminton players. Reaction time measured with the BRIT may provide input for training programs aiming to improve badminton players' performance.

  10. Projected Shell Model Description of Positive Parity Band of 130Pr Nucleus

    Science.gov (United States)

    Singh, Suram; Kumar, Amit; Singh, Dhanvir; Sharma, Chetan; Bharti, Arun; Bhat, G. H.; Sheikh, J. A.

    2018-02-01

    Theoretical investigation of positive parity yrast band of odd-odd 130Pr nucleus is performed by applying the projected shell model. The present study is undertaken to investigate and verify the very recently observed side band in 130Pr theoretically in terms of quasi-particle (qp) configuration. From the analysis of band diagram, the yrast as well as side band are found to arise from two-qp configuration πh 11/2 ⊗ νh 11/2. The present calculations are viewed to have qualitatively reproduced the known experimental data for yrast states, transition energies, and B( M1) / B( E2) ratios of this nucleus. The recently observed positive parity side band is also reproduced by the present calculations. The energy states of the side band are predicted up to spin 25+, which is far above the known experimental spin of 18+ and this could serve as a motivational factor for future experiments. In addition, the reduced transition probability B( E2) for interband transitions has also been calculated for the first time in projected shell model, which would serve as an encouragement for other research groups in the future.

  11. Qualitative methods textbooks

    OpenAIRE

    Barndt, William

    2003-01-01

    Over the past few years, the number of political science departments offering qualitative methods courses has grown substantially. The number of qualitative methods textbooks has kept pace, providing instructors with an overwhelming array of choices. But how to decide which text to choose from this exhortatory smorgasbord? The scholarship desperately needs evaluated. Yet the task is not entirely straightforward: qualitative methods textbooks reflect the diversity inherent in qualitative metho...

  12. Using a theory-driven conceptual framework in qualitative health research.

    Science.gov (United States)

    Macfarlane, Anne; O'Reilly-de Brún, Mary

    2012-05-01

    The role and merits of highly inductive research designs in qualitative health research are well established, and there has been a powerful proliferation of grounded theory method in the field. However, tight qualitative research designs informed by social theory can be useful to sensitize researchers to concepts and processes that they might not necessarily identify through inductive processes. In this article, we provide a reflexive account of our experience of using a theory-driven conceptual framework, the Normalization Process Model, in a qualitative evaluation of general practitioners' uptake of a free, pilot, language interpreting service in the Republic of Ireland. We reflect on our decisions about whether or not to use the Model, and describe our actual use of it to inform research questions, sampling, coding, and data analysis. We conclude with reflections on the added value that the Model and tight design brought to our research.

  13. Nursing students' perceptions of a collaborative clinical placement model: A qualitative descriptive study.

    Science.gov (United States)

    van der Riet, Pamela; Levett-Jones, Tracy; Courtney-Pratt, Helen

    2018-03-01

    Clinical placements are specifically designed to facilitate authentic learning opportunities and are an integral component of undergraduate nursing programs. However, as academics and clinicians frequently point out, clinical placements are fraught with problems that are long-standing and multidimensional in nature. Collaborative placement models, grounded in a tripartite relationship between students, university staff and clinical partners, and designed to foster students' sense of belonging, have recently been implemented to address many of the challenges associated with clinical placements. In this study a qualitative descriptive design was undertaken with the aim of exploring 14 third year third year nursing students' perceptions of a collaborative clinical placement model undertaken in an Australian university. Students participated in audio recorded focus groups following their final clinical placement. Thematic analysis of the interview data resulted in identification of six main themes: Convenience and Camaraderie, Familiarity and Confidence, Welcomed and Wanted, Belongingness and Support, Employment, and The Need for Broader Clinical Experiences. The clinical collaborative model fostered a sense of familiarity for many of the participants and this led to belongingness, acceptance, confidence and meaningful learning experiences. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Esthetic Evaluation of Implant Crowns and Peri-Implant Soft Tissue in the Anterior Maxilla: Comparison and Reproducibility of Three Different Indices.

    Science.gov (United States)

    Tettamanti, Sandro; Millen, Christopher; Gavric, Jelena; Buser, Daniel; Belser, Urs C; Brägger, Urs; Wittneben, Julia-Gabriela

    2016-06-01

    A successful implant reconstruction with optimal esthetics consists of a visually pleasing prosthesis and complete and healthy surrounding soft tissue. In the current literature, numerous indices used to qualitatively assess esthetics have been described. However, studies comparing the indices and their reproducibility are scarce. The aim of this study was to compare three different esthetic indices for the evaluation of single implant-supported crowns. A total of 10 prosthodontists (P), 10 orthodontists (O), 10 general dentists (G), and 10 lay people (L) independently performed the same assessment using 30 photographs and corresponding casts with three different esthetic indices (Peri-Implant and Crown Index [PICI], Implant Crown Aesthetic Index [ICAI], "Pink Esthetic Score/White Esthetic Score [PES/WES]) and repeated the evaluations 4 weeks later. The PES/WES and the PICI showed significantly higher esthetic scores (pink, white, total) and clinical acceptance compared with the ICAI in all four groups and in both assessments. The highest intraobserver agreement was achieved using the PES/WES and the least with the ICAI. The mean Kappa per group ranged from 0.18 (group L with ICAI) to 0.63 (group G with PICI). In comparison with the ICAI, the PES/WES and PICI were more reproducible. Therefore, PES/WES and PICI seem to be more suitable as esthetic indices for single implant crowns. © 2015 Wiley Periodicals, Inc.

  15. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  16. Qualitative Assessment of Inquiry-Based Teaching Methods

    Science.gov (United States)

    Briggs, Michael; Long, George; Owens, Katrina

    2011-01-01

    A new approach to teaching method assessment using student focused qualitative studies and the theoretical framework of mental models is proposed. The methodology is considered specifically for the advantages it offers when applied to the assessment of inquiry-based teaching methods. The theoretical foundation of mental models is discussed, and…

  17. Can cancer researchers accurately judge whether preclinical reports will reproduce?

    Directory of Open Access Journals (Sweden)

    Daniel Benjamin

    2017-06-01

    Full Text Available There is vigorous debate about the reproducibility of research findings in cancer biology. Whether scientists can accurately assess which experiments will reproduce original findings is important to determining the pace at which science self-corrects. We collected forecasts from basic and preclinical cancer researchers on the first 6 replication studies conducted by the Reproducibility Project: Cancer Biology (RP:CB to assess the accuracy of expert judgments on specific replication outcomes. On average, researchers forecasted a 75% probability of replicating the statistical significance and a 50% probability of replicating the effect size, yet none of these studies successfully replicated on either criterion (for the 5 studies with results reported. Accuracy was related to expertise: experts with higher h-indices were more accurate, whereas experts with more topic-specific expertise were less accurate. Our findings suggest that experts, especially those with specialized knowledge, were overconfident about the RP:CB replicating individual experiments within published reports; researcher optimism likely reflects a combination of overestimating the validity of original studies and underestimating the difficulties of repeating their methodologies.

  18. Intercenter reproducibility of binary typing for Staphylococcus aureus

    NARCIS (Netherlands)

    van Leeuwen, Willem B.; Snoeijers, Sandor; van der Werken-Libregts, Christel; Tuip, Anita; van der Zee, Anneke; Egberink, Diane; de Proost, Monique; Bik, Elisabeth; Lunter, Bjorn; Kluytmans, Jan; Gits, Etty; van Duyn, Inge; Heck, Max; van der Zwaluw, Kim; Wannet, Wim; Noordhoek, Gerda T.; Mulder, Sije; Renders, Nicole; Boers, Miranda; Zaat, Sebastiaan; van der Riet, Daniëlle; Kooistra, Mirjam; Talens, Adriaan; Dijkshoorn, Lenie; van der Reyden, Tanny; Veenendaal, Dick; Bakker, Nancy; Cookson, Barry; Lynch, Alisson; Witte, Wolfgang; Cuny, Christa; Blanc, Dominique; Vernez, Isabelle; Hryniewicz, Waleria; Fiett, Janusz; Struelens, Marc; Deplano, Ariane; Landegent, Jim; Verbrugh, Henri A.; van Belkum, Alex

    2002-01-01

    The reproducibility of the binary typing (BT) protocol developed for epidemiological typing of Staphylococcus aureus was analyzed in a biphasic multicenter study. In a Dutch multicenter pilot study, 10 genetically unique isolates of methicillin-resistant S. aureus (MRSA) were characterized by the BT

  19. Intra-observer reproducibility and diagnostic performance of breast shear-wave elastography in Asian women.

    Science.gov (United States)

    Park, Hye Young; Han, Kyung Hwa; Yoon, Jung Hyun; Moon, Hee Jung; Kim, Min Jung; Kim, Eun-Kyung

    2014-06-01

    Our aim was to evaluate intra-observer reproducibility of shear-wave elastography (SWE) in Asian women. Sixty-four breast masses (24 malignant, 40 benign) were examined with SWE in 53 consecutive Asian women (mean age, 44.9 y old). Two SWE images were obtained for each of the lesions. The intra-observer reproducibility was assessed by intra-class correlation coefficients (ICC). We also evaluated various clinicoradiologic factors that can influence reproducibility in SWE. The ICC of intra-observer reproducibility was 0.789. In clinicoradiologic factor evaluation, masses surrounded by mixed fatty and glandular tissue (ICC: 0.619) showed lower intra-observer reproducibility compared with lesions that were surrounded by glandular tissue alone (ICC: 0.937; p breast SWE was excellent in Asian women. However, it may decrease when breast tissue is in a heterogeneous background. Therefore, SWE should be performed carefully in these cases. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  20. Tools for Reproducibility and Extensibility in Scientific Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others.    There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...