WorldWideScience

Sample records for model reproduces qualitatively

  1. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  2. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  3. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  4. Towards reproducible descriptions of neuronal network models.

    Directory of Open Access Journals (Sweden)

    Eilen Nordlie

    2009-08-01

    Full Text Available Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.

  5. Modeling and evaluating repeatability and reproducibility of ordinal classifications

    NARCIS (Netherlands)

    de Mast, J.; van Wieringen, W.N.

    2010-01-01

    This paper argues that currently available methods for the assessment of the repeatability and reproducibility of ordinal classifications are not satisfactory. The paper aims to study whether we can modify a class of models from Item Response Theory, well established for the study of the reliability

  6. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  7. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  8. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    Science.gov (United States)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  9. From alginate impressions to digital virtual models: accuracy and reproducibility.

    Science.gov (United States)

    Dalstra, Michel; Melsen, Birte

    2009-03-01

    To compare the accuracy and reproducibility of measurements performed on digital virtual models with those taken on plaster casts from models poured immediately after the impression was taken, the 'gold standard', and from plaster models poured following a 3-5 day shipping procedure of the alginate impression. Direct comparison of two measuring techniques. The study was conducted at the Department of Orthodontics, School of Dentistry, University of Aarhus, Denmark in 2006/2007. Twelve randomly selected orthodontic graduate students with informed consent. Three sets of alginate impressions were taken from the participants within 1 hour. Plaster models were poured immediately from two of the sets, while the third set was kept in transit in the mail for 3-5 days. Upon return a plaster model was poured as well. Finally digital models were made from the plaster models. A number of measurements were performed on the plaster casts with a digital calliper and on the corresponding digital models using the virtual measuring tool of the accompanying software. Afterwards these measurements were compared statistically. No statistical differences were found between the three sets of plaster models. The intra- and inter-observer variability are smaller for the measurements performed on the digital models. Sending alginate impressions by mail does not affect the quality and accuracy of plaster casts poured from them afterwards. Virtual measurements performed on digital models display less variability than the corresponding measurements performed with a calliper on the actual models.

  10. A reproducible brain tumour model established from human glioblastoma biopsies

    International Nuclear Information System (INIS)

    Wang, Jian; Chekenya, Martha; Bjerkvig, Rolf; Enger, Per Ø; Miletic, Hrvoje; Sakariassen, Per Ø; Huszthy, Peter C; Jacobsen, Hege; Brekkå, Narve; Li, Xingang; Zhao, Peng; Mørk, Sverre

    2009-01-01

    Establishing clinically relevant animal models of glioblastoma multiforme (GBM) remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression

  11. Can a coupled meteorology–chemistry model reproduce the ...

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  12. Modelling soil erosion at European scale: towards harmonization and reproducibility

    Science.gov (United States)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  13. A reproducible brain tumour model established from human glioblastoma biopsies

    Directory of Open Access Journals (Sweden)

    Li Xingang

    2009-12-01

    Full Text Available Abstract Background Establishing clinically relevant animal models of glioblastoma multiforme (GBM remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. Methods In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. Results The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. Conclusions In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression.

  14. Development of a Consistent and Reproducible Porcine Scald Burn Model

    Science.gov (United States)

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  15. Qualitative and quantitative histopathology in transitional cell carcinomas of the urinary bladder. An international investigation of intra- and interobserver reproducibility

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Sasaki, M; Fukuzawa, S

    1994-01-01

    a random, systematic sampling scheme.RESULTS: The results were compared by bivariate correlation analyses and Kendall's tau. The international interobserver reproducibility of qualitative gradings was rather poor (kappa = 0.51), especially for grade 2 tumors (kappa = 0.28). Likewise, the interobserver.......54). This can probably be related to the manual design of the sampling scheme and may be solved by introducing a motorized object stage in the systematic selection of fields of vision for quantitative measurements. However, the nuclear mean size estimators are unaffected by such sampling variability...... of both qualitative and quantitative grading methods. Grading of malignancy was performed by one observer in Japan (using the World Health Organization scheme), and by two observers in Denmark (using the Bergkvist system). A "translation" between the systems, grade for grade, and kappa statistics were...

  16. Reproducibility of qualitative assessments of temporal lobe atrophy in MRI studies.

    Science.gov (United States)

    Sarria-Estrada, S; Acevedo, C; Mitjana, R; Frascheri, L; Siurana, S; Auger, C; Rovira, A

    2015-01-01

    To determine the reproducibility of the Scheltens visual rating scale in establishing atrophy of the medial temporal lobe. We used coronal T1-weighted inversion recovery sequences on a 1.5 Tesla MRI scanner to study 25 patients with clinically diagnosed Alzheimer's disease or mild cognitive decline and 25 subjects without cognitive decline. Five neuroradiologists trained to apply the Scheltens visual rating scale analyzed the images. We used the interclass correlation coefficient to evaluate interrater and intrarater agreement. Raters scored 20 (80%) of the 25 patients with mild cognitive decline or Alzheimer's disease between 2 and 4; by contrast, they scored 21 (84%) of the 25 subjects without cognitive decline between 0 and 1. The interrater agreement was consistently greater than 0.82, with a 95% confidence interval of (0.7-0.9). The intrarater agreement ranged from 0.82 to 0.87, with a 95% confidence interval of (0.56-0.93). The Scheltens visual rating scale is reproducible among observers, and this finding supports its use in clinical practice. Copyright © 2013 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  17. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    Science.gov (United States)

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    osteosarcoma model was shown to be feasible: the take rate was high, surgical mortality was negligible and the procedure was simple to perform and easily reproduced. It may be a useful tool in the investigation of antiangiogenic and anticancer therapeutics. Ultrasound was found to be a highly accurate tool for tumor diagnosis, localization and measurement and may be recommended for monitoring tumor growth in this model.

  18. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    Science.gov (United States)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  19. Using a 1-D model to reproduce diurnal SST signals

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob L.

    2014-01-01

    The diurnal variability of SST has been extensively studied as it poses challenges for validating and calibrating satellite sensors, merging SST time series, oceanic and atmospheric modelling. As heat is significantly trapped close to the surface, the diurnal signal’s maximum amplitude is best...... captured by radiometers. The availability of infra-red retrievals from a geostationary orbit allows the hourly monitoring of the diurnal SST evolution. When infra-red SSTs are validated with in situ measurements a general mismatch is found, associated with the different reference depth of each type...... of measurement. A generally preferred approach to bridge the gap between in situ and remotely obtained measurements is through modelling of the upper ocean temperature. This ESA supported study focuses on the implementation of the 1 dimensional General Ocean Turbulence Model (GOTM), in order to resolve...

  20. Hydrological Modeling Reproducibility Through Data Management and Adaptors for Model Interoperability

    Science.gov (United States)

    Turner, M. A.

    2015-12-01

    Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of

  1. Controller Synthesis using Qualitative Models and Constraints

    OpenAIRE

    Ramamoorthy, Subramanian; Kuipers, Benjamin J

    2004-01-01

    Many engineering systems require the synthesis of global behaviors in nonlinear dynamical systems. Multiple model approaches to control design make it possible to synthesize robust and optimal versions of such global behaviors. We propose a methodology called Qualitative Heterogeneous Control that enables this type of control design. This methodology is based on a separation of concerns between qualitative correctness and quantitative optimization. Qualitative sufficient conditions are derive...

  2. Qualitative models for space system engineering

    Science.gov (United States)

    Forbus, Kenneth D.

    1990-01-01

    The objectives of this project were: (1) to investigate the implications of qualitative modeling techniques for problems arising in the monitoring, diagnosis, and design of Space Station subsystems and procedures; (2) to identify the issues involved in using qualitative models to enhance and automate engineering functions. These issues include representing operational criteria, fault models, alternate ontologies, and modeling continuous signals at a functional level of description; and (3) to develop a prototype collection of qualitative models for fluid and thermal systems commonly found in Space Station subsystems. Potential applications of qualitative modeling to space-systems engineering, including the notion of intelligent computer-aided engineering are summarized. Emphasis is given to determining which systems of the proposed Space Station provide the most leverage for study, given the current state of the art. Progress on using qualitative models, including development of the molecular collection ontology for reasoning about fluids, the interaction of qualitative and quantitative knowledge in analyzing thermodynamic cycles, and an experiment on building a natural language interface to qualitative reasoning is reported. Finally, some recommendations are made for future research.

  3. COMBINE archive and OMEX format : One file to share all information to reproduce a modeling project

    NARCIS (Netherlands)

    Bergmann, Frank T.; Olivier, Brett G.; Soiland-Reyes, Stian

    2014-01-01

    Background: With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models,

  4. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  5. Learning Actions Models: Qualitative Approach

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2015-01-01

    In dynamic epistemic logic, actions are described using action models. In this paper we introduce a framework for studying learnability of action models from observations. We present first results concerning propositional action models. First we check two basic learnability criteria: finite ident...

  6. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  7. Investigation of dimensional variation in parts manufactured by fused deposition modeling using Gauge Repeatability and Reproducibility

    Science.gov (United States)

    Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar

    2018-02-01

    In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.

  8. Qualitative models of global warming amplifiers

    NARCIS (Netherlands)

    Milošević, U.; Bredeweg, B.; de Kleer, J.; Forbus, K.D.

    2010-01-01

    There is growing interest from ecological experts to create qualitative models of phenomena for which numerical information is sparse or missing. We present a number of successful models in the field of environmental science, namely, the domain of global warming. The motivation behind the effort is

  9. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  10. Anatomical Reproducibility of a Head Model Molded by a Three-dimensional Printer.

    Science.gov (United States)

    Kondo, Kosuke; Nemoto, Masaaki; Masuda, Hiroyuki; Okonogi, Shinichi; Nomoto, Jun; Harada, Naoyuki; Sugo, Nobuo; Miyazaki, Chikao

    2015-01-01

    We prepared rapid prototyping models of heads with unruptured cerebral aneurysm based on image data of computed tomography angiography (CTA) using a three-dimensional (3D) printer. The objective of this study was to evaluate the anatomical reproducibility and accuracy of these models by comparison with the CTA images on a monitor. The subjects were 22 patients with unruptured cerebral aneurysm who underwent preoperative CTA. Reproducibility of the microsurgical anatomy of skull bone and arteries, the length and thickness of the main arteries, and the size of cerebral aneurysm were compared between the CTA image and rapid prototyping model. The microsurgical anatomy and arteries were favorably reproduced, apart from a few minute regions, in the rapid prototyping models. No significant difference was noted in the measured lengths of the main arteries between the CTA image and rapid prototyping model, but errors were noted in their thickness (p printer. It was concluded that these models are useful tools for neurosurgical simulation. The thickness of the main arteries and size of cerebral aneurysm should be comprehensively judged including other neuroimaging in consideration of errors.

  11. The Accuracy and Reproducibility of Linear Measurements Made on CBCT-derived Digital Models.

    Science.gov (United States)

    Maroua, Ahmad L; Ajaj, Mowaffak; Hajeer, Mohammad Y

    2016-04-01

    To evaluate the accuracy and reproducibility of linear measurements made on cone-beam computed tomography (CBCT)-derived digital models. A total of 25 patients (44% female, 18.7 ± 4 years) who had CBCT images for diagnostic purposes were included. Plaster models were obtained and digital models were extracted from CBCT scans. Seven linear measurements from predetermined landmarks were measured and analyzed on plaster models and the corresponding digital models. The measurements included arch length and width at different sites. Paired t test and Bland-Altman analysis were used to evaluate the accuracy of measurements on digital models compared to the plaster models. Also, intraclass correlation coefficients (ICCs) were used to evaluate the reproducibility of the measurements in order to assess the intraobserver reliability. The statistical analysis showed significant differences on 5 out of 14 variables, and the mean differences ranged from -0.48 to 0.51 mm. The Bland-Altman analysis revealed that the mean difference between variables was (0.14 ± 0.56) and (0.05 ± 0.96) mm and limits of agreement between the two methods ranged from -1.2 to 0.96 and from -1.8 to 1.9 mm in the maxilla and the mandible, respectively. The intraobserver reliability values were determined for all 14 variables of two types of models separately. The mean ICC value for the plaster models was 0.984 (0.924-0.999), while it was 0.946 for the CBCT models (range from 0.850 to 0.985). Linear measurements obtained from the CBCT-derived models appeared to have a high level of accuracy and reproducibility.

  12. A Qualitative Acceleration Model Based on Intervals

    Directory of Open Access Journals (Sweden)

    Ester MARTINEZ-MARTIN

    2013-08-01

    Full Text Available On the way to autonomous service robots, spatial reasoning plays a main role since it properly deals with problems involving uncertainty. In particular, we are interested in knowing people's pose to avoid collisions. With that aim, in this paper, we present a qualitative acceleration model for robotic applications including representation, reasoning and a practical application.

  13. Qualitative modeling of the dynamics of detonations with losses

    KAUST Repository

    Faria, Luiz; Kasimov, Aslan R.

    2015-01-01

    We consider a simplified model for the dynamics of one-dimensional detonations with generic losses. It consists of a single partial differential equation that reproduces, at a qualitative level, the essential properties of unsteady detonation waves, including pulsating and chaotic solutions. In particular, we investigate the effects of shock curvature and friction losses on detonation dynamics. To calculate steady-state solutions, a novel approach to solving the detonation eigenvalue problem is introduced that avoids the well-known numerical difficulties associated with the presence of a sonic point. By using unsteady numerical simulations of the simplified model, we also explore the nonlinear stability of steady-state or quasi-steady solutions. © 2014 The Combustion Institute.

  14. Cellular automaton model in the fundamental diagram approach reproducing the synchronized outflow of wide moving jams

    International Nuclear Information System (INIS)

    Tian, Jun-fang; Yuan, Zhen-zhou; Jia, Bin; Fan, Hong-qiang; Wang, Tao

    2012-01-01

    Velocity effect and critical velocity are incorporated into the average space gap cellular automaton model [J.F. Tian, et al., Phys. A 391 (2012) 3129], which was able to reproduce many spatiotemporal dynamics reported by the three-phase theory except the synchronized outflow of wide moving jams. The physics of traffic breakdown has been explained. Various congested patterns induced by the on-ramp are reproduced. It is shown that the occurrence of synchronized outflow, free outflow of wide moving jams is closely related with drivers time delay in acceleration at the downstream jam front and the critical velocity, respectively. -- Highlights: ► Velocity effect is added into average space gap cellular automaton model. ► The physics of traffic breakdown has been explained. ► The probabilistic nature of traffic breakdown is simulated. ► Various congested patterns induced by the on-ramp are reproduced. ► The occurrence of synchronized outflow of jams depends on drivers time delay.

  15. NRFixer: Sentiment Based Model for Predicting the Fixability of Non-Reproducible Bugs

    Directory of Open Access Journals (Sweden)

    Anjali Goyal

    2017-08-01

    Full Text Available Software maintenance is an essential step in software development life cycle. Nowadays, software companies spend approximately 45\\% of total cost in maintenance activities. Large software projects maintain bug repositories to collect, organize and resolve bug reports. Sometimes it is difficult to reproduce the reported bug with the information present in a bug report and thus this bug is marked with resolution non-reproducible (NR. When NR bugs are reconsidered, a few of them might get fixed (NR-to-fix leaving the others with the same resolution (NR. To analyse the behaviour of developers towards NR-to-fix and NR bugs, the sentiment analysis of NR bug report textual contents has been conducted. The sentiment analysis of bug reports shows that NR bugs' sentiments incline towards more negativity than reproducible bugs. Also, there is a noticeable opinion drift found in the sentiments of NR-to-fix bug reports. Observations driven from this analysis were an inspiration to develop a model that can judge the fixability of NR bugs. Thus a framework, {NRFixer,} which predicts the probability of NR bug fixation, is proposed. {NRFixer} was evaluated with two dimensions. The first dimension considers meta-fields of bug reports (model-1 and the other dimension additionally incorporates the sentiments (model-2 of developers for prediction. Both models were compared using various machine learning classifiers (Zero-R, naive Bayes, J48, random tree and random forest. The bug reports of Firefox and Eclipse projects were used to test {NRFixer}. In Firefox and Eclipse projects, J48 and Naive Bayes classifiers achieve the best prediction accuracy, respectively. It was observed that the inclusion of sentiments in the prediction model shows a rise in the prediction accuracy ranging from 2 to 5\\% for various classifiers.

  16. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    Science.gov (United States)

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  17. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  18. A novel, comprehensive, and reproducible porcine model for determining the timing of bruises in forensic pathology

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2016-01-01

    Purpose Calculating the timing of bruises is crucial in forensic pathology but is a challenging discipline in both human and veterinary medicine. A mechanical device for inflicting bruises in pigs was developed and validated, and the pathological reactions in the bruises were studied over time......-dependent response. Combining these parameters, bruises could be grouped as being either less than 4 h old or between 4 and 10 h of age. Gross lesions and changes in the epidermis and dermis were inconclusive with respect to time determination. Conclusions The model was reproducible and resembled forensic cases...

  19. A novel highly reproducible and lethal nonhuman primate model for orthopox virus infection.

    Directory of Open Access Journals (Sweden)

    Marit Kramski

    Full Text Available The intentional re-introduction of Variola virus (VARV, the agent of smallpox, into the human population is of great concern due its bio-terroristic potential. Moreover, zoonotic infections with Cowpox (CPXV and Monkeypox virus (MPXV cause severe diseases in humans. Smallpox vaccines presently available can have severe adverse effects that are no longer acceptable. The efficacy and safety of new vaccines and antiviral drugs for use in humans can only be demonstrated in animal models. The existing nonhuman primate models, using VARV and MPXV, need very high viral doses that have to be applied intravenously or intratracheally to induce a lethal infection in macaques. To overcome these drawbacks, the infectivity and pathogenicity of a particular CPXV was evaluated in the common marmoset (Callithrix jacchus.A CPXV named calpox virus was isolated from a lethal orthopox virus (OPV outbreak in New World monkeys. We demonstrated that marmosets infected with calpox virus, not only via the intravenous but also the intranasal route, reproducibly develop symptoms resembling smallpox in humans. Infected animals died within 1-3 days after onset of symptoms, even when very low infectious viral doses of 5x10(2 pfu were applied intranasally. Infectious virus was demonstrated in blood, saliva and all organs analyzed.We present the first characterization of a new OPV infection model inducing a disease in common marmosets comparable to smallpox in humans. Intranasal virus inoculation mimicking the natural route of smallpox infection led to reproducible infection. In vivo titration resulted in an MID(50 (minimal monkey infectious dose 50% of 8.3x10(2 pfu of calpox virus which is approximately 10,000-fold lower than MPXV and VARV doses applied in the macaque models. Therefore, the calpox virus/marmoset model is a suitable nonhuman primate model for the validation of vaccines and antiviral drugs. Furthermore, this model can help study mechanisms of OPV pathogenesis.

  20. Improving the Pattern Reproducibility of Multiple-Point-Based Prior Models Using Frequency Matching

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2014-01-01

    Some multiple-point-based sampling algorithms, such as the snesim algorithm, rely on sequential simulation. The conditional probability distributions that are used for the simulation are based on statistics of multiple-point data events obtained from a training image. During the simulation, data...... events with zero probability in the training image statistics may occur. This is handled by pruning the set of conditioning data until an event with non-zero probability is found. The resulting probability distribution sampled by such algorithms is a pruned mixture model. The pruning strategy leads...... to a probability distribution that lacks some of the information provided by the multiple-point statistics from the training image, which reduces the reproducibility of the training image patterns in the outcome realizations. When pruned mixture models are used as prior models for inverse problems, local re...

  1. Reproducing the nonlinear dynamic behavior of a structured beam with a generalized continuum model

    Science.gov (United States)

    Vila, J.; Fernández-Sáez, J.; Zaera, R.

    2018-04-01

    In this paper we study the coupled axial-transverse nonlinear vibrations of a kind of one dimensional structured solids by application of the so called Inertia Gradient Nonlinear continuum model. To show the accuracy of this axiomatic model, previously proposed by the authors, its predictions are compared with numeric results from a previously defined finite discrete chain of lumped masses and springs, for several number of particles. A continualization of the discrete model equations based on Taylor series allowed us to set equivalent values of the mechanical properties in both discrete and axiomatic continuum models. Contrary to the classical continuum model, the inertia gradient nonlinear continuum model used herein is able to capture scale effects, which arise for modes in which the wavelength is comparable to the characteristic distance of the structured solid. The main conclusion of the work is that the proposed generalized continuum model captures the scale effects in both linear and nonlinear regimes, reproducing the behavior of the 1D nonlinear discrete model adequately.

  2. Mouse Models of Diet-Induced Nonalcoholic Steatohepatitis Reproduce the Heterogeneity of the Human Disease

    Science.gov (United States)

    Machado, Mariana Verdelho; Michelotti, Gregory Alexander; Xie, Guanhua; de Almeida, Thiago Pereira; Boursier, Jerome; Bohnic, Brittany; Guy, Cynthia D.; Diehl, Anna Mae

    2015-01-01

    Background and aims Non-alcoholic steatohepatitis (NASH), the potentially progressive form of nonalcoholic fatty liver disease (NAFLD), is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD) diet and Western diet. Methods Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose) for 16 weeks. Liver pathology and metabolic profile were compared. Results The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation) was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation. Conclusion Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH. PMID:26017539

  3. Mouse models of diet-induced nonalcoholic steatohepatitis reproduce the heterogeneity of the human disease.

    Directory of Open Access Journals (Sweden)

    Mariana Verdelho Machado

    Full Text Available Non-alcoholic steatohepatitis (NASH, the potentially progressive form of nonalcoholic fatty liver disease (NAFLD, is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD diet and Western diet.Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose for 16 weeks. Liver pathology and metabolic profile were compared.The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation.Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH.

  4. Reproducibility analysis of measurements with a mechanical semiautomatic eye model for evaluation of intraocular lenses

    Science.gov (United States)

    Rank, Elisabet; Traxler, Lukas; Bayer, Natascha; Reutterer, Bernd; Lux, Kirsten; Drauschke, Andreas

    2014-03-01

    Mechanical eye models are used to validate ex vivo the optical quality of intraocular lenses (IOLs). The quality measurement and test instructions for IOLs are defined in the ISO 11979-2. However, it was mentioned in literature that these test instructions could lead to inaccurate measurements in case of some modern IOL designs. Reproducibility of alignment and measurement processes are presented, performed with a semiautomatic mechanical ex vivo eye model based on optical properties published by Liou and Brennan in the scale 1:1. The cornea, the iris aperture and the IOL itself are separately changeable within the eye model. The adjustment of the IOL can be manipulated by automatic decentration and tilt of the IOL in reference to the optical axis of the whole system, which is defined by the connection line of the central point of the artificial cornea and the iris aperture. With the presented measurement setup two quality criteria are measurable: the modulation transfer function (MTF) and the Strehl ratio. First the reproducibility of the alignment process for definition of initial conditions of the lateral position and tilt in reference to the optical axis of the system is investigated. Furthermore, different IOL holders are tested related to the stable holding of the IOL. The measurement is performed by a before-after comparison of the lens position using a typical decentration and tilt tolerance analysis path. Modulation transfer function MTF and Strehl ratio S before and after this tolerance analysis are compared and requirements for lens holder construction are deduced from the presented results.

  5. The construction of a two-dimensional reproducing kernel function and its application in a biomedical model.

    Science.gov (United States)

    Guo, Qi; Shen, Shu-Ting

    2016-04-29

    There are two major classes of cardiac tissue models: the ionic model and the FitzHugh-Nagumo model. During computer simulation, each model entails solving a system of complex ordinary differential equations and a partial differential equation with non-flux boundary conditions. The reproducing kernel method possesses significant applications in solving partial differential equations. The derivative of the reproducing kernel function is a wavelet function, which has local properties and sensitivities to singularity. Therefore, study on the application of reproducing kernel would be advantageous. Applying new mathematical theory to the numerical solution of the ventricular muscle model so as to improve its precision in comparison with other methods at present. A two-dimensional reproducing kernel function inspace is constructed and applied in computing the solution of two-dimensional cardiac tissue model by means of the difference method through time and the reproducing kernel method through space. Compared with other methods, this method holds several advantages such as high accuracy in computing solutions, insensitivity to different time steps and a slow propagation speed of error. It is suitable for disorderly scattered node systems without meshing, and can arbitrarily change the location and density of the solution on different time layers. The reproducing kernel method has higher solution accuracy and stability in the solutions of the two-dimensional cardiac tissue model.

  6. Why are models unable to reproduce multi-decadal trends in lower tropospheric baseline ozone levels?

    Science.gov (United States)

    Hu, L.; Liu, J.; Mickley, L. J.; Strahan, S. E.; Steenrod, S.

    2017-12-01

    Assessments of tropospheric ozone radiative forcing rely on accurate model simulations. Parrish et al (2014) found that three chemistry-climate models (CCMs) overestimate present-day O3 mixing ratios and capture only 50% of the observed O3 increase over the last five decades at 12 baseline sites in the northern mid-latitudes, indicating large uncertainties in our understanding of the ozone trends and their implications for radiative forcing. Here we present comparisons of outputs from two chemical transport models (CTMs) - GEOS-Chem and the Global Modeling Initiative model - with O3 observations from the same sites and from the global ozonesonde network. Both CTMs are driven by reanalysis meteorological data (MERRA or MERRA2) and thus are expected to be different in atmospheric transport processes relative to those freely running CCMs. We test whether recent model developments leading to more active ozone chemistry affect the computed ozone sensitivity to perturbations in emissions. Preliminary results suggest these CTMs can reproduce present-day ozone levels but fail to capture the multi-decadal trend since 1980. Both models yield widespread overpredictions of free tropospheric ozone in the 1980s. Sensitivity studies in GEOS-Chem suggest that the model estimate of natural background ozone is too high. We discuss factors that contribute to the variability and trends of tropospheric ozone over the last 30 years, with a focus on intermodel differences in spatial resolution and in the representation of stratospheric chemistry, stratosphere-troposphere exchange, halogen chemistry, and biogenic VOC emissions and chemistry. We also discuss uncertainty in the historical emission inventories used in models, and how these affect the simulated ozone trends.

  7. Reproducing tailing in breakthrough curves: Are statistical models equally representative and predictive?

    Science.gov (United States)

    Pedretti, Daniele; Bianchi, Marco

    2018-03-01

    Breakthrough curves (BTCs) observed during tracer tests in highly heterogeneous aquifers display strong tailing. Power laws are popular models for both the empirical fitting of these curves, and the prediction of transport using upscaling models based on best-fitted estimated parameters (e.g. the power law slope or exponent). The predictive capacity of power law based upscaling models can be however questioned due to the difficulties to link model parameters with the aquifers' physical properties. This work analyzes two aspects that can limit the use of power laws as effective predictive tools: (a) the implication of statistical subsampling, which often renders power laws undistinguishable from other heavily tailed distributions, such as the logarithmic (LOG); (b) the difficulties to reconcile fitting parameters obtained from models with different formulations, such as the presence of a late-time cutoff in the power law model. Two rigorous and systematic stochastic analyses, one based on benchmark distributions and the other on BTCs obtained from transport simulations, are considered. It is found that a power law model without cutoff (PL) results in best-fitted exponents (αPL) falling in the range of typical experimental values reported in the literature (1.5 tailing becomes heavier. Strong fluctuations occur when the number of samples is limited, due to the effects of subsampling. On the other hand, when the power law model embeds a cutoff (PLCO), the best-fitted exponent (αCO) is insensitive to the degree of tailing and to the effects of subsampling and tends to a constant αCO ≈ 1. In the PLCO model, the cutoff rate (λ) is the parameter that fully reproduces the persistence of the tailing and is shown to be inversely correlated to the LOG scale parameter (i.e. with the skewness of the distribution). The theoretical results are consistent with the fitting analysis of a tracer test performed during the MADE-5 experiment. It is shown that a simple

  8. Demography-based adaptive network model reproduces the spatial organization of human linguistic groups

    Science.gov (United States)

    Capitán, José A.; Manrubia, Susanna

    2015-12-01

    The distribution of human linguistic groups presents a number of interesting and nontrivial patterns. The distributions of the number of speakers per language and the area each group covers follow log-normal distributions, while population and area fulfill an allometric relationship. The topology of networks of spatial contacts between different linguistic groups has been recently characterized, showing atypical properties of the degree distribution and clustering, among others. Human demography, spatial conflicts, and the construction of networks of contacts between linguistic groups are mutually dependent processes. Here we introduce an adaptive network model that takes all of them into account and successfully reproduces, using only four model parameters, not only those features of linguistic groups already described in the literature, but also correlations between demographic and topological properties uncovered in this work. Besides their relevance when modeling and understanding processes related to human biogeography, our adaptive network model admits a number of generalizations that broaden its scope and make it suitable to represent interactions between agents based on population dynamics and competition for space.

  9. Diagnostic reasoning using qualitative causal models

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1992-01-01

    The application of expert systems to reasoning problems involving real-time data from plant measurements has been a topic of much research, but few practical systems have been deployed. One obstacle to wider use of expert systems in applications involving real-time data is the lack of adequate knowledge representation methodologies for dynamic processes. Knowledge bases composed mainly of rules have disadvantages when applied to dynamic processes and real-time data. This paper describes a methodology for the development of qualitative causal models that can be used as knowledge bases for reasoning about process dynamic behavior. These models provide a systematic method for knowledge base construction, considerably reducing the engineering effort required. They also offer much better opportunities for verification and validation of the knowledge base, thus increasing the possibility of the application of expert systems to reasoning about mission critical systems. Starting with the Signed Directed Graph (SDG) method that has been successfully applied to describe the behavior of diverse dynamic processes, the paper shows how certain non-physical behaviors that result from abstraction may be eliminated by applying causal constraint to the models. The resulting Extended Signed Directed Graph (ESDG) may then be compiled to produce a model for use in process fault diagnosis. This model based reasoning methodology is used in the MOBIAS system being developed by Duke Power Company under EPRI sponsorship. 15 refs., 4 figs

  10. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  11. Stochastic model of financial markets reproducing scaling and memory in volatility return intervals

    Science.gov (United States)

    Gontis, V.; Havlin, S.; Kononovicius, A.; Podobnik, B.; Stanley, H. E.

    2016-11-01

    We investigate the volatility return intervals in the NYSE and FOREX markets. We explain previous empirical findings using a model based on the interacting agent hypothesis instead of the widely-used efficient market hypothesis. We derive macroscopic equations based on the microscopic herding interactions of agents and find that they are able to reproduce various stylized facts of different markets and different assets with the same set of model parameters. We show that the power-law properties and the scaling of return intervals and other financial variables have a similar origin and could be a result of a general class of non-linear stochastic differential equations derived from a master equation of an agent system that is coupled by herding interactions. Specifically, we find that this approach enables us to recover the volatility return interval statistics as well as volatility probability and spectral densities for the NYSE and FOREX markets, for different assets, and for different time-scales. We find also that the historical S&P500 monthly series exhibits the same volatility return interval properties recovered by our proposed model. Our statistical results suggest that human herding is so strong that it persists even when other evolving fluctuations perturbate the financial system.

  12. A stable and reproducible human blood-brain barrier model derived from hematopoietic stem cells.

    Directory of Open Access Journals (Sweden)

    Romeo Cecchelli

    Full Text Available The human blood brain barrier (BBB is a selective barrier formed by human brain endothelial cells (hBECs, which is important to ensure adequate neuronal function and protect the central nervous system (CNS from disease. The development of human in vitro BBB models is thus of utmost importance for drug discovery programs related to CNS diseases. Here, we describe a method to generate a human BBB model using cord blood-derived hematopoietic stem cells. The cells were initially differentiated into ECs followed by the induction of BBB properties by co-culture with pericytes. The brain-like endothelial cells (BLECs express tight junctions and transporters typically observed in brain endothelium and maintain expression of most in vivo BBB properties for at least 20 days. The model is very reproducible since it can be generated from stem cells isolated from different donors and in different laboratories, and could be used to predict CNS distribution of compounds in human. Finally, we provide evidence that Wnt/β-catenin signaling pathway mediates in part the BBB inductive properties of pericytes.

  13. How well do CMIP5 Climate Models Reproduce the Hydrologic Cycle of the Colorado River Basin?

    Science.gov (United States)

    Gautam, J.; Mascaro, G.

    2017-12-01

    The Colorado River, which is the primary source of water for nearly 40 million people in the arid Southwestern states of the United States, has been experiencing an extended drought since 2000, which has led to a significant reduction in water supply. As the water demands increase, one of the major challenges for water management in the region has been the quantification of uncertainties associated with streamflow predictions in the Colorado River Basin (CRB) under potential changes of future climate. Hence, testing the reliability of model predictions in the CRB is critical in addressing this challenge. In this study, we evaluated the performances of 17 General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase Five (CMIP5) and 4 Regional Climate Models (RCMs) in reproducing the statistical properties of the hydrologic cycle in the CRB. We evaluated the water balance components at four nested sub-basins along with the inter-annual and intra-annual changes of precipitation (P), evaporation (E), runoff (R) and temperature (T) from 1979 to 2005. Most of the models captured the net water balance fairly well in the most-upstream basin but simulated a weak hydrological cycle in the evaporation channel at the downstream locations. The simulated monthly variability of P had different patterns, with correlation coefficients ranging from -0.6 to 0.8 depending on the sub-basin and the models from same parent institution clustering together. Apart from the most-upstream sub-basin where the models were mainly characterized by a negative seasonal bias in SON (of up to -50%), most of them had a positive bias in all seasons (of up to +260%) in the other three sub-basins. The models, however, captured the monthly variability of T well at all sites with small inter-model variabilities and a relatively similar range of bias (-7 °C to +5 °C) across all seasons. Mann-Kendall test was applied to the annual P and T time-series where majority of the models

  14. Reproducibility of summertime diurnal precipitation over northern Eurasia simulated by CMIP5 climate models

    Science.gov (United States)

    Hirota, N.; Takayabu, Y. N.

    2015-12-01

    Reproducibility of diurnal precipitation over northern Eurasia simulated by CMIP5 climate models in their historical runs were evaluated, in comparison with station data (NCDC-9813) and satellite data (GSMaP-V5). We first calculated diurnal cycles by averaging precipitation at each local solar time (LST) in June-July-August during 1981-2000 over the continent of northern Eurasia (0-180E, 45-90N). Then we examined occurrence time of maximum precipitation and a contribution of diurnally varying precipitation to the total precipitation.The contribution of diurnal precipitation was about 21% in both NCDC-9813 and GSMaP-V5. The maximum precipitation occurred at 18LST in NCDC-9813 but 16LST in GSMaP-V5, indicating some uncertainties even in the observational datasets. The diurnal contribution of the CMIP5 models varied largely from 11% to 62%, and their timing of the precipitation maximum ranged from 11LST to 20LST. Interestingly, the contribution and the timing had strong negative correlation of -0.65. The models with larger diurnal precipitation showed precipitation maximum earlier around noon. Next, we compared sensitivity of precipitation to surface temperature and tropospheric humidity between 5 models with large diurnal precipitation (LDMs) and 5 models with small diurnal precipitation (SDMs). Precipitation in LDMs showed high sensitivity to surface temperature, indicating its close relationship with local instability. On the other hand, synoptic disturbances were more active in SDMs with a dominant role of the large scale condensation, and precipitation in SDMs was more related with tropospheric moisture. Therefore, the relative importance of the local instability and the synoptic disturbances was suggested to be an important factor in determining the contribution and timing of the diurnal precipitation. Acknowledgment: This study is supported by Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology

  15. Reproducibility and consistency of proteomic experiments on natural populations of a non-model aquatic insect.

    Science.gov (United States)

    Hidalgo-Galiana, Amparo; Monge, Marta; Biron, David G; Canals, Francesc; Ribera, Ignacio; Cieslak, Alexandra

    2014-01-01

    Population proteomics has a great potential to address evolutionary and ecological questions, but its use in wild populations of non-model organisms is hampered by uncontrolled sources of variation. Here we compare the response to temperature extremes of two geographically distant populations of a diving beetle species (Agabus ramblae) using 2-D DIGE. After one week of acclimation in the laboratory under standard conditions, a third of the specimens of each population were placed at either 4 or 27°C for 12 h, with another third left as a control. We then compared the protein expression level of three replicated samples of 2-3 specimens for each treatment. Within each population, variation between replicated samples of the same treatment was always lower than variation between treatments, except for some control samples that retained a wider range of expression levels. The two populations had a similar response, without significant differences in the number of protein spots over- or under-expressed in the pairwise comparisons between treatments. We identified exemplary proteins among those differently expressed between treatments, which proved to be proteins known to be related to thermal response or stress. Overall, our results indicate that specimens collected in the wild are suitable for proteomic analyses, as the additional sources of variation were not enough to mask the consistency and reproducibility of the response to the temperature treatments.

  16. An improved cost-effective, reproducible method for evaluation of bone loss in a rodent model.

    Science.gov (United States)

    Fine, Daniel H; Schreiner, Helen; Nasri-Heir, Cibele; Greenberg, Barbara; Jiang, Shuying; Markowitz, Kenneth; Furgang, David

    2009-02-01

    This study was designed to investigate the utility of two "new" definitions for assessment of bone loss in a rodent model of periodontitis. Eighteen rats were divided into three groups. Group 1 was infected by Aggregatibacter actinomycetemcomitans (Aa), group 2 was infected with an Aa leukotoxin knock-out, and group 3 received no Aa (controls). Microbial sampling and antibody titres were determined. Initially, two examiners measured the distance from the cemento-enamel-junction to alveolar bone crest using the three following methods; (1) total area of bone loss by radiograph, (2) linear bone loss by radiograph, (3) a direct visual measurement (DVM) of horizontal bone loss. Two "new" definitions were adopted; (1) any site in infected animals showing bone loss >2 standard deviations above the mean seen at that site in control animals was recorded as bone loss, (2) any animal with two or more sites in any quadrant affected by bone loss was considered as diseased. Using the "new" definitions both evaluators independently found that infected animals had significantly more disease than controls (DVM system; p<0.05). The DVM method provides a simple, cost effective, and reproducible method for studying periodontal disease in rodents.

  17. How Qualitative Methods Can be Used to Inform Model Development.

    Science.gov (United States)

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  18. Criminalisation of clients: reproducing vulnerabilities for violence and poor health among street-based sex workers in Canada—a qualitative study

    Science.gov (United States)

    Krüsi, A; Pacey, K; Bird, L; Taylor, C; Chettiar, J; Allan, S; Bennett, D; Montaner, J S; Kerr, T; Shannon, K

    2014-01-01

    Objectives To explore how criminalisation and policing of sex buyers (clients) rather than sex workers shapes sex workers’ working conditions and sexual transactions including risk of violence and HIV/sexually transmitted infections (STIs). Design Qualitative and ethnographic study triangulated with sex work-related violence prevalence data and publicly available police statistics. Setting Vancouver, Canada, provides a unique opportunity to evaluate the impact of policies that criminalise clients as the local police department adopted a sex work enforcement policy in January 2013 that prioritises sex workers’ safety over arrest, while continuing to target clients. Participants 26 cisgender and 5 transgender women who were street-based sex workers (n=31) participated in semistructured interviews about their working conditions. All had exchanged sex for money in the previous 30 days in Vancouver. Outcome measures Thematic analysis of interview transcripts and ethnographic field notes focused on how police enforcement of clients shaped sex workers’ working conditions and sexual transactions, including risk of violence and HIV/STIs, over an 11-month period postpolicy implementation (January–November 2013). Results Sex workers’ narratives and ethnographic observations indicated that while police sustained a high level of visibility, they eased charging or arresting sex workers and showed increased concern for their safety. However, participants’ accounts and police statistics indicated continued police enforcement of clients. This profoundly impacted the safety strategies sex workers employed. Sex workers continued to mistrust police, had to rush screening clients and were displaced to outlying areas with increased risks of violence, including being forced to engage in unprotected sex. Conclusions These findings suggest that criminalisation and policing strategies that target clients reproduce the harms created by the criminalisation of sex work, in

  19. Can Computational Sediment Transport Models Reproduce the Observed Variability of Channel Networks in Modern Deltas?

    Science.gov (United States)

    Nesvold, E.; Mukerji, T.

    2017-12-01

    River deltas display complex channel networks that can be characterized through the framework of graph theory, as shown by Tejedor et al. (2015). Deltaic patterns may also be useful in a Bayesian approach to uncertainty quantification of the subsurface, but this requires a prior distribution of the networks of ancient deltas. By considering subaerial deltas, one can at least obtain a snapshot in time of the channel network spectrum across deltas. In this study, the directed graph structure is semi-automatically extracted from satellite imagery using techniques from statistical processing and machine learning. Once the network is labeled with vertices and edges, spatial trends and width and sinuosity distributions can also be found easily. Since imagery is inherently 2D, computational sediment transport models can serve as a link between 2D network structure and 3D depositional elements; the numerous empirical rules and parameters built into such models makes it necessary to validate the output with field data. For this purpose we have used a set of 110 modern deltas, with average water discharge ranging from 10 - 200,000 m3/s, as a benchmark for natural variability. Both graph theoretic and more general distributions are established. A key question is whether it is possible to reproduce this deltaic network spectrum with computational models. Delft3D was used to solve the shallow water equations coupled with sediment transport. The experimental setup was relatively simple; incoming channelized flow onto a tilted plane, with varying wave and tidal energy, sediment types and grain size distributions, river discharge and a few other input parameters. Each realization was run until a delta had fully developed: between 50 and 500 years (with a morphology acceleration factor). It is shown that input parameters should not be sampled independently from the natural ranges, since this may result in deltaic output that falls well outside the natural spectrum. Since we are

  20. Modeling arson - An exercise in qualitative model building

    Science.gov (United States)

    Heineke, J. M.

    1975-01-01

    A detailed example is given of the role of von Neumann and Morgenstern's 1944 'expected utility theorem' (in the theory of games and economic behavior) in qualitative model building. Specifically, an arsonist's decision as to the amount of time to allocate to arson and related activities is modeled, and the responsiveness of this time allocation to changes in various policy parameters is examined. Both the activity modeled and the method of presentation are intended to provide an introduction to the scope and power of the expected utility theorem in modeling situations of 'choice under uncertainty'. The robustness of such a model is shown to vary inversely with the number of preference restrictions used in the analysis. The fewer the restrictions, the wider is the class of agents to which the model is applicable, and accordingly more confidence is put in the derived results. A methodological discussion on modeling human behavior is included.

  1. Validation of the 3D Skin Comet assay using full thickness skin models: Transferability and reproducibility.

    Science.gov (United States)

    Reisinger, Kerstin; Blatz, Veronika; Brinkmann, Joep; Downs, Thomas R; Fischer, Anja; Henkler, Frank; Hoffmann, Sebastian; Krul, Cyrille; Liebsch, Manfred; Luch, Andreas; Pirow, Ralph; Reus, Astrid A; Schulz, Markus; Pfuhler, Stefan

    2018-03-01

    Recently revised OECD Testing Guidelines highlight the importance of considering the first site-of-contact when investigating the genotoxic hazard. Thus far, only in vivo approaches are available to address the dermal route of exposure. The 3D Skin Comet and Reconstructed Skin Micronucleus (RSMN) assays intend to close this gap in the in vitro genotoxicity toolbox by investigating DNA damage after topical application. This represents the most relevant route of exposure for a variety of compounds found in household products, cosmetics, and industrial chemicals. The comet assay methodology is able to detect both chromosomal damage and DNA lesions that may give rise to gene mutations, thereby complementing the RSMN which detects only chromosomal damage. Here, the comet assay was adapted to two reconstructed full thickness human skin models: the EpiDerm™- and Phenion ® Full-Thickness Skin Models. First, tissue-specific protocols for the isolation of single cells and the general comet assay were transferred to European and US-American laboratories. After establishment of the assay, the protocol was then further optimized with appropriate cytotoxicity measurements and the use of aphidicolin, a DNA repair inhibitor, to improve the assay's sensitivity. In the first phase of an ongoing validation study eight chemicals were tested in three laboratories each using the Phenion ® Full-Thickness Skin Model, informing several validation modules. Ultimately, the 3D Skin Comet assay demonstrated a high predictive capacity and good intra- and inter-laboratory reproducibility with four laboratories reaching a 100% predictivity and the fifth yielding 70%. The data are intended to demonstrate the use of the 3D Skin Comet assay as a new in vitro tool for following up on positive findings from the standard in vitro genotoxicity test battery for dermally applied chemicals, ultimately helping to drive the regulatory acceptance of the assay. To expand the database, the validation will

  2. Interpretative intra- and interobserver reproducibility of Stress/Rest 99m Tc-steamboat's myocardial perfusion SPECT using semi quantitative 20-segment model

    International Nuclear Information System (INIS)

    Fazeli, M.; Firoozi, F.

    2002-01-01

    It well established that myocardial perfusion SPECT with 201 T L or 99 mTc-se sta mi bi play an important role diagnosis and risk assessment in patients with known or suspected coronary artery disease. Both quantitative and qualitative methods are available for interpretation of images. The use of a semi quantitative scoring system in which each of 20 segments is scored according to a five-point scheme provides an approach to interpretation that is more systematic and reproducible than simple qualitative evaluation. Only a limited number of studies have dealt with the interpretive observer reproducibility of 99 mTc-steamboat's myocardial perfusion imaging. The aim of this study was to assess the intra-and inter observer variability of semi quantitative SPECT performed with this technique. Among 789 patients that underwent myocardial perfusion SPECT during last year 80 patients finally need to coronary angiography as gold standard. In this group of patients a semi quantitative visual interpretation was carried out using short axis and vertical long-axis myocardial tomograms and a 20-segments model. These segments we reassigned on six evenly spaced regions in the apical, mid-ventricular, and basal short-axis view and two apical segments on the mid-ventricular long-axis slice. Uptake in each segment was graded on a 5-point scale (0=normal, 1=equivocal, 2=moderate, 3=severe, 4=absence of uptake). The steamboat's images was interpreted separately w ice by two observers without knowledge of each other's findings or results of angiography. A SPECT study was judged abnormal if there were two or more segments with a stress score equal or more than 2. We con eluded that semi-quantitative visual analysis is a simple and reproducible method of interpretation

  3. Reproducing the Wechsler Intelligence Scale for Children-Fifth Edition: Factor Model Results

    Science.gov (United States)

    Beaujean, A. Alexander

    2016-01-01

    One of the ways to increase the reproducibility of research is for authors to provide a sufficient description of the data analytic procedures so that others can replicate the results. The publishers of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) do not follow these guidelines when reporting their confirmatory factor…

  4. Can a coupled meteorology–chemistry model reproduce the historical trend in aerosol direct radiative effects over the Northern Hemisphere?

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere h...

  5. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  6. Decision making in goverment tenders: A formalized qualitative model

    Directory of Open Access Journals (Sweden)

    Štěpán Veselý

    2012-01-01

    Full Text Available The paper presents a simple formalized qualitative model of government tenders (GTs. Qualitative models use just three values: Positive/Increasing, Zero/Constant and Negative/Decreasing. Such quantifiers of trends are the least information intensive. Qualitative models can be useful, since GT evaluation often includes such goals as e.g. efficiency of public purchasing, and variables as e.g. availability of relevant information or subjectivity of judgment, that are difficult to quantify. Hence, a significant fraction of available information about GTs is not of numerical nature, e.g. if availability of relevant information is decreasing then efficiency of public purchasing is decreasing as well. Such equationless relations are studied in this paper. A qualitative model of the function F(Goals, Variables is developed. The model has four goal functions, eight variables, and 39 equationless relations. The model is solved and seven solutions, i.e. scenarios are obtained. All qualitative states, including first and second qualitative derivatives with respect to time, of all variables are specified for each scenario. Any unsteady state behavior of the GT model is described by its transitional oriented graph. There are eight possible transitions among seven scenarios. No a priori knowledge of qualitative modeling is required on the reader’s part.

  7. Qualitative models of magnetic field accelerated propagation in a plasma due to the Hall effect

    International Nuclear Information System (INIS)

    Kukushkin, A.B.; Cherepanov, K.V.

    2000-01-01

    Two qualitatively new models of accelerated magnetic field propagation (relative to normal diffusion) in a plasma due to the Hall effect are developed within the frames of the electron magnetic hydrodynamics. The first model is based on a simple hydrodynamic approach, which, in particular, reproduces the number of known theoretical results. The second one makes it possible to obtain exact analytical description of the basic characteristics of the magnetic field accelerated propagation in a inhomogeneous iso-thermic plasma, namely, the magnetic field front and its effective width [ru

  8. Assessment of the potential forecasting skill of a global hydrological model in reproducing the occurrence of monthly flow extremes

    Directory of Open Access Journals (Sweden)

    N. Candogan Yossef

    2012-11-01

    Full Text Available As an initial step in assessing the prospect of using global hydrological models (GHMs for hydrological forecasting, this study investigates the skill of the GHM PCR-GLOBWB in reproducing the occurrence of past extremes in monthly discharge on a global scale. Global terrestrial hydrology from 1958 until 2001 is simulated by forcing PCR-GLOBWB with daily meteorological data obtained by downscaling the CRU dataset to daily fields using the ERA-40 reanalysis. Simulated discharge values are compared with observed monthly streamflow records for a selection of 20 large river basins that represent all continents and a wide range of climatic zones.

    We assess model skill in three ways all of which contribute different information on the potential forecasting skill of a GHM. First, the general skill of the model in reproducing hydrographs is evaluated. Second, model skill in reproducing significantly higher and lower flows than the monthly normals is assessed in terms of skill scores used for forecasts of categorical events. Third, model skill in reproducing flood and drought events is assessed by constructing binary contingency tables for floods and droughts for each basin. The skill is then compared to that of a simple estimation of discharge from the water balance (PE.

    The results show that the model has skill in all three types of assessments. After bias correction the model skill in simulating hydrographs is improved considerably. For most basins it is higher than that of the climatology. The skill is highest in reproducing monthly anomalies. The model also has skill in reproducing floods and droughts, with a markedly higher skill in floods. The model skill far exceeds that of the water balance estimate. We conclude that the prospect for using PCR-GLOBWB for monthly and seasonal forecasting of the occurrence of hydrological extremes is positive. We argue that this conclusion applies equally to other similar GHMs and

  9. Two-Finger Tightness: What Is It? Measuring Torque and Reproducibility in a Simulated Model.

    Science.gov (United States)

    Acker, William B; Tai, Bruce L; Belmont, Barry; Shih, Albert J; Irwin, Todd A; Holmes, James R

    2016-05-01

    Residents in training are often directed to insert screws using "two-finger tightness" to impart adequate torque but minimize the chance of a screw stripping in bone. This study seeks to quantify and describe two-finger tightness and to assess the variability of its application by residents in training. Cortical bone was simulated using a polyurethane foam block (30-pcf density) that was prepared with predrilled holes for tightening 3.5 × 14-mm long cortical screws and mounted to a custom-built apparatus on a load cell to capture torque data. Thirty-three residents in training, ranging from the first through fifth years of residency, along with 8 staff members, were directed to tighten 6 screws to two-finger tightness in the test block, and peak torque values were recorded. The participants were blinded to their torque values. Stripping torque (2.73 ± 0.56 N·m) was determined from 36 trials and served as a threshold for failed screw placement. The average torques varied substantially with regard to absolute torque values, thus poorly defining two-finger tightness. Junior residents less consistently reproduced torque compared with other groups (0.29 and 0.32, respectively). These data quantify absolute values of two-finger tightness but demonstrate considerable variability in absolute torque values, percentage of stripping torque, and ability to consistently reproduce given torque levels. Increased years in training are weakly correlated with reproducibility, but experience does not seem to affect absolute torque levels. These results question the usefulness of two-finger tightness as a teaching tool and highlight the need for improvement in resident motor skill training and development within a teaching curriculum. Torque measuring devices may be a useful simulation tools for this purpose.

  10. Intestinal microdialysis--applicability, reproducibility and local tissue response in a pig model

    DEFF Research Database (Denmark)

    Emmertsen, K J; Wara, P; Sørensen, Flemming Brandt

    2005-01-01

    BACKGROUND AND AIMS: Microdialysis has been applied to the intestinal wall for the purpose of monitoring local ischemia. The aim of this study was to investigate the applicability, reproducibility and local response to microdialysis in the intestinal wall. MATERIALS AND METHODS: In 12 pigs two...... the probes were processed for histological examination. RESULTS: Large intra- and inter-group differences in the relative recovery were found between all locations. Absolute values of metabolites showed no significant changes during the study period. The lactate in blood was 25-30% of the intra-tissue values...

  11. Pharmacokinetic Modelling to Predict FVIII:C Response to Desmopressin and Its Reproducibility in Nonsevere Haemophilia A Patients.

    Science.gov (United States)

    Schütte, Lisette M; van Hest, Reinier M; Stoof, Sara C M; Leebeek, Frank W G; Cnossen, Marjon H; Kruip, Marieke J H A; Mathôt, Ron A A

    2018-04-01

     Nonsevere haemophilia A (HA) patients can be treated with desmopressin. Response of factor VIII activity (FVIII:C) differs between patients and is difficult to predict.  Our aims were to describe FVIII:C response after desmopressin and its reproducibility by population pharmacokinetic (PK) modelling.  Retrospective data of 128 nonsevere HA patients (age 7-75 years) receiving an intravenous or intranasal dose of desmopressin were used. PK modelling of FVIII:C was performed by nonlinear mixed effect modelling. Reproducibility of FVIII:C response was defined as less than 25% difference in peak FVIII:C between administrations.  A total of 623 FVIII:C measurements from 142 desmopressin administrations were available; 14 patients had received two administrations at different occasions. The FVIII:C time profile was best described by a two-compartment model with first-order absorption and elimination. Interindividual variability of the estimated baseline FVIII:C, central volume of distribution and clearance were 37, 43 and 50%, respectively. The most recently measured FVIII:C (FVIII-recent) was significantly associated with FVIII:C response to desmopressin ( p  C increase of 0.47 IU/mL (median, interquartile range: 0.32-0.65 IU/mL, n  = 142). C response was reproducible in 6 out of 14 patients receiving two desmopressin administrations.  FVIII:C response to desmopressin in nonsevere HA patients was adequately described by a population PK model. Large variability in FVIII:C response was observed, which could only partially be explained by FVIII-recent. C response was not reproducible in a small subset of patients. Therefore, monitoring FVIII:C around surgeries or bleeding might be considered. Research is needed to study this further. Schattauer Stuttgart.

  12. SDG and qualitative trend based model multiple scale validation

    Science.gov (United States)

    Gao, Dong; Xu, Xin; Yin, Jianjin; Zhang, Hongyu; Zhang, Beike

    2017-09-01

    Verification, Validation and Accreditation (VV&A) is key technology of simulation and modelling. For the traditional model validation methods, the completeness is weak; it is carried out in one scale; it depends on human experience. The SDG (Signed Directed Graph) and qualitative trend based multiple scale validation is proposed. First the SDG model is built and qualitative trends are added to the model. And then complete testing scenarios are produced by positive inference. The multiple scale validation is carried out by comparing the testing scenarios with outputs of simulation model in different scales. Finally, the effectiveness is proved by carrying out validation for a reactor model.

  13. Modelling qualitative knowledge for strategic river management

    NARCIS (Netherlands)

    Janssen, Judith

    2009-01-01

    In decision making processes on strategic river management, use of models is not as great as the research efforts in the field of model application might suggest they could be. Both the fact that the development of many models remains restricted to readily available data and pre-existing models,

  14. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  15. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  16. Modeling with Young Students--Quantitative and Qualitative.

    Science.gov (United States)

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  17. Qualitative Analysis of Integration Adapter Modeling

    OpenAIRE

    Ritter, Daniel; Holzleitner, Manuel

    2015-01-01

    Integration Adapters are a fundamental part of an integration system, since they provide (business) applications access to its messaging channel. However, their modeling and configuration remain under-represented. In previous work, the integration control and data flow syntax and semantics have been expressed in the Business Process Model and Notation (BPMN) as a semantic model for message-based integration, while adapter and the related quality of service modeling were left for further studi...

  18. A qualitative evaluation approach for energy system modelling frameworks

    DEFF Research Database (Denmark)

    Wiese, Frauke; Hilpert, Simon; Kaldemeyer, Cord

    2018-01-01

    properties define how useful it is in regard to the existing challenges. For energy system models, evaluation methods exist, but we argue that many decisions upon properties are rather made on the model generator or framework level. Thus, this paper presents a qualitative approach to evaluate frameworks...

  19. The Use of Modelling for Theory Building in Qualitative Analysis

    Science.gov (United States)

    Briggs, Ann R. J.

    2007-01-01

    The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…

  20. Graphical means for inspecting qualitative models of system behaviour

    NARCIS (Netherlands)

    Bouwer, A.; Bredeweg, B.

    2010-01-01

    This article presents the design and evaluation of a tool for inspecting conceptual models of system behaviour. The basis for this research is the Garp framework for qualitative simulation. This framework includes modelling primitives, such as entities, quantities and causal dependencies, which are

  1. Accelerating transition dynamics in city regions: A qualitative modeling perspective

    NARCIS (Netherlands)

    P.J. Valkering (Pieter); Yücel, G. (Gönenç); Gebetsroither-Geringer, E. (Ernst); Markvica, K. (Karin); Meynaerts, E. (Erika); N. Frantzeskaki (Niki)

    2017-01-01

    textabstractIn this article, we take stock of the findings from conceptual and empirical work on the role of transition initiatives for accelerating transitions as input for modeling acceleration dynamics. We applied the qualitative modeling approach of causal loop diagrams to capture the dynamics

  2. Intra- and interobserver reliability and intra-catheter reproducibility using frequency domain optical coherence tomography for the evaluation of morphometric stent parameters and qualitative assessment of stent strut coverage

    International Nuclear Information System (INIS)

    Antonsen, Lisbeth; Thayssen, Per; Junker, Anders; Veien, Karsten Tange; Hansen, Henrik Steen; Hansen, Knud Nørregaard; Hougaard, Mikkel; Jensen, Lisette Okkels

    2015-01-01

    Purpose: Frequency-domain optical coherence tomography (FD-OCT) is a high-resolution imaging tool (~ 10–15 μm), which enables near-histological in-vivo images of the coronary vessel wall. The use of the technique is increasing, both for research- and clinical purposes. This study sought to investigate the intra- and interobserver reliability, as well as the intra-catheter reproducibility of quantitative FD-OCT-assessment of morphometric stent parameters and qualitative FD-OCT-evaluation of strut coverage in 10 randomly selected 6-month follow-up Nobori® biolimus-eluting stents (N-BESs). Methods: Ten N-BESs (213 cross sectional areas (CSAs) and 1897 struts) imaged with OCT 6 months post-implantation were randomly selected and analyzed by 2 experienced analysts, and the same 10 N-BESs were analyzed by one of the analysts 3 months later. Further, 2 consecutive pullbacks randomly performed in another 10 N-BESs (219 CSAs and 1860 struts) were independently assessed by one of the analysts. Results: The intraobserver variability with regard to relative difference of mean luminal area and mean stent area at the CSA-level was very low: 0.1% ± 1.4% and 0.5% ± 3.2%. Interobserver variability also proved to be low: − 2.1% ± 3.3% and 2.1% ± 4.6%, and moreover, very restricted intra-catheter variation was observed: 0.02% ± 6.8% and − 0.18% ± 5.2%. The intraobserver-, interobserver- and intra-catheter reliability for the qualitative evaluation of strut coverage was found to be: kappa (κ) = 0.91 (95% confidence interval (CI): 0.88–0.93, p < 0.01), κ = 0.88 (95% CI: 0.85–0.91, p < 0.01), and κ = 0.73 (95% CI: 0.68–0.78, p < 0.01), respectively. Conclusions: FD-OCT is a reproducible and reliable imaging tool for quantitative evaluation of stented coronary segments, and for qualitative assessment of strut coverage. - Highlights: • Frequency-domain optical coherence tomography (FD-OCT) is increasingly adopted in the catherization laboratories. • This

  3. Intra- and interobserver reliability and intra-catheter reproducibility using frequency domain optical coherence tomography for the evaluation of morphometric stent parameters and qualitative assessment of stent strut coverage

    Energy Technology Data Exchange (ETDEWEB)

    Antonsen, Lisbeth, E-mail: Lisbeth.antonsen@rsyd.dk; Thayssen, Per; Junker, Anders; Veien, Karsten Tange; Hansen, Henrik Steen; Hansen, Knud Nørregaard; Hougaard, Mikkel; Jensen, Lisette Okkels

    2015-12-15

    Purpose: Frequency-domain optical coherence tomography (FD-OCT) is a high-resolution imaging tool (~ 10–15 μm), which enables near-histological in-vivo images of the coronary vessel wall. The use of the technique is increasing, both for research- and clinical purposes. This study sought to investigate the intra- and interobserver reliability, as well as the intra-catheter reproducibility of quantitative FD-OCT-assessment of morphometric stent parameters and qualitative FD-OCT-evaluation of strut coverage in 10 randomly selected 6-month follow-up Nobori® biolimus-eluting stents (N-BESs). Methods: Ten N-BESs (213 cross sectional areas (CSAs) and 1897 struts) imaged with OCT 6 months post-implantation were randomly selected and analyzed by 2 experienced analysts, and the same 10 N-BESs were analyzed by one of the analysts 3 months later. Further, 2 consecutive pullbacks randomly performed in another 10 N-BESs (219 CSAs and 1860 struts) were independently assessed by one of the analysts. Results: The intraobserver variability with regard to relative difference of mean luminal area and mean stent area at the CSA-level was very low: 0.1% ± 1.4% and 0.5% ± 3.2%. Interobserver variability also proved to be low: − 2.1% ± 3.3% and 2.1% ± 4.6%, and moreover, very restricted intra-catheter variation was observed: 0.02% ± 6.8% and − 0.18% ± 5.2%. The intraobserver-, interobserver- and intra-catheter reliability for the qualitative evaluation of strut coverage was found to be: kappa (κ) = 0.91 (95% confidence interval (CI): 0.88–0.93, p < 0.01), κ = 0.88 (95% CI: 0.85–0.91, p < 0.01), and κ = 0.73 (95% CI: 0.68–0.78, p < 0.01), respectively. Conclusions: FD-OCT is a reproducible and reliable imaging tool for quantitative evaluation of stented coronary segments, and for qualitative assessment of strut coverage. - Highlights: • Frequency-domain optical coherence tomography (FD-OCT) is increasingly adopted in the catherization laboratories. • This

  4. Intra- and interobserver reliability and intra-catheter reproducibility using frequency domain optical coherence tomography for the evaluation of morphometric stent parameters and qualitative assessment of stent strut coverage

    DEFF Research Database (Denmark)

    Antonsen, Lisbeth; Thayssen, Per; Junker, Anders

    2015-01-01

    to investigate the intra- and interobserver reliability, as well as the intra-catheter reproducibility of quantitative FD-OCT-assessment of morphometric stent parameters and qualitative FD-OCT-evaluation of strut coverage in 10 randomly selected 6-month follow-up Nobori® biolimus-eluting stents (N-BESs). METHODS...... in another 10 N-BESs (219 CSAs and 1860 struts) were independently assessed by one of the analysts. RESULTS: The intraobserver variability with regard to relative difference of mean luminal area and mean stent area at the CSA-level was very low: 0.1%±1.4% and 0.5%±3.2%. Interobserver variability also proved...... (CI): 0.88-0.93, pstented coronary segments, and for qualitative assessment of strut coverage....

  5. The Computable Catchment: An executable document for model-data software sharing, reproducibility and interactive visualization

    Science.gov (United States)

    Gil, Y.; Duffy, C.

    2015-12-01

    This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.

  6. Validation of EURO-CORDEX regional climate models in reproducing the variability of precipitation extremes in Romania

    Science.gov (United States)

    Dumitrescu, Alexandru; Busuioc, Aristita

    2016-04-01

    EURO-CORDEX is the European branch of the international CORDEX initiative that aims to provide improved regional climate change projections for Europe. The main objective of this paper is to document the performance of the individual models in reproducing the variability of precipitation extremes in Romania. Here three EURO-CORDEX regional climate models (RCMs) ensemble (scenario RCP4.5) are analysed and inter-compared: DMI-HIRHAM5, KNMI-RACMO2.2 and MPI-REMO. Compared to previous studies, when the RCM validation regarding the Romanian climate has mainly been made on mean state and at station scale, a more quantitative approach of precipitation extremes is proposed. In this respect, to have a more reliable comparison with observation, a high resolution daily precipitation gridded data set was used as observational reference (CLIMHYDEX project). The comparison between the RCM outputs and observed grid point values has been made by calculating three extremes precipitation indices, recommended by the Expert Team on Climate Change Detection Indices (ETCCDI), for the 1976-2005 period: R10MM, annual count of days when precipitation ≥10mm; RX5DAY, annual maximum 5-day precipitation and R95P%, precipitation fraction of annual total precipitation due to daily precipitation > 95th percentile. The RCMs capability to reproduce the mean state for these variables, as well as the main modes of their spatial variability (given by the first three EOF patterns), are analysed. The investigation confirms the ability of RCMs to simulate the main features of the precipitation extreme variability over Romania, but some deficiencies in reproducing of their regional characteristics were found (for example, overestimation of the mea state, especially over the extra Carpathian regions). This work has been realised within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian

  7. Failure of Stadard Optical Models to Reproduce Neutron Total Cross Section Difference in the W Isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, J D; Bauer, R W; Dietrich, F S; Grimes, S M; Finlay, R W; Abfalterer, W P; Bateman, F B; Haight, R C; Morgan, G L; Bauge, E; Delaroche, J P; Romain, P

    2001-11-01

    Recently cross section differences among the isotopes{sup 182,184,186}W have been measured as part of a study of total cross sections in the 5-560 MeV energy range. These measurements show oscillations up to 150 mb between 5 and 100 MeV. Spherical and deformed phenomenological optical potentials with typical radial and isospin dependences show very small oscillations, in disagreement with the data. In a simple Ramsauer model, this discrepancy can be traced to a cancellation between radial and isospin effects. Understanding this problem requires a more detailed model that incorporates a realistic description of the neutron and proton density distributions. This has been done with results of Hartree-Fock-Bogolyubov calculations using the Gogny force, together with a microscopic folding model employing a modification of the JLM potential as an effective interaction. This treatment yields a satisfactory interpretation of the observed total cross section differences.

  8. A simple branching model that reproduces language family and language population distributions

    Science.gov (United States)

    Schwämmle, Veit; de Oliveira, Paulo Murilo Castro

    2009-07-01

    Human history leaves fingerprints in human languages. Little is known about language evolution and its study is of great importance. Here we construct a simple stochastic model and compare its results to statistical data of real languages. The model is based on the recent finding that language changes occur independently of the population size. We find agreement with the data additionally assuming that languages may be distinguished by having at least one among a finite, small number of different features. This finite set is also used in order to define the distance between two languages, similarly to linguistics tradition since Swadesh.

  9. Contrasting response to nutrient manipulation in Arctic mesocosms are reproduced by a minimum microbial food web model.

    Science.gov (United States)

    Larsen, Aud; Egge, Jorun K; Nejstgaard, Jens C; Di Capua, Iole; Thyrhaug, Runar; Bratbak, Gunnar; Thingstad, T Frede

    2015-03-01

    A minimum mathematical model of the marine pelagic microbial food web has previously shown to be able to reproduce central aspects of observed system response to different bottom-up manipulations in a mesocosm experiment Microbial Ecosystem Dynamics (MEDEA) in Danish waters. In this study, we apply this model to two mesocosm experiments (Polar Aquatic Microbial Ecology (PAME)-I and PAME-II) conducted at the Arctic location Kongsfjorden, Svalbard. The different responses of the microbial community to similar nutrient manipulation in the three mesocosm experiments may be described as diatom-dominated (MEDEA), bacteria-dominated (PAME-I), and flagellated-dominated (PAME-II). When allowing ciliates to be able to feed on small diatoms, the model describing the diatom-dominated MEDEA experiment give a bacteria-dominated response as observed in PAME I in which the diatom community comprised almost exclusively small-sized cells. Introducing a high initial mesozooplankton stock as observed in PAME-II, the model gives a flagellate-dominated response in accordance with the observed response also of this experiment. The ability of the model originally developed for temperate waters to reproduce population dynamics in a 10°C colder Arctic fjord, does not support the existence of important shifts in population balances over this temperature range. Rather, it suggests a quite resilient microbial food web when adapted to in situ temperature. The sensitivity of the model response to its mesozooplankton component suggests, however, that the seasonal vertical migration of Arctic copepods may be a strong forcing factor on Arctic microbial food webs.

  10. The ability of a GCM-forced hydrological model to reproduce global discharge variability

    NARCIS (Netherlands)

    Sperna Weiland, F.C.; Beek, L.P.H. van; Kwadijk, J.C.J.; Bierkens, M.F.P.

    2010-01-01

    Data from General Circulation Models (GCMs) are often used to investigate hydrological impacts of climate change. However GCM data are known to have large biases, especially for precipitation. In this study the usefulness of GCM data for hydrological studies, with focus on discharge variability

  11. Establishing a Reproducible Hypertrophic Scar following Thermal Injury: A Porcine Model

    Directory of Open Access Journals (Sweden)

    Scott J. Rapp, MD

    2015-02-01

    Conclusions: Deep partial-thickness thermal injury to the back of domestic swine produces an immature hypertrophic scar by 10 weeks following burn with thickness appearing to coincide with the location along the dorsal axis. With minimal pig to pig variation, we describe our technique to provide a testable immature scar model.

  12. Reproducibility of a novel model of murine asthma-like pulmonary inflammation.

    Science.gov (United States)

    McKinley, L; Kim, J; Bolgos, G L; Siddiqui, J; Remick, D G

    2004-05-01

    Sensitization to cockroach allergens (CRA) has been implicated as a major cause of asthma, especially among inner-city populations. Endotoxin from Gram-negative bacteria has also been investigated for its role in attenuating or exacerbating the asthmatic response. We have created a novel model utilizing house dust extract (HDE) containing high levels of both CRA and endotoxin to induce pulmonary inflammation (PI) and airway hyperresponsiveness (AHR). A potential drawback of this model is that the HDE is in limited supply and preparation of new HDE will not contain the exact components of the HDE used to define our model system. The present study involved testing HDEs collected from various homes for their ability to cause PI and AHR. Dust collected from five homes was extracted in phosphate buffered saline overnight. The levels of CRA and endotoxin in the supernatants varied from 7.1 to 49.5 mg/ml of CRA and 1.7-6 micro g/ml of endotoxin in the HDEs. Following immunization and two pulmonary exposures to HDE all five HDEs induced AHR, PI and plasma IgE levels substantially higher than normal mice. This study shows that HDE containing high levels of cockroach allergens and endotoxin collected from different sources can induce an asthma-like response in our murine model.

  13. Energy and nutrient deposition and excretion in the reproducing sow: model development and evaluation

    DEFF Research Database (Denmark)

    Hansen, A V; Strathe, A B; Theil, Peter Kappel

    2014-01-01

    requirements for maintenance, and fetal and maternal growth were described. In the lactating module, a factorial approach was used to estimate requirements for maintenance, milk production, and maternal growth. The priority for nutrient partitioning was assumed to be in the order of maintenance, milk...... production, and maternal growth with body tissue losses constrained within biological limits. Global sensitivity analysis showed that nonlinearity in the parameters was small. The model outputs considered were the total protein and fat deposition, average urinary and fecal N excretion, average methane...... emission, manure carbon excretion, and manure production. The model was evaluated using independent data sets from the literature using root mean square prediction error (RMSPE) and concordance correlation coefficients. The gestation module predicted body fat gain better than body protein gain, which...

  14. Evaluation of Nitinol staples for the Lapidus arthrodesis in a reproducible biomechanical model

    Directory of Open Access Journals (Sweden)

    Nicholas Alexander Russell

    2015-12-01

    Full Text Available While the Lapidus procedure is a widely accepted technique for treatment of hallux valgus, the optimal fixation method to maintain joint stability remains controversial. The purpose of this study was to evaluate the biomechanical properties of new Shape Memory Alloy staples arranged in different configurations in a repeatable 1st Tarsometatarsal arthrodesis model. Ten sawbones models of the whole foot (n=5 per group were reconstructed using a single dorsal staple or two staples in a delta configuration. Each construct was mechanically tested in dorsal four-point bending, medial four-point bending, dorsal three-point bending and plantar cantilever bending with the staples activated at 37°C. The peak load, stiffness and plantar gapping were determined for each test. Pressure sensors were used to measure the contact force and area of the joint footprint in each group. There was a significant (p < 0.05 increase in peak load in the two staple constructs compared to the single staple constructs for all testing modalities. Stiffness also increased significantly in all tests except dorsal four-point bending. Pressure sensor readings showed a significantly higher contact force at time zero and contact area following loading in the two staple constructs (p < 0.05. Both groups completely recovered any plantar gapping following unloading and restored their initial contact footprint. The biomechanical integrity and repeatability of the models was demonstrated with no construct failures due to hardware or model breakdown. Shape memory alloy staples provide fixation with the ability to dynamically apply and maintain compression across a simulated arthrodesis following a range of loading conditions.

  15. Evaluation of Nitinol Staples for the Lapidus Arthrodesis in a Reproducible Biomechanical Model.

    Science.gov (United States)

    Russell, Nicholas A; Regazzola, Gianmarco; Aiyer, Amiethab; Nomura, Tomohiro; Pelletier, Matthew H; Myerson, Mark; Walsh, William R

    2015-01-01

    While the Lapidus procedure is a widely accepted technique for treatment of hallux valgus, the optimal fixation method to maintain joint stability remains controversial. The purpose of this study is to evaluate the biomechanical properties of new shape memory alloy (SMA) staples arranged in different configurations in a repeatable first tarsometatarsal arthrodesis model. Ten sawbones models of the whole foot (n = 5 per group) were reconstructed using a single dorsal staple or two staples in a delta configuration. Each construct was mechanically tested non-destructively in dorsal four-point bending, medial four-point bending, dorsal three-point bending, and plantar cantilever bending with the staples activated at 37°C. The peak load (newton), stiffness (newton per millimeter), and plantar gapping (millimeter) were determined for each test. Pressure sensors were used to measure the contact force and area of the joint footprint in each group. There was a statistically significant increase in peak load in the two staple constructs compared to the single staple constructs for all testing modalities with P values range from 0.016 to 0.000. Stiffness also increased significantly in all tests except dorsal four-point bending. Pressure sensor readings showed a significantly higher contact force at time zero (P = 0.037) and contact area following loading in the two staple constructs (P = 0.045). Both groups completely recovered any plantar gapping following unloading and restored their initial contact footprint. The biomechanical integrity and repeatability of the models was demonstrated with no construct failures due to hardware or model breakdown. SMA staples provide fixation with the ability to dynamically apply and maintain compression across a simulated arthrodesis following a range of loading conditions.

  16. Can lagrangian models reproduce the migration time of European eel obtained from otolith analysis?

    Science.gov (United States)

    Rodríguez-Díaz, L.; Gómez-Gesteira, M.

    2017-12-01

    European eel can be found at the Bay of Biscay after a long migration across the Atlantic. The duration of migration, which takes place at larval stage, is of primary importance to understand eel ecology and, hence, its survival. This duration is still a controversial matter since it can range from 7 months to > 4 years depending on the method to estimate duration. The minimum migration duration estimated from our lagrangian model is similar to the duration obtained from the microstructure of eel otoliths, which is typically on the order of 7-9 months. The lagrangian model showed to be sensitive to different conditions like spatial and time resolution, release depth, release area and initial distribution. In general, migration showed to be faster when decreasing the depth and increasing the resolution of the model. In average, the fastest migration was obtained when only advective horizontal movement was considered. However, faster migration was even obtained in some cases when locally oriented random migration was taken into account.

  17. Reproducibility of the heat/capsaicin skin sensitization model in healthy volunteers

    Directory of Open Access Journals (Sweden)

    Cavallone LF

    2013-11-01

    Full Text Available Laura F Cavallone,1 Karen Frey,1 Michael C Montana,1 Jeremy Joyal,1 Karen J Regina,1 Karin L Petersen,2 Robert W Gereau IV11Department of Anesthesiology, Washington University in St Louis, School of Medicine, St Louis, MO, USA; 2California Pacific Medical Center Research Institute, San Francisco, CA, USAIntroduction: Heat/capsaicin skin sensitization is a well-characterized human experimental model to induce hyperalgesia and allodynia. Using this model, gabapentin, among other drugs, was shown to significantly reduce cutaneous hyperalgesia compared to placebo. Since the larger thermal probes used in the original studies to produce heat sensitization are now commercially unavailable, we decided to assess whether previous findings could be replicated with a currently available smaller probe (heated area 9 cm2 versus 12.5–15.7 cm2.Study design and methods: After Institutional Review Board approval, 15 adult healthy volunteers participated in two study sessions, scheduled 1 week apart (Part A. In both sessions, subjects were exposed to the heat/capsaicin cutaneous sensitization model. Areas of hypersensitivity to brush stroke and von Frey (VF filament stimulation were measured at baseline and after rekindling of skin sensitization. Another group of 15 volunteers was exposed to an identical schedule and set of sensitization procedures, but, in each session, received either gabapentin or placebo (Part B.Results: Unlike previous reports, a similar reduction of areas of hyperalgesia was observed in all groups/sessions. Fading of areas of hyperalgesia over time was observed in Part A. In Part B, there was no difference in area reduction after gabapentin compared to placebo.Conclusion: When using smaller thermal probes than originally proposed, modifications of other parameters of sensitization and/or rekindling process may be needed to allow the heat/capsaicin sensitization protocol to be used as initially intended. Standardization and validation of

  18. A computational model incorporating neural stem cell dynamics reproduces glioma incidence across the lifespan in the human population.

    Directory of Open Access Journals (Sweden)

    Roman Bauer

    Full Text Available Glioma is the most common form of primary brain tumor. Demographically, the risk of occurrence increases until old age. Here we present a novel computational model to reproduce the probability of glioma incidence across the lifespan. Previous mathematical models explaining glioma incidence are framed in a rather abstract way, and do not directly relate to empirical findings. To decrease this gap between theory and experimental observations, we incorporate recent data on cellular and molecular factors underlying gliomagenesis. Since evidence implicates the adult neural stem cell as the likely cell-of-origin of glioma, we have incorporated empirically-determined estimates of neural stem cell number, cell division rate, mutation rate and oncogenic potential into our model. We demonstrate that our model yields results which match actual demographic data in the human population. In particular, this model accounts for the observed peak incidence of glioma at approximately 80 years of age, without the need to assert differential susceptibility throughout the population. Overall, our model supports the hypothesis that glioma is caused by randomly-occurring oncogenic mutations within the neural stem cell population. Based on this model, we assess the influence of the (experimentally indicated decrease in the number of neural stem cells and increase of cell division rate during aging. Our model provides multiple testable predictions, and suggests that different temporal sequences of oncogenic mutations can lead to tumorigenesis. Finally, we conclude that four or five oncogenic mutations are sufficient for the formation of glioma.

  19. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  20. Global qualitative analysis of a quartic ecological model

    NARCIS (Netherlands)

    Broer, Hendrik; Gaiko, Valery A.

    2010-01-01

    in this paper we complete the global qualitative analysis of a quartic ecological model. In particular, studying global bifurcations of singular points and limit cycles, we prove that the corresponding dynamical system has at most two limit cycles. (C) 2009 Elsevier Ltd. All rights reserved.

  1. Realizing the Living Paper using the ProvONE Model for Reproducible Research

    Science.gov (United States)

    Jones, M. B.; Jones, C. S.; Ludäscher, B.; Missier, P.; Walker, L.; Slaughter, P.; Schildhauer, M.; Cuevas-Vicenttín, V.

    2015-12-01

    Science has advanced through traditional publications that codify research results as a permenant part of the scientific record. But because publications are static and atomic, researchers can only cite and reference a whole work when building on prior work of colleagues. The open source software model has demonstrated a new approach in which strong version control in an open environment can nurture an open ecosystem of software. Developers now commonly fork and extend software giving proper credit, with less repetition, and with confidence in the relationship to original software. Through initiatives like 'Beyond the PDF', an analogous model has been imagined for open science, in which software, data, analyses, and derived products become first class objects within a publishing ecosystem that has evolved to be finer-grained and is realized through a web of linked open data. We have prototyped a Living Paper concept by developing the ProvONE provenance model for scientific workflows, with prototype deployments in DataONE. ProvONE promotes transparency and openness by describing the authenticity, origin, structure, and processing history of research artifacts and by detailing the steps in computational workflows that produce derived products. To realize the Living Paper, we decompose scientific papers into their constituent products and publish these as compound objects in the DataONE federation of archival repositories. Each individual finding and sub-product of a reseach project (such as a derived data table, a workflow or script, a figure, an image, or a finding) can be independently stored, versioned, and cited. ProvONE provenance traces link these fine-grained products within and across versions of a paper, and across related papers that extend an original analysis. This allows for open scientific publishing in which researchers extend and modify findings, creating a dynamic, evolving web of results that collectively represent the scientific enterprise. The

  2. Qualitative mechanism models and the rationalization of procedures

    Science.gov (United States)

    Farley, Arthur M.

    1989-01-01

    A qualitative, cluster-based approach to the representation of hydraulic systems is described and its potential for generating and explaining procedures is demonstrated. Many ideas are formalized and implemented as part of an interactive, computer-based system. The system allows for designing, displaying, and reasoning about hydraulic systems. The interactive system has an interface consisting of three windows: a design/control window, a cluster window, and a diagnosis/plan window. A qualitative mechanism model for the ORS (Orbital Refueling System) is presented to coordinate with ongoing research on this system being conducted at NASA Ames Research Center.

  3. A discrete particle model reproducing collective dynamics of a bee swarm.

    Science.gov (United States)

    Bernardi, Sara; Colombi, Annachiara; Scianna, Marco

    2018-02-01

    In this article, we present a microscopic discrete mathematical model describing collective dynamics of a bee swarm. More specifically, each bee is set to move according to individual strategies and social interactions, the former involving the desire to reach a target destination, the latter accounting for repulsive/attractive stimuli and for alignment processes. The insects tend in fact to remain sufficiently close to the rest of the population, while avoiding collisions, and they are able to track and synchronize their movement to the flight of a given set of neighbors within their visual field. The resulting collective behavior of the bee cloud therefore emerges from non-local short/long-range interactions. Differently from similar approaches present in the literature, we here test different alignment mechanisms (i.e., based either on an Euclidean or on a topological neighborhood metric), which have an impact also on the other social components characterizing insect behavior. A series of numerical realizations then shows the phenomenology of the swarm (in terms of pattern configuration, collective productive movement, and flight synchronization) in different regions of the space of free model parameters (i.e., strength of attractive/repulsive forces, extension of the interaction regions). In this respect, constraints in the possible variations of such coefficients are here given both by reasonable empirical observations and by analytical results on some stability characteristics of the defined pairwise interaction kernels, which have to assure a realistic crystalline configuration of the swarm. An analysis of the effect of unconscious random fluctuations of bee dynamics is also provided. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Acute multi-sgRNA knockdown of KEOPS complex genes reproduces the microcephaly phenotype of the stable knockout zebrafish model.

    Directory of Open Access Journals (Sweden)

    Tilman Jobst-Schwan

    Full Text Available Until recently, morpholino oligonucleotides have been widely employed in zebrafish as an acute and efficient loss-of-function assay. However, off-target effects and reproducibility issues when compared to stable knockout lines have compromised their further use. Here we employed an acute CRISPR/Cas approach using multiple single guide RNAs targeting simultaneously different positions in two exemplar genes (osgep or tprkb to increase the likelihood of generating mutations on both alleles in the injected F0 generation and to achieve a similar effect as morpholinos but with the reproducibility of stable lines. This multi single guide RNA approach resulted in median likelihoods for at least one mutation on each allele of >99% and sgRNA specific insertion/deletion profiles as revealed by deep-sequencing. Immunoblot showed a significant reduction for Osgep and Tprkb proteins. For both genes, the acute multi-sgRNA knockout recapitulated the microcephaly phenotype and reduction in survival that we observed previously in stable knockout lines, though milder in the acute multi-sgRNA knockout. Finally, we quantify the degree of mutagenesis by deep sequencing, and provide a mathematical model to quantitate the chance for a biallelic loss-of-function mutation. Our findings can be generalized to acute and stable CRISPR/Cas targeting for any zebrafish gene of interest.

  5. The diverse broad-band light-curves of Swift GRBs reproduced with the cannonball model

    CERN Document Server

    Dado, Shlomo; De Rújula, A

    2009-01-01

    Two radiation mechanisms, inverse Compton scattering (ICS) and synchrotron radiation (SR), suffice within the cannonball (CB) model of long gamma ray bursts (LGRBs) and X-ray flashes (XRFs) to provide a very simple and accurate description of their observed prompt emission and afterglows. Simple as they are, the two mechanisms and the burst environment generate the rich structure of the light curves at all frequencies and times. This is demonstrated for 33 selected Swift LGRBs and XRFs, which are well sampled from early time until late time and well represent the entire diversity of the broad band light curves of Swift LGRBs and XRFs. Their prompt gamma-ray and X-ray emission is dominated by ICS of glory light. During their fast decline phase, ICS is taken over by SR which dominates their broad band afterglow. The pulse shape and spectral evolution of the gamma-ray peaks and the early-time X-ray flares, and even the delayed optical `humps' in XRFs, are correctly predicted. The canonical and non-canonical X-ra...

  6. From qualitative reasoning models to Bayesian-based learner modeling

    NARCIS (Netherlands)

    Milošević, U.; Bredeweg, B.; de Kleer, J.; Forbus, K.D.

    2010-01-01

    Assessing the knowledge of a student is a fundamental part of intelligent learning environments. We present a Bayesian network based approach to dealing with uncertainty when estimating a learner’s state of knowledge in the context of Qualitative Reasoning (QR). A proposal for a global architecture

  7. Efficient and reproducible myogenic differentiation from human iPS cells: prospects for modeling Miyoshi Myopathy in vitro.

    Directory of Open Access Journals (Sweden)

    Akihito Tanaka

    Full Text Available The establishment of human induced pluripotent stem cells (hiPSCs has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1 in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70-90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF. These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs.

  8. Efficient and Reproducible Myogenic Differentiation from Human iPS Cells: Prospects for Modeling Miyoshi Myopathy In Vitro

    Science.gov (United States)

    Tanaka, Akihito; Woltjen, Knut; Miyake, Katsuya; Hotta, Akitsu; Ikeya, Makoto; Yamamoto, Takuya; Nishino, Tokiko; Shoji, Emi; Sehara-Fujisawa, Atsuko; Manabe, Yasuko; Fujii, Nobuharu; Hanaoka, Kazunori; Era, Takumi; Yamashita, Satoshi; Isobe, Ken-ichi; Kimura, En; Sakurai, Hidetoshi

    2013-01-01

    The establishment of human induced pluripotent stem cells (hiPSCs) has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1) in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70–90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM) is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF). These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs. PMID:23626698

  9. TU-AB-BRC-05: Creation of a Monte Carlo TrueBeam Model by Reproducing Varian Phase Space Data

    International Nuclear Information System (INIS)

    O’Grady, K; Davis, S; Seuntjens, J

    2016-01-01

    Purpose: To create a Varian TrueBeam 6 MV FFF Monte Carlo model using BEAMnrc/EGSnrc that accurately reproduces the Varian representative dataset, followed by tuning the model’s source parameters to accurately reproduce in-house measurements. Methods: A BEAMnrc TrueBeam model for 6 MV FFF has been created by modifying a validated 6 MV Varian CL21EX model. Geometric dimensions and materials were adjusted in a trial and error approach to match the fluence and spectra of TrueBeam phase spaces output by the Varian VirtuaLinac. Once the model’s phase space matched Varian’s counterpart using the default source parameters, it was validated to match 10 × 10 cm"2 Varian representative data obtained with the IBA CC13. The source parameters were then tuned to match in-house 5 × 5 cm"2 PTW microDiamond measurements. All dose to water simulations included detector models to include the effects of volume averaging and the non-water equivalence of the chamber materials, allowing for more accurate source parameter selection. Results: The Varian phase space spectra and fluence were matched with excellent agreement. The in-house model’s PDD agreement with CC13 TrueBeam representative data was within 0.9% local percent difference beyond the first 3 mm. Profile agreement at 10 cm depth was within 0.9% local percent difference and 1.3 mm distance-to-agreement in the central axis and penumbra regions, respectively. Once the source parameters were tuned, PDD agreement with microDiamond measurements was within 0.9% local percent difference beyond 2 mm. The microDiamond profile agreement at 10 cm depth was within 0.6% local percent difference and 0.4 mm distance-to-agreement in the central axis and penumbra regions, respectively. Conclusion: An accurate in-house Monte Carlo model of the Varian TrueBeam was achieved independently of the Varian phase space solution and was tuned to in-house measurements. KO acknowledges partial support by the CREATE Medical Physics Research

  10. Can CFMIP2 models reproduce the leading modes of cloud vertical structure in the CALIPSO-GOCCP observations?

    Science.gov (United States)

    Wang, Fang; Yang, Song

    2018-02-01

    Using principal component (PC) analysis, three leading modes of cloud vertical structure (CVS) are revealed by the GCM-Oriented CALIPSO Cloud Product (GOCCP), i.e. tropical high, subtropical anticyclonic and extratropical cyclonic cloud modes (THCM, SACM and ECCM, respectively). THCM mainly reflect the contrast between tropical high clouds and clouds in middle/high latitudes. SACM is closely associated with middle-high clouds in tropical convective cores, few-cloud regimes in subtropical anticyclonic clouds and stratocumulus over subtropical eastern oceans. ECCM mainly corresponds to clouds along extratropical cyclonic regions. Models of phase 2 of Cloud Feedback Model Intercomparison Project (CFMIP2) well reproduce the THCM, but SACM and ECCM are generally poorly simulated compared to GOCCP. Standardized PCs corresponding to CVS modes are generally captured, whereas original PCs (OPCs) are consistently underestimated (overestimated) for THCM (SACM and ECCM) by CFMIP2 models. The effects of CVS modes on relative cloud radiative forcing (RSCRF/RLCRF) (RSCRF being calculated at the surface while RLCRF at the top of atmosphere) are studied in terms of principal component regression method. Results show that CFMIP2 models tend to overestimate (underestimated or simulate the opposite sign) RSCRF/RLCRF radiative effects (REs) of ECCM (THCM and SACM) in unit global mean OPC compared to observations. These RE biases may be attributed to two factors, one of which is underestimation (overestimation) of low/middle clouds (high clouds) (also known as stronger (weaker) REs in unit low/middle (high) clouds) in simulated global mean cloud profiles, the other is eigenvector biases in CVS modes (especially for SACM and ECCM). It is suggested that much more attention should be paid on improvement of CVS, especially cloud parameterization associated with particular physical processes (e.g. downwelling regimes with the Hadley circulation, extratropical storm tracks and others), which

  11. Reproducing Electric Field Observations during Magnetic Storms by means of Rigorous 3-D Modelling and Distortion Matrix Co-estimation

    Science.gov (United States)

    Püthe, Christoph; Manoj, Chandrasekharan; Kuvshinov, Alexey

    2015-04-01

    Electric fields induced in the conducting Earth during magnetic storms drive currents in power transmission grids, telecommunication lines or buried pipelines. These geomagnetically induced currents (GIC) can cause severe service disruptions. The prediction of GIC is thus of great importance for public and industry. A key step in the prediction of the hazard to technological systems during magnetic storms is the calculation of the geoelectric field. To address this issue for mid-latitude regions, we developed a method that involves 3-D modelling of induction processes in a heterogeneous Earth and the construction of a model of the magnetospheric source. The latter is described by low-degree spherical harmonics; its temporal evolution is derived from observatory magnetic data. Time series of the electric field can be computed for every location on Earth's surface. The actual electric field however is known to be perturbed by galvanic effects, arising from very local near-surface heterogeneities or topography, which cannot be included in the conductivity model. Galvanic effects are commonly accounted for with a real-valued time-independent distortion matrix, which linearly relates measured and computed electric fields. Using data of various magnetic storms that occurred between 2000 and 2003, we estimated distortion matrices for observatory sites onshore and on the ocean bottom. Strong correlations between modellings and measurements validate our method. The distortion matrix estimates prove to be reliable, as they are accurately reproduced for different magnetic storms. We further show that 3-D modelling is crucial for a correct separation of galvanic and inductive effects and a precise prediction of electric field time series during magnetic storms. Since the required computational resources are negligible, our approach is suitable for a real-time prediction of GIC. For this purpose, a reliable forecast of the source field, e.g. based on data from satellites

  12. A qualitatively validated mathematical-computational model of the immune response to the yellow fever vaccine.

    Science.gov (United States)

    Bonin, Carla R B; Fernandes, Guilherme C; Dos Santos, Rodrigo W; Lobosco, Marcelo

    2018-05-25

    Although a safe and effective yellow fever vaccine was developed more than 80 years ago, several issues regarding its use remain unclear. For example, what is the minimum dose that can provide immunity against the disease? A useful tool that can help researchers answer this and other related questions is a computational simulator that implements a mathematical model describing the human immune response to vaccination against yellow fever. This work uses a system of ten ordinary differential equations to represent a few important populations in the response process generated by the body after vaccination. The main populations include viruses, APCs, CD8+ T cells, short-lived and long-lived plasma cells, B cells and antibodies. In order to qualitatively validate our model, four experiments were carried out, and their computational results were compared to experimental data obtained from the literature. The four experiments were: a) simulation of a scenario in which an individual was vaccinated against yellow fever for the first time; b) simulation of a booster dose ten years after the first dose; c) simulation of the immune response to the yellow fever vaccine in individuals with different levels of naïve CD8+ T cells; and d) simulation of the immune response to distinct doses of the yellow fever vaccine. This work shows that the simulator was able to qualitatively reproduce some of the experimental results reported in the literature, such as the amount of antibodies and viremia throughout time, as well as to reproduce other behaviors of the immune response reported in the literature, such as those that occur after a booster dose of the vaccine.

  13. Eccentric Contraction-Induced Muscle Injury: Reproducible, Quantitative, Physiological Models to Impair Skeletal Muscle's Capacity to Generate Force.

    Science.gov (United States)

    Call, Jarrod A; Lowe, Dawn A

    2016-01-01

    In order to investigate the molecular and cellular mechanisms of muscle regeneration an experimental injury model is required. Advantages of eccentric contraction-induced injury are that it is a controllable, reproducible, and physiologically relevant model to cause muscle injury, with injury being defined as a loss of force generating capacity. While eccentric contractions can be incorporated into conscious animal study designs such as downhill treadmill running, electrophysiological approaches to elicit eccentric contractions and examine muscle contractility, for example before and after the injurious eccentric contractions, allows researchers to circumvent common issues in determining muscle function in a conscious animal (e.g., unwillingness to participate). Herein, we describe in vitro and in vivo methods that are reliable, repeatable, and truly maximal because the muscle contractions are evoked in a controlled, quantifiable manner independent of subject motivation. Both methods can be used to initiate eccentric contraction-induced injury and are suitable for monitoring functional muscle regeneration hours to days to weeks post-injury.

  14. Recruiting Transcultural Qualitative Research Participants: A Conceptual Model

    Directory of Open Access Journals (Sweden)

    Phyllis Eide

    2005-06-01

    Full Text Available Working with diverse populations poses many challenges to the qualitative researcher who is a member of the dominant culture. Traditional methods of recruitment and selection (such as flyers and advertisements are often unproductive, leading to missed contributions from potential participants who were not recruited and researcher frustration. In this article, the authors explore recruitment issues related to the concept of personal knowing based on experiences with Aboriginal Hawai'ian and Micronesian populations, wherein knowing and being known are crucial to successful recruitment of participants. They present a conceptual model that incorporates key concepts of knowing the other, cultural context, and trust to guide other qualitative transcultural researchers. They also describe challenges, implications, and concrete suggestions for recruitment of participants.

  15. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    Directory of Open Access Journals (Sweden)

    Sonal eGoswami

    2012-06-01

    Full Text Available Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e. that require the hippocampus or not the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze. Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma, PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  16. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome.

    Science.gov (United States)

    Goswami, Sonal; Samuel, Sherin; Sierra, Olga R; Cascardi, Michele; Paré, Denis

    2012-01-01

    Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD) remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e., that require the hippocampus or not) the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze (EPM). Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  17. Reproducing the organic matter model of anthropogenic dark earth of Amazonia and testing the ecotoxicity of functionalized charcoal compounds

    Directory of Open Access Journals (Sweden)

    Carolina Rodrigues Linhares

    2012-05-01

    Full Text Available The objective of this work was to obtain organic compounds similar to the ones found in the organic matter of anthropogenic dark earth of Amazonia (ADE using a chemical functionalization procedure on activated charcoal, as well as to determine their ecotoxicity. Based on the study of the organic matter from ADE, an organic model was proposed and an attempt to reproduce it was described. Activated charcoal was oxidized with the use of sodium hypochlorite at different concentrations. Nuclear magnetic resonance was performed to verify if the spectra of the obtained products were similar to the ones of humic acids from ADE. The similarity between spectra indicated that the obtained products were polycondensed aromatic structures with carboxyl groups: a soil amendment that can contribute to soil fertility and to its sustainable use. An ecotoxicological test with Daphnia similis was performed on the more soluble fraction (fulvic acids of the produced soil amendment. Aryl chloride was formed during the synthesis of the organic compounds from activated charcoal functionalization and partially removed through a purification process. However, it is probable that some aryl chloride remained in the final product, since the ecotoxicological test indicated that the chemical functionalized soil amendment is moderately toxic.

  18. Coupled RipCAS-DFLOW (CoRD) Software and Data Management System for Reproducible Floodplain Vegetation Succession Modeling

    Science.gov (United States)

    Turner, M. A.; Miller, S.; Gregory, A.; Cadol, D. D.; Stone, M. C.; Sheneman, L.

    2016-12-01

    We present the Coupled RipCAS-DFLOW (CoRD) modeling system created to encapsulate the workflow to analyze the effects of stream flooding on vegetation succession. CoRD provides an intuitive command-line and web interface to run DFLOW and RipCAS in succession over many years automatically, which is a challenge because, for our application, DFLOW must be run on a supercomputing cluster via the PBS job scheduler. RipCAS is a vegetation succession model, and DFLOW is a 2D open channel flow model. Data adaptors have been developed to seamlessly connect DFLOW output data to be RipCAS inputs, and vice-versa. CoRD provides automated statistical analysis and visualization, plus automatic syncing of input and output files and model run metadata to the hydrological data management system HydroShare using its excellent Python REST client. This combination of technologies and data management techniques allows the results to be shared with collaborators and eventually published. Perhaps most importantly, it allows results to be easily reproduced via either the command-line or web user interface. This system is a result of collaboration between software developers and hydrologists participating in the Western Consortium for Watershed Analysis, Visualization, and Exploration (WC-WAVE). Because of the computing-intensive nature of this particular workflow, including automating job submission/monitoring and data adaptors, software engineering expertise is required. However, the hydrologists provide the software developers with a purpose and ensure a useful, intuitive tool is developed. Our hydrologists contribute software, too: RipCAS was developed from scratch by hydrologists on the team as a specialized, open-source version of the Computer Aided Simulation Model for Instream Flow and Riparia (CASiMiR) vegetation model; our hydrologists running DFLOW provided numerous examples and help with the supercomputing system. This project is written in Python, a popular language in the

  19. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  20. Reproducibility and accuracy of linear measurements on dental models derived from cone-beam computed tomography compared with digital dental casts

    NARCIS (Netherlands)

    Waard, O. de; Rangel, F.A.; Fudalej, P.S.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Breuning, K.H.

    2014-01-01

    INTRODUCTION: The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models

  1. A computational model for histone mark propagation reproduces the distribution of heterochromatin in different human cell types.

    Science.gov (United States)

    Schwämmle, Veit; Jensen, Ole Nørregaard

    2013-01-01

    Chromatin is a highly compact and dynamic nuclear structure that consists of DNA and associated proteins. The main organizational unit is the nucleosome, which consists of a histone octamer with DNA wrapped around it. Histone proteins are implicated in the regulation of eukaryote genes and they carry numerous reversible post-translational modifications that control DNA-protein interactions and the recruitment of chromatin binding proteins. Heterochromatin, the transcriptionally inactive part of the genome, is densely packed and contains histone H3 that is methylated at Lys 9 (H3K9me). The propagation of H3K9me in nucleosomes along the DNA in chromatin is antagonizing by methylation of H3 Lysine 4 (H3K4me) and acetylations of several lysines, which is related to euchromatin and active genes. We show that the related histone modifications form antagonized domains on a coarse scale. These histone marks are assumed to be initiated within distinct nucleation sites in the DNA and to propagate bi-directionally. We propose a simple computer model that simulates the distribution of heterochromatin in human chromosomes. The simulations are in agreement with previously reported experimental observations from two different human cell lines. We reproduced different types of barriers between heterochromatin and euchromatin providing a unified model for their function. The effect of changes in the nucleation site distribution and of propagation rates were studied. The former occurs mainly with the aim of (de-)activation of single genes or gene groups and the latter has the power of controlling the transcriptional programs of entire chromosomes. Generally, the regulatory program of gene transcription is controlled by the distribution of nucleation sites along the DNA string.

  2. Reproducing multi-model ensemble average with Ensemble-averaged Reconstructed Forcings (ERF) in regional climate modeling

    Science.gov (United States)

    Erfanian, A.; Fomenko, L.; Wang, G.

    2016-12-01

    Multi-model ensemble (MME) average is considered the most reliable for simulating both present-day and future climates. It has been a primary reference for making conclusions in major coordinated studies i.e. IPCC Assessment Reports and CORDEX. The biases of individual models cancel out each other in MME average, enabling the ensemble mean to outperform individual members in simulating the mean climate. This enhancement however comes with tremendous computational cost, which is especially inhibiting for regional climate modeling as model uncertainties can originate from both RCMs and the driving GCMs. Here we propose the Ensemble-based Reconstructed Forcings (ERF) approach to regional climate modeling that achieves a similar level of bias reduction at a fraction of cost compared with the conventional MME approach. The new method constructs a single set of initial and boundary conditions (IBCs) by averaging the IBCs of multiple GCMs, and drives the RCM with this ensemble average of IBCs to conduct a single run. Using a regional climate model (RegCM4.3.4-CLM4.5), we tested the method over West Africa for multiple combination of (up to six) GCMs. Our results indicate that the performance of the ERF method is comparable to that of the MME average in simulating the mean climate. The bias reduction seen in ERF simulations is achieved by using more realistic IBCs in solving the system of equations underlying the RCM physics and dynamics. This endows the new method with a theoretical advantage in addition to reducing computational cost. The ERF output is an unaltered solution of the RCM as opposed to a climate state that might not be physically plausible due to the averaging of multiple solutions with the conventional MME approach. The ERF approach should be considered for use in major international efforts such as CORDEX. Key words: Multi-model ensemble, ensemble analysis, ERF, regional climate modeling

  3. Reproducibility of image quality for moving objects using respiratory-gated computed tomography. A study using a phantom model

    International Nuclear Information System (INIS)

    Fukumitsu, Nobuyoshi; Ishida, Masaya; Terunuma, Toshiyuki

    2012-01-01

    To investigate the reproducibility of computed tomography (CT) imaging quality in respiratory-gated radiation treatment planning is essential in radiotherapy of movable tumors. Seven series of regular and six series of irregular respiratory motions were performed using a thorax dynamic phantom. For the regular respiratory motions, the respiratory cycle was changed from 2.5 to 4 s and the amplitude was changed from 4 to 10 mm. For the irregular respiratory motions, a cycle of 2.5 to 4 or an amplitude of 4 to 10 mm was added to the base data (id est (i.e.) 3.5-s cycle, 6-mm amplitude) every three cycles. Images of the object were acquired six times using respiratory-gated data acquisition. The volume of the object was calculated and the reproducibility of the volume was decided based on the variety. The registered image of the object was added and the reproducibility of the shape was decided based on the degree of overlap of objects. The variety in the volumes and shapes differed significantly as the respiratory cycle changed according to regular respiratory motions. In irregular respiratory motion, shape reproducibility was further inferior, and the percentage of overlap among the six images was 35.26% in the 2.5- and 3.5-s cycle mixed group. Amplitude changes did not produce significant differences in the variety of the volumes and shapes. Respiratory cycle changes reduced the reproducibility of the image quality in respiratory-gated CT. (author)

  4. Universal free school breakfast: a qualitative model for breakfast behaviors

    Directory of Open Access Journals (Sweden)

    Louise eHarvey-Golding

    2015-06-01

    Full Text Available In recent years the provision of school breakfast has increased significantly in the UK. However, research examining the effectiveness of school breakfast is still within relative stages of infancy, and findings to date have been rather mixed. Moreover, previous evaluations of school breakfast schemes have been predominantly quantitative in their methodologies. Presently there are few qualitative studies examining the subjective perceptions and experiences of stakeholders, and thereby an absence of knowledge regarding the sociocultural impacts of school breakfast. The purpose of this study was to investigate the beliefs, views and attitudes, and breakfast consumption behaviors, among key stakeholders, served by a council-wide universal free school breakfast initiative, within the North West of England, UK. A sample of children, parents and school staff were recruited from three primary schools, participating in the universal free school breakfast scheme, to partake in semi-structured interviews and small focus groups. A Grounded Theory analysis of the data collected identified a theoretical model of breakfast behaviors, underpinned by the subjective perceptions and experiences of these key stakeholders. The model comprises of three domains relating to breakfast behaviors, and the internal and external factors that are perceived to influence breakfast behaviors, among children, parents and school staff. Findings were validated using triangulation methods, member checks and inter-rater reliability measures. In presenting this theoretically grounded model for breakfast behaviors, this paper provides a unique qualitative insight into the breakfast consumption behaviors and barriers to breakfast consumption, within a socioeconomically deprived community, participating in a universal free school breakfast intervention program.

  5. Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000

    Science.gov (United States)

    Sanchez-Gomez, Emilia; Somot, S.; Déqué, M.

    2009-10-01

    One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation.

  6. Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000

    Energy Technology Data Exchange (ETDEWEB)

    Somot, S.; Deque, M. [Meteo-France CNRM/GMGEC CNRS/GAME, Toulouse (France); Sanchez-Gomez, Emilia

    2009-10-15

    One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation. (orig.)

  7. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  8. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  9. Measurement of cerebral blood flow by intravenous xenon-133 technique and a mobile system. Reproducibility using the Obrist model compared to total curve analysis

    DEFF Research Database (Denmark)

    Schroeder, T; Holstein, P; Lassen, N A

    1986-01-01

    and side-to-side asymmetry. Data were analysed according to the Obrist model and the results compared with those obtained using a model correcting for the air passage artifact. Reproducibility was of the same order of magnitude as reported using stationary equipment. The side-to-side CBF asymmetry...... was considerably more reproducible than CBF level. Using a single detector instead of five regional values averaged as the hemispheric flow increased standard deviation of CBF level by 10-20%, while the variation in asymmetry was doubled. In optimal measuring conditions the two models revealed no significant...... differences, but in low flow situations the artifact model yielded significantly more stable results. The present apparatus, equipped with 3-5 detectors covering each hemisphere, offers the opportunity of performing serial CBF measurements in situations not otherwise feasible....

  10. Development of a Three-Dimensional Hand Model Using Three-Dimensional Stereophotogrammetry: Assessment of Image Reproducibility.

    Directory of Open Access Journals (Sweden)

    Inge A Hoevenaren

    Full Text Available Using three-dimensional (3D stereophotogrammetry precise images and reconstructions of the human body can be produced. Over the last few years, this technique is mainly being developed in the field of maxillofacial reconstructive surgery, creating fusion images with computed tomography (CT data for precise planning and prediction of treatment outcome. Though, in hand surgery 3D stereophotogrammetry is not yet being used in clinical settings.A total of 34 three-dimensional hand photographs were analyzed to investigate the reproducibility. For every individual, 3D photographs were captured at two different time points (baseline T0 and one week later T1. Using two different registration methods, the reproducibility of the methods was analyzed. Furthermore, the differences between 3D photos of men and women were compared in a distance map as a first clinical pilot testing our registration method.The absolute mean registration error for the complete hand was 1.46 mm. This reduced to an error of 0.56 mm isolating the region to the palm of the hand. When comparing hands of both sexes, it was seen that the male hand was larger (broader base and longer fingers than the female hand.This study shows that 3D stereophotogrammetry can produce reproducible images of the hand without harmful side effects for the patient, so proving to be a reliable method for soft tissue analysis. Its potential use in everyday practice of hand surgery needs to be further explored.

  11. Learning about Ecological Systems by Constructing Qualitative Models with DynaLearn

    Science.gov (United States)

    Leiba, Moshe; Zuzovsky, Ruth; Mioduser, David; Benayahu, Yehuda; Nachmias, Rafi

    2012-01-01

    A qualitative model of a system is an abstraction that captures ordinal knowledge and predicts the set of qualitatively possible behaviours of the system, given a qualitative description of its structure and initial state. This paper examines an innovative approach to science education using an interactive learning environment that supports…

  12. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  13. Reproducibility study of [{sup 18}F]FPP(RGD){sub 2} uptake in murine models of human tumor xenografts

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Edwin; Liu, Shuangdong; Chin, Frederick; Cheng, Zhen [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Gowrishankar, Gayatri; Yaghoubi, Shahriar [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Wedgeworth, James Patrick [Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Berndorff, Dietmar; Gekeler, Volker [Bayer Schering Pharma AG, Global Drug Discovery, Berlin (Germany); Gambhir, Sanjiv S. [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Canary Center at Stanford for Cancer Early Detection, Nuclear Medicine, Departments of Radiology and Bioengineering, Molecular Imaging Program at Stanford, Stanford, CA (United States)

    2011-04-15

    An {sup 18}F-labeled PEGylated arginine-glycine-aspartic acid (RGD) dimer [{sup 18}F]FPP(RGD){sub 2} has been used to image tumor {alpha}{sub v}{beta}{sub 3} integrin levels in preclinical and clinical studies. Serial positron emission tomography (PET) studies may be useful for monitoring antiangiogenic therapy response or for drug screening; however, the reproducibility of serial scans has not been determined for this PET probe. The purpose of this study was to determine the reproducibility of the integrin {alpha}{sub v}{beta}{sub 3}-targeted PET probe, [{sup 18}F ]FPP(RGD){sub 2} using small animal PET. Human HCT116 colon cancer xenografts were implanted into nude mice (n = 12) in the breast and scapular region and grown to mean diameters of 5-15 mm for approximately 2.5 weeks. A 3-min acquisition was performed on a small animal PET scanner approximately 1 h after administration of [{sup 18}F]FPP(RGD){sub 2} (1.9-3.8 MBq, 50-100 {mu}Ci) via the tail vein. A second small animal PET scan was performed approximately 6 h later after reinjection of the probe to assess for reproducibility. Images were analyzed by drawing an ellipsoidal region of interest (ROI) around the tumor xenograft activity. Percentage injected dose per gram (%ID/g) values were calculated from the mean or maximum activity in the ROIs. Coefficients of variation and differences in %ID/g values between studies from the same day were calculated to determine the reproducibility. The coefficient of variation (mean {+-}SD) for %ID{sub mean}/g and %ID{sub max}/g values between [{sup 18}F]FPP(RGD){sub 2} small animal PET scans performed 6 h apart on the same day were 11.1 {+-} 7.6% and 10.4 {+-} 9.3%, respectively. The corresponding differences in %ID{sub mean}/g and %ID{sub max}/g values between scans were -0.025 {+-} 0.067 and -0.039 {+-} 0.426. Immunofluorescence studies revealed a direct relationship between extent of {alpha}{sub {nu}}{beta}{sub 3} integrin expression in tumors and tumor vasculature

  14. 3D-modeling of the spine using EOS imaging system: Inter-reader reproducibility and reliability.

    Directory of Open Access Journals (Sweden)

    Johannes Rehm

    Full Text Available To retrospectively assess the interreader reproducibility and reliability of EOS 3D full spine reconstructions in patients with adolescent idiopathic scoliosis (AIS.73 patients with mean age of 17 years and a moderate AIS (median Cobb Angle 18.2° obtained low-dose standing biplanar radiographs with EOS. Two independent readers performed "full spine" 3D reconstructions of the spine with the "full-spine" method adjusting the bone contour of every thoracic and lumbar vertebra (Th1-L5. Interreader reproducibility was assessed regarding rotation of every single vertebra in the coronal (i.e. frontal, sagittal (i.e. lateral, and axial plane, T1/T12 kyphosis, T4/T12 kyphosis, L1/L5 lordosis, L1/S1 lordosis and pelvic parameters. Radiation exposure, scan-time and 3D reconstruction time were recorded.Interclass correlation (ICC ranged between 0.83 and 0.98 for frontal vertebral rotation, between 0.94 and 0.99 for lateral vertebral rotation and between 0.51 and 0.88 for axial vertebral rotation. ICC was 0.92 for T1/T12 kyphosis, 0.95 for T4/T12 kyphosis, 0.90 for L1/L5 lordosis, 0.85 for L1/S1 lordosis, 0.97 for pelvic incidence, 0.96 for sacral slope, 0.98 for sagittal pelvic tilt and 0.94 for lateral pelvic tilt. The mean time for reconstruction was 14.9 minutes (reader 1: 14.6 minutes, reader 2: 15.2 minutes, p<0.0001. The mean total absorbed dose was 593.4μGy ±212.3 per patient.EOS "full spine" 3D angle measurement of vertebral rotation proved to be reliable and was performed in an acceptable reconstruction time. Interreader reproducibility of axial rotation was limited to some degree in the upper and middle thoracic spine due the obtuse angulation of the pedicles and the processi spinosi in the frontal view somewhat complicating their delineation.

  15. Respiratory-Gated Helical Computed Tomography of Lung: Reproducibility of Small Volumes in an Ex Vivo Model

    International Nuclear Information System (INIS)

    Biederer, Juergen; Dinkel, Julien; Bolte, Hendrik; Welzel, Thomas; Hoffmann, Beata M.Sc.; Thierfelder, Carsten; Mende, Ulrich; Debus, Juergen; Heller, Martin; Kauczor, Hans-Ulrich

    2007-01-01

    Purpose: Motion-adapted radiotherapy with gated irradiation or tracking of tumor positions requires dedicated imaging techniques such as four-dimensional (4D) helical computed tomography (CT) for patient selection and treatment planning. The objective was to evaluate the reproducibility of spatial information for small objects on respiratory-gated 4D helical CT using computer-assisted volumetry of lung nodules in a ventilated ex vivo system. Methods and Materials: Five porcine lungs were inflated inside a chest phantom and prepared with 55 artificial nodules (mean diameter, 8.4 mm ± 1.8). The lungs were respirated by a flexible diaphragm and scanned with 40-row detector CT (collimation, 24 x 1.2 mm; pitch, 0.1; rotation time, 1 s; slice thickness, 1.5 mm; increment, 0.8 mm). The 4D-CT scans acquired during respiration (eight per minute) and reconstructed at 0-100% inspiration and equivalent static scans were scored for motion-related artifacts (0 or absent to 3 or relevant). The reproducibility of nodule volumetry (three readers) was assessed using the variation coefficient (VC). Results: The mean volumes from the static and dynamic inspiratory scans were equal (364.9 and 360.8 mm 3 , respectively, p = 0.24). The static and dynamic end-expiratory volumes were slightly greater (371.9 and 369.7 mm 3 , respectively, p = 0.019). The VC for volumetry (static) was 3.1%, with no significant difference between 20 apical and 20 caudal nodules (2.6% and 3.5%, p = 0.25). In dynamic scans, the VC was greater (3.9%, p = 0.004; apical and caudal, 2.6% and 4.9%; p = 0.004), with a significant difference between static and dynamic in the 20 caudal nodules (3.5% and 4.9%, p = 0.015). This was consistent with greater motion-related artifacts and image noise at the diaphragm (p <0.05). The VC for interobserver variability was 0.6%. Conclusion: Residual motion-related artifacts had only minimal influence on volumetry of small solid lesions. This indicates a high reproducibility of

  16. Qualitative modeling of the decision-making process using electrooculography.

    Science.gov (United States)

    Zargari Marandi, Ramtin; Sabzpoushan, S H

    2015-12-01

    A novel method based on electrooculography (EOG) has been introduced in this work to study the decision-making process. An experiment was designed and implemented wherein subjects were asked to choose between two items from the same category that were presented within a limited time. The EOG and voice signals of the subjects were recorded during the experiment. A calibration task was performed to map the EOG signals to their corresponding gaze positions on the screen by using an artificial neural network. To analyze the data, 16 parameters were extracted from the response time and EOG signals of the subjects. Evaluation and comparison of the parameters, together with subjects' choices, revealed functional information. On the basis of this information, subjects switched their eye gazes between items about three times on average. We also found, according to statistical hypothesis testing-that is, a t test, t(10) = 71.62, SE = 1.25, p < .0001-that the correspondence rate of a subjects' gaze at the moment of selection with the selected item was significant. Ultimately, on the basis of these results, we propose a qualitative choice model for the decision-making task.

  17. Magni Reproducibility Example

    DEFF Research Database (Denmark)

    2016-01-01

    An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set.......An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set....

  18. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  19. Environmental Consequences of Wildlife Tourism: The Use of Formalised Qualitative Models

    Directory of Open Access Journals (Sweden)

    Veselý Štěpán

    2015-09-01

    Full Text Available The paper presents a simple qualitative model of environmental consequences of wildlife tourism. Qualitative models use just three values: Positive/Increasing, Zero/Constant and Negative/Decreasing. Such quantifiers of trends are the least information intensive. Qualitative models can be useful, since models of wildlife tourism include such variables as, for example, Biodiversity (BIO, Animals’ habituation to tourists (HAB or Plant composition change (PLA that are sometimes difficult or costly to quantify. Hence, a significant fraction of available information about wildlife tourism and its consequences is not of numerical nature, for example, if HAB is increasing then BIO is decreasing. Such equationless relations are studied in this paper. The model has 10 variables and 20 equationless pairwise interrelations among them. The model is solved and 15 solutions, that is, scenarios are obtained. All qualitative states, including the first and second qualitative derivatives with respect to time, of all variables are specified for each scenario.

  20. A qualitative reasoning model of algal bloom in the Danube Delta Biosphere Reserve (DDBR)

    NARCIS (Netherlands)

    Cioaca, E.; Linnebank, F.E.; Bredeweg, B.; Salles, P.

    2009-01-01

    This paper presents a Qualitative Reasoning model of the algal bloom phenomenon and its effects in the Danube Delta Biosphere Reserve (DDBR) in Romania. Qualitative Reasoning models represent processes and their cause-effect relationships in a flexible and conceptually rich manner and as such can be

  1. Reproducibility of Carbon and Water Cycle by an Ecosystem Process Based Model Using a Weather Generator and Effect of Temporal Concentration of Precipitation on Model Outputs

    Science.gov (United States)

    Miyauchi, T.; Machimura, T.

    2014-12-01

    GCM is generally used to produce input weather data for the simulation of carbon and water cycle by ecosystem process based models under climate change however its temporal resolution is sometimes incompatible to requirement. A weather generator (WG) is used for temporal downscaling of input weather data for models, where the effect of WG algorithms on reproducibility of ecosystem model outputs must be assessed. In this study simulated carbon and water cycle by Biome-BGC model using weather data measured and generated by CLIMGEN weather generator were compared. The measured weather data (daily precipitation, maximum, minimum air temperature) at a few sites for 30 years was collected from NNDC Online weather data. The generated weather data was produced by CLIMGEN parameterized using the measured weather data. NPP, heterotrophic respiration (HR), NEE and water outflow were simulated by Biome-BGC using measured and generated weather data. In the case of deciduous broad leaf forest in Lushi, Henan Province, China, 30 years average monthly NPP by WG was 10% larger than that by measured weather in the growing season. HR by WG was larger than that by measured weather in all months by 15% in average. NEE by WG was more negative in winter and was close to that by measured weather in summer. These differences in carbon cycle were because the soil water content by WG was larger than that by measured weather. The difference between monthly water outflow by WG and by measured weather was large and variable, and annual outflow by WG was 50% of that by measured weather. The inconsistency in carbon and water cycle by WG and measured weather was suggested be affected by the difference in temporal concentration of precipitation, which was assessed.

  2. A qualitative model construction method of nuclear power plants for effective diagnostic knowledge generation

    International Nuclear Information System (INIS)

    Yoshikawa, Shinji; Endou, Akira; Kitamura, Yoshinobu; Sasajima, Munehiko; Ikeda, Mitsuru; Mizoguchi, Riichiro.

    1994-01-01

    This paper discusses a method to construct a qualitative model of a nuclear power plant, in order to generate effective diagnostic knowledge. The proposed method is to prepare deep knowledge to be provided to a knowledge compiler based upon qualitative reasoning (QR). Necessity of knowledge compilation for nuclear plant diagnosis will be explained first, and conventionally-experienced problems in qualitative reasoning and a proposed method to overcome this problem is shown next, then a sample procedure to build a qualitative nuclear plant model is demonstrated. (author)

  3. Statistical Methods for the Qualitative Assessment of Dynamic Models with Time Delay (R Package qualV

    Directory of Open Access Journals (Sweden)

    Stefanie Jachner

    2007-06-01

    Full Text Available Results of ecological models differ, to some extent, more from measured data than from empirical knowledge. Existing techniques for validation based on quantitative assessments sometimes cause an underestimation of the performance of models due to time shifts, accelerations and delays or systematic differences between measurement and simulation. However, for the application of such models it is often more important to reproduce essential patterns instead of seemingly exact numerical values. This paper presents techniques to identify patterns and numerical methods to measure the consistency of patterns between observations and model results. An orthogonal set of deviance measures for absolute, relative and ordinal scale was compiled to provide informations about the type of difference. Furthermore, two different approaches accounting for time shifts were presented. The first one transforms the time to take time delays and speed differences into account. The second one describes known qualitative criteria dividing time series into interval units in accordance to their main features. The methods differ in their basic concepts and in the form of the resulting criteria. Both approaches and the deviance measures discussed are implemented in an R package. All methods are demonstrated by means of water quality measurements and simulation data. The proposed quality criteria allow to recognize systematic differences and time shifts between time series and to conclude about the quantitative and qualitative similarity of patterns.

  4. Disaster Reintegration Model: A Qualitative Analysis on Developing Korean Disaster Mental Health Support Model

    Directory of Open Access Journals (Sweden)

    Yun-Jung Choi

    2018-02-01

    Full Text Available This study sought to describe the mental health problems experienced by Korean disaster survivors, using a qualitative research method to provide empirical resources for effective disaster mental health support in Korea. Participants were 16 adults or elderly adults who experienced one or more disasters at least 12 months ago recruited via theoretical sampling. Participants underwent in-depth individual interviews on their disaster experiences, which were recorded and transcribed for qualitative analysis, which followed Strauss and Corbin’s (1998 Grounded theory. After open coding, participants’ experiences were categorized into 130 codes, 43 sub-categories and 17 categories. The categories were further analyzed in a paradigm model, conditional model and the Disaster Reintegration Model, which proposed potentially effective mental health recovery strategies for disaster survivors, health providers and administrators. To provide effective assistance for mental health recovery of disaster survivors, both personal and public resilience should be promoted while considering both cultural and spiritual elements.

  5. The use of real-time cell analyzer technology in drug discovery: defining optimal cell culture conditions and assay reproducibility with different adherent cellular models.

    Science.gov (United States)

    Atienzar, Franck A; Tilmant, Karen; Gerets, Helga H; Toussaint, Gaelle; Speeckaert, Sebastien; Hanon, Etienne; Depelchin, Olympe; Dhalluin, Stephane

    2011-07-01

    The use of impedance-based label-free technology applied to drug discovery is nowadays receiving more and more attention. Indeed, such a simple and noninvasive assay that interferes minimally with cell morphology and function allows one to perform kinetic measurements and to obtain information on proliferation, migration, cytotoxicity, and receptor-mediated signaling. The objective of the study was to further assess the usefulness of a real-time cell analyzer (RTCA) platform based on impedance in the context of quality control and data reproducibility. The data indicate that this technology is useful to determine the best coating and cellular density conditions for different adherent cellular models including hepatocytes, cardiomyocytes, fibroblasts, and hybrid neuroblastoma/neuronal cells. Based on 31 independent experiments, the reproducibility of cell index data generated from HepG2 cells exposed to DMSO and to Triton X-100 was satisfactory, with a coefficient of variation close to 10%. Cell index data were also well reproduced when cardiomyocytes and fibroblasts were exposed to 21 compounds three times (correlation >0.91, p technology appears to be a powerful and reliable tool in drug discovery because of the reasonable throughput, rapid and efficient performance, technical optimization, and cell quality control.

  6. Three-dimensional surgical modelling with an open-source software protocol: study of precision and reproducibility in mandibular reconstruction with the fibula free flap.

    Science.gov (United States)

    Ganry, L; Quilichini, J; Bandini, C M; Leyder, P; Hersant, B; Meningaud, J P

    2017-08-01

    Very few surgical teams currently use totally independent and free solutions to perform three-dimensional (3D) surgical modelling for osseous free flaps in reconstructive surgery. This study assessed the precision and technical reproducibility of a 3D surgical modelling protocol using free open-source software in mandibular reconstruction with fibula free flaps and surgical guides. Precision was assessed through comparisons of the 3D surgical guide to the sterilized 3D-printed guide, determining accuracy to the millimetre level. Reproducibility was assessed in three surgical cases by volumetric comparison to the millimetre level. For the 3D surgical modelling, a difference of less than 0.1mm was observed. Almost no deformations (free flap modelling was between 0.1mm and 0.4mm, and the average precision of the complete reconstructed mandible was less than 1mm. The open-source software protocol demonstrated high accuracy without complications. However, the precision of the surgical case depends on the surgeon's 3D surgical modelling. Therefore, surgeons need training on the use of this protocol before applying it to surgical cases; this constitutes a limitation. Further studies should address the transfer of expertise. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  7. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  8. QML-AiNet: An immune network approach to learning qualitative differential equation models.

    Science.gov (United States)

    Pang, Wei; Coghill, George M

    2015-02-01

    In this paper, we explore the application of Opt-AiNet, an immune network approach for search and optimisation problems, to learning qualitative models in the form of qualitative differential equations. The Opt-AiNet algorithm is adapted to qualitative model learning problems, resulting in the proposed system QML-AiNet. The potential of QML-AiNet to address the scalability and multimodal search space issues of qualitative model learning has been investigated. More importantly, to further improve the efficiency of QML-AiNet, we also modify the mutation operator according to the features of discrete qualitative model space. Experimental results show that the performance of QML-AiNet is comparable to QML-CLONALG, a QML system using the clonal selection algorithm (CLONALG). More importantly, QML-AiNet with the modified mutation operator can significantly improve the scalability of QML and is much more efficient than QML-CLONALG.

  9. Qualitative and numerical study of Bianchi IX Models

    International Nuclear Information System (INIS)

    Francisco, G.; Matsas, G.E.A.

    1987-01-01

    The qualitative behaviour of trajectories in the Mixmaster universe is studied. The Lyapunov exponents computed directly from the differential equations and from the Poincare map are shown to be different. A detailed discussion of the role of these exponents in analysing the effect of chaos on trajectories is presented. (Author) [pt

  10. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Science.gov (United States)

    Nada, Rania M; Maal, Thomas J J; Breuning, K Hero; Bergé, Stefaan J; Mostafa, Yehya A; Kuijpers-Jagtman, Anne Marie

    2011-02-09

    Superimposition of serial Cone Beam Computed Tomography (CBCT) scans has become a valuable tool for three dimensional (3D) assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16) for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27) for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  11. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    OpenAIRE

    Goswami, Sonal; Samuel, Sherin; Sierra, Olga R.; Cascardi, Michele; Paré, Denis

    2012-01-01

    Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD) remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situ...

  12. Skills of General Circulation and Earth System Models in reproducing streamflow to the ocean: the case of Congo river

    Science.gov (United States)

    Santini, M.; Caporaso, L.

    2017-12-01

    Although the importance of water resources in the context of climate change, it is still difficult to correctly simulate the freshwater cycle over the land via General Circulation and Earth System Models (GCMs and ESMs). Existing efforts from the Climate Model Intercomparison Project 5 (CMIP5) were mainly devoted to the validation of atmospheric variables like temperature and precipitation, with low attention to discharge.Here we investigate the present-day performances of GCMs and ESMs participating to CMIP5 in simulating the discharge of the river Congo to the sea thanks to: i) the long-term availability of discharge data for the Kinshasa hydrological station representative of more than 95% of the water flowing in the whole catchment; and ii) the River's still low influence by human intervention, which enables comparison with the (mostly) natural streamflow simulated within CMIP5.Our findings suggest how most of models appear overestimating the streamflow in terms of seasonal cycle, especially in the late winter and spring, while overestimation and variability across models are lower in late summer. Weighted ensemble means are also calculated, based on simulations' performances given by several metrics, showing some improvements of results.Although simulated inter-monthly and inter-annual percent anomalies do not appear significantly different from those in observed data, when translated into well consolidated indicators of drought attributes (frequency, magnitude, timing, duration), usually adopted for more immediate communication to stakeholders and decision makers, such anomalies can be misleading.These inconsistencies produce incorrect assessments towards water management planning and infrastructures (e.g. dams or irrigated areas), especially if models are used instead of measurements, as in case of ungauged basins or for basins with insufficient data, as well as when relying on models for future estimates without a preliminary quantification of model biases.

  13. Assessment of a numerical model to reproduce event‐scale erosion and deposition distributions in a braided river

    Science.gov (United States)

    Measures, R.; Hicks, D. M.; Brasington, J.

    2016-01-01

    Abstract Numerical morphological modeling of braided rivers, using a physics‐based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth‐averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high‐flow event. Evaluation of model performance primarily focused upon using high‐resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach‐scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers. PMID:27708477

  14. Assessment of a numerical model to reproduce event-scale erosion and deposition distributions in a braided river.

    Science.gov (United States)

    Williams, R D; Measures, R; Hicks, D M; Brasington, J

    2016-08-01

    Numerical morphological modeling of braided rivers, using a physics-based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth-averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high-flow event. Evaluation of model performance primarily focused upon using high-resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach-scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers.

  15. The Systems Biology Markup Language (SBML) Level 3 Package: Qualitative Models, Version 1, Release 1.

    Science.gov (United States)

    Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš

    2015-09-04

    Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.

  16. Attempting to train a digital human model to reproduce human subject reach capabilities in an ejection seat aircraft

    NARCIS (Netherlands)

    Zehner, G.F.; Hudson, J.A.; Oudenhuijzen, A.

    2006-01-01

    From 1997 through 2002, the Air Force Research Lab and TNO Defence, Security and Safety (Business Unit Human Factors) were involved in a series of tests to quantify the accuracy of five Human Modeling Systems (HMSs) in determining accommodation limits of ejection seat aircraft. The results of these

  17. A Reliable and Reproducible Model for Assessing the Effect of Different Concentrations of α-Solanine on Rat Bone Marrow Mesenchymal Stem Cells

    Directory of Open Access Journals (Sweden)

    Adriana Ordóñez-Vásquez

    2017-01-01

    Full Text Available Αlpha-solanine (α-solanine is a glycoalkaloid present in potato (Solanum tuberosum. It has been of particular interest because of its toxicity and potential teratogenic effects that include abnormalities of the central nervous system, such as exencephaly, encephalocele, and anophthalmia. Various types of cell culture have been used as experimental models to determine the effect of α-solanine on cell physiology. The morphological changes in the mesenchymal stem cell upon exposure to α-solanine have not been established. This study aimed to describe a reliable and reproducible model for assessing the structural changes induced by exposure of mouse bone marrow mesenchymal stem cells (MSCs to different concentrations of α-solanine for 24 h. The results demonstrate that nonlethal concentrations of α-solanine (2–6 μM changed the morphology of the cells, including an increase in the number of nucleoli, suggesting elevated protein synthesis, and the formation of spicules. In addition, treatment with α-solanine reduced the number of adherent cells and the formation of colonies in culture. Immunophenotypic characterization and staining of MSCs are proposed as a reproducible method that allows description of cells exposed to the glycoalkaloid, α-solanine.

  18. Developing a Collection of Composable Data Translation Software Units to Improve Efficiency and Reproducibility in Ecohydrologic Modeling Workflows

    Science.gov (United States)

    Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.

    2017-12-01

    Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of

  19. Synchronized mammalian cell culture: part II--population ensemble modeling and analysis for development of reproducible processes.

    Science.gov (United States)

    Jandt, Uwe; Barradas, Oscar Platas; Pörtner, Ralf; Zeng, An-Ping

    2015-01-01

    The consideration of inherent population inhomogeneities of mammalian cell cultures becomes increasingly important for systems biology study and for developing more stable and efficient processes. However, variations of cellular properties belonging to different sub-populations and their potential effects on cellular physiology and kinetics of culture productivity under bioproduction conditions have not yet been much in the focus of research. Culture heterogeneity is strongly determined by the advance of the cell cycle. The assignment of cell-cycle specific cellular variations to large-scale process conditions can be optimally determined based on the combination of (partially) synchronized cultivation under otherwise physiological conditions and subsequent population-resolved model adaptation. The first step has been achieved using the physical selection method of countercurrent flow centrifugal elutriation, recently established in our group for different mammalian cell lines which is presented in Part I of this paper series. In this second part, we demonstrate the successful adaptation and application of a cell-cycle dependent population balance ensemble model to describe and understand synchronized bioreactor cultivations performed with two model mammalian cell lines, AGE1.HNAAT and CHO-K1. Numerical adaptation of the model to experimental data allows for detection of phase-specific parameters and for determination of significant variations between different phases and different cell lines. It shows that special care must be taken with regard to the sampling frequency in such oscillation cultures to minimize phase shift (jitter) artifacts. Based on predictions of long-term oscillation behavior of a culture depending on its start conditions, optimal elutriation setup trade-offs between high cell yields and high synchronization efficiency are proposed. © 2014 American Institute of Chemical Engineers.

  20. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  1. A conceptual framework to model long-run qualitative change in the energy system

    OpenAIRE

    Ebersberger, Bernd

    2004-01-01

    A conceptual framework to model long-run qualitative change in the energy system / A. Pyka, B. Ebersberger, H. Hanusch. - In: Evolution and economic complexity / ed. J. Stanley Metcalfe ... - Cheltenham [u.a.] : Elgar, 2004. - S. 191-213

  2. A CRPS-IgG-transfer-trauma model reproducing inflammatory and positive sensory signs associated with complex regional pain syndrome.

    Science.gov (United States)

    Tékus, Valéria; Hajna, Zsófia; Borbély, Éva; Markovics, Adrienn; Bagoly, Teréz; Szolcsányi, János; Thompson, Victoria; Kemény, Ágnes; Helyes, Zsuzsanna; Goebel, Andreas

    2014-02-01

    The aetiology of complex regional pain syndrome (CRPS), a highly painful, usually post-traumatic condition affecting the limbs, is unknown, but recent results have suggested an autoimmune contribution. To confirm a role for pathogenic autoantibodies, we established a passive-transfer trauma model. Prior to undergoing incision of hind limb plantar skin and muscle, mice were injected either with serum IgG obtained from chronic CRPS patients or matched healthy volunteers, or with saline. Unilateral hind limb plantar skin and muscle incision was performed to induce typical, mild tissue injury. Mechanical hyperalgesia, paw swelling, heat and cold sensitivity, weight-bearing ability, locomotor activity, motor coordination, paw temperature, and body weight were investigated for 8days. After sacrifice, proinflammatory sensory neuropeptides and cytokines were measured in paw tissues. CRPS patient IgG treatment significantly increased hind limb mechanical hyperalgesia and oedema in the incised paw compared with IgG from healthy subjects or saline. Plantar incision induced a remarkable elevation of substance P immunoreactivity on day 8, which was significantly increased by CRPS-IgG. In this IgG-transfer-trauma model for CRPS, serum IgG from chronic CRPS patients induced clinical and laboratory features resembling the human disease. These results support the hypothesis that autoantibodies may contribute to the pathophysiology of CRPS, and that autoantibody-removing therapies may be effective treatments for long-standing CRPS. Copyright © 2013 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  3. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  4. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  5. CONFIG - Adapting qualitative modeling and discrete event simulation for design of fault management systems

    Science.gov (United States)

    Malin, Jane T.; Basham, Bryan D.

    1989-01-01

    CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.

  6. Oxygen distribution in tumors: A qualitative analysis and modeling study providing a novel Monte Carlo approach

    International Nuclear Information System (INIS)

    Lagerlöf, Jakob H.; Kindblom, Jon; Bernhardt, Peter

    2014-01-01

    Purpose: To construct a Monte Carlo (MC)-based simulation model for analyzing the dependence of tumor oxygen distribution on different variables related to tumor vasculature [blood velocity, vessel-to-vessel proximity (vessel proximity), and inflowing oxygen partial pressure (pO 2 )]. Methods: A voxel-based tissue model containing parallel capillaries with square cross-sections (sides of 10 μm) was constructed. Green's function was used for diffusion calculations and Michaelis-Menten's kinetics to manage oxygen consumption. The model was tuned to approximately reproduce the oxygenational status of a renal carcinoma; the depth oxygenation curves (DOC) were fitted with an analytical expression to facilitate rapid MC simulations of tumor oxygen distribution. DOCs were simulated with three variables at three settings each (blood velocity, vessel proximity, and inflowing pO 2 ), which resulted in 27 combinations of conditions. To create a model that simulated variable oxygen distributions, the oxygen tension at a specific point was randomly sampled with trilinear interpolation in the dataset from the first simulation. Six correlations between blood velocity, vessel proximity, and inflowing pO 2 were hypothesized. Variable models with correlated parameters were compared to each other and to a nonvariable, DOC-based model to evaluate the differences in simulated oxygen distributions and tumor radiosensitivities for different tumor sizes. Results: For tumors with radii ranging from 5 to 30 mm, the nonvariable DOC model tended to generate normal or log-normal oxygen distributions, with a cut-off at zero. The pO 2 distributions simulated with the six-variable DOC models were quite different from the distributions generated with the nonvariable DOC model; in the former case the variable models simulated oxygen distributions that were more similar to in vivo results found in the literature. For larger tumors, the oxygen distributions became truncated in the lower

  7. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  8. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Directory of Open Access Journals (Sweden)

    Rania M Nada

    Full Text Available Superimposition of serial Cone Beam Computed Tomography (CBCT scans has become a valuable tool for three dimensional (3D assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16 for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27 for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  9. Reproducibility of haemodynamical simulations in a subject-specific stented aneurysm model--a report on the Virtual Intracranial Stenting Challenge 2007.

    Science.gov (United States)

    Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F

    2008-07-19

    This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and

  10. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  11. Evaluating quantitative and qualitative models: An application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  12. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  13. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes......, as well as overall preference, was based on consistency tests of binary paired-comparison judgments and on modeling the choice frequencies using probabilistic choice models. As a result, the preferences of non-expert listeners could be measured reliably at a ratio scale level. Principal components derived...

  14. Discriminating Between Models of Ambiguity Attitude : A Qualitative Test

    NARCIS (Netherlands)

    Cubitt, Robin; van de Kuilen, Gijs; Mukerji, Sujoy

    The exchange between Epstein (2010) and Klibanoff et al. (2012) identified a behavioral issue that sharply distinguishes between two classes of models of ambiguity sensitivity, exemplified by the 훼- MEU model and the smooth ambiguity model, respectively. The issue in question is whether a subject’s

  15. AI/OR computational model for integrating qualitative and quantitative design methods

    Science.gov (United States)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  16. Qualitative mathematics for the social sciences mathematical models for research on cultural dynamics

    CERN Document Server

    Rudolph, Lee

    2012-01-01

    In this book Lee Rudolph brings together international contributors who combine psychological and mathematical perspectives to analyse how qualitative mathematics can be used to create models of social and psychological processes. Bridging the gap between the fields with an imaginative and stimulating collection of contributed chapters, the volume updates the current research on the subject, which until now has been rather limited, focussing largely on the use of statistics. Qualitative Mathematics for the Social Sciences contains a variety of useful illustrative figures, in

  17. [Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].

    Science.gov (United States)

    Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping

    2014-06-01

    The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value.

  18. Magnet stability and reproducibility

    CERN Document Server

    Marks, N

    2010-01-01

    Magnet stability and reproducibility have become increasingly important as greater precision and beams with smaller dimension are required for research, medical and other purpose. The observed causes of mechanical and electrical instability are introduced and the engineering arrangements needed to minimize these problems discussed; the resulting performance of a state-of-the-art synchrotron source (Diamond) is then presented. The need for orbit feedback to obtain best possible beam stability is briefly introduced, but omitting any details of the necessary technical equipment, which is outside the scope of the presentation.

  19. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  20. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...... source. Toluene was degraded under aerobic conditions at a constant temperature of 28 degreesC. The experiments were modelled by a Monod model - extended to meet the air/liquid system, and the parameter values were estimated using a statistical nonlinear estimation procedure. Model reduction analysis...... resulted in a simpler model without the biomass decay term. In order to test for model reduction and reproducibility of parameter estimates, a likelihood ratio test was employed. The limited reproducibility for these experiments implied that all 9 batch experiments could not be described by the same set...

  1. Structuring Qualitative Data for Agent-Based Modelling

    NARCIS (Netherlands)

    Ghorbani, Amineh; Dijkema, Gerard P.J.; Schrauwen, Noortje

    2015-01-01

    Using ethnography to build agent-based models may result in more empirically grounded simulations. Our study on innovation practice and culture in the Westland horticulture sector served to explore what information and data from ethnographic analysis could be used in models and how. MAIA, a

  2. Modelling impacts of performance on the probability of reproducing, and thereby on productive lifespan, allow prediction of lifetime efficiency in dairy cows.

    Science.gov (United States)

    Phuong, H N; Blavy, P; Martin, O; Schmidely, P; Friggens, N C

    2016-01-01

    Reproductive success is a key component of lifetime efficiency - which is the ratio of energy in milk (MJ) to energy intake (MJ) over the lifespan, of cows. At the animal level, breeding and feeding management can substantially impact milk yield, body condition and energy balance of cows, which are known as major contributors to reproductive failure in dairy cattle. This study extended an existing lifetime performance model to incorporate the impacts that performance changes due to changing breeding and feeding strategies have on the probability of reproducing and thereby on the productive lifespan, and thus allow the prediction of a cow's lifetime efficiency. The model is dynamic and stochastic, with an individual cow being the unit modelled and one day being the unit of time. To evaluate the model, data from a French study including Holstein and Normande cows fed high-concentrate diets and data from a Scottish study including Holstein cows selected for high and average genetic merit for fat plus protein that were fed high- v. low-concentrate diets were used. Generally, the model consistently simulated productive and reproductive performance of various genotypes of cows across feeding systems. In the French data, the model adequately simulated the reproductive performance of Holsteins but significantly under-predicted that of Normande cows. In the Scottish data, conception to first service was comparably simulated, whereas interval traits were slightly under-predicted. Selection for greater milk production impaired the reproductive performance and lifespan but not lifetime efficiency. The definition of lifetime efficiency used in this model did not include associated costs or herd-level effects. Further works should include such economic indicators to allow more accurate simulation of lifetime profitability in different production scenarios.

  3. Qualitative dynamical analysis of chaotic plasma perturbations model

    Science.gov (United States)

    Elsadany, A. A.; Elsonbaty, Amr; Agiza, H. N.

    2018-06-01

    In this work, an analytical framework to understand nonlinear dynamics of plasma perturbations model is introduced. In particular, we analyze the model presented by Constantinescu et al. [20] which consists of three coupled ODEs and contains three parameters. The basic dynamical properties of the system are first investigated by the ways of bifurcation diagrams, phase portraits and Lyapunov exponents. Then, the normal form technique and perturbation methods are applied so as to the different types of bifurcations that exist in the model are investigated. It is proved that pitcfork, Bogdanov-Takens, Andronov-Hopf bifurcations, degenerate Hopf and homoclinic bifurcation can occur in phase space of the model. Also, the model can exhibit quasiperiodicity and chaotic behavior. Numerical simulations confirm our theoretical analytical results.

  4. Retrospective Correction of Physiological Noise: Impact on Sensitivity, Specificity, and Reproducibility of Resting-State Functional Connectivity in a Reading Network Model.

    Science.gov (United States)

    Krishnamurthy, Venkatagiri; Krishnamurthy, Lisa C; Schwam, Dina M; Ealey, Ashley; Shin, Jaemin; Greenberg, Daphne; Morris, Robin D

    2018-03-01

    It is well accepted that physiological noise (PN) obscures the detection of neural fluctuations in resting-state functional connectivity (rsFC) magnetic resonance imaging. However, a clear consensus for an optimal PN correction (PNC) methodology and how it can impact the rsFC signal characteristics is still lacking. In this study, we probe the impact of three PNC methods: RETROICOR: (Glover et al., 2000 ), ANATICOR: (Jo et al., 2010 ), and RVTMBPM: (Bianciardi et al., 2009 ). Using a reading network model, we systematically explore the effects of PNC optimization on sensitivity, specificity, and reproducibility of rsFC signals. In terms of specificity, ANATICOR was found to be effective in removing local white matter (WM) fluctuations and also resulted in aggressive removal of expected cortical-to-subcortical functional connections. The ability of RETROICOR to remove PN was equivalent to removal of simulated random PN such that it artificially inflated the connection strength, thereby decreasing sensitivity. RVTMBPM maintained specificity and sensitivity by balanced removal of vasodilatory PN and local WM nuisance edges. Another aspect of this work was exploring the effects of PNC on identifying reading group differences. Most PNC methods accounted for between-subject PN variability resulting in reduced intersession reproducibility. This effect facilitated the detection of the most consistent group differences. RVTMBPM was most effective in detecting significant group differences due to its inherent sensitivity to removing spatially structured and temporally repeating PN arising from dense vasculature. Finally, results suggest that combining all three PNC resulted in "overcorrection" by removing signal along with noise.

  5. Microbial community development in a dynamic gut model is reproducible, colon region specific, and selective for Bacteroidetes and Clostridium cluster IX.

    Science.gov (United States)

    Van den Abbeele, Pieter; Grootaert, Charlotte; Marzorati, Massimo; Possemiers, Sam; Verstraete, Willy; Gérard, Philippe; Rabot, Sylvie; Bruneau, Aurélia; El Aidy, Sahar; Derrien, Muriel; Zoetendal, Erwin; Kleerebezem, Michiel; Smidt, Hauke; Van de Wiele, Tom

    2010-08-01

    Dynamic, multicompartment in vitro gastrointestinal simulators are often used to monitor gut microbial dynamics and activity. These reactors need to harbor a microbial community that is stable upon inoculation, colon region specific, and relevant to in vivo conditions. Together with the reproducibility of the colonization process, these criteria are often overlooked when the modulatory properties from different treatments are compared. We therefore investigated the microbial colonization process in two identical simulators of the human intestinal microbial ecosystem (SHIME), simultaneously inoculated with the same human fecal microbiota with a high-resolution phylogenetic microarray: the human intestinal tract chip (HITChip). Following inoculation of the in vitro colon compartments, microbial community composition reached steady state after 2 weeks, whereas 3 weeks were required to reach functional stability. This dynamic colonization process was reproducible in both SHIME units and resulted in highly diverse microbial communities which were colon region specific, with the proximal regions harboring saccharolytic microbes (e.g., Bacteroides spp. and Eubacterium spp.) and the distal regions harboring mucin-degrading microbes (e.g., Akkermansia spp.). Importantly, the shift from an in vivo to an in vitro environment resulted in an increased Bacteroidetes/Firmicutes ratio, whereas Clostridium cluster IX (propionate producers) was enriched compared to clusters IV and XIVa (butyrate producers). This was supported by proportionally higher in vitro propionate concentrations. In conclusion, high-resolution analysis of in vitro-cultured gut microbiota offers new insight on the microbial colonization process and indicates the importance of digestive parameters that may be crucial in the development of new in vitro models.

  6. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  7. Learning to Act: Qualitative Learning of Deterministic Action Models

    DEFF Research Database (Denmark)

    Bolander, Thomas; Gierasimczuk, Nina

    2017-01-01

    In this article we study learnability of fully observable, universally applicable action models of dynamic epistemic logic. We introduce a framework for actions seen as sets of transitions between propositional states and we relate them to their dynamic epistemic logic representations as action...... in the limit (inconclusive convergence to the right action model). We show that deterministic actions are finitely identifiable, while arbitrary (non-deterministic) actions require more learning power—they are identifiable in the limit. We then move on to a particular learning method, i.e. learning via update......, which proceeds via restriction of a space of events within a learning-specific action model. We show how this method can be adapted to learn conditional and unconditional deterministic action models. We propose update learning mechanisms for the afore mentioned classes of actions and analyse...

  8. Representing and managing uncertainty in qualitative ecological models

    NARCIS (Netherlands)

    Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.

    2009-01-01

    Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete

  9. A qualitative model of the salmon life cycle in the context of river rehabilitation

    NARCIS (Netherlands)

    Noble, R.A.A.; Bredeweg, B.; Linnebank, F.; Salles, P.; Cowx, I.G.; Žabkar, J.; Bratko, I.

    2009-01-01

    A qualitative model was developed in Garp3 to capture and formalise knowledge about river rehabilitation and the management of an Atlantic salmon population. The model integrates information about the ecology of the salmon life cycle, the environmental factors that may limit the survival of key life

  10. A Proposed Model of Retransformed Qualitative Data within a Mixed Methods Research Design

    Science.gov (United States)

    Palladino, John M.

    2009-01-01

    Most models of mixed methods research design provide equal emphasis of qualitative and quantitative data analyses and interpretation. Other models stress one method more than the other. The present article is a discourse about the investigator's decision to employ a mixed method design to examine special education teachers' advocacy and…

  11. The operator model as a framework of research on errors and temporal, qualitative and analogical reasoning

    International Nuclear Information System (INIS)

    Decortis, F.; Drozdowicz, B.; Masson, M.

    1990-01-01

    In this paper the needs and requirements for developing a cognitive model of a human operator are discussed and the computer architecture, currently being developed, is described. Given the approach taken, namely the division of the problem into specialised tasks within an area and using the architecture chosen, it is possible to build independently several cognitive and psychological models such as errors and stress models, as well as models of temporal, qualitative and an analogical reasoning. (author)

  12. The spruce budworm and forest: a qualitative comparison of ODE and Boolean models

    Directory of Open Access Journals (Sweden)

    Raina Robeva

    2016-01-01

    Full Text Available Boolean and polynomial models of biological systems have emerged recently as viable companions to differential equations models. It is not immediately clear however whether such models are capable of capturing the multi-stable behaviour of certain biological systems: this behaviour is often sensitive to changes in the values of the model parameters, while Boolean and polynomial models are qualitative in nature. In the past few years, Boolean models of gene regulatory systems have been shown to capture multi-stability at the molecular level, confirming that such models can be used to obtain information about the system’s qualitative dynamics when precise information regarding its parameters may not be available. In this paper, we examine Boolean approximations of a classical ODE model of budworm outbreaks in a forest and show that these models exhibit a qualitative behaviour consistent with that derived from the ODE models. In particular, we demonstrate that these models can capture the bistable nature of insect population outbreaks, thus showing that Boolean models can be successfully utilized beyond the molecular level.

  13. Unicriterion Model: A Qualitative Decision Making Method That Promotes Ethics

    Directory of Open Access Journals (Sweden)

    Fernando Guilherme Silvano Lobo Pimentel

    2011-06-01

    Full Text Available Management decision making methods frequently adopt quantitativemodels of several criteria that bypass the question of whysome criteria are considered more important than others, whichmakes more difficult the task of delivering a transparent viewof preference structure priorities that might promote ethics andlearning and serve as a basis for future decisions. To tackle thisparticular shortcoming of usual methods, an alternative qualitativemethodology of aggregating preferences based on the rankingof criteria is proposed. Such an approach delivers a simpleand transparent model for the solution of each preference conflictfaced during the management decision making process. Themethod proceeds by breaking the decision problem into ‘two criteria– two alternatives’ scenarios, and translating the problem ofchoice between alternatives to a problem of choice between criteriawhenever appropriate. The unicriterion model method is illustratedby its application in a car purchase and a house purchasedecision problem.

  14. ARCHITECTURES AND ALGORITHMS FOR COGNITIVE NETWORKS ENABLED BY QUALITATIVE MODELS

    DEFF Research Database (Denmark)

    Balamuralidhar, P.

    2013-01-01

    traditional limitations and potentially achieving better performance. The vision is that, networks should be able to monitor themselves, reason upon changes in self and environment, act towards the achievement of specific goals and learn from experience. The concept of a Cognitive Engine (CE) supporting...... cognitive functions, as part of network elements, enabling above said autonomic capabilities is gathering attention. Awareness of the self and the world is an important aspect of the cognitive engine to be autonomic. This is achieved through embedding their models in the engine, but the complexity...... of the cognitive engine that incorporates a context space based information structure to its knowledge model. I propose a set of guiding principles behind a cognitive system to be autonomic and use them with additional requirements to build a detailed architecture for the cognitive engine. I define a context space...

  15. Reproducing early Martian atmospheric carbon dioxide partial pressure by modeling the formation of Mg-Fe-Ca carbonate identified in the Comanche rock outcrops on Mars

    Science.gov (United States)

    Berk, Wolfgang; Fu, Yunjiao; Ilger, Jan-Michael

    2012-10-01

    The well defined composition of the Comanche rock's carbonate (Magnesite0.62Siderite0.25Calcite0.11Rhodochrosite0.02) and its host rock's composition, dominated by Mg-rich olivine, enable us to reproduce the atmospheric CO2partial pressure that may have triggered the formation of these carbonates. Hydrogeochemical one-dimensional transport modeling reveals that similar aqueous rock alteration conditions (including CO2partial pressure) may have led to the formation of Mg-Fe-Ca carbonate identified in the Comanche rock outcrops (Gusev Crater) and also in the ultramafic rocks exposed in the Nili Fossae region. Hydrogeochemical conditions enabling the formation of Mg-rich solid solution carbonate result from equilibrium species distributions involving (1) ultramafic rocks (ca. 32 wt% olivine; Fo0.72Fa0.28), (2) pure water, and (3) CO2partial pressures of ca. 0.5 to 2.0 bar at water-to-rock ratios of ca. 500 molH2O mol-1rock and ca. 5°C (278 K). Our modeled carbonate composition (Magnesite0.64Siderite0.28Calcite0.08) matches the measured composition of carbonates preserved in the Comanche rocks. Considerably different carbonate compositions are achieved at (1) higher temperature (85°C), (2) water-to-rock ratios considerably higher and lower than 500 mol mol-1 and (3) CO2partial pressures differing from 1.0 bar in the model set up. The Comanche rocks, hosting the carbonate, may have been subjected to long-lasting (>104 to 105 years) aqueous alteration processes triggered by atmospheric CO2partial pressures of ca. 1.0 bar at low temperature. Their outcrop may represent a fragment of the upper layers of an altered olivine-rich rock column, which is characterized by newly formed Mg-Fe-Ca solid solution carbonate, and phyllosilicate-rich alteration assemblages within deeper (unexposed) units.

  16. Downscaling SSPs in the GBM Delta - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila

    2016-04-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  17. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  18. Gravitational wave background from Standard Model physics: qualitative features

    International Nuclear Information System (INIS)

    Ghiglieri, J.; Laine, M.

    2015-01-01

    Because of physical processes ranging from microscopic particle collisions to macroscopic hydrodynamic fluctuations, any plasma in thermal equilibrium emits gravitational waves. For the largest wavelengths the emission rate is proportional to the shear viscosity of the plasma. In the Standard Model at 0T > 16 GeV, the shear viscosity is dominated by the most weakly interacting particles, right-handed leptons, and is relatively large. We estimate the order of magnitude of the corresponding spectrum of gravitational waves. Even though at small frequencies (corresponding to the sub-Hz range relevant for planned observatories such as eLISA) this background is tiny compared with that from non-equilibrium sources, the total energy carried by the high-frequency part of the spectrum is non-negligible if the production continues for a long time. We suggest that this may constrain (weakly) the highest temperature of the radiation epoch. Observing the high-frequency part directly sets a very ambitious goal for future generations of GHz-range detectors

  19. Teaching Qualitative Research for Human Services Students: A Three-Phase Model

    Science.gov (United States)

    Goussinsky, Ruhama; Reshef, Arie; Yanay-Ventura, Galit; Yassour-Borochowitz, Dalit

    2011-01-01

    Qualitative research is an inherent part of the human services profession, since it emphasizes the great and multifaceted complexity characterizing human experience and the sociocultural context in which humans act. In the department of human services at Emek Yezreel College, Israel, we have developed a three-phase model to ensure a relatively…

  20. Disease management projects and the Chronic CareModel in action: Baseline qualitative research

    NARCIS (Netherlands)

    B.J. Hipple Walters (Bethany); S.A. Adams (Samantha); A.P. Nieboer (Anna); R.A. Bal (Roland)

    2012-01-01

    textabstractBackground: Disease management programs, especially those based on the Chronic Care Model (CCM),are increasingly common in the Netherlands. While disease management programs have beenwell-researched quantitatively and economically, less qualitative research has been done. Theoverall aim

  1. Towards a structured approach to building qualitative reasoning models and simulations

    NARCIS (Netherlands)

    Bredeweg, B.; Salles, P.; Bouwer, A.; Liem, J.; Nuttle, T.; Cioca, E.; Nakova, E.; Noble, R.; Caldas, A.L.R.; Uzunov, Y.; Varadinova, E.; Zitek, A.

    2008-01-01

    Successful transfer and uptake of qualitative reasoning technology for modelling and simulation in a variety of domains has been hampered by the lack of a structured methodology to support formalisation of ideas. We present a framework that structures and supports the capture of conceptual knowledge

  2. A qualitative model of limiting factors for a salmon life cycle in the context of river rehabilitation

    NARCIS (Netherlands)

    Noble, R.A.A.; Bredeweg, B.; Linnebank, F.; Salles, P.; Cowx, I.G.

    2009-01-01

    Qualitative Reasoning modelling has been promoted as a tool for formalising, integrating and exploring conceptual knowledge in ecological systems, such as river rehabilitation, which draw different information from multiple domains. A qualitative model was developed in Garp3 to capture and formalise

  3. A CFBPN Artificial Neural Network Model for Educational Qualitative Data Analyses: Example of Students' Attitudes Based on Kellerts' Typologies

    Science.gov (United States)

    Yorek, Nurettin; Ugulu, Ilker

    2015-01-01

    In this study, artificial neural networks are suggested as a model that can be "trained" to yield qualitative results out of a huge amount of categorical data. It can be said that this is a new approach applied in educational qualitative data analysis. In this direction, a cascade-forward back-propagation neural network (CFBPN) model was…

  4. Biodiversity and soil quality in agroecosystems: the use of a qualitative multi-attribute model

    DEFF Research Database (Denmark)

    Cortet, J.; Bohanec, M.; Griffiths, B.

    2009-01-01

    In ecological impact assessment, special emphasis is put on soil biology and estimating soil quality from the observed biological parameters. The aim of this study is to propose a tool easy to use for scientists and decision makers for agroecosystems soil quality assessment using these biological...... parameters. This tool was developed as a collaboration between ECOGEN (www.ecogen.dk) soil experts and decision analysts. Methodologically, we have addressed this goal using model-based Decision Support Systems (DSS), taking the approach of qualitative multi-attribute modelling. The approach is based...... on developing various hierarchical multiattribute models that consist of qualitative attributes and utility (aggregation) functions, represented by decision rules. The assessment of soil quality is based on two main indicators: (1) soil diversity (assessed through microfauna, mesofauna and macrofauna richness...

  5. COGNITIVE MODELING AS A METHOD OF QUALITATIVE ANALYSIS OF IT PROJECTS

    Directory of Open Access Journals (Sweden)

    Інна Ігорівна ОНИЩЕНКО

    2016-03-01

    Full Text Available The example project implementing automated CRM-system demonstrated the possibility and features of cognitive modeling in the qualitative analysis of project risks to determine their additional features. Proposed construction of cognitive models of project risks in information technology within the qualitative risk analysis, additional assessments as a method of ranking risk to characterize the relationship between them. The proposed cognitive model reflecting the relationship between the risk of IT project to assess the negative and the positive impact of certain risks for the remaining risks of project implementation of the automated CRM-system. The ability to influence the risk of a fact of other project risks can increase the priority of risk with low impact on results due to its relationship with other project risks.

  6. Global Qualitative Flow-Path Modeling for Local State Determination in Simulation and Analysis

    Science.gov (United States)

    Malin, Jane T. (Inventor); Fleming, Land D. (Inventor)

    1998-01-01

    For qualitative modeling and analysis, a general qualitative abstraction of power transmission variables (flow and effort) for elements of flow paths includes information on resistance, net flow, permissible directions of flow, and qualitative potential is discussed. Each type of component model has flow-related variables and an associated internal flow map, connected into an overall flow network of the system. For storage devices, the implicit power transfer to the environment is represented by "virtual" circuits that include an environmental junction. A heterogeneous aggregation method simplifies the path structure. A method determines global flow-path changes during dynamic simulation and analysis, and identifies corresponding local flow state changes that are effects of global configuration changes. Flow-path determination is triggered by any change in a flow-related device variable in a simulation or analysis. Components (path elements) that may be affected are identified, and flow-related attributes favoring flow in the two possible directions are collected for each of them. Next, flow-related attributes are determined for each affected path element, based on possibly conflicting indications of flow direction. Spurious qualitative ambiguities are minimized by using relative magnitudes and permissible directions of flow, and by favoring flow sources over effort sources when comparing flow tendencies. The results are output to local flow states of affected components.

  7. Reproducibility in a multiprocessor system

    Science.gov (United States)

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  8. The development of a qualitative dynamic attribute value model for healthcare institutes.

    Science.gov (United States)

    Lee, Wan-I

    2010-01-01

    Understanding customers has become an urgent topic for increasing competitiveness. The purpopse of the study was to develop a qualitative dynamic attribute value model which provides insight into the customers' value for healthcare institute managers by conducting the initial open-ended questionnaire survey to select participants purposefully. A total number of 427 questionnaires was conducted in two hospitals in Taiwan (one district hospital with 635 beds and one academic hospital with 2495 beds) and 419 questionnaires were received in nine weeks. Then, apply qualitative in-depth interviews to explore customers' perspective of values for building a model of partial differential equations. This study concludes nine categories of value, including cost, equipment, physician background, physicain care, environment, timing arrangement, relationship, brand image and additional value, to construct objective network for customer value and qualitative dynamic attribute value model where the network shows the value process of loyalty development via its effect on customer satisfaction, customer relationship, customer loyalty and healthcare service. One set predicts the customer relationship based on comminent, including service quality, communication and empahty. As the same time, customer loyalty based on trust, involves buzz marketing, brand and image. Customer value of the current instance is useful for traversing original customer attributes and identifing customers on different service share.

  9. Production process reproducibility and product quality consistency of transient gene expression in HEK293 cells with anti-PD1 antibody as the model protein.

    Science.gov (United States)

    Ding, Kai; Han, Lei; Zong, Huifang; Chen, Junsheng; Zhang, Baohong; Zhu, Jianwei

    2017-03-01

    Demonstration of reproducibility and consistency of process and product quality is one of the most crucial issues in using transient gene expression (TGE) technology for biopharmaceutical development. In this study, we challenged the production consistency of TGE by expressing nine batches of recombinant IgG antibody in human embryonic kidney 293 cells to evaluate reproducibility including viable cell density, viability, apoptotic status, and antibody yield in cell culture supernatant. Product quality including isoelectric point, binding affinity, secondary structure, and thermal stability was assessed as well. In addition, major glycan forms of antibody from different batches of production were compared to demonstrate glycosylation consistency. Glycan compositions of the antibody harvested at different time periods were also measured to illustrate N-glycan distribution over the culture time. From the results, it has been demonstrated that different TGE batches are reproducible from lot to lot in overall cell growth, product yield, and product qualities including isoelectric point, binding affinity, secondary structure, and thermal stability. Furthermore, major N-glycan compositions are consistent among different TGE batches and conserved during cell culture time.

  10. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  11. Qualitative and quantitative guidelines for the comparison of environmental model predictions

    International Nuclear Information System (INIS)

    Scott, M.

    1995-03-01

    The question of how to assess or compare predictions from a number of models is one of concern in the validation of models, in understanding the effects of different models and model parameterizations on model output, and ultimately in assessing model reliability. Comparison of model predictions with observed data is the basic tool of model validation while comparison of predictions amongst different models provides one measure of model credibility. The guidance provided here is intended to provide qualitative and quantitative approaches (including graphical and statistical techniques) to such comparisons for use within the BIOMOVS II project. It is hoped that others may find it useful. It contains little technical information on the actual methods but several references are provided for the interested reader. The guidelines are illustrated on data from the VAMP CB scenario. Unfortunately, these data do not permit all of the possible approaches to be demonstrated since predicted uncertainties were not provided. The questions considered are concerned with a) intercomparison of model predictions and b) comparison of model predictions with the observed data. A series of examples illustrating some of the different types of data structure and some possible analyses have been constructed. A bibliography of references on model validation is provided. It is important to note that the results of the various techniques discussed here, whether qualitative or quantitative, should not be considered in isolation. Overall model performance must also include an evaluation of model structure and formulation, i.e. conceptual model uncertainties, and results for performance measures must be interpreted in this context. Consider a number of models which are used to provide predictions of a number of quantities at a number of time points. In the case of the VAMP CB scenario, the results include predictions of total deposition of Cs-137 and time dependent concentrations in various

  12. Learning Qualitative Differential Equation models: a survey of algorithms and applications.

    Science.gov (United States)

    Pang, Wei; Coghill, George M

    2010-03-01

    Over the last two decades, qualitative reasoning (QR) has become an important domain in Artificial Intelligence. QDE (Qualitative Differential Equation) model learning (QML), as a branch of QR, has also received an increasing amount of attention; many systems have been proposed to solve various significant problems in this field. QML has been applied to a wide range of fields, including physics, biology and medical science. In this paper, we first identify the scope of this review by distinguishing QML from other QML systems, and then review all the noteworthy QML systems within this scope. The applications of QML in several application domains are also introduced briefly. Finally, the future directions of QML are explored from different perspectives.

  13. Qualitative models to predict impacts of human interventions in a wetland ecosystem

    Directory of Open Access Journals (Sweden)

    S. Loiselle

    2002-07-01

    Full Text Available The large shallow wetlands that dominate much of the South American continent are rich in biodiversity and complexity. Many of these undamaged ecosystems are presently being examined for their potential economic utility, putting pressure on local authorities and the conservation community to find ways of correctly utilising the available natural resources without compromising the ecosystem functioning and overall integrity. Contrary to many northern hemisphere ecosystems, there have been little long term ecological studies of these systems, leading to a lack of quantitative data on which to construct ecological or resource use models. As a result, decision makers, even well meaning ones, have difficulty in determining if particular economic activities can potentially cause significant damage to the ecosystem and how one should go about monitoring the impacts of such activities. While the direct impact of many activities is often known, the secondary indirect impacts are usually less clear and can depend on local ecological conditions.

    The use of qualitative models is a helpful tool to highlight potential feedback mechanisms and secondary effects of management action on ecosystem integrity. The harvesting of a single, apparently abundant, species can have indirect secondary effects on key trophic and abiotic compartments. In this paper, loop model analysis is used to qualitatively examine secondary effects of potential economic activities in a large wetland area in northeast Argentina, the Esteros del Ibera. Based on interaction with local actors together with observed ecological information, loop models were constructed to reflect relationships between biotic and abiotic compartments. A series of analyses were made to study the effect of different economic scenarios on key ecosystem compartments. Important impacts on key biotic compartments (phytoplankton, zooplankton, ichthyofauna, aquatic macrophytes and on the abiotic environment

  14. Shear wave elastography for breast masses is highly reproducible.

    Science.gov (United States)

    Cosgrove, David O; Berg, Wendie A; Doré, Caroline J; Skyba, Danny M; Henry, Jean-Pierre; Gay, Joel; Cohen-Bacrie, Claude

    2012-05-01

    To evaluate intra- and interobserver reproducibility of shear wave elastography (SWE) for breast masses. For intraobserver reproducibility, each observer obtained three consecutive SWE images of 758 masses that were visible on ultrasound. 144 (19%) were malignant. Weighted kappa was used to assess the agreement of qualitative elastographic features; the reliability of quantitative measurements was assessed by intraclass correlation coefficients (ICC). For the interobserver reproducibility, a blinded observer reviewed images and agreement on features was determined. Mean age was 50 years; mean mass size was 13 mm. Qualitatively, SWE images were at least reasonably similar for 666/758 (87.9%). Intraclass correlation for SWE diameter, area and perimeter was almost perfect (ICC ≥ 0.94). Intraobserver reliability for maximum and mean elasticity was almost perfect (ICC = 0.84 and 0.87) and was substantial for the ratio of mass-to-fat elasticity (ICC = 0.77). Interobserver agreement was moderate for SWE homogeneity (κ = 0.57), substantial for qualitative colour assessment of maximum elasticity (κ = 0.66), fair for SWE shape (κ = 0.40), fair for B-mode mass margins (κ = 0.38), and moderate for B-mode mass shape (κ = 0.58), orientation (κ = 0.53) and BI-RADS assessment (κ = 0.59). SWE is highly reproducible for assessing elastographic features of breast masses within and across observers. SWE interpretation is at least as consistent as that of BI-RADS ultrasound B-mode features. • Shear wave ultrasound elastography can measure the stiffness of breast tissue • It provides a qualitatively and quantitatively interpretable colour-coded map of tissue stiffness • Intraobserver reproducibility of SWE is almost perfect while intraobserver reproducibility of SWE proved to be moderate to substantial • The most reproducible SWE features between observers were SWE image homogeneity and maximum elasticity.

  15. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.

  16. Qualitative models and experimental investigation of chaotic NOR gates and set/reset flip-flops

    Science.gov (United States)

    Rahman, Aminur; Jordan, Ian; Blackmore, Denis

    2018-01-01

    It has been observed through experiments and SPICE simulations that logical circuits based upon Chua's circuit exhibit complex dynamical behaviour. This behaviour can be used to design analogues of more complex logic families and some properties can be exploited for electronics applications. Some of these circuits have been modelled as systems of ordinary differential equations. However, as the number of components in newer circuits increases so does the complexity. This renders continuous dynamical systems models impractical and necessitates new modelling techniques. In recent years, some discrete dynamical models have been developed using various simplifying assumptions. To create a robust modelling framework for chaotic logical circuits, we developed both deterministic and stochastic discrete dynamical models, which exploit the natural recurrence behaviour, for two chaotic NOR gates and a chaotic set/reset flip-flop. This work presents a complete applied mathematical investigation of logical circuits. Experiments on our own designs of the above circuits are modelled and the models are rigorously analysed and simulated showing surprisingly close qualitative agreement with the experiments. Furthermore, the models are designed to accommodate dynamics of similarly designed circuits. This will allow researchers to develop ever more complex chaotic logical circuits with a simple modelling framework.

  17. Contextual sensitivity in scientific reproducibility

    Science.gov (United States)

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  18. Measurements of boat motion in waves at Durban harbour for qualitative validation of motion model

    CSIR Research Space (South Africa)

    Mosikare, OR

    2010-09-01

    Full Text Available in Waves at Durban Harbour for Qualitative Validation of Motion Model O.R. Mosikare1,2, N.J. Theron1, W. Van der Molen 1 University of Pretoria, South Africa, 0001 2Council for Scientific and Industrial Research, Meiring Naude Rd, Brummeria, 0001... stream_source_info Mosikare_2010.pdf.txt stream_content_type text/plain stream_size 3033 Content-Encoding UTF-8 stream_name Mosikare_2010.pdf.txt Content-Type text/plain; charset=UTF-8 Measurements of Boat Motion...

  19. Reproducibility of somatosensory spatial perceptual maps.

    Science.gov (United States)

    Steenbergen, Peter; Buitenweg, Jan R; Trojan, Jörg; Veltink, Peter H

    2013-02-01

    Various studies have shown subjects to mislocalize cutaneous stimuli in an idiosyncratic manner. Spatial properties of individual localization behavior can be represented in the form of perceptual maps. Individual differences in these maps may reflect properties of internal body representations, and perceptual maps may therefore be a useful method for studying these representations. For this to be the case, individual perceptual maps need to be reproducible, which has not yet been demonstrated. We assessed the reproducibility of localizations measured twice on subsequent days. Ten subjects participated in the experiments. Non-painful electrocutaneous stimuli were applied at seven sites on the lower arm. Subjects localized the stimuli on a photograph of their own arm, which was presented on a tablet screen overlaying the real arm. Reproducibility was assessed by calculating intraclass correlation coefficients (ICC) for the mean localizations of each electrode site and the slope and offset of regression models of the localizations, which represent scaling and displacement of perceptual maps relative to the stimulated sites. The ICCs of the mean localizations ranged from 0.68 to 0.93; the ICCs of the regression parameters were 0.88 for the intercept and 0.92 for the slope. These results indicate a high degree of reproducibility. We conclude that localization patterns of non-painful electrocutaneous stimuli on the arm are reproducible on subsequent days. Reproducibility is a necessary property of perceptual maps for these to reflect properties of a subject's internal body representations. Perceptual maps are therefore a promising method for studying body representations.

  20. Integrated decision-making about housing, energy and wellbeing: a qualitative system dynamics model.

    Science.gov (United States)

    Macmillan, Alexandra; Davies, Michael; Shrubsole, Clive; Luxford, Naomi; May, Neil; Chiu, Lai Fong; Trutnevyte, Evelina; Bobrova, Yekatherina; Chalabi, Zaid

    2016-03-08

    The UK government has an ambitious goal to reduce carbon emissions from the housing stock through energy efficiency improvements. This single policy goal is a strong driver for change in the housing system, but comes with positive and negative "unintended consequences" across a broad range of outcomes for health, equity and environmental sustainability. The resulting policies are also already experiencing under-performance through a failure to consider housing as a complex system. This research aimed to move from considering disparate objectives of housing policies in isolation to mapping the links between environmental, economic, social and health outcomes as a complex system. We aimed to support a broad range of housing policy stakeholders to improve their understanding of housing as a complex system through a collaborative learning process. We used participatory system dynamics modelling to develop a qualitative causal theory linking housing, energy and wellbeing. Qualitative interviews were followed by two interactive workshops to develop the model, involving representatives from national and local government, housing industries, non-government organisations, communities and academia. More than 50 stakeholders from 37 organisations participated. The process resulted in a shared understanding of wellbeing as it relates to housing; an agreed set of criteria against which to assess to future policy options; and a comprehensive set of causal loop diagrams describing the housing, energy and wellbeing system. The causal loop diagrams cover seven interconnected themes: community connection and quality of neighbourhoods; energy efficiency and climate change; fuel poverty and indoor temperature; household crowding; housing affordability; land ownership, value and development patterns; and ventilation and indoor air pollution. The collaborative learning process and the model have been useful for shifting the thinking of a wide range of housing stakeholders towards a more

  1. Qualitative Fault Isolation of Hybrid Systems: A Structural Model Decomposition-Based Approach

    Science.gov (United States)

    Bregon, Anibal; Daigle, Matthew; Roychoudhury, Indranil

    2016-01-01

    Quick and robust fault diagnosis is critical to ensuring safe operation of complex engineering systems. A large number of techniques are available to provide fault diagnosis in systems with continuous dynamics. However, many systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete behavioral modes, each with its own continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task computationally more complex due to the large number of possible system modes and the existence of autonomous mode transitions. This paper presents a qualitative fault isolation framework for hybrid systems based on structural model decomposition. The fault isolation is performed by analyzing the qualitative information of the residual deviations. However, in hybrid systems this process becomes complex due to possible existence of observation delays, which can cause observed deviations to be inconsistent with the expected deviations for the current mode in the system. The great advantage of structural model decomposition is that (i) it allows to design residuals that respond to only a subset of the faults, and (ii) every time a mode change occurs, only a subset of the residuals will need to be reconfigured, thus reducing the complexity of the reasoning process for isolation purposes. To demonstrate and test the validity of our approach, we use an electric circuit simulation as the case study.

  2. Involving mental health service users in suicide-related research: a qualitative inquiry model.

    Science.gov (United States)

    Lees, David; Procter, Nicholas; Fassett, Denise; Handley, Christine

    2016-03-01

    To describe the research model developed and successfully deployed as part of a multi-method qualitative study investigating suicidal service-users' experiences of mental health nursing care. Quality mental health care is essential to limiting the occurrence and burden of suicide, however there is a lack of relevant research informing practice in this context. Research utilising first-person accounts of suicidality is of particular importance to expanding the existing evidence base. However, conducting ethical research to support this imperative is challenging. The model discussed here illustrates specific and more generally applicable principles for qualitative research regarding sensitive topics and involving potentially vulnerable service-users. Researching into mental health service users with first-person experience of suicidality requires stakeholder and institutional support, researcher competency, and participant recruitment, consent, confidentiality, support and protection. Research with service users into their experiences of sensitive issues such as suicidality can result in rich and valuable data, and may also provide positive experiences of collaboration and inclusivity. If challenges are not met, objectification and marginalisation of service-users may be reinforced, and limitations in the evidence base and service provision may be perpetuated.

  3. Developing a change model for peer worker interventions in mental health services: a qualitative research study.

    Science.gov (United States)

    Gillard, S; Gibson, S L; Holley, J; Lucock, M

    2015-10-01

    A range of peer worker roles are being introduced into mental health services internationally. There is some evidence that attests to the benefits of peer workers for the people they support but formal trial evidence in inconclusive, in part because the change model underpinning peer support-based interventions is underdeveloped. Complex intervention evaluation guidance suggests that understandings of how an intervention is associated with change in outcomes should be modelled, theoretically and empirically, before the intervention can be robustly evaluated. This paper aims to model the change mechanisms underlying peer worker interventions. In a qualitative, comparative case study of ten peer worker initiatives in statutory and voluntary sector mental health services in England in-depth interviews were carried out with 71 peer workers, service users, staff and managers, exploring their experiences of peer working. Using a Grounded Theory approach we identified core processes within the peer worker role that were productive of change for service users supported by peer workers. Key change mechanisms were: (i) building trusting relationships based on shared lived experience; (ii) role-modelling individual recovery and living well with mental health problems; (iii) engaging service users with mental health services and the community. Mechanisms could be further explained by theoretical literature on role-modelling and relationship in mental health services. We were able to model process and downstream outcomes potentially associated with peer worker interventions. An empirically and theoretically grounded change model can be articulated that usefully informs the development, evaluation and planning of peer worker interventions.

  4. Qualitative Validation of the IMM Model for ISS and STS Programs

    Science.gov (United States)

    Kerstman, E.; Walton, M.; Reyes, D.; Boley, L.; Saile, L.; Young, M.; Arellano, J.; Garcia, Y.; Myers, J. G.

    2016-01-01

    To validate and further improve the Integrated Medical Model (IMM), medical event data were obtained from 32 ISS and 122 STS person-missions. Using the crew characteristics from these observed missions, IMM v4.0 was used to forecast medical events and medical resource utilization. The IMM medical condition incidence values were compared to the actual observed medical event incidence values, and the IMM forecasted medical resource utilization was compared to actual observed medical resource utilization. Qualitative comparisons of these parameters were conducted for both the ISS and STS programs. The results of these analyses will provide validation of IMM v4.0 and reveal areas of the model requiring adjustments to improve the overall accuracy of IMM outputs. This validation effort should result in enhanced credibility of the IMM and improved confidence in the use of IMM as a decision support tool for human space flight.

  5. A toy model that predicts the qualitative role of bar bend in a push jerk.

    Science.gov (United States)

    Santos, Aaron; Meltzer, Norman E

    2009-11-01

    In this work, we describe a simple coarse-grained model of a barbell that can be used to determine the qualitative role of bar bend during a jerk. In simulations of this model, we observed a narrow time window during which the lifter can leverage the elasticity of the bar in order to lift the weight to a maximal height. This time window shifted to later times as the weight was increased. In addition, we found that the optimal time to initiate the drive was strongly correlated with the time at which the bar had reached a maximum upward velocity after recoiling. By isolating the effect of the bar, we obtained a generalized strategy for lifting heavy weight in the jerk.

  6. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  7. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  8. Use of Game Theory to model patient engagement after surgery: a qualitative analysis.

    Science.gov (United States)

    Castellanos, Stephen A; Buentello, Gerardo; Gutierrez-Meza, Diana; Forgues, Angela; Haubert, Lisa; Artinyan, Avo; Macdonald, Cameron L; Suliburk, James W

    2018-01-01

    Patient engagement is challenging to define and operationalize. Qualitative analysis allows us to explore patient perspectives on this topic and establish themes. A game theoretic signaling model also provides a framework through which to further explore engagement. Over a 6-mo period, thirty-eight interviews were conducted within 6 wk of discharge in patients undergoing thyroid, parathyroid, or colorectal surgery. Interviews were transcribed, anonymized, and analyzed using the NVivo 11 platform. A signaling model was then developed depicting the doctor-patient interaction surrounding the patient's choice to reach out to their physician with postoperative concerns based upon the patient's perspective of the doctor's availability. This was defined as "engagement". We applied the model to the qualitative data to determine possible causations for a patient's engagement or lack thereof. A private hospital's and a safety net hospital's populations were contrasted. The private patient population was more likely to engage than their safety-net counterparts. Using our model in conjunction with patient data, we determined possible etiologies for this engagement to be due to the private patient's perceived probability of dealing with an available doctor and apparent signals from the doctor indicating so. For the safety-net population, decreased access to care caused them to be less willing to engage with a doctor perceived as possibly unavailable. A physician who understands these Game Theory concepts may be able to alter their interactions with their patients, tailoring responses and demeanor to fit the patient's circumstances and possible barriers to engagement. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  10. Analysis of Water Conflicts across Natural and Societal Boundaries: Integration of Quantitative Modeling and Qualitative Reasoning

    Science.gov (United States)

    Gao, Y.; Balaram, P.; Islam, S.

    2009-12-01

    , the knowledge generated from these studies cannot be easily generalized or transferred to other basins. Here, we present an approach to integrate the quantitative and qualitative methods to study water issues and capture the contextual knowledge of water management- by combining the NSSs framework and an area of artificial intelligence called qualitative reasoning. Using the Apalachicola-Chattahoochee-Flint (ACF) River Basin dispute as an example, we demonstrate how quantitative modeling and qualitative reasoning can be integrated to examine the impact of over abstraction of water from the river on the ecosystem and the role of governance in shaping the evolution of the ACF water dispute.

  11. Configurational Model for Conductivity of Stabilized Fluorite Structure Oxides

    DEFF Research Database (Denmark)

    Poulsen, Finn Willy

    1981-01-01

    The formalism developed here furnishes means by which ionic configurations, solid solution limits, and conductivity mechanisms in doped fluorite structures can be described. The present model differs markedly from previous models but reproduces qualitatively reality. The analysis reported...

  12. A fuzzy-logic-based approach to qualitative safety modelling for marine systems

    International Nuclear Information System (INIS)

    Sii, H.S.; Ruxton, Tom; Wang Jin

    2001-01-01

    Safety assessment based on conventional tools (e.g. probability risk assessment (PRA)) may not be well suited for dealing with systems having a high level of uncertainty, particularly in the feasibility and concept design stages of a maritime or offshore system. By contrast, a safety model using fuzzy logic approach employing fuzzy IF-THEN rules can model the qualitative aspects of human knowledge and reasoning processes without employing precise quantitative analyses. A fuzzy-logic-based approach may be more appropriately used to carry out risk analysis in the initial design stages. This provides a tool for working directly with the linguistic terms commonly used in carrying out safety assessment. This research focuses on the development and representation of linguistic variables to model risk levels subjectively. These variables are then quantified using fuzzy sets. In this paper, the development of a safety model using fuzzy logic approach for modelling various design variables for maritime and offshore safety based decision making in the concept design stage is presented. An example is used to illustrate the proposed approach

  13. Being reflexive in qualitative grounded theory: discussion and application of a model of reflexivity.

    Science.gov (United States)

    Engward, Hilary; Davis, Geraldine

    2015-07-01

    A discussion of the meaning of reflexivity in research with the presentation of examples of how a model of reflexivity was used in a grounded theory research project. Reflexivity requires the researcher to make transparent the decisions they make in the research process and is therefore important in developing quality in nursing research. The importance of being reflexive is highlighted in the literature in relation to nursing research, however, practical guidance as to how to go about doing research reflexively is not always clearly articulated. This is a discussion paper. The concept of reflexivity in research is explored using the Alvesson and Skoldberg model of reflexivity and practical examples of how a researcher developed reflexivity in a grounded theory project are presented. Nurse researchers are encouraged to explore and apply the concept of reflexivity in their research practices to develop transparency in the research process and to increase robustness in their research. The Alvesson and Skoldberg model is of value in applying reflexivity in qualitative nursing research, particularly in grounded theory research. Being reflexive requires the researcher to be completely open about decisions that are made in the research process. The Alvesson and Skolberg model of reflexivity is a useful model that can enhance reflexivity in the research process. It can be a useful practical tool to develop reflexivity in grounded theory research. © 2015 John Wiley & Sons Ltd.

  14. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  15. Qualitative and quantitative examination of the performance of regional air quality models representing different modeling approaches

    International Nuclear Information System (INIS)

    Bhumralkar, C.M.; Ludwig, F.L.; Shannon, J.D.; McNaughton, D.

    1985-04-01

    The calculations of three different air quality models were compared with the best available observations. The comparisons were made without calibrating the models to improve agreement with the observations. Model performance was poor for short averaging times (less than 24 hours). Some of the poor performance can be traced to errors in the input meteorological fields, but error exist on all levels. It should be noted that these models were not originally designed for treating short-term episodes. For short-term episodes, much of the variance in the data can arise from small spatial scale features that tend to be averaged out over longer periods. These small spatial scale features cannot be resolved with the coarse grids that are used for the meteorological and emissions inputs. Thus, it is not surprising that the models performed for the longer averaging times. The models compared were RTM-II, ENAMAP-2 and ACID. (17 refs., 5 figs., 4 tabs

  16. Model-Based Analysis for Qualitative Data: An Application in Drosophila Germline Stem Cell Regulation

    Science.gov (United States)

    Pargett, Michael; Rundell, Ann E.; Buzzard, Gregery T.; Umulis, David M.

    2014-01-01

    Discovery in developmental biology is often driven by intuition that relies on the integration of multiple types of data such as fluorescent images, phenotypes, and the outcomes of biochemical assays. Mathematical modeling helps elucidate the biological mechanisms at play as the networks become increasingly large and complex. However, the available data is frequently under-utilized due to incompatibility with quantitative model tuning techniques. This is the case for stem cell regulation mechanisms explored in the Drosophila germarium through fluorescent immunohistochemistry. To enable better integration of biological data with modeling in this and similar situations, we have developed a general parameter estimation process to quantitatively optimize models with qualitative data. The process employs a modified version of the Optimal Scaling method from social and behavioral sciences, and multi-objective optimization to evaluate the trade-off between fitting different datasets (e.g. wild type vs. mutant). Using only published imaging data in the germarium, we first evaluated support for a published intracellular regulatory network by considering alternative connections of the same regulatory players. Simply screening networks against wild type data identified hundreds of feasible alternatives. Of these, five parsimonious variants were found and compared by multi-objective analysis including mutant data and dynamic constraints. With these data, the current model is supported over the alternatives, but support for a biochemically observed feedback element is weak (i.e. these data do not measure the feedback effect well). When also comparing new hypothetical models, the available data do not discriminate. To begin addressing the limitations in data, we performed a model-based experiment design and provide recommendations for experiments to refine model parameters and discriminate increasingly complex hypotheses. PMID:24626201

  17. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built-up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognize systematic error distorting the performance test....

  18. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built–up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognise systematic error distorting the performance test....

  19. A method to identify energy efficiency measures for factory systems based on qualitative modeling

    CERN Document Server

    Krones, Manuela

    2017-01-01

    Manuela Krones develops a method that supports factory planners in generating energy-efficient planning solutions. The method provides qualitative description concepts for factory planning tasks and energy efficiency knowledge as well as an algorithm-based linkage between these measures and the respective planning tasks. Its application is guided by a procedure model which allows a general applicability in the manufacturing sector. The results contain energy efficiency measures that are suitable for a specific planning task and reveal the roles of various actors for the measures’ implementation. Contents Driving Concerns for and Barriers against Energy Efficiency Approaches to Increase Energy Efficiency in Factories Socio-Technical Description of Factory Planning Tasks Description of Energy Efficiency Measures Case Studies on Welding Processes and Logistics Systems Target Groups Lecturers and Students of Industrial Engineering, Production Engineering, Environmental Engineering, Mechanical Engineering Practi...

  20. Applying the chronic care model to an employee benefits program: a qualitative inquiry.

    Science.gov (United States)

    Schauer, Gillian L; Wilson, Mark; Barrett, Barbara; Honeycutt, Sally; Hermstad, April K; Kegler, Michelle C

    2013-12-01

    To assess how employee benefits programs may strengthen and/or complement elements of the chronic care model (CCM), a framework used by health systems to improve chronic illness care. A qualitative inquiry consisting of semi-structured interviews with employee benefit administrators and partners from a self-insured, self-administered employee health benefits program was conducted at a large family-owned business in southwest Georgia. Results indicate that the employer adapted and used many health system-related elements of the CCM in the design of their benefit program. Data also suggest that the employee benefits program contributed to self-management skills and to informing and activating patients to interact with the health system. Findings suggest that employee benefits programs can use aspects of the CCM in their own benefit design, and can structure their benefits to contribute to patient-related elements from the CCM.

  1. Evaluation of Land Surface Models in Reproducing Satellite-Derived LAI over the High-Latitude Northern Hemisphere. Part I: Uncoupled DGVMs

    Directory of Open Access Journals (Sweden)

    Ning Zeng

    2013-10-01

    Full Text Available Leaf Area Index (LAI represents the total surface area of leaves above a unit area of ground and is a key variable in any vegetation model, as well as in climate models. New high resolution LAI satellite data is now available covering a period of several decades. This provides a unique opportunity to validate LAI estimates from multiple vegetation models. The objective of this paper is to compare new, satellite-derived LAI measurements with modeled output for the Northern Hemisphere. We compare monthly LAI output from eight land surface models from the TRENDY compendium with satellite data from an Artificial Neural Network (ANN from the latest version (third generation of GIMMS AVHRR NDVI data over the period 1986–2005. Our results show that all the models overestimate the mean LAI, particularly over the boreal forest. We also find that seven out of the eight models overestimate the length of the active vegetation-growing season, mostly due to a late dormancy as a result of a late summer phenology. Finally, we find that the models report a much larger positive trend in LAI over this period than the satellite observations suggest, which translates into a higher trend in the growing season length. These results highlight the need to incorporate a larger number of more accurate plant functional types in all models and, in particular, to improve the phenology of deciduous trees.

  2. An Early Model for Value and Sustainability in Health Information Exchanges: Qualitative Study

    Science.gov (United States)

    2018-01-01

    Background The primary value relative to health information exchange has been seen in terms of cost savings relative to laboratory and radiology testing, emergency department expenditures, and admissions. However, models are needed to statistically quantify value and sustainability and better understand the dependent and mediating factors that contribute to value and sustainability. Objective The purpose of this study was to provide a basis for early model development for health information exchange value and sustainability. Methods A qualitative study was conducted with 21 interviews of eHealth Exchange participants across 10 organizations. Using a grounded theory approach and 3.0 as a relative frequency threshold, 5 main categories and 16 subcategories emerged. Results This study identifies 3 core current perceived value factors and 5 potential perceived value factors—how interviewees predict health information exchanges may evolve as there are more participants. These value factors were used as the foundation for early model development for sustainability of health information exchange. Conclusions Using the value factors from the interviews, the study provides the basis for early model development for health information exchange value and sustainability. This basis includes factors from the research: fostering consumer engagement; establishing a provider directory; quantifying use, cost, and clinical outcomes; ensuring data integrity through patient matching; and increasing awareness, usefulness, interoperability, and sustainability of eHealth Exchange. PMID:29712623

  3. Autobiography and Anorexia: A Qualitative Alternative to Prochaska and DiClemente's Stages of Change Model

    Directory of Open Access Journals (Sweden)

    Félix Díaz

    2012-11-01

    Full Text Available In this article, we propose a qualitative approach to the study of the ways in which people face good and poor health issues. During the last 30 years, Prochaska and DiClemente's "trans-theoretical model" (1982, 1983, 1984, 1986, 1992 has gained relevance as a model to assess disposition for change in patients. We revise the features of the model and its common techniques to assess stages of change, underlining its methodological and conceptual problems. Particularly, we discuss the paradoxes set by "pre-contemplation" as a concept; the exogenous definition of human problems in terms of institutional and clinical criteria; and the ambiguity of the model, where the purpose of accompanying and orienting the patient contrasts with the imposition of problem definitions and solution strategies. We propose a narrative analysis of autobiographies of patients as an alternative that recasts their own notions of "change," "problem," and "vital trajectory." We illustrate this possibility with the analysis of an autobiographic interview with a woman who has a history of anorexia. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1203209

  4. An Early Model for Value and Sustainability in Health Information Exchanges: Qualitative Study.

    Science.gov (United States)

    Feldman, Sue S

    2018-04-30

    The primary value relative to health information exchange has been seen in terms of cost savings relative to laboratory and radiology testing, emergency department expenditures, and admissions. However, models are needed to statistically quantify value and sustainability and better understand the dependent and mediating factors that contribute to value and sustainability. The purpose of this study was to provide a basis for early model development for health information exchange value and sustainability. A qualitative study was conducted with 21 interviews of eHealth Exchange participants across 10 organizations. Using a grounded theory approach and 3.0 as a relative frequency threshold, 5 main categories and 16 subcategories emerged. This study identifies 3 core current perceived value factors and 5 potential perceived value factors-how interviewees predict health information exchanges may evolve as there are more participants. These value factors were used as the foundation for early model development for sustainability of health information exchange. Using the value factors from the interviews, the study provides the basis for early model development for health information exchange value and sustainability. This basis includes factors from the research: fostering consumer engagement; establishing a provider directory; quantifying use, cost, and clinical outcomes; ensuring data integrity through patient matching; and increasing awareness, usefulness, interoperability, and sustainability of eHealth Exchange. ©Sue S Feldman. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 30.04.2018.

  5. Fault Tree Analysis with Temporal Gates and Model Checking Technique for Qualitative System Safety Analysis

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2010-01-01

    Fault tree analysis (FTA) has suffered from several drawbacks such that it uses only static gates and hence can not capture dynamic behaviors of the complex system precisely, and it is in lack of rigorous semantics, and reasoning process which is to check whether basic events really cause top events is done manually and hence very labor-intensive and time-consuming for the complex systems while it has been one of the most widely used safety analysis technique in nuclear industry. Although several attempts have been made to overcome this problem, they can not still do absolute or actual time modeling because they adapt relative time concept and can capture only sequential behaviors of the system. In this work, to resolve the problems, FTA and model checking are integrated to provide formal, automated and qualitative assistance to informal and/or quantitative safety analysis. Our approach proposes to build a formal model of the system together with fault trees. We introduce several temporal gates based on timed computational tree logic (TCTL) to capture absolute time behaviors of the system and to give concrete semantics to fault tree gates to reduce errors during the analysis, and use model checking technique to automate the reasoning process of FTA

  6. An examination of qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations

    International Nuclear Information System (INIS)

    Herbert, M.; Williams, G.

    1986-01-01

    New qualitative techniques for representing the behaviour of physical systems have recently been developed. These allow a qualitative representation to be formally derived from a quantitative plant model. One such technique, Incremental Qualitative Analysis, is based on manipulating qualitative differential equations, called confluences, using sign algebra. This is described and its potential for reducing the amount of information presented to the reactor operator is discussed. In order to illustrate the technique, a specific example relating to the influence of failures associated with a pressurized water reactor pressuriser is presented. It is shown that, although failures cannot necessarily be diagnosed unambiguously, the number of possible failures inferred is low. Techniques for discriminating between these possible failures are discussed. (author)

  7. A qualitative assessment of a community antiretroviral therapy group model in Tete, Mozambique.

    Directory of Open Access Journals (Sweden)

    Freya Rasschaert

    Full Text Available BACKGROUND: To improve retention on ART, Médecins Sans Frontières, the Ministry of Health and patients piloted a community-based antiretroviral distribution and adherence monitoring model through Community ART Groups (CAG in Tete, Mozambique. By December 2012, almost 6000 patients on ART had formed groups of whom 95.7% were retained in care. We conducted a qualitative study to evaluate the relevance, dynamic and impact of the CAG model on patients, their communities and the healthcare system. METHODS: Between October 2011 and May 2012, we conducted 16 focus group discussions and 24 in-depth interviews with the major stakeholders involved in the CAG model. Audio-recorded data were transcribed verbatim and analysed using a grounded theory approach. RESULTS: Six key themes emerged from the data: 1 Barriers to access HIV care, 2 CAG functioning and actors involved, 3 Benefits for CAG members, 4 Impacts of CAG beyond the group members, 5 Setbacks, and 6 Acceptance and future expectations of the CAG model. The model provides cost and time savings, certainty of ART access and mutual peer support resulting in better adherence to treatment. Through the active role of patients, HIV information could be conveyed to the broader community, leading to an increased uptake of services and positive transformation of the identity of people living with HIV. Potential pitfalls included limited access to CAG for those most vulnerable to defaulting, some inequity to patients in individual ART care and a high dependency on counsellors. CONCLUSION: The CAG model resulted in active patient involvement and empowerment, and the creation of a supportive environment improving the ART retention. It also sparked a reorientation of healthcare services towards the community and strengthened community actions. Successful implementation and scalability requires (a the acceptance of patients as partners in health, (b adequate resources, and (c a well-functioning monitoring and

  8. Assessing parameter importance of the Common Land Model based on qualitative and quantitative sensitivity analysis

    Directory of Open Access Journals (Sweden)

    J. Li

    2013-08-01

    Full Text Available Proper specification of model parameters is critical to the performance of land surface models (LSMs. Due to high dimensionality and parameter interaction, estimating parameters of an LSM is a challenging task. Sensitivity analysis (SA is a tool that can screen out the most influential parameters on model outputs. In this study, we conducted parameter screening for six output fluxes for the Common Land Model: sensible heat, latent heat, upward longwave radiation, net radiation, soil temperature and soil moisture. A total of 40 adjustable parameters were considered. Five qualitative SA methods, including local, sum-of-trees, multivariate adaptive regression splines, delta test and Morris methods, were compared. The proper sampling design and sufficient sample size necessary to effectively screen out the sensitive parameters were examined. We found that there are 2–8 sensitive parameters, depending on the output type, and about 400 samples are adequate to reliably identify the most sensitive parameters. We also employed a revised Sobol' sensitivity method to quantify the importance of all parameters. The total effects of the parameters were used to assess the contribution of each parameter to the total variances of the model outputs. The results confirmed that global SA methods can generally identify the most sensitive parameters effectively, while local SA methods result in type I errors (i.e., sensitive parameters labeled as insensitive or type II errors (i.e., insensitive parameters labeled as sensitive. Finally, we evaluated and confirmed the screening results for their consistency with the physical interpretation of the model parameters.

  9. Religious views of the 'medical' rehabilitation model: a pilot qualitative study.

    Science.gov (United States)

    Yamey, Gavin; Greenwood, Richard

    2004-04-22

    To explore the religious beliefs that patients may bring to the rehabilitation process, and the hypothesis that these beliefs may diverge from the medical model of rehabilitation. Qualitative semi-structured interviews with representatives of six major religions--Islam, Buddhism, Christianity, Judaism, Sikhism, and Hinduism. Representatives were either health care professionals or religious leaders, all with an interest in how their religion approached health issues. There were three recurrent themes in the interviews: religious explanations for injury and illness; beliefs about recovery; religious duties of care towards family members. The Buddhist, Sikh, and Hindu interviewees described beliefs about karma--unfortunate events happening due to a person's former deeds. Fatalistic ideas, involving God having control over an individual's recovery, were expressed by the Muslim, Jewish, and Christian interviewees. All interviewees expressed the fundamental importance of a family's religious duty of care towards ill or injured relatives, and all expressed some views that were compatible with the medical model of rehabilitation. Religious beliefs may both diverge from and resonate with the medical rehabilitation model. Understanding these beliefs may be valuable in facilitating the rehabilitation of diverse religious groups.

  10. Health literacy and the social determinants of health: a qualitative model from adult learners.

    Science.gov (United States)

    Rowlands, Gillian; Shaw, Adrienne; Jaswal, Sabrena; Smith, Sian; Harpham, Trudy

    2017-02-01

    Health literacy, ‘the personal characteristics and social resources needed for individuals and communities to access, understand, appraise and use information and services to make decisions about health’, is key to improving peoples’ control over modifiable social determinants of health (SDH). This study listened to adult learners to understand their perspectives on gathering, understanding and using information for health. This qualitative project recruited participants from community skills courses to identify relevant ‘health information’ factors. Subsequently different learners put these together to develop a model of their ‘Journey to health’. Twenty-seven participants were recruited; twenty from community health literacy courses and seven from an adult basic literacy and numeracy course. Participants described health as a ‘journey’ starting from an individual's family, ethnicity and culture. Basic (functional) health literacy skills were needed to gather and understand information. More complex interactive health literacy skills were needed to evaluate the importance and relevance of information in context, and make health decisions. Critical health literacy skills could be used to adapt negative external factors that might inhibit health-promotion. Our model is an iterative linear one moving from ethnicity, community and culture, through lifestyle, to health, with learning revisited in the context of different sources of support. It builds on existing models by highlighting the importance of SDH in the translation of new health knowledge into healthy behaviours, and the importance of health literacy in enabling people to overcome barriers to health.

  11. Comparing Reasons for Quitting Substance Abuse with the Constructs of Behavioral Models: A Qualitative Study

    Directory of Open Access Journals (Sweden)

    Hamid Tavakoli Ghouchani

    2015-03-01

    Full Text Available Background and Objectives: The world population has reached over seven billion people. Of these, 230 million individuals abuse substances. Therefore, substance abuse prevention and treatment programs have received increasing attention during the past two decades. Understanding people’s motivations for quitting drug abuse is essential to the success of treatment. This study hence sought to identify major motivations for quitting and to compare them with the constructs of health education models. Materials and Methods: In the present study, qualitative content analysis was used to determine the main motivations for quitting substance abuse. Overall, 22 patients, physicians, and psychotherapists were selected from several addiction treatment clinics in Bojnord (Iran during 2014. Purposeful sampling method was applied and continued until data saturation was achieved. Data were collected through semi-structured, face-to-face interviews and field notes. All interviews were recorded and transcribed. Results: Content analysis revealed 33 sub-categories and nine categories including economic problems, drug-related concerns, individual problems, family and social problems, family expectations, attention to social status, beliefs about drug addiction, and valuing the quitting behavior. Accordingly, four themes, i.e. perceived threat, perceived barriers, attitude toward the behavior, and subjective norms, were extracted. Conclusion: Reasons for quitting substance abuse match the constructs of different behavioral models (e.g. the health belief model and the theory of planned behavior.

  12. Nursing students' perceptions of a collaborative clinical placement model: A qualitative descriptive study.

    Science.gov (United States)

    van der Riet, Pamela; Levett-Jones, Tracy; Courtney-Pratt, Helen

    2018-03-01

    Clinical placements are specifically designed to facilitate authentic learning opportunities and are an integral component of undergraduate nursing programs. However, as academics and clinicians frequently point out, clinical placements are fraught with problems that are long-standing and multidimensional in nature. Collaborative placement models, grounded in a tripartite relationship between students, university staff and clinical partners, and designed to foster students' sense of belonging, have recently been implemented to address many of the challenges associated with clinical placements. In this study a qualitative descriptive design was undertaken with the aim of exploring 14 third year third year nursing students' perceptions of a collaborative clinical placement model undertaken in an Australian university. Students participated in audio recorded focus groups following their final clinical placement. Thematic analysis of the interview data resulted in identification of six main themes: Convenience and Camaraderie, Familiarity and Confidence, Welcomed and Wanted, Belongingness and Support, Employment, and The Need for Broader Clinical Experiences. The clinical collaborative model fostered a sense of familiarity for many of the participants and this led to belongingness, acceptance, confidence and meaningful learning experiences. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. A Conceptual Model of Irritability Following Traumatic Brain Injury: A Qualitative, Participatory Research Study.

    Science.gov (United States)

    Hammond, Flora M; Davis, Christine; Cook, James R; Philbrick, Peggy; Hirsch, Mark A

    2016-01-01

    Individuals with a history of traumatic brain injury (TBI) may have chronic problems with irritability, which can negatively affect their lives. (1) To describe the experience (thoughts and feelings) of irritability from the perspectives of multiple people living with or affected by the problem, and (2) to develop a conceptual model of irritability. Qualitative, participatory research. Forty-four stakeholders (individuals with a history of TBI, family members, community professionals, healthcare providers, and researchers) divided into 5 focus groups. Each group met 10 times to discuss the experience of irritability following TBI. Data were coded using grounded theory to develop themes, metacodes, and theories. Not applicable. A conceptual model emerged in which irritability has 5 dimensions: affective (related to moods and feelings); behavioral (especially in areas of self-regulation, impulse control, and time management); cognitive-perceptual (self-talk and ways of seeing the world); relational issues (interpersonal and family dynamics); and environmental (including environmental stimuli, change, disruptions in routine, and cultural expectations). This multidimensional model provides a framework for assessment, treatment, and future research aimed at better understanding irritability, as well as the development of assessment tools and treatment interventions.

  14. Selection mechanisms underlying high impact biomedical research--a qualitative analysis and causal model.

    Directory of Open Access Journals (Sweden)

    Hilary Zelko

    Full Text Available BACKGROUND: Although scientific innovation has been a long-standing topic of interest for historians, philosophers and cognitive scientists, few studies in biomedical research have examined from researchers' perspectives how high impact publications are developed and why they are consistently produced by a small group of researchers. Our objective was therefore to interview a group of researchers with a track record of high impact publications to explore what mechanism they believe contribute to the generation of high impact publications. METHODOLOGY/PRINCIPAL FINDINGS: Researchers were located in universities all over the globe and interviews were conducted by phone. All interviews were transcribed using standard qualitative methods. A Grounded Theory approach was used to code each transcript, later aggregating concept and categories into overarching explanation model. The model was then translated into a System Dynamics mathematical model to represent its structure and behavior. Five emerging themes were found in our study. First, researchers used heuristics or rules of thumb that came naturally to them. Second, these heuristics were reinforced by positive feedback from their peers and mentors. Third, good communication skills allowed researchers to provide feedback to their peers, thus closing a positive feedback loop. Fourth, researchers exhibited a number of psychological attributes such as curiosity or open-mindedness that constantly motivated them, even when faced with discouraging situations. Fifth, the system is dominated by randomness and serendipity and is far from a linear and predictable environment. Some researchers, however, took advantage of this randomness by incorporating mechanisms that would allow them to benefit from random findings. The aggregation of these themes into a policy model represented the overall expected behavior of publications and their impact achieved by high impact researchers. CONCLUSIONS: The proposed

  15. Reproducibility of isotope ratio measurements

    International Nuclear Information System (INIS)

    Elmore, D.

    1981-01-01

    The use of an accelerator as part of a mass spectrometer has improved the sensitivity for measuring low levels of long-lived radionuclides by several orders of magnitude. However, the complexity of a large tandem accelerator and beam transport system has made it difficult to match the precision of low energy mass spectrometry. Although uncertainties for accelerator measured isotope ratios as low as 1% have been obtained under favorable conditions, most errors quoted in the literature for natural samples are in the 5 to 20% range. These errors are dominated by statistics and generally the reproducibility is unknown since the samples are only measured once

  16. Adjustments in the Almod 3W2 code models for reproducing the net load trip test in Angra I nuclear power plant

    International Nuclear Information System (INIS)

    Camargo, C.T.M.; Madeira, A.A.; Pontedeiro, A.C.; Dominguez, L.

    1986-09-01

    The recorded traces got from the net load trip test in Angra I NPP yelded the oportunity to make fine adjustments in the ALMOD 3W2 code models. The changes are described and the results are compared against plant real data. (Author) [pt

  17. Preliminary clinical nursing leadership competency model: a qualitative study from Thailand.

    Science.gov (United States)

    Supamanee, Treeyaphan; Krairiksh, Marisa; Singhakhumfu, Laddawan; Turale, Sue

    2011-12-01

    This qualitative study explored the clinical nursing leadership competency perspectives of Thai nurses working in a university hospital. To collect data, in-depth interviews were undertaken with 23 nurse administrators, and focus groups were used with 31 registered nurses. Data were analyzed using content analysis, and theory development was guided by the Iceberg model. Nurses' clinical leadership competencies emerged, comprising hidden characteristics and surface characteristics. The hidden characteristics composed three elements: motive (respect from the nursing and healthcare team and being secure in life), self-concept (representing positive attitudes and values), and traits (personal qualities necessary for leadership). The surface characteristics comprised specific knowledge of nurse leaders about clinical leadership, management and nursing informatics, and clinical skills, such as coordination, effective communication, problem solving, and clinical decision-making. The study findings help nursing to gain greater knowledge of the essence of clinical nursing leadership competencies, a matter critical for theory development in leadership. This study's results later led to the instigation of a training program for registered nurse leaders at the study site, and the formation of a preliminary clinical nursing leadership competency model. © 2011 Blackwell Publishing Asia Pty Ltd.

  18. A Conversational Model for Qualitative Research: A Case Study of Clergy and Religious Knowledge

    Science.gov (United States)

    Roland, Daniel; Wicks, Don A.

    2009-01-01

    This paper describes the qualitative research interview as a conversation designed to gain understanding of the world of research informants. It illustrates the potential of the qualitative research interview when the researcher is able to enter into and maintain a conversation with the research informant as an insider in the latter's community.…

  19. Reproducing American Sign Language Sentences: Cognitive Scaffolding in Working Memory

    Directory of Open Access Journals (Sweden)

    Ted eSupalla

    2014-08-01

    Full Text Available The American Sign Language Sentence Reproduction Test (ASL-SRT requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects’ recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies in the absence of linguistic knowledge. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are

  20. Motives and preferences of general practitioners for new collaboration models with medical specialists: a qualitative study

    Directory of Open Access Journals (Sweden)

    Klazinga Niek S

    2007-01-01

    Full Text Available Abstract Background Collaboration between general practitioners (GPs and specialists has been the focus of many collaborative care projects during the past decade. Unfortunately, quite a number of these projects failed. This raises the question of what motivates GPs to initiate and continue participating with medical specialists in new collaborative care models. The following two questions are addressed in this study: What motivates GPs to initiate and sustain new models for collaborating with medical specialists? What kind of new collaboration models do GPs suggest? Methods A qualitative study design was used. Starting in 2003 and finishing in 2005, we conducted semi-structured interviews with a purposive sample of 21 Dutch GPs. The sampling criteria were age, gender, type of practice, and practice site. The interviews were recorded, fully transcribed, and analysed by two researchers working independently. The resulting motivational factors and preferences were grouped into categories. Results 'Developing personal relationships' and 'gaining mutual respect' appeared to dominate when the motivational factors were considered. Besides developing personal relationships with specialists, the GPs were also interested in familiarizing specialists with the competencies attached to the profession of family medicine. Additionally, they were eager to increase their medical knowledge to the benefit of their patients. The GPs stated a variety of preferences with respect to the design of new models of collaboration. Conclusion Developing personal relationships with specialists appeared to be one of the dominant motives for increased collaboration. Once the relationships have been formed, an informal network with occasional professional contact seemed sufficient. Although GPs are interested in increasing their knowledge, once they have reached a certain level of expertise, they shift their focus to another specialty. The preferences for new collaboration

  1. The Maudsley Model of Family-Based Treatment for Anorexia Nervosa: A Qualitative Evaluation of Parent-to-Parent Consultation

    Science.gov (United States)

    Rhodes, Paul; Brown, Jac; Madden, Sloane

    2009-01-01

    This article describes the qualitative analysis of a randomized control trial that explores the use of parent-to-parent consultations as an augmentation to the Maudsley model of family-based treatment for anorexia. Twenty families were randomized into two groups, 10 receiving standard treatment and 10 receiving an additional parent-to-parent…

  2. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  3. A Mouse Model That Reproduces the Developmental Pathways and Site Specificity of the Cancers Associated With the Human BRCA1 Mutation Carrier State

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2015-10-01

    Full Text Available Predisposition to breast and extrauterine Müllerian carcinomas in BRCA1 mutation carriers is due to a combination of cell-autonomous consequences of BRCA1 inactivation on cell cycle homeostasis superimposed on cell-nonautonomous hormonal factors magnified by the effects of BRCA1 mutations on hormonal changes associated with the menstrual cycle. We used the Müllerian inhibiting substance type 2 receptor (Mis2r promoter and a truncated form of the Follicle stimulating hormone receptor (Fshr promoter to introduce conditional knockouts of Brca1 and p53 not only in mouse mammary and Müllerian epithelia, but also in organs that control the estrous cycle. Sixty percent of the double mutant mice developed invasive Müllerian and mammary carcinomas. Mice carrying heterozygous mutations in Brca1 and p53 also developed invasive tumors, albeit at a lesser (30% rate, in which the wild type alleles were no longer present due to loss of heterozygosity. While mice carrying heterozygous mutations in both genes developed mammary tumors, none of the mice carrying only a heterozygous p53 mutation developed such tumors (P < 0.0001, attesting to a role for Brca1 mutations in tumor development. This mouse model is attractive to investigate cell-nonautonomous mechanisms associated with cancer predisposition in BRCA1 mutation carriers and to investigate the merit of chemo-preventive drugs targeting such mechanisms.

  4. Effect of Initial Conditions on Reproducibility of Scientific Research

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-01-01

    Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705

  5. Experienced Practitioners’ Beliefs Utilized to Create a Successful Massage Therapist Conceptual Model: a Qualitative Investigation

    Science.gov (United States)

    Kennedy, Anne B.; Munk, Niki

    2017-01-01

    Background The massage therapy profession in the United States has grown exponentially, with 35% of the profession’s practitioners in practice for three years or less. Investigating personal and social factors with regard to the massage therapy profession could help to identify constructs needed to be successful in the field. Purpose This data-gathering exercise explores massage therapists’ perceptions on what makes a successful massage therapist that will provide guidance for future research. Success is defined as supporting oneself and practice solely through massage therapy and related, revenue-generating field activity. Participants and Setting Ten successful massage therapy practitioners from around the United States who have a minimum of five years of experience. Research Design Semistructured qualitative interviews were used in an analytic induction framework; index cards with preidentified concepts printed on them were utilized to enhance conversation. An iterative process of interview coding and analysis was used to determine themes and subthemes. Results Based on the participants input, the categories in which therapists needed to be successful were organized into four main themes: effectively establish therapeutic relationships, develop massage therapy business acumen, seek valuable learning environments and opportunities, and cultivate strong social ties and networks. The four themes operate within specific contexts (e.g., regulation and licensing requirements in the therapists’ state), which may also influence the success of the massage therapist. Conclusions The model needs to be tested to explore which constructs explain variability in success and attrition rate. Limitations and future research implications are discussed. PMID:28690704

  6. Designing a Qualitative Model of Doping Phenomenon Effect on Sport Marketing in Iran

    Directory of Open Access Journals (Sweden)

    Jasem Manouchehri

    2016-10-01

    Full Text Available There a number of factors effecting consumers' purchase behavior. It is believed that celebrities can effect selling positively by transferring their popular image to the endorsed product. But, it is heard lots about excommunicate behaviors in the sport world today. Disclosure of the recent doping affairs relating to Lance Armstrong's seven wins in Tour De France is just one among many spectacular and also negative cases. The main aim of the present paper was to explore the effect of doping phenomenon on sport marketing. Depth interviews data were analyzed in three phases: open coding, axial coding, and selective coding. 297 open codes were achieved by 18 interviews. Grouping axial codes in each case and comparing, all gained codes can be divided in five groups: brand image (athlete and endorsed product brands images, moral reasoning (moral coupling, moral decoupling, and moral rationalization, consumer behavioral consequences (word of mouth, purchasing intention, and brand loyalty, attitude change (attitudes change toward athlete and brand, and moral emotions (moral evaluation, contempt, anger, disgust, and sympathy. The proposed qualitative model for the effect of doping phenomenon on sport marketing in Iran illustrated that moral emotions and product brand image affected by the doped athlete brand image and it resulted in attitudes change toward endorser athlete and endorsed brand and negative consumer behavioral consequences, however, moral reasoning strategies emerged by cognitive dissonance might protect consumers behavior from negative effects.

  7. Business Models, Vaccination Services, and Public Health Relationships of Retail Clinics: A Qualitative Study.

    Science.gov (United States)

    Arthur, Bayo C; Fisher, Allison Kennedy; Shoemaker, Sarah J; Pozniak, Alyssa; Stokley, Shannon

    2015-01-01

    Despite the rapid growth of retail clinics (RCs), literature is limited in terms of how these facilities offer preventive services, particularly vaccination services. The purpose of this study was to obtain an in-depth understanding of the RC business model pertaining to vaccine offerings, profitability, and decision making. From March to June 2009, we conducted 15 interviews with key individuals from three types of organizations: 12 representatives of RC corporations, 2 representatives of retail hosts (i.e., stores in which the RCs are located), and 1 representative of an industry association. We analyzed interview transcripts qualitatively. Our results indicate that consumer demand and profitability were the main drivers in offering vaccinations. RCs in this sample primarily offered vaccinations to adults and adolescents, and they were not well integrated with local public health and immunization registries. Our findings demonstrate the potential for stronger linkages with public health in these settings. The findings also may help inform future research to increase patient access to vaccination services at RCs.

  8. Experienced Practitioners' Beliefs Utilized to Create a Successful Massage Therapist Conceptual Model: a Qualitative Investigation.

    Science.gov (United States)

    Kennedy, Anne B; Munk, Niki

    2017-06-01

    The massage therapy profession in the United States has grown exponentially, with 35% of the profession's practitioners in practice for three years or less. Investigating personal and social factors with regard to the massage therapy profession could help to identify constructs needed to be successful in the field. This data-gathering exercise explores massage therapists' perceptions on what makes a successful massage therapist that will provide guidance for future research. Success is defined as supporting oneself and practice solely through massage therapy and related, revenue-generating field activity. Ten successful massage therapy practitioners from around the United States who have a minimum of five years of experience. Semistructured qualitative interviews were used in an analytic induction framework; index cards with preidentified concepts printed on them were utilized to enhance conversation. An iterative process of interview coding and analysis was used to determine themes and subthemes. Based on the participants input, the categories in which therapists needed to be successful were organized into four main themes: effectively establish therapeutic relationships, develop massage therapy business acumen, seek valuable learning environments and opportunities, and cultivate strong social ties and networks. The four themes operate within specific contexts (e.g., regulation and licensing requirements in the therapists' state), which may also influence the success of the massage therapist. The model needs to be tested to explore which constructs explain variability in success and attrition rate. Limitations and future research implications are discussed.

  9. Evaluation of single- and dual-porosity models for reproducing the release of external and internal tracers from heterogeneous waste-rock piles.

    Science.gov (United States)

    Blackmore, S; Pedretti, D; Mayer, K U; Smith, L; Beckie, R D

    2018-05-30

    Accurate predictions of solute release from waste-rock piles (WRPs) are paramount for decision making in mining-related environmental processes. Tracers provide information that can be used to estimate effective transport parameters and understand mechanisms controlling the hydraulic and geochemical behavior of WRPs. It is shown that internal tracers (i.e. initially present) together with external (i.e. applied) tracers provide complementary and quantitative information to identify transport mechanisms. The analysis focuses on two experimental WRPs, Piles 4 and Pile 5 at the Antamina Mine site (Peru), where both an internal chloride tracer and externally applied bromide tracer were monitored in discharge over three years. The results suggest that external tracers provide insight into transport associated with relatively fast flow regions that are activated during higher-rate recharge events. In contrast, internal tracers provide insight into mechanisms controlling solutes release from lower-permeability zones within the piles. Rate-limited diffusive processes, which can be mimicked by nonlocal mass-transfer models, affect both internal and external tracers. The sensitivity of the mass-transfer parameters to heterogeneity is higher for external tracers than for internal tracers, as indicated by the different mean residence times characterizing the flow paths associated with each tracer. The joint use of internal and external tracers provides a more comprehensive understanding of the transport mechanisms in WRPs. In particular, the tracer tests support the notion that a multi-porosity conceptualization of WRPs is more adequate for capturing key mechanisms than a dual-porosity conceptualization. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Qualitative evaluation of adherence therapy in Parkinson’s disease: a multidirectional model

    Directory of Open Access Journals (Sweden)

    Daley DJ

    2015-07-01

    Full Text Available David James Daley,1,2 Katherine Helen O’Leary Deane,3 Richard John Gray,4 Rebekah Hill,3 Phyo Kyaw Myint5 1Norwich Medical School, Faculty of Medicine and Health Sciences, University of East Anglia, Norwich Research Park, 2Norfolk and Norwich University Hospital NHS Foundation Trust, 3School of Health Sciences, University of East Anglia, Norwich Research Park, Norwich, UK; 4Hamad Medical Corporation, Doha, Qatar; 5Epidemiology Group, School of Medicine and Dentistry, Institute of Applied Health Sciences, College of Life Sciences and Medicine, University of Aberdeen, Aberdeen, UK Background: Medication can control the symptoms of Parkinson’s disease (PD. Despite this, non-adherence with medication is prevalent in PD. Treatments for improving adherence with medication have been investigated in many chronic conditions, including PD. However, few researchers have evaluated their interventions qualitatively. We investigated the acceptability and potential mechanism of action of adherence therapy (AT in PD patients and their spouse/carers who received the intervention as part of a randomized controlled trial. Methods: Sixteen participants (ten patients and six spouses/carers who had recently completed the trial were purposely selected in order to cover a range of ages and disease severity. Semi-structured interviews were conducted in the participants’ homes. Data were transcribed and analyzed using a thematic approach. A second researcher, naïve to PD and AT, analyzed the data independently to limit bias. Results: The trial showed that AT significantly improved both medication adherence and quality of life in people with PD. Specifically, patients who received AT reported improvements in mobility, activities of daily living, emotional wellbeing, cognition, communication, and body discomfort. General beliefs about medication also significantly improved in those who received AT compared with controls. In the current qualitative evaluation, a

  11. Qualitatively Modeling solute fate and transport across scales in an agricultural catchment with diverse lithology

    Science.gov (United States)

    Wayman, C. R.; Russo, T. A.; Li, L.; Forsythe, B.; Hoagland, B.

    2017-12-01

    As part of the Susquehanna Shale Hills Critical Zone Observatory (SSHCZO) project, we have collected geochemical and hydrological data from several subcatchments and four monitoring sites on the main stem of Shaver's Creek, in Huntingon county, Pennsylvania. One subcatchment (0.43 km2) is under agricultural land use, and the monitoring locations on the larger Shaver's Creek (up to 163 km2) drain watersheds with 0 to 25% agricultural area. These two scales of investigation, coupled with advances made across the SSHCZO on multiple lithologies allow us to extrapolate from the subcatchment to the larger watershed. We use geochemical surface and groundwater data to estimate the solute and water transport regimes within the catchment, and to show how lithology and land use are major controls on ground and surface water quality. One area of investigation includes the transport of nutrients between interflow and regional groundwater, and how that connectivity may be reflected in local surface waters. Water and nutrient (Nitrogen) isotopes, will be used to better understand the relative contributions of local and regional groundwater and interflow fluxes into nearby streams. Following initial qualitative modeling, multiple hydrologic and nutrient transport models (e.g. SWAT and CYCLES/PIHM) will be evaluated from the subcatchment to large watershed scales. We will evaluate the ability to simulate the contributions of regional groundwater versus local groundwater, and also impacts of agricultural land management on surface water quality. Improving estimations of groundwater contributions to stream discharge will provide insight into how much agricultural development can impact stream quality and nutrient loading.

  12. Simplified Qualitative Discrete Numerical Model to Determine Cracking Pattern in Brittle Materials by Means of Finite Element Method

    OpenAIRE

    Ochoa-Avendaño, J.; Garzon-Alvarado, D. A.; Linero, Dorian L.; Cerrolaza, M.

    2017-01-01

    This paper presents the formulation, implementation, and validation of a simplified qualitative model to determine the crack path of solids considering static loads, infinitesimal strain, and plane stress condition. This model is based on finite element method with a special meshing technique, where nonlinear link elements are included between the faces of the linear triangular elements. The stiffness loss of some link elements represents the crack opening. Three experimental tests of bending...

  13. An integrated qualitative and quantitative modeling framework for computer‐assisted HAZOP studies

    DEFF Research Database (Denmark)

    Wu, Jing; Zhang, Laibin; Hu, Jinqiu

    2014-01-01

    safety critical operations, its causes and consequences. The outcome is a qualitative hazard analysis of selected process deviations from normal operations and their consequences as input to a traditional HAZOP table. The list of unacceptable high risk deviations identified by the qualitative HAZOP......‐assisted HAZOP studies introduced in this article allows the HAZOP team to devote more attention to high consequence hazards. © 2014 American Institute of Chemical Engineers AIChE J 60: 4150–4173, 2014...

  14. Reproducible research: a minority opinion

    Science.gov (United States)

    Drummond, Chris

    2018-01-01

    Reproducible research, a growing movement within many scientific fields, including machine learning, would require the code, used to generate the experimental results, be published along with any paper. Probably the most compelling argument for this is that it is simply following good scientific practice, established over the years by the greats of science. The implication is that failure to follow such a practice is unscientific, not a label any machine learning researchers would like to carry. It is further claimed that misconduct is causing a growing crisis of confidence in science. That, without this practice being enforced, science would inevitably fall into disrepute. This viewpoint is becoming ubiquitous but here I offer a differing opinion. I argue that far from being central to science, what is being promulgated is a narrow interpretation of how science works. I contend that the consequences are somewhat overstated. I would also contend that the effort necessary to meet the movement's aims, and the general attitude it engenders would not serve well any of the research disciplines, including our own.

  15. Disease management projects and the Chronic Care Model in action: baseline qualitative research

    Science.gov (United States)

    2012-01-01

    Background Disease management programs, especially those based on the Chronic Care Model (CCM), are increasingly common in the Netherlands. While disease management programs have been well-researched quantitatively and economically, less qualitative research has been done. The overall aim of the study is to explore how disease management programs are implemented within primary care settings in the Netherlands; this paper focuses on the early development and implementation stages of five disease management programs in the primary care setting, based on interviews with project leadership teams. Methods Eleven semi-structured interviews were conducted at the five selected sites with sixteen professionals interviewed; all project directors and managers were interviewed. The interviews focused on each project’s chosen chronic illness (diabetes, eating disorders, COPD, multi-morbidity, CVRM) and project plan, barriers to development and implementation, the project leaders’ action and reactions, as well as their roles and responsibilities, and disease management strategies. Analysis was inductive and interpretive, based on the content of the interviews. After analysis, the results of this research on disease management programs and the Chronic Care Model are viewed from a traveling technology framework. Results This analysis uncovered four themes that can be mapped to disease management and the Chronic Care Model: (1) changing the health care system, (2) patient-centered care, (3) technological systems and barriers, and (4) integrating projects into the larger system. Project leaders discussed the paths, both direct and indirect, for transforming the health care system to one that addresses chronic illness. Patient-centered care was highlighted as needed and a paradigm shift for many. Challenges with technological systems were pervasive. Project leaders managed the expenses of a traveling technology, including the social, financial, and administration involved

  16. Disease management projects and the Chronic Care Model in action: baseline qualitative research.

    Science.gov (United States)

    Walters, Bethany Hipple; Adams, Samantha A; Nieboer, Anna P; Bal, Roland

    2012-05-11

    Disease management programs, especially those based on the Chronic Care Model (CCM), are increasingly common in The Netherlands. While disease management programs have been well-researched quantitatively and economically, less qualitative research has been done. The overall aim of the study is to explore how disease management programs are implemented within primary care settings in The Netherlands; this paper focuses on the early development and implementation stages of five disease management programs in the primary care setting, based on interviews with project leadership teams. Eleven semi-structured interviews were conducted at the five selected sites with sixteen professionals interviewed; all project directors and managers were interviewed. The interviews focused on each project's chosen chronic illness (diabetes, eating disorders, COPD, multi-morbidity, CVRM) and project plan, barriers to development and implementation, the project leaders' action and reactions, as well as their roles and responsibilities, and disease management strategies. Analysis was inductive and interpretive, based on the content of the interviews. After analysis, the results of this research on disease management programs and the Chronic Care Model are viewed from a traveling technology framework. This analysis uncovered four themes that can be mapped to disease management and the Chronic Care Model: (1) changing the health care system, (2) patient-centered care, (3) technological systems and barriers, and (4) integrating projects into the larger system. Project leaders discussed the paths, both direct and indirect, for transforming the health care system to one that addresses chronic illness. Patient-centered care was highlighted as needed and a paradigm shift for many. Challenges with technological systems were pervasive. Project leaders managed the expenses of a traveling technology, including the social, financial, and administration involved. At the sites, project leaders served

  17. Navigating the complexities of qualitative comparative analysis: case numbers, necessity relations, and model ambiguities.

    Science.gov (United States)

    Thiem, Alrik

    2014-12-01

    In recent years, the method of Qualitative Comparative Analysis (QCA) has been enjoying increasing levels of popularity in evaluation and directly neighboring fields. Its holistic approach to causal data analysis resonates with researchers whose theories posit complex conjunctions of conditions and events. However, due to QCA's relative immaturity, some of its technicalities and objectives have not yet been well understood. In this article, I seek to raise awareness of six pitfalls of employing QCA with regard to the following three central aspects: case numbers, necessity relations, and model ambiguities. Most importantly, I argue that case numbers are irrelevant to the methodological choice of QCA or any of its variants, that necessity is not as simple a concept as it has been suggested by many methodologists, and that doubt must be cast on the determinacy of virtually all results presented in past QCA research. By means of empirical examples from published articles, I explain the background of these pitfalls and introduce appropriate procedures, partly with reference to current software, that help avoid them. QCA carries great potential for scholars in evaluation and directly neighboring areas interested in the analysis of complex dependencies in configurational data. If users beware of the pitfalls introduced in this article, and if they avoid mechanistic adherence to doubtful "standards of good practice" at this stage of development, then research with QCA will gain in quality, as a result of which a more solid foundation for cumulative knowledge generation and well-informed policy decisions will also be created. © The Author(s) 2014.

  18. The explanatory models of depression and anxiety in primary care: a qualitative study from India

    Directory of Open Access Journals (Sweden)

    Andrew Gracy

    2012-09-01

    Full Text Available Abstract Background The biggest barrier to treatment of common mental disorders in primary care settings is low recognition among health care providers. This study attempts to explore the explanatory models of common mental disorders (CMD with the goal of identifying how they could help in improving the recognition, leading to effective treatment in primary care. Results The paper describes findings of a cross sectional qualitative study nested within a large randomized controlled trial (the Manas trial. Semi structured interviews were conducted with 117 primary health care attendees (30 males and 87 females suffering from CMD. Main findings of the study are that somatic phenomena were by far the most frequent presenting problems; however, psychological phenomena were relatively easily elicited on probing. Somatic phenomena were located within a biopsychosocial framework, and a substantial proportion of informants used the psychological construct of ‘tension’ or ‘worry’ to label their illness, but did not consider themselves as suffering from a ‘mental disorder’. Very few gender differences were observed in the descriptions of symptoms but at the same time the pattern of adverse life events and social difficulties varied across gender. Conclusion Our study demonstrates how people present their illness through somatic complaints but clearly link their illness to their psychosocial world. However they do not associate their illness to a ‘mental disorder’ and this is an important phenomenon that needs to be recognized in management of CMD in primary settings. Our study also elicits important gender differences in the experience of CMD.

  19. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  20. Qualitative analysis of cosmological models in Brans-Dicke theory, solutions from non-minimal coupling and viscous universe

    International Nuclear Information System (INIS)

    Romero Filho, C.A.

    1988-01-01

    Using dynamical system theory we investigate homogeneous and isotropic models in Brans-Dicke theory for perfect fluids with general equation of state and arbitrary ω. Phase diagrams are drawn on the Poincare sphere which permits a qualitative analysis of the models. Based on this analysis we construct a method for generating classes of solutions in Brans-Dicke theory. The same technique is used for studying models arising from non-minimal coupling of electromagnetism with gravity. In addition, viscous fluids are considered and non-singular solutions with bulk viscosity are found. (author)

  1. Qualitative Analysis of a Diffusive Ratio-Dependent Holling-Tanner Predator-Prey Model with Smith Growth

    Directory of Open Access Journals (Sweden)

    Zongmin Yue

    2013-01-01

    Full Text Available We investigated the dynamics of a diffusive ratio-dependent Holling-Tanner predator-prey model with Smith growth subject to zero-flux boundary condition. Some qualitative properties, including the dissipation, persistence, and local and global stability of positive constant solution, are discussed. Moreover, we give the refined a priori estimates of positive solutions and derive some results for the existence and nonexistence of nonconstant positive steady state.

  2. Effective Form of Reproducing the Total Financial Potential of Ukraine

    Directory of Open Access Journals (Sweden)

    Portna Oksana V.

    2015-03-01

    Full Text Available Development of scientific principles of reproducing the total financial potential of the country and its effective form is an urgent problem both in theoretical and practical aspects of the study, the solution of which is intended to ensure the active mobilization and effective use of the total financial potential of Ukraine, and as a result — its expanded reproduction as well, which would contribute to realization of the internal capacities for stabilization of the national economy. The purpose of the article is disclosing the essence of the effective form of reproducing the total financial potential of the country, analyzing the results of reproducing the total financial potential of Ukraine. It has been proved that the basis for the effective form of reproducing the total financial potential of the country is the volume and flow of resources, which are associated with the «real» economy, affect the dynamics of GDP and define it, i.e. resource and process forms of reproducing the total financial potential of Ukraine (which precede the effective one. The analysis of reproducing the total financial potential of Ukraine has shown that in the analyzed period there was an increase in the financial possibilities of the country, but steady dynamics of reduction of the total financial potential was observed. If we consider the amount of resources involved in production, creating a net value added and GDP, it occurs on a restricted basis. Growth of the total financial potential of Ukraine is connected only with extensive quantitative factors rather than intensive qualitative changes.

  3. Co-Designing and Co-Teaching Graduate Qualitative Methods: An Innovative Ethnographic Workshop Model

    Science.gov (United States)

    Cordner, Alissa; Klein, Peter T.; Baiocchi, Gianpaolo

    2012-01-01

    This article describes an innovative collaboration between graduate students and a faculty member to co-design and co-teach a graduate-level workshop-style qualitative methods course. The goal of co-designing and co-teaching the course was to involve advanced graduate students in all aspects of designing a syllabus and leading class discussions in…

  4. Probability of identification (POI): a statistical model for the validation of qualitative botanical identification methods

    Science.gov (United States)

    A qualitative botanical identification method (BIM) is an analytical procedure which returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) mate...

  5. A Model to Reproduce the Response of the Gaseous Fission Product Monitor (GFPM) in a CANDU{sup R} 6 Reactor (An Estimate of Tramp Uranium Mass in a Candu Core)

    Energy Technology Data Exchange (ETDEWEB)

    Mostofian, Sara; Boss, Charles [AECL Atomic Energy of Canada Limited, 2251 Speakman Drive, Mississauga Ontario L5K 1B2 (Canada)

    2008-07-01

    In a Canada Deuterium Uranium (Candu) reactor, the fuel bundles produce gaseous and volatile fission products that are contained within the fuel matrix and the welded zircaloy sheath. Sometimes a fuel sheath can develop a defect and release the fission products into the circulating coolant. To detect fuel defects, a Gaseous Fission Product Monitoring (GFPM) system is provided in Candu reactors. The (GFPM) is a gamma ray spectrometer that measures fission products in the coolant and alerts the operator to the presence of defected fuel through an increase in measured fission product concentration. A background fission product concentration in the coolant also arises from tramp uranium. The sources of the tramp uranium are small quantities of uranium contamination on the surfaces of fuel bundles and traces of uranium on the pressure tubes, arising from the rare defected fuel element that released uranium into the core. This paper presents a dynamic model that reproduces the behaviour of a GFPM in a Candu 6 plant. The model predicts the fission product concentrations in the coolant from the chronic concentration of tramp uranium on the inner surface of the pressure tubes (PT) and the surface of the fuel bundles (FB) taking into account the on-power refuelling system. (authors)

  6. Qualitative and quantitative combined nonlinear dynamics model and its application in analysis of price, supply–demand ratio and selling rate

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    The qualitative and quantitative combined nonlinear dynamics model proposed in this paper fill the gap in nonlinear dynamics model in terms of qualitative and quantitative combined methods, allowing the qualitative model and quantitative model to perfectly combine and overcome their weaknesses by learning from each other. These two types of models use their strengths to make up for the other’s deficiencies. The qualitative and quantitative combined models can surmount the weakness that the qualitative model cannot be applied and verified in a quantitative manner, and the high costs and long time of multiple construction as well as verification of the quantitative model. The combined model is more practical and efficient, which is of great significance for nonlinear dynamics. The qualitative and quantitative combined modeling and model analytical method raised in this paper is not only applied to nonlinear dynamics, but can be adopted and drawn on in the modeling and model analysis of other fields. Additionally, the analytical method of qualitative and quantitative combined nonlinear dynamics model proposed in this paper can satisfactorily resolve the problems with the price system’s existing nonlinear dynamics model analytical method. The three-dimensional dynamics model of price, supply–demand ratio and selling rate established in this paper make estimates about the best commodity prices using the model results, thereby providing a theoretical basis for the government’s macro-control of price. Meanwhile, this model also offer theoretical guidance to how to enhance people’s purchasing power and consumption levels through price regulation and hence to improve people’s living standards.

  7. Determining the optimal number of independent components for reproducible transcriptomic data analysis.

    Science.gov (United States)

    Kairov, Ulykbek; Cantini, Laura; Greco, Alessandro; Molkenov, Askhat; Czerwinska, Urszula; Barillot, Emmanuel; Zinovyev, Andrei

    2017-09-11

    Independent Component Analysis (ICA) is a method that models gene expression data as an action of a set of statistically independent hidden factors. The output of ICA depends on a fundamental parameter: the number of components (factors) to compute. The optimal choice of this parameter, related to determining the effective data dimension, remains an open question in the application of blind source separation techniques to transcriptomic data. Here we address the question of optimizing the number of statistically independent components in the analysis of transcriptomic data for reproducibility of the components in multiple runs of ICA (within the same or within varying effective dimensions) and in multiple independent datasets. To this end, we introduce ranking of independent components based on their stability in multiple ICA computation runs and define a distinguished number of components (Most Stable Transcriptome Dimension, MSTD) corresponding to the point of the qualitative change of the stability profile. Based on a large body of data, we demonstrate that a sufficient number of dimensions is required for biological interpretability of the ICA decomposition and that the most stable components with ranks below MSTD have more chances to be reproduced in independent studies compared to the less stable ones. At the same time, we show that a transcriptomics dataset can be reduced to a relatively high number of dimensions without losing the interpretability of ICA, even though higher dimensions give rise to components driven by small gene sets. We suggest a protocol of ICA application to transcriptomics data with a possibility of prioritizing components with respect to their reproducibility that strengthens the biological interpretation. Computing too few components (much less than MSTD) is not optimal for interpretability of the results. The components ranked within MSTD range have more chances to be reproduced in independent studies.

  8. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  9. REPRODUCING THE OBSERVED ABUNDANCES IN RCB AND HdC STARS WITH POST-DOUBLE-DEGENERATE MERGER MODELS-CONSTRAINTS ON MERGER AND POST-MERGER SIMULATIONS AND PHYSICS PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    Menon, Athira; Herwig, Falk; Denissenkov, Pavel A. [Department of Physics and Astronomy, University of Victoria, Victoria, BC V8P5C2 (Canada); Clayton, Geoffrey C.; Staff, Jan [Department of Physics and Astronomy, Louisiana State University, 202 Nicholson Hall, Tower Dr., Baton Rouge, LA 70803-4001 (United States); Pignatari, Marco [Department of Physics, University of Basel, Klingelbergstrasse 82, CH-4056 Basel (Switzerland); Paxton, Bill [Kavli Institute for Theoretical Physics and Department of Physics, Kohn Hall, University of California, Santa Barbara, CA 93106 (United States)

    2013-07-20

    The R Coronae Borealis (RCB) stars are hydrogen-deficient, variable stars that are most likely the result of He-CO WD mergers. They display extremely low oxygen isotopic ratios, {sup 16}O/{sup 18}O {approx_equal} 1-10, {sup 12}C/{sup 13}C {>=} 100, and enhancements up to 2.6 dex in F and in s-process elements from Zn to La, compared to solar. These abundances provide stringent constraints on the physical processes during and after the double-degenerate merger. As shown previously, O-isotopic ratios observed in RCB stars cannot result from the dynamic double-degenerate merger phase, and we now investigate the role of the long-term one-dimensional spherical post-merger evolution and nucleosynthesis based on realistic hydrodynamic merger progenitor models. We adopt a model for extra envelope mixing to represent processes driven by rotation originating in the dynamical merger. Comprehensive nucleosynthesis post-processing simulations for these stellar evolution models reproduce, for the first time, the full range of the observed abundances for almost all the elements measured in RCB stars: {sup 16}O/{sup 18}O ratios between 9 and 15, C-isotopic ratios above 100, and {approx}1.4-2.35 dex F enhancements, along with enrichments in s-process elements. The nucleosynthesis processes in our models constrain the length and temperature in the dynamic merger shell-of-fire feature as well as the envelope mixing in the post-merger phase. s-process elements originate either in the shell-of-fire merger feature or during the post-merger evolution, but the contribution from the asymptotic giant branch progenitors is negligible. The post-merger envelope mixing must eventually cease {approx}10{sup 6} yr after the dynamic merger phase before the star enters the RCB phase.

  10. A Qualitative Application of the Belsky Model to Explore Early Care and Education Teachers' Mealtime History, Beliefs, and Interactions.

    Science.gov (United States)

    Swindle, Taren M; Patterson, Zachary; Boden, Carrie J

    Studies on factors associated with nutrition practices in early care and education settings often focus on sociodemographic and programmatic characteristics. This qualitative study adapted and applied Belsky's determinants of parenting model to inform a broader exploration of Early Care and Education Teachers (ECETs) practices. Qualitative cross-sectional study with ECETs. The researchers interviewed ECETs in their communities across a Southern state. Purposive sampling was employed to recruit ECETs (n = 28) from Head Start or state-funded centers serving low-income families. Developmental histories of ECETs regarding food and nutrition, beliefs about child nutrition, and teaching interactions related to food. Qualitative interviews were coded using a deductive content analysis approach. Three distinct interrelationships were observed across the themes. First, rules and routines regarding food and mealtime in the educators' childhood often aligned with educator beliefs and behaviors at meals in their classroom. Second, some ECETs described motivations to leave a healthy food legacy for children in their class. Finally, an experience of food insecurity appeared in narratives that also emphasized making sure children got enough through various strategies. The influence of ECET developmental histories and their related beliefs can be addressed through professional development and ongoing support. Future study should quantify model constructs in a larger sample and study their relationships over time. Copyright © 2017 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  11. Using a nursing theory or a model in nursing PhD dissertations: a qualitative study from Turkey.

    Science.gov (United States)

    Mete, Samiye; Gokçe İsbir, Gozde

    2015-04-01

    The aim of this study was to reveal experiences of nursing students and their advisors using theories and models in their PhD dissertations. The study adopted a descriptive qualitative approach. This study was performed with 10 PhD candidates and their five advisors from nursing faculty. The results of the study were categorized into four. These are reasons for using a theory/model in a PhD dissertation, reasons for preferring a given model, causes of difficulties in using models in PhD dissertations, and facilitating factors of using theories and models in PhD of dissertations. It was also reported to contribute to the methodology of research and professional development of the students and advisors. © 2014 NANDA International, Inc.

  12. Systemic thioridazine in combination with dicloxacillin against early aortic graft infections caused by Staphylococcus aureus in a porcine model: In vivo results do not reproduce the in vitro synergistic activity.

    Directory of Open Access Journals (Sweden)

    Michael Stenger

    Full Text Available Conservative treatment solutions against aortic prosthetic vascular graft infection (APVGI for inoperable patients are limited. The combination of antibiotics with antibacterial helper compounds, such as the neuroleptic drug thioridazine (TDZ, should be explored.To investigate the efficacy of conservative systemic treatment with dicloxacillin (DCX in combination with TDZ (DCX+TDZ, compared to DCX alone, against early APVGI caused by methicillin-sensitive Staphylococcus aureus (MSSA in a porcine model.The synergism of DCX+TDZ against MSSA was initially assessed in vitro by viability assay. Thereafter, thirty-two pigs had polyester grafts implanted in the infrarenal aorta, followed by inoculation with 106 CFU of MSSA, and were randomly administered oral systemic treatment with either 1 DCX or 2 DCX+TDZ. Treatment was initiated one week postoperatively and continued for a further 21 days. Weight, temperature, and blood samples were collected at predefined intervals. By termination, bacterial quantities from the graft surface, graft material, and perigraft tissue were obtained.Despite in vitro synergism, the porcine experiment revealed no statistical differences for bacteriological endpoints between the two treatment groups, and none of the treatments eradicated the APVGI. Accordingly, the mixed model analyses of weight, temperature, and blood samples revealed no statistical differences.Conservative systemic treatment with DCX+TDZ did not reproduce in vitro results against APVGI caused by MSSA in this porcine model. However, unexpected severe adverse effects related to the planned dose of TDZ required a considerable reduction to the administered dose of TDZ, which may have compromised the results.

  13. Qualitative cosmology

    International Nuclear Information System (INIS)

    Khalatnikov, I.M.; Belinskij, V.A.

    1984-01-01

    Application of the qualitative theory of dynamic systems to analysis of homogeneous cosmological models is described. Together with the well-known cases, requiring ideal liquid, the properties of cosmological evolution of matter with dissipative processes due to viscosity are considered. New cosmological effects occur, when viscosity terms being one and the same order with the rest terms in the equations of gravitation or even exceeding them. In these cases the description of the dissipative process by means of only two viscosity coefficients (volume and shift) may become inapplicable because all the rest decomposition terms of dissipative addition to the energy-momentum in velocity gradient can be large application of equations with hydrodynamic viscosty should be considered as a model of dissipative effects in cosmology

  14. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  15. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  16. The persistence of subsistence: qualitative social-ecological modeling of indigenous aquatic hunting and gathering in tropical Australia

    Directory of Open Access Journals (Sweden)

    Marcus Barber

    2015-03-01

    Full Text Available Subsistence remains critical to indigenous people in settler-colonial states such as Australia, providing key foundations for indigenous identities and for wider state recognition. However, the drivers of contemporary subsistence are rarely fully articulated and analyzed in terms of likely changing conditions. Our interdisciplinary team combined past research experience gained from multiple sites with published literature to create two generalized qualitative models of the socio-cultural and environmental influences on indigenous aquatic subsistence in northern Australia. One model focused on the longer term (inter-year to generational persistence of subsistence at the community scale, the other model on shorter term (day to season drivers of effort by active individuals. The specification of driver definitions and relationships demonstrates the complexities of even generalized and materialist models of contemporary subsistence practices. The qualitative models were analyzed for emergent properties and for responses to plausible changes in key variables: access, habitat degradation, social security availability, and community dysfunction. Positive human community condition is shown to be critical to the long-term persistence of subsistence, but complex interactions of negative and positive drivers shape subsistence effort expended at the individual scale and within shorter time frames. Such models enable motivations, complexities, and the potential management and policy levers of significance to be identified, defined, causally related, and debated. The models can be used to augment future models of human-natural systems, be tested against case-specific field conditions and/or indigenous perspectives, and aid preliminary assessments of the effects on subsistence of changes in social and environmental conditions, including policy settings.

  17. Qualitative Content Analysis

    OpenAIRE

    Philipp Mayring

    2000-01-01

    The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...

  18. Qualitative research.

    Science.gov (United States)

    Gelling, Leslie

    2015-03-25

    Qualitative research has an important role in helping nurses and other healthcare professionals understand patient experiences of health and illness. Qualitative researchers have a large number of methodological options and therefore should take care in planning and conducting their research. This article offers a brief overview of some of the key issues qualitative researchers should consider.

  19. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    Science.gov (United States)

    Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  20. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  1. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  2. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  3. Extending a model of precarious employment: A qualitative study of immigrant workers in Spain.

    Science.gov (United States)

    Porthé, Victoria; Ahonen, Emily; Vázquez, M Luisa; Pope, Catherine; Agudelo, Andrés Alonso; García, Ana M; Amable, Marcelo; Benavides, Fernando G; Benach, Joan

    2010-04-01

    Since the 1980s, changes in the labor market have modified power relations between capital and labor, leading to greater levels of precarious employment among workers. Globalization has led to a growth in migration, as people leave their countries in search of work. We aimed to describe the dimensions of precarious employment for immigrant workers in Spain. Qualitative study using analytic induction. Criterion sampling was used to recruit 129 immigrant workers in Spain with documented and undocumented administrative status. Data quality was ensured by triangulation. Immigrant workers reported that precarious employment is characterized by high job instability, a lack of power for negotiating employment conditions, and defenselessness against high labor demands. They described insufficient wages, long working hours, limited social benefits, and difficulty in exercising their rights. Undocumented workers reported greater defenselessness and worse employment conditions. This study allowed us to describe the dimensions of precarious employment in immigrant workers. (c) 2010 Wiley-Liss, Inc.

  4. Methods for partial differential equations qualitative properties of solutions, phase space analysis, semilinear models

    CERN Document Server

    Ebert, Marcelo R

    2018-01-01

    This book provides an overview of different topics related to the theory of partial differential equations. Selected exercises are included at the end of each chapter to prepare readers for the “research project for beginners” proposed at the end of the book. It is a valuable resource for advanced graduates and undergraduate students who are interested in specializing in this area. The book is organized in five parts: In Part 1 the authors review the basics and the mathematical prerequisites, presenting two of the most fundamental results in the theory of partial differential equations: the Cauchy-Kovalevskaja theorem and Holmgren's uniqueness theorem in its classical and abstract form. It also introduces the method of characteristics in detail and applies this method to the study of Burger's equation. Part 2 focuses on qualitative properties of solutions to basic partial differential equations, explaining the usual properties of solutions to elliptic, parabolic and hyperbolic equations for the archetypes...

  5. Governance arrangements for IT project portfolio management qualitative insights and a quantitative modeling approach

    CERN Document Server

    Frey, Thorsten

    2014-01-01

    Due to the growing importance of IT-based innovations, contemporary firms face an excessive number of proposals for IT projects. As typically only a fraction of these projects can be implemented with the given capacity, IT project portfolio management as a relatively new discipline has received growing attention in research and practice in recent years.?Thorsten Frey demonstrates how companies are struggling to find the right balance between local autonomy and central overview about all projects in the organization. In this context, impacts of different contextual factors on the design of governance arrangements for IT project portfolio management are demonstrated. Moreover, consequences of the use of different organizational designs are analyzed. The author presents insights from a qualitative empirical study as well as a simulative approach.

  6. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    Science.gov (United States)

    2017-08-09

    project.org/) and SPSS (IBM Corp., Armonk, NY) for data analysis. Mean and confidence inter- vals for each measure are found in Tables 1–7. To assess...visits, and was calculated using a two- way mixed model in SPSS MCV and MRD values closer to 0 are considered to be the most reproducible, and ICC

  7. "Personified as Paragon of Suffering...... Optimistic Being of Achieving Normalcy:" A Conceptual Model Derived from Qualitative Research

    Science.gov (United States)

    Nayak, Shalini G; Pai, Mamatha Shivananda; George, Linu Sara

    2018-01-01

    Background: Conceptual models developed through qualitative research are based on the unique experiences of suffering and individuals’ adoptions of each participant. A wide array of problems are faced by head-and-neck cancer (HNC) patients due to disease pathology and treatment modalities which are sufficient to influence the quality of life (QOL). Men possess greater self-acceptance and are better equipped with intrapersonal strength to cope with stress and adequacy compared to women. Methodology: A qualitative phenomenology study was conducted among seven women suffering from HNC, with the objective to understand their experiences of suffering and to describe the phenomenon. Data were collected by face-to-face, in-depth, open-ended interviews. Data were analyzed using Open Code software (OPC 4.0) by following the steps of Colaizzi process. Results: The phenomenon that emerged out of the lived experiences of HNC women was "Personified as paragon of suffering.optimistic being of achieving normalcy," with five major themes and 13 subthemes. Conclusion: The conceptual model developed with the phenomenological approach is very specific to the women suffering from HNC, which will be contributing to develop strategies to improve the QOL of women. PMID:29440812

  8. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  9. Relevant principal factors affecting the reproducibility of insect primary culture.

    Science.gov (United States)

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-06-01

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  10. A reproducible accelerated in vitro release testing method for PLGA microspheres.

    Science.gov (United States)

    Shen, Jie; Lee, Kyulim; Choi, Stephanie; Qu, Wen; Wang, Yan; Burgess, Diane J

    2016-02-10

    The objective of the present study was to develop a discriminatory and reproducible accelerated in vitro release method for long-acting PLGA microspheres with inner structure/porosity differences. Risperidone was chosen as a model drug. Qualitatively and quantitatively equivalent PLGA microspheres with different inner structure/porosity were obtained using different manufacturing processes. Physicochemical properties as well as degradation profiles of the prepared microspheres were investigated. Furthermore, in vitro release testing of the prepared risperidone microspheres was performed using the most common in vitro release methods (i.e., sample-and-separate and flow through) for this type of product. The obtained compositionally equivalent risperidone microspheres had similar drug loading but different inner structure/porosity. When microsphere particle size appeared similar, porous risperidone microspheres showed faster microsphere degradation and drug release compared with less porous microspheres. Both in vitro release methods investigated were able to differentiate risperidone microsphere formulations with differences in porosity under real-time (37 °C) and accelerated (45 °C) testing conditions. Notably, only the accelerated USP apparatus 4 method showed good reproducibility for highly porous risperidone microspheres. These results indicated that the accelerated USP apparatus 4 method is an appropriate fast quality control tool for long-acting PLGA microspheres (even with porous structures). Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Perspectives on econometric modelling to inform policy: a UK qualitative case study of minimum unit pricing of alcohol.

    Science.gov (United States)

    Katikireddi, Srinivasa V; Bond, Lyndal; Hilton, Shona

    2014-06-01

    Novel policy interventions may lack evaluation-based evidence. Considerations to introduce minimum unit pricing (MUP) of alcohol in the UK were informed by econometric modelling (the 'Sheffield model'). We aim to investigate policy stakeholders' views of the utility of modelling studies for public health policy. In-depth qualitative interviews with 36 individuals involved in MUP policy debates (purposively sampled to include civil servants, politicians, academics, advocates and industry-related actors) were conducted and thematically analysed. Interviewees felt familiar with modelling studies and often displayed detailed understandings of the Sheffield model. Despite this, many were uneasy about the extent to which the Sheffield model could be relied on for informing policymaking and preferred traditional evaluations. A tension was identified between this preference for post hoc evaluations and a desire for evidence derived from local data, with modelling seen to offer high external validity. MUP critics expressed concern that the Sheffield model did not adequately capture the 'real life' world of the alcohol market, which was conceptualized as a complex and, to some extent, inherently unpredictable system. Communication of modelling results was considered intrinsically difficult but presenting an appropriate picture of the uncertainties inherent in modelling was viewed as desirable. There was general enthusiasm for increased use of econometric modelling to inform future policymaking but an appreciation that such evidence should only form one input into the process. Modelling studies are valued by policymakers as they provide contextually relevant evidence for novel policies, but tensions exist with views of traditional evaluation-based evidence. © The Author 2013. Published by Oxford University Press on behalf of the European Public Health Association.

  12. Numerical and Qualitative Contrasts of Two Statistical Models for Water Quality Change in Tidal Waters

    Science.gov (United States)

    Two statistical approaches, weighted regression on time, discharge, and season and generalized additive models, have recently been used to evaluate water quality trends in estuaries. Both models have been used in similar contexts despite differences in statistical foundations and...

  13. Qualitative validation of humanoid robot models through balance recovery side-stepping experiments

    NARCIS (Netherlands)

    Assman, T.M.; Zutven, van P.W.M.; Nijmeijer, H.

    2013-01-01

    Different models are used in literature to approximate the complex dynamics of a humanoid robot. Many models use strongly varying model assumptions that neglect the influence of feet, discontinuous ground impact, internal dynamics and coupling between the 3D coronal and sagittal plane dynamics.

  14. Reproducing Epidemiologic Research and Ensuring Transparency.

    Science.gov (United States)

    Coughlin, Steven S

    2017-08-15

    Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Women's maternity care needs and related service models in rural areas: A comprehensive systematic review of qualitative evidence.

    Science.gov (United States)

    Hoang, Ha; Le, Quynh; Ogden, Kathryn

    2014-12-01

    Understanding the needs of rural women in maternity care and service models available to them is significant for the development of effective policies and the sustainability of rural communities. Nevertheless, no systematic review of studies addressing these needs has been conducted. To synthesise the best available evidence on the experiences of women's needs in maternity care and existing service models in rural areas. Literature search of ten electronic databases, digital theses, and reference lists of relevant studies applying inclusion/exclusion criteria was conducted. Selected papers were assessed using standardised critical appraisal instruments from JBI-QARI. Data extracted from these studies were synthesised using thematic synthesis. 12 studies met the inclusion criteria. There were three main themes and several sub-themes identified. A comprehensive set of the maternity care expectations of rural women was reported in this review including safety (7), continuity of care (6) and quality of care (6), and informed choices needs (4). In addition, challenges in accessing maternity services also emerged from the literature such as access (6), risk of travelling (9) and associated cost of travel (9). Four models of maternity care examined in the literature were medically led care (5), GP-led care (4), midwifery-led care (7) and home birth (6). The systematic review demonstrates the importance of including well-conducted qualitative studies in informing the development of evidence-based policies to address women's maternity care needs and inform service models. Synthesising the findings from qualitative studies offers important insight for informing effective public health policy. Copyright © 2014 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  16. Angiographic core laboratory reproducibility analyses: implications for planning clinical trials using coronary angiography and left ventriculography end-points.

    Science.gov (United States)

    Steigen, Terje K; Claudio, Cheryl; Abbott, David; Schulzer, Michael; Burton, Jeff; Tymchak, Wayne; Buller, Christopher E; John Mancini, G B

    2008-06-01

    To assess reproducibility of core laboratory performance and impact on sample size calculations. Little information exists about overall reproducibility of core laboratories in contradistinction to performance of individual technicians. Also, qualitative parameters are being adjudicated increasingly as either primary or secondary end-points. The comparative impact of using diverse indexes on sample sizes has not been previously reported. We compared initial and repeat assessments of five quantitative parameters [e.g., minimum lumen diameter (MLD), ejection fraction (EF), etc.] and six qualitative parameters [e.g., TIMI myocardial perfusion grade (TMPG) or thrombus grade (TTG), etc.], as performed by differing technicians and separated by a year or more. Sample sizes were calculated from these results. TMPG and TTG were also adjudicated by a second core laboratory. MLD and EF were the most reproducible, yielding the smallest sample size calculations, whereas percent diameter stenosis and centerline wall motion require substantially larger trials. Of the qualitative parameters, all except TIMI flow grade gave reproducibility characteristics yielding sample sizes of many 100's of patients. Reproducibility of TMPG and TTG was only moderately good both within and between core laboratories, underscoring an intrinsic difficulty in assessing these. Core laboratories can be shown to provide reproducibility performance that is comparable to performance commonly ascribed to individual technicians. The differences in reproducibility yield huge differences in sample size when comparing quantitative and qualitative parameters. TMPG and TTG are intrinsically difficult to assess and conclusions based on these parameters should arise only from very large trials.

  17. Qualitative analysis of nonlinear incidence rate upon the behaviour of an epidemiological model

    International Nuclear Information System (INIS)

    Li Xiaogui.

    1988-12-01

    Two theorems concerning the solutions of the system of differential equations describing an epidemiological model with nonlinear incidence rate per infective individual are demonstrated. 2 refs, 1 fig

  18. Model endophenotype for bipolar disorder: Qualitative Analysis, etiological factors, and research areas

    Directory of Open Access Journals (Sweden)

    Naraiana de Oliveira Tavares

    2014-12-01

    Full Text Available The aim of this study is to present an updated view of the writings on the endophenotype model for bipolar disorder using analytical methodologies. A review and analysis of networks was performed through descriptors and keywords that characterize the composition of the endophenotype model as a model of health. Information was collected from between 1992 and 2014, and the main thematic areas covered in the articles were identified. We discuss the results and question their cohesion, emphasizing the need to strengthen and identify the points of connection between etiological factors and characteristics that make up the model of endophenotypes for bipolar disorder.

  19. Qualitative and Computational Analysis of a Mathematical Model for Tumor-Immune Interactions

    Directory of Open Access Journals (Sweden)

    F. A. Rihan

    2012-01-01

    Full Text Available We provide a family of ordinary and delay differential equations to model the dynamics of tumor-growth and immunotherapy interactions. We explore the effects of adoptive cellular immunotherapy on the model and describe under what circumstances the tumor can be eliminated. The possibility of clearing the tumor, with a strategy, is based on two parameters in the model: the rate of influx of the effector cells and the rate of influx of IL-2. The critical tumor-growth rate, below which endemic tumor does not exist, has been found. One can use the model to make predictions about tumor dormancy.

  20. Induction of a chloracne phenotype in an epidermal equivalent model by 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) is dependent on aryl hydrocarbon receptor activation and is not reproduced by aryl hydrocarbon receptor knock down.

    Science.gov (United States)

    Forrester, Alison R; Elias, Martina S; Woodward, Emma L; Graham, Mark; Williams, Faith M; Reynolds, Nick J

    2014-01-01

    2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) is a potent activator of the aryl hydrocarbon receptor (AhR) and causes chloracne in humans. The pathogenesis and role of AhR in chloracne remains incompletely understood. To elucidate the mechanisms contributing to the development of the chloracne-like phenotype in a human epidermal equivalent model and identify potential biomarkers. Using primary normal human epidermal keratinocytes (NHEK), we studied AhR activation by XRE-luciferase, AhR degradation and CYP1A1 induction. We treated epidermal equivalents with high affinity TCDD or two non-chloracnegens: β-naphthoflavone (β-NF) and 2-(1'H-indole-3'-carbonyl)-thiazole-4-carboxylic acid methyl ester (ITE). Using Western blotting and immunochemistry for filaggrin (FLG), involucrin (INV) and transglutaminase-1 (TGM-1), we compared the effects of the ligands on keratinocyte differentiation and development of the chloracne-like phenotype by H&E. In NHEKs, activation of an XRE-luciferase and CYP1A1 protein induction correlated with ligand binding affinity: TCDD>β-NF>ITE. AhR degradation was induced by all ligands. In epidermal equivalents, TCDD induced a chloracne-like phenotype, whereas β-NF or ITE did not. All three ligands induced involucrin and TGM-1 protein expression in epidermal equivalents whereas FLG protein expression decreased following treatment with TCDD and β-NF. Inhibition of AhR by α-NF blocked TCDD-induced AhR activation in NHEKs and blocked phenotypic changes in epidermal equivalents; however, AhR knock down did not reproduce the phenotype. Ligand-induced CYP1A1 and AhR degradation did not correlate with their chloracnegenic potential, indicating that neither CYP1A1 nor AhR are suitable biomarkers. Mechanistic studies showed that the TCDD-induced chloracne-like phenotype depends on AhR activation whereas AhR knock down did not appear sufficient to induce the phenotype. Copyright © 2013 Japanese Society for Investigative Dermatology. Published by Elsevier

  1. Modeling grain boundaries in polycrystals using cohesive elements: Qualitative and quantitative analysis

    Energy Technology Data Exchange (ETDEWEB)

    El Shawish, Samir, E-mail: Samir.ElShawish@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Cizelj, Leon [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Simonovski, Igor [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands)

    2013-08-15

    Highlights: ► We estimate the performance of cohesive elements for modeling grain boundaries. ► We compare the computed stresses in ABAQUS finite element solver. ► Tests are performed in analytical and realistic models of polycrystals. ► Most severe issue is found within the plastic grain response. ► Other identified issues are related to topological constraints in modeling space. -- Abstract: We propose and demonstrate several tests to estimate the performance of the cohesive elements in ABAQUS for modeling grain boundaries in complex spatial structures such as polycrystalline aggregates. The performance of the cohesive elements is checked by comparing the computed stresses with the theoretically predicted values for a homogeneous material under uniaxial tensile loading. Statistical analyses are performed under different loading conditions for two elasto-plastic models of the grains: isotropic elasticity with isotropic hardening plasticity and anisotropic elasticity with crystal plasticity. Tests are conducted on an analytical finite element model generated from Voronoi tessellation as well as on a realistic finite element model of a stainless steel wire. The results of the analyses highlight several issues related to the computation of normal and shear stresses. The most severe issue is found within the plastic grain response where the computed normal stresses on a particularly oriented cohesive elements are significantly underestimated. Other issues are found to be related to topological constraints in the modeling space and result in the increased scatter of the computed stresses.

  2. Qualitative evaluation of various models for mechanical analysis of nuclear wastes storage in brittle rocks

    International Nuclear Information System (INIS)

    Millard, A.

    1994-01-01

    In order to appraise the large scale behaviour of high level nuclear wastes underground repositories in brittle rocks, basic models are presented and evaluated in the case of generic repository configurations. Predictive Capabilities of the models are briefly discussed. 7 figs

  3. Simplified Qualitative Discrete Numerical Model to Determine Cracking Pattern in Brittle Materials by Means of Finite Element Method

    Directory of Open Access Journals (Sweden)

    J. Ochoa-Avendaño

    2017-01-01

    Full Text Available This paper presents the formulation, implementation, and validation of a simplified qualitative model to determine the crack path of solids considering static loads, infinitesimal strain, and plane stress condition. This model is based on finite element method with a special meshing technique, where nonlinear link elements are included between the faces of the linear triangular elements. The stiffness loss of some link elements represents the crack opening. Three experimental tests of bending beams are simulated, where the cracking pattern calculated with the proposed numerical model is similar to experimental result. The advantages of the proposed model compared to discrete crack approaches with interface elements can be the implementation simplicity, the numerical stability, and the very low computational cost. The simulation with greater values of the initial stiffness of the link elements does not affect the discontinuity path and the stability of the numerical solution. The exploded mesh procedure presented in this model avoids a complex nonlinear analysis and regenerative or adaptive meshes.

  4. Reproducibility of central lumbar vertebral BMD

    International Nuclear Information System (INIS)

    Chan, F.; Pocock, N.; Griffiths, M.; Majerovic, Y.; Freund, J.

    1997-01-01

    Full text: Lumbar vertebral bone mineral density (BMD) using dual X-ray absorptiometry (DXA) has generally been calculated from a region of interest which includes the entire vertebral body. Although this region excludes part of the transverse processes, it does include the outer cortical shell of the vertebra. Recent software has been devised to calculate BMD in a central vertebral region of interest which excludes the outer cortical envelope. Theoretically this area may be more sensitive to detecting osteoporosis which affects trabecular bone to a greater extent than cortical bone. Apart from the sensitivity of BMD estimation, the reproducibility of any measurement is important owing to the slow rate of change of bone mass. We have evaluated the reproducibility of this new vertebral region of interest in 23 women who had duplicate lumbar spine DXA scans performed on the same day. The patients were repositioned between each measurement. Central vertebral analysis was performed for L2-L4 and the reproducibility of area, bone mineral content (BMC) and BMD calculated as the coefficient of variation; these values were compared with those from conventional analysis. Thus we have shown that the reproducibility of the central BMD is comparable to the conventional analysis which is essential if this technique is to provide any additional clinical data. The reasons for the decrease in reproducibility of the area and hence BMC requires further investigation

  5. Detailed qualitative dynamic knowledge representation using a BioNetGen model of TLR-4 signaling and preconditioning.

    Science.gov (United States)

    An, Gary C; Faeder, James R

    2009-01-01

    Intracellular signaling/synthetic pathways are being increasingly extensively characterized. However, while these pathways can be displayed in static diagrams, in reality they exist with a degree of dynamic complexity that is responsible for heterogeneous cellular behavior. Multiple parallel pathways exist and interact concurrently, limiting the ability to integrate the various identified mechanisms into a cohesive whole. Computational methods have been suggested as a means of concatenating this knowledge to aid in the understanding of overall system dynamics. Since the eventual goal of biomedical research is the identification and development of therapeutic modalities, computational representation must have sufficient detail to facilitate this 'engineering' process. Adding to the challenge, this type of representation must occur in a perpetual state of incomplete knowledge. We present a modeling approach to address this challenge that is both detailed and qualitative. This approach is termed 'dynamic knowledge representation,' and is intended to be an integrated component of the iterative cycle of scientific discovery. BioNetGen (BNG), a software platform for modeling intracellular signaling pathways, was used to model the toll-like receptor 4 (TLR-4) signal transduction cascade. The informational basis of the model was a series of reference papers on modulation of (TLR-4) signaling, and some specific primary research papers to aid in the characterization of specific mechanistic steps in the pathway. This model was detailed with respect to the components of the pathway represented, but qualitative with respect to the specific reaction coefficients utilized to execute the reactions. Responsiveness to simulated lipopolysaccharide (LPS) administration was measured by tumor necrosis factor (TNF) production. Simulation runs included evaluation of initial dose-dependent response to LPS administration at 10, 100, 1000 and 10,000, and a subsequent examination of

  6. Hierarchy of models: From qualitative to quantitative analysis of circadian rhythms in cyanobacteria

    Science.gov (United States)

    Chaves, M.; Preto, M.

    2013-06-01

    A hierarchy of models, ranging from high to lower levels of abstraction, is proposed to construct "minimal" but predictive and explanatory models of biological systems. Three hierarchical levels will be considered: Boolean networks, piecewise affine differential (PWA) equations, and a class of continuous, ordinary, differential equations' models derived from the PWA model. This hierarchy provides different levels of approximation of the biological system and, crucially, allows the use of theoretical tools to more exactly analyze and understand the mechanisms of the system. The Kai ABC oscillator, which is at the core of the cyanobacterial circadian rhythm, is analyzed as a case study, showing how several fundamental properties—order of oscillations, synchronization when mixing oscillating samples, structural robustness, and entrainment by external cues—can be obtained from basic mechanisms.

  7. Supporting conceptual modelling of dynamic systems: A knowledge engineering perspective on qualitative reasoning

    NARCIS (Netherlands)

    Liem, J.

    2013-01-01

    Research has shown that even students educated in science at prestigious universities have misconceptions about the systems underlying climate change, sustainability and government spending. Interactive conceptual modelling and simulation tools, which are based on Artificial Intelligence techniques,

  8. Method of modelization assistance with bond graphs and application to qualitative diagnosis of physical systems

    International Nuclear Information System (INIS)

    Lucas, B.

    1994-05-01

    After having recalled the usual diagnosis techniques (failure index, decision tree) and those based on an artificial intelligence approach, the author reports a research aimed at exploring the knowledge and model generation technique. He focuses on the design of an aid to model generation tool and aid-to-diagnosis tool. The bond graph technique is shown to be adapted to the aid to model generation, and is then adapted to the aid to diagnosis. The developed tool is applied to three projects: DIADEME (a diagnosis system based on physical model), the improvement of the SEXTANT diagnosis system (an expert system for transient analysis), and the investigation on an Ariane 5 launcher component. Notably, the author uses the Reiter and Greiner algorithm

  9. A qualitative comparison of fire spread models incorporating wind and slope effects

    Science.gov (United States)

    David R. Weise; Gregory S. Biging

    1997-01-01

    Wind velocity and slope are two critical variables that affect wildland fire rate of spread. The effects of these variables on rate of spread are often combined in rate-of-spread models using vector addition. The various methods used to combine wind and slope effects have seldom been validated or compared due to differences in the models or to lack of data. In this...

  10. The DEPICT model for participatory qualitative health promotion research analysis piloted in Canada, Zambia and South Africa.

    Science.gov (United States)

    Flicker, Sarah; Nixon, Stephanie A

    2015-09-01

    Health promotion researchers are increasingly conducting Community-Based Participatory Research in an effort to reduce health disparities. Despite efforts towards greater inclusion, research teams continue to regularly exclude diverse representation from data analysis efforts. The DEPICT model for collaborative qualitative analysis is a democratic approach to enhancing rigour through inclusion of diverse stakeholders. It is broken down into six sequential steps. Strong leadership, coordination and facilitation skills are needed; however, the process is flexible enough to adapt to most environments and varying levels of expertise. Including diverse stakeholders on an analysis team can enrich data analysis and provide more nuanced understandings of complicated health problems. © The Author (2014). Published by Oxford University Press.

  11. Conceptual model for dietary behaviour change at household level: a 'best-fit' qualitative study using primary data.

    Science.gov (United States)

    Daivadanam, Meena; Wahlström, Rolf; Ravindran, T K Sundari; Thankappan, K R; Ramanathan, Mala

    2014-06-09

    Interventions having a strong theoretical basis are more efficacious, providing a strong argument for incorporating theory into intervention planning. The objective of this study was to develop a conceptual model to facilitate the planning of dietary intervention strategies at the household level in rural Kerala. Three focus group discussions and 17 individual interviews were conducted among men and women, aged between 23 and 75 years. An interview guide facilitated the process to understand: 1) feasibility and acceptability of a proposed dietary behaviour change intervention; 2) beliefs about foods, particularly fruits and vegetables; 3) decision-making in households with reference to food choices and access; and 4) to gain insights into the kind of intervention strategies that may be practical at community and household level. The data were analysed using a modified form of qualitative framework analysis, which combined both deductive and inductive reasoning. A priori themes were identified from relevant behaviour change theories using construct definitions, and used to index the meaning units identified from the primary qualitative data. In addition, new themes emerging from the data were included. The associations between the themes were mapped into four main factors and its components, which contributed to construction of the conceptual model. Thirteen of the a priori themes from three behaviour change theories (Trans-theoretical model, Health Belief model and Theory of Planned Behaviour) were confirmed or slightly modified, while four new themes emerged from the data. The conceptual model had four main factors and its components: impact factors (decisional balance, risk perception, attitude); change processes (action-oriented, cognitive); background factors (personal modifiers, societal norms); and overarching factors (accessibility, perceived needs and preferences), built around a three-stage change spiral (pre-contemplation, intention, action). Decisional

  12. Conceptual model for dietary behaviour change at household level: a ‘best-fit’ qualitative study using primary data

    Science.gov (United States)

    2014-01-01

    Background Interventions having a strong theoretical basis are more efficacious, providing a strong argument for incorporating theory into intervention planning. The objective of this study was to develop a conceptual model to facilitate the planning of dietary intervention strategies at the household level in rural Kerala. Methods Three focus group discussions and 17 individual interviews were conducted among men and women, aged between 23 and 75 years. An interview guide facilitated the process to understand: 1) feasibility and acceptability of a proposed dietary behaviour change intervention; 2) beliefs about foods, particularly fruits and vegetables; 3) decision-making in households with reference to food choices and access; and 4) to gain insights into the kind of intervention strategies that may be practical at community and household level. The data were analysed using a modified form of qualitative framework analysis, which combined both deductive and inductive reasoning. A priori themes were identified from relevant behaviour change theories using construct definitions, and used to index the meaning units identified from the primary qualitative data. In addition, new themes emerging from the data were included. The associations between the themes were mapped into four main factors and its components, which contributed to construction of the conceptual model. Results Thirteen of the a priori themes from three behaviour change theories (Trans-theoretical model, Health Belief model and Theory of Planned Behaviour) were confirmed or slightly modified, while four new themes emerged from the data. The conceptual model had four main factors and its components: impact factors (decisional balance, risk perception, attitude); change processes (action-oriented, cognitive); background factors (personal modifiers, societal norms); and overarching factors (accessibility, perceived needs and preferences), built around a three-stage change spiral (pre

  13. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  14. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  15. Reproducibility, controllability, and optimization of LENR experiments

    International Nuclear Information System (INIS)

    Nagel, David J.

    2006-01-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR

  16. Undefined cellulase formulations hinder scientific reproducibility.

    Science.gov (United States)

    Himmel, Michael E; Abbas, Charles A; Baker, John O; Bayer, Edward A; Bomble, Yannick J; Brunecky, Roman; Chen, Xiaowen; Felby, Claus; Jeoh, Tina; Kumar, Rajeev; McCleary, Barry V; Pletschke, Brett I; Tucker, Melvin P; Wyman, Charles E; Decker, Stephen R

    2017-01-01

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparations may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.

  17. Proposing a Qualitative Approach for Corporate Competitive Capability Modeling in High-Tech Business (Case study: Software Industry

    Directory of Open Access Journals (Sweden)

    Mahmoud Saremi Saremi

    2010-09-01

    Full Text Available The evolution of global business trend for ICT-based products in recent decades shows the intensive activity of pioneer developing countries to gain a powerful competitive position in global software industry. In this research, with regard to importance of competition issue for top managers of Iranian software companies, a conceptual model has been developed for Corporate Competitive Capability concept. First, after describing the research problem, we present a comparative review of recent theories of firm and competition that has been applied by different researchers in the High-Tech and Knowledge Intensive Organization filed. Afterwards, with a detailed review of literature and previous research papers, an initial research framework and applied research method has been proposed. The main and final section of paper assigned to describing the result of research in different steps of qualitative modeling process. The agreed concepts are related to corporate competitive capability, the elicited and analyzed experts Cause Map, the elicited collective causal maps, and the final proposed model for software industry are the modeling results for this paper.

  18. Synthesizing diverse evidence: the use of primary qualitative data analysis methods and logic models in public health reviews.

    Science.gov (United States)

    Baxter, S; Killoran, A; Kelly, M P; Goyder, E

    2010-02-01

    The nature of public health evidence presents challenges for conventional systematic review processes, with increasing recognition of the need to include a broader range of work including observational studies and qualitative research, yet with methods to combine diverse sources remaining underdeveloped. The objective of this paper is to report the application of a new approach for review of evidence in the public health sphere. The method enables a diverse range of evidence types to be synthesized in order to examine potential relationships between a public health environment and outcomes. The study drew on previous work by the National Institute for Health and Clinical Excellence on conceptual frameworks. It applied and further extended this work to the synthesis of evidence relating to one particular public health area: the enhancement of employee mental well-being in the workplace. The approach utilized thematic analysis techniques from primary research, together with conceptual modelling, to explore potential relationships between factors and outcomes. The method enabled a logic framework to be built from a diverse document set that illustrates how elements and associations between elements may impact on the well-being of employees. Whilst recognizing potential criticisms of the approach, it is suggested that logic models can be a useful way of examining the complexity of relationships between factors and outcomes in public health, and of highlighting potential areas for interventions and further research. The use of techniques from primary qualitative research may also be helpful in synthesizing diverse document types. Copyright 2010 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  19. The co-operative model as a means of stakeholder management: An exploratory qualitative analysis

    Directory of Open Access Journals (Sweden)

    Darrell Hammond

    2016-11-01

    Full Text Available The South African economy has for some time been characterised by high unemployment, income inequality and a skills mismatch, all of which have contributed to conflict between business, government and labour. The co-operative model of stakeholder management is examined as a possible mitigating organisational form in this high-conflict environment. International experience indicates some success with co-operative models but they are not easy to implement effectively and face severe obstacles. Trust and knowledge sharing are critical for enabling a co-operative model of stakeholder management, which requires strong governance and adherence to strict rules. The model must balance the tension between optimisation of governance structures and responsiveness to members' needs. Furthermore, support from social and political institutions is necessary. We find barriers to scalability which manifest in the lack of depth of business skills, negative perception of the co-operative model by external stakeholders, government ambivalence, and a lack of willingness on the part of workers to co-operate for mutual benefit.

  20. [Natural head position's reproducibility on photographs].

    Science.gov (United States)

    Eddo, Marie-Line; El Hayeck, Émilie; Hoyeck, Maha; Khoury, Élie; Ghoubril, Joseph

    2017-12-01

    The purpose of this study is to evaluate the reproducibility of natural head position with time on profile photographs. Our sample is composed of 96 students (20-30 years old) at the department of dentistry of Saint Joseph University in Beirut. Two profile photographs were taken in natural head position about a week apart. No significant differences were found between T0 and T1 (E = 1.065°). Many studies confirmed this reproducibility with time. Natural head position can be adopted as an orientation for profile photographs in orthodontics. © EDP Sciences, SFODF, 2017.

  1. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  2. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  3. The Time Delays’ Effects on the Qualitative Behavior of an Economic Growth Model

    Directory of Open Access Journals (Sweden)

    Carlo Bianca

    2013-01-01

    Full Text Available A further generalization of an economic growth model is the main topic of this paper. The paper specifically analyzes the effects on the asymptotic dynamics of the Solow model when two time delays are inserted: the time employed in order that the capital is used for production and the necessary time so that the capital is depreciated. The existence of a unique nontrivial positive steady state of the generalized model is proved and sufficient conditions for the asymptotic stability are established. Moreover, the existence of a Hopf bifurcation is proved and, by using the normal form theory and center manifold argument, the explicit formulas which determine the stability, direction, and period of bifurcating periodic solutions are obtained. Finally, numerical simulations are performed for supporting the analytical results.

  4. GMM - a general microstructural model for qualitative and quantitative studies of smectite clays

    International Nuclear Information System (INIS)

    Pusch, R.; Karnland, O.; Hoekmark, H.

    1990-12-01

    A few years ago an attempt was made to accommodate a number of basic ideas on the fabric and interparticle forces that are assumed to be valid in montmorillonite clay in an integrated microstructural model and this resulted in an SKB report on 'Outlines of models of water and gas flow through smectite clay buffers'. This model gave reasonable agreement between predicted hydraulic conductivity values and actually recorded ones for room temperature and porewater that is poor in electrolytes. The present report describes an improved model that also accounts for effects generated by salt porewater and heating, and that provides a basis for both quantitative determination of transport capacities in a more general way, and also for analysis and prediction of rheological behaviour in bulk. It has been understood very early by investigators in this scientific field that full understanding of the physical state of porewater is asked for in order to make it possible to develop models for clay particle interaction. In particular, a deep insight in the nature of the interlamellar water and of the hydration mechanisms leading to an equilibrium state between the two types of water, and of forcefields in matured smectite clay, requires very qualified multi-discipline research and attempts have been made by the senior author to initiate and coordinate such work in the last 30 years. Despite this effort it has not been possible to get an unanimous understanding of these things but a number of major features have become more clear through the work that we have been able to carry out in the current SKB research work. Thus, NMR studies and precision measurements of the density of porewater as well as comprehensive electron microscopy and rheological testing in combination with application of stochastical mechanics, have led to the hypothetical microstructural model - the GMM - presented in this report. (au)

  5. Qualitative analysis of a stochastic epidemic model with specific functional response and temporary immunity

    Science.gov (United States)

    Hattaf, Khalid; Mahrouf, Marouane; Adnani, Jihad; Yousfi, Noura

    2018-01-01

    In this paper, we propose a stochastic delayed epidemic model with specific functional response. The time delay represents temporary immunity period, i.e., time from recovery to becoming susceptible again. We first show that the proposed model is mathematically and biologically well-posed. Moreover, the extinction of the disease and the persistence in the mean are established in the terms of a threshold value R0S which is smaller than the basic reproduction number R0 of the corresponding deterministic system.

  6. Qualitative properties and hopf bifurcation in haematopoietic disease model with chemotherapy

    Directory of Open Access Journals (Sweden)

    Yafia R.

    2014-01-01

    Full Text Available In this paper, we consider a model describing the dynamics of Hematopoietic Stem Cells (HSC disease with chemotherapy. The model is given by a system of three ordinary differential equations with discrete delay. Its dynamics are studied in term of local stability of the possible steady states for the case without drug intervention term. We prove the existence of periodic oscillations for each case when the delay passes trough a critical values. In the end, we illustrate our results by some numerical simulations.

  7. Qualitative analysis of an integro-differential equation model of periodic chemotherapy

    KAUST Repository

    Jain, Harsh Vardhan; Byrne, Helen M.

    2012-01-01

    An existing model of tumor growth that accounts for cell cycle arrest and cell death induced by chemotherapy is extended to simulate the response to treatment of a tumor growing in vivo. The tumor is assumed to undergo logistic growth in the absence

  8. Family Adaptation to Stroke: A Metasynthesis of Qualitative Research based on Double ABCX Model

    Directory of Open Access Journals (Sweden)

    Ali Hesamzadeh, RN, PhD Student of Nursing

    2015-09-01

    Conclusions: The results of the study are in conformity with the tenets of the Double ABCX Model. Family adaptation is a dynamic process and the present study findings provide rich information on proper assessment and intervention to the practitioners working with families of stroke survivors.

  9. How Model Can Help Inquiry--A Qualitative Study of Model Based Inquiry Learning (Mobile) in Engineering Education

    Science.gov (United States)

    Gong, Yu

    2017-01-01

    This study investigates how students can use "interactive example models" in inquiry activities to develop their conceptual knowledge about an engineering phenomenon like electromagnetic fields and waves. An interactive model, for example a computational model, could be used to develop and teach principles of dynamic complex systems, and…

  10. Models for integrating medical acupuncture into practice: an exploratory qualitative study of physicians' experiences.

    Science.gov (United States)

    Crumley, Ellen T

    2016-08-01

    Internationally, physicians are integrating medical acupuncture into their practice. Although there are some informative surveys and reviews, there are few international, exploratory studies detailing how physicians have accommodated medical acupuncture (eg, by modifying schedules, space and processes). To examine how physicians integrate medical acupuncture into their practice. Semi-structured interviews and participant observations of physicians practising medical acupuncture were conducted using convenience and snowball sampling. Data were analysed in NVivo and themes were developed. Despite variation, three principal models were developed to summarise the different ways that physicians integrated medical acupuncture into their practice, using the core concept of 'helping'. Quotes were used to illustrate each model and its corresponding themes. There were 25 participants from 11 countries: 21 agreed to be interviewed and four engaged in participant observations. Seventy-two per cent were general practitioners. The three models were: (1) appointments (44%); (2) clinics (44%); and (3) full-time practice (24%). Some physicians held both appointments and regular clinics (models 1 and 2). Most full-time physicians initially tried appointments and/or clinics. Some physicians charged to offset administration costs or compensate for their time. Despite variation within each category, the three models encapsulated how physicians described their integration of medical acupuncture. Physicians varied in how often they administered medical acupuncture and the amount of time they spent with patients. Although 24% of physicians surveyed administered medical acupuncture full-time, most practised it part-time. Each individual physician incorporated medical acupuncture in the way that worked best for their practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. The development of a model of dignity in illness based on qualitative interviews with seriously ill patients.

    Science.gov (United States)

    van Gennip, Isis E; Pasman, H Roeline W; Oosterveld-Vlug, Mariska G; Willems, Dick L; Onwuteaka-Philipsen, Bregje D

    2013-08-01

    While knowledge on factors affecting personal dignity of patients nearing death is quite substantial, far less is known about how patients living with a serious disease understand dignity. To develop a conceptual model of dignity that illuminates the process by which serious illness can undermine patients' dignity, and that is applicable to a wide patient population. Qualitative interview study. 34 patients with either cancer, early stage dementia, or a severe chronic illness were selected from an extensive cohort study into advance directives. In-depth interviews were carried out exploring the experiences of seriously ill patients with regard to their personal dignity. The interview transcripts were analyzed using thematic analysis and a conceptual model was constructed based on the resulting themes. We developed a two-step dignity model of illness. According to this model, illness related conditions do not affect patients' dignity directly but indirectly by affecting the way patients perceive themselves. We identified three components shaping self-perception: (a) the individual self: the subjective experiences and internally held qualities of the patient; (b) the relational self: the self within reciprocal interaction with others; and, (c) the societal self: the self as a social object in the eyes of others. The merits of the model are two-folded. First, it offers an organizing framework for further research into patients' dignity. Secondly, the model can serve to facilitate care for seriously ill patients in practice by providing insight into illness and dignity at the level of the individual patient where intervention can be effectively targeted. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Using satellite imagery for qualitative evaluation of plume transport in modeling the effects of the Kuwait oil fire smoke plumes

    International Nuclear Information System (INIS)

    Bass, A.; Janota, P.

    1992-01-01

    To forecast the behavior of the Kuwait oil fire smoke plumes and their possible acute or chronic health effects over the Arabian Gulf region, TASC created a comprehensive health and environmental impacts modeling system. A specially-adapted Lagrangian puff transport model was used to create (a) short-term (multiday) forecasts of plume transport and ground-level concentrations of soot and SO 2 ; and (b) long-term (seasonal and longer) estimates of average surface concentrations and depositions. EPA-approved algorithms were used to transform exposures to SO 2 and soot (as PAH/BaP) into morbidity, mortality and crop damage risks. Absent any ground truth, satellite imagery from the NOAA Polar Orbiter and the ESA Geostationary Meteosat offered the only opportunity for timely qualitative evaluation of the long-range plume transport and diffusion predictions. This paper shows the use of actual satellite images (including animated loops of hourly Meteosat images) to evaluate plume forecasts in near-real-time, and to sanity-check the meso- and long-range plume transport projections for the long-term estimates. Example modeled concentrations, depositions and health effects are shown

  13. Rethinking work-health models for the new global economy: a qualitative analysis of emerging dimensions of work.

    Science.gov (United States)

    Polanyi, Michael; Tompa, Emile

    2004-01-01

    Technology change, rising international trade and investment, and increased competition are changing the organization, distribution and nature of work in industrialized countries. To enhance productivity, employers are striving to increase innovation while minimizing costs. This is leading to an intensification of work demands on core employees and the outsourcing or casualization of more marginal tasks, often to contingent workers. The two prevailing models of work and health - demand-control and effort-reward imbalance - may not capture the full range of experiences of workers in today's increasingly flexible and competitive economies. To explore this proposition, we conducted a secondary qualitative analysis of interviews with 120 American workers [6]. Our analysis identifies aspects of work affecting the quality of workers' experiences that are largely overlooked by popular work-health models: the nature of social interactions with customers and clients; workers' belief in, and perception of, the importance of the product of their work. We suggest that the quality of work experiences is partly determined by the objective characteristics of the work environment, but also by the fit of the work environment with the worker's needs, interests, desires and personality, something not adequately captured in current models.

  14. Prediction of qualitative parameters of slab steel ingot using numerical modelling

    Directory of Open Access Journals (Sweden)

    M. Tkadlečková

    2016-07-01

    Full Text Available The paper describes the verification of casting and solidification of heavy slab ingot weighing 40 t from tool steel by means of numerical modelling with use of a finite element method. The pre-processing, processing and post-processing phases of numerical modelling are outlined. Also, the problems with determination of the thermodynamic properties of materials and with determination of the heat transfer between the individual parts of the casting system are discussed. The final porosity, macrosegregation and the risk of cracks were predicted. The results allowed us to use the slab ingot instead of the conventional heavy steel ingot and to improve the ratio, the chamfer and the external shape of the wall of the new design of the slab ingot.

  15. Disentangling the Complexity of HGF Signaling by Combining Qualitative and Quantitative Modeling.

    Directory of Open Access Journals (Sweden)

    Lorenza A D'Alessandro

    2015-04-01

    Full Text Available Signaling pathways are characterized by crosstalk, feedback and feedforward mechanisms giving rise to highly complex and cell-context specific signaling networks. Dissecting the underlying relations is crucial to predict the impact of targeted perturbations. However, a major challenge in identifying cell-context specific signaling networks is the enormous number of potentially possible interactions. Here, we report a novel hybrid mathematical modeling strategy to systematically unravel hepatocyte growth factor (HGF stimulated phosphoinositide-3-kinase (PI3K and mitogen activated protein kinase (MAPK signaling, which critically contribute to liver regeneration. By combining time-resolved quantitative experimental data generated in primary mouse hepatocytes with interaction graph and ordinary differential equation modeling, we identify and experimentally validate a network structure that represents the experimental data best and indicates specific crosstalk mechanisms. Whereas the identified network is robust against single perturbations, combinatorial inhibition strategies are predicted that result in strong reduction of Akt and ERK activation. Thus, by capitalizing on the advantages of the two modeling approaches, we reduce the high combinatorial complexity and identify cell-context specific signaling networks.

  16. Flexibility in community pharmacy: a qualitative study of business models and cognitive services.

    Science.gov (United States)

    Feletto, Eleonora; Wilson, Laura K; Roberts, Alison S; Benrimoj, Shalom I

    2010-04-01

    To identify the capacity of current pharmacy business models, and the dimensions of organisational flexibility within them, to integrate products and services as well as the perceptions of viability of these models. Fifty-seven semi-structured interviews were conducted with community pharmacy owners or managers and support staff in 30 pharmacies across Australia. A framework of organisational flexibility was used to analyse their capacity to integrate services and perceptions of viability. Data were analysed using the method of constant comparison by two independent researchers. The study found that Australian community pharmacies have used the four types of flexibility to build capacity in distinct ways and react to changes in the local environment. This capacity building was manifested in four emerging business models which integrate services to varying degrees: classic community pharmacy, retail destination pharmacy, health care solution pharmacy and networked pharmacy. The perception of viability is less focused on dispensing medications and more focused on differentiating pharmacies through either a retail or services focus. Strategic flexibility appeared to offer pharmacies the ability to integrate and sustainably deliver services more successfully than other types, as exhibited by health care solution and networked pharmacies. Active support and encouragement to transition from being dependent on dispensing to implementing services is needed. The study showed that pharmacies where services were implemented and showed success are those strategically differentiating their businesses to become focused health care providers. This holistic approach should inevitably influence the sustainability of services.

  17. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  18. Reproducibility of the results in ultrasonic testing

    International Nuclear Information System (INIS)

    Chalaye, M.; Launay, J.P.; Thomas, A.

    1980-12-01

    This memorandum reports on the conclusions of the tests carried out in order to evaluate the reproducibility of ultrasonic tests made on welded joints. FRAMATOME have started a study to assess the dispersion of results afforded by the test line and to characterize its behaviour. The tests covered sensors and ultrasonic generators said to be identical to each other (same commercial batch) [fr

  19. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  20. Reproducibility, Controllability, and Optimization of Lenr Experiments

    Science.gov (United States)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  1. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  2. Size Control of Sessile Microbubbles for Reproducibly Driven Acoustic Streaming

    Science.gov (United States)

    Volk, Andreas; Kähler, Christian J.

    2018-05-01

    Acoustically actuated bubbles are receiving growing interest in microfluidic applications, as they induce a streaming field that can be used for particle sorting and fluid mixing. An essential but often unspoken challenge in such applications is to maintain a constant bubble size to achieve reproducible conditions. We present an automatized system for the size control of a cylindrical bubble that is formed at a blind side pit of a polydimethylsiloxane microchannel. Using a pressure control system, we adapt the protrusion depth of the bubble into the microchannel to a precision of approximately 0.5 μ m on a timescale of seconds. By comparing the streaming field generated by bubbles of width 80 μ m with a protrusion depth between -12 and 60 μ m , we find that the mean velocity of the induced streaming fields varies by more than a factor of 4. We also find a qualitative change of the topology of the streaming field. Both observations confirm the importance of the bubble size control system in order to achieve reproducible and reliable bubble-driven streaming experiments.

  3. A Bayesian Perspective on the Reproducibility Project: Psychology.

    Science.gov (United States)

    Etz, Alexander; Vandekerckhove, Joachim

    2016-01-01

    We revisit the results of the recent Reproducibility Project: Psychology by the Open Science Collaboration. We compute Bayes factors-a quantity that can be used to express comparative evidence for an hypothesis but also for the null hypothesis-for a large subset (N = 72) of the original papers and their corresponding replication attempts. In our computation, we take into account the likely scenario that publication bias had distorted the originally published results. Overall, 75% of studies gave qualitatively similar results in terms of the amount of evidence provided. However, the evidence was often weak (i.e., Bayes factor studies (64%) did not provide strong evidence for either the null or the alternative hypothesis in either the original or the replication, and no replication attempts provided strong evidence in favor of the null. In all cases where the original paper provided strong evidence but the replication did not (15%), the sample size in the replication was smaller than the original. Where the replication provided strong evidence but the original did not (10%), the replication sample size was larger. We conclude that the apparent failure of the Reproducibility Project to replicate many target effects can be adequately explained by overestimation of effect sizes (or overestimation of evidence against the null hypothesis) due to small sample sizes and publication bias in the psychological literature. We further conclude that traditional sample sizes are insufficient and that a more widespread adoption of Bayesian methods is desirable.

  4. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  5. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  6. A qualitative readiness-requirements assessment model for enterprise big-data infrastructure investment

    Science.gov (United States)

    Olama, Mohammed M.; McNair, Allen W.; Sukumar, Sreenivas R.; Nutaro, James J.

    2014-05-01

    In the last three decades, there has been an exponential growth in the area of information technology providing the information processing needs of data-driven businesses in government, science, and private industry in the form of capturing, staging, integrating, conveying, analyzing, and transferring data that will help knowledge workers and decision makers make sound business decisions. Data integration across enterprise warehouses is one of the most challenging steps in the big data analytics strategy. Several levels of data integration have been identified across enterprise warehouses: data accessibility, common data platform, and consolidated data model. Each level of integration has its own set of complexities that requires a certain amount of time, budget, and resources to implement. Such levels of integration are designed to address the technical challenges inherent in consolidating the disparate data sources. In this paper, we present a methodology based on industry best practices to measure the readiness of an organization and its data sets against the different levels of data integration. We introduce a new Integration Level Model (ILM) tool, which is used for quantifying an organization and data system's readiness to share data at a certain level of data integration. It is based largely on the established and accepted framework provided in the Data Management Association (DAMADMBOK). It comprises several key data management functions and supporting activities, together with several environmental elements that describe and apply to each function. The proposed model scores the maturity of a system's data governance processes and provides a pragmatic methodology for evaluating integration risks. The higher the computed scores, the better managed the source data system and the greater the likelihood that the data system can be brought in at a higher level of integration.

  7. Diffusion of a collaborative care model in primary care: a longitudinal qualitative study

    Directory of Open Access Journals (Sweden)

    Vedel Isabelle

    2013-01-01

    Full Text Available Background Although collaborative team models (CTM improve care processes and health outcomes, their diffusion poses challenges related to difficulties in securing their adoption by primary care clinicians (PCPs. The objectives of this study are to understand: (1 how the perceived characteristics of a CTM influenced clinicians' decision to adopt -or not- the model; and (2 the model's diffusion process. Methods We conducted a longitudinal case study based on the Diffusion of Innovations Theory. First, diffusion curves were developed for all 175 PCPs and 59 nurses practicing in one borough of Paris. Second, semi-structured interviews were conducted with a representative sample of 40 PCPs and 15 nurses to better understand the implementation dynamics. Results Diffusion curves showed that 3.5 years after the start of the implementation, 100% of nurses and over 80% of PCPs had adopted the CTM. The dynamics of the CTM's diffusion were different between the PCPs and the nurses. The slopes of the two curves are also distinctly different. Among the nurses, the critical mass of adopters was attained faster, since they adopted the CTM earlier and more quickly than the PCPs. Results of the semi-structured interviews showed that these differences in diffusion dynamics were mostly founded in differences between the PCPs' and the nurses' perceptions of the CTM's compatibility with norms, values and practices and its relative advantage (impact on patient management and work practices. Opinion leaders played a key role in the diffusion of the CTM among PCPs. Conclusion CTM diffusion is a social phenomenon that requires a major commitment by clinicians and a willingness to take risks; the role of opinion leaders is key. Paying attention to the notion of a critical mass of adopters is essential to developing implementation strategies that will accelerate the adoption process by clinicians.

  8. Modeling bistable cell-fate choices in the Drosophila eye: qualitative and quantitative perspectives

    Science.gov (United States)

    Graham, Thomas G. W.; Tabei, S. M. Ali; Dinner, Aaron R.; Rebay, Ilaria

    2010-01-01

    A major goal of developmental biology is to understand the molecular mechanisms whereby genetic signaling networks establish and maintain distinct cell types within multicellular organisms. Here, we review cell-fate decisions in the developing eye of Drosophila melanogaster and the experimental results that have revealed the topology of the underlying signaling circuitries. We then propose that switch-like network motifs based on positive feedback play a central role in cell-fate choice, and discuss how mathematical modeling can be used to understand and predict the bistable or multistable behavior of such networks. PMID:20570936

  9. Qualitative analysis of an integro-differential equation model of periodic chemotherapy

    KAUST Repository

    Jain, Harsh Vardhan

    2012-12-01

    An existing model of tumor growth that accounts for cell cycle arrest and cell death induced by chemotherapy is extended to simulate the response to treatment of a tumor growing in vivo. The tumor is assumed to undergo logistic growth in the absence of therapy, and treatment is administered periodically rather than continuously. Necessary and sufficient conditions for the global stability of the cancer-free equilibrium are derived and conditions under which the system evolves to periodic solutions are determined. © 2012 Elsevier Ltd. All rights reserved.

  10. Mathematical-statistical models and qualitative theories for economic and social sciences

    CERN Document Server

    Maturo, Fabrizio; Kacprzyk, Janusz

    2017-01-01

    This book presents a broad spectrum of problems related to statistics, mathematics, teaching, social science, and economics as well as a range of tools and techniques that can be used to solve these problems. It is the result of a scientific collaboration between experts in the field of economic and social systems from the University of Defence in Brno (Czech Republic), G. d’Annunzio University of Chieti-Pescara (Italy), Pablo de Olavid eUniversity of Sevilla (Spain), and Ovidius University in Constanţa, (Romania). The studies included were selected using a peer-review process and reflect heterogeneity and complexity of economic and social phenomena. They and present interesting empirical research from around the globe and from several research fields, such as statistics, decision making, mathematics, complexity, psychology, sociology and economics. The volume is divided into two parts. The first part, “Recent trends in mathematical and statistical models for economic and social sciences”, collects pap...

  11. Conceptual model of acid attacks based on survivor's experiences: Lessons from a qualitative exploration.

    Science.gov (United States)

    Sabzi Khoshnami, Mohammad; Mohammadi, Elham; Addelyan Rasi, Hamideh; Khankeh, Hamid Reza; Arshi, Maliheh

    2017-05-01

    Acid attack, a worldwide phenomenon, has been increasing in recent years. In addition to severe injuries to the face and body, such violence leads to psychological and social problems that affect the survivors' quality of life. The present study provides a more in-depth understanding of this phenomenon and explores the nature and dimensions of acid attacks based on survivors' experiences. A grounded theory study using semi-structured, recorded interviews and applying purposeful theoretical sampling was conducted with 12 acid attack survivors in Iran. Data were analysed using constant comparison in open, axial and selective coding stages. A conceptual model was developed to explain the relationships among the main categories extracted through the grounded theory study. Physical and psychological wounds emerged as a core category. Traditional context and extreme beauty value in society acted as the context of the physical and psychological wounds experienced. Living with a drug abuser with behavioural disorders and lack of problem-solving skills in interpersonal conflict were found to be causal conditions. Action strategies to deal with this experience were found to be composed of individual, interpersonal and structural levels. Education, percentage and place of burning acted as intervening conditions that influenced survivors' strategies. Finally, adverse consequences of social deprivation and feeling helpless and hindered were found to have an important impact. Acid attack lead to physical and psychological wounds in survivors. This is a multi-dimensional phenomenon involving illness, disability, and victimization, and requires a wide range of strategies at different levels. The conceptual model derived through this study can serve as a good basis for intervention programs. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  12. Does systematic variation improve the reproducibility of animal experiments?

    NARCIS (Netherlands)

    Jonker, R.M.; Guenther, A.; Engqvist, L.; Schmoll, T.

    2013-01-01

    Reproducibility of results is a fundamental tenet of science. In this journal, Richter et al.1 tested whether systematic variation in experimental conditions (heterogenization) affects the reproducibility of results. Comparing this approach with the current standard of ensuring reproducibility

  13. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  14. Global energy supply the day before yesterday, the day after tomorrow, today, tomorrow - a qualitative modeling approach

    International Nuclear Information System (INIS)

    Herrmann, D.

    2004-01-01

    Current developments, and peak world market price levels, of oil, which add to the prices also of natural gas and other energy resources, give rise to the question whether there is any reason to expect fundamental changes and trend reversals in energy prices and on energy markets on a medium to long term basis. Attempts to find answers to such questions about the future can be helped by looking back into the more than three hundred years of global history of the development of modern industrial-sale power supply. Over that period of time, there have always been changes of boundary conditions and reversals of trends, respectively; step by step, by trial and error, a structural change has evolved from the use mainly of renewable energy resources to the primary use of fossil fuels supplemented by nuclear power. A model is presented which is able not only to describe and explain in a consistent and plausible way the global qualitative development of industrial-scale energy supply over the three different development periods, as far as contents go, between 1700 and 2100, but also allows higher resolution to be achieved in terms both of contents and time. The modeling approach is applied to the entire era of energy supply on an industrial scale, and should be seen as a representation of the specific perspective in this approach for further discussion. (orig.)

  15. How are pharmacists in Ontario adapting to practice change? Results of a qualitative analysis using Kotter's change management model.

    Science.gov (United States)

    Teixeira, Beatriz; Gregory, Paul A M; Austin, Zubin

    2017-01-01

    The pace of practice change in community pharmacy over the past decade has been significant, yet there is little evidence documenting implementation of change in the profession. Kotter's change management model was selected as a theoretical framework for this exploratory qualitative study. Community pharmacists were interviewed using a semistructured protocol based on Kotter's model. Data were analyzed and coded using a constant-comparative iterative method aligned with the stages of change management outlined by Kotter. Twelve community pharmacists were interviewed. Three key themes emerged: 1) the profession has successfully established the urgency to, and created a climate conducive for, change; 2) the profession has been less successful in engaging and enabling the profession to actually implement change; and 3) legislative changes (for example, expansion of pharmacists' scope of practice) may have occurred prematurely, prior to other earlier stages of the change process being consolidated. As noted by most participants, allowing change is not implementing change: pharmacists reported feeling underprepared and lacking confidence to actually make change in their practices and believe that more emphasis on practical, specific implementation tactics is needed. Change management is complex and time and resource intensive. There is a need to provide personalized, detailed, context-specific implementation strategies to pharmacists to allow them to take full advantage of expanded scope of practice.

  16. Applying the information-motivation-behavioral skills model in medication adherence among Thai youth living with HIV: a qualitative study.

    Science.gov (United States)

    Rongkavilit, Chokechai; Naar-King, Sylvie; Kaljee, Linda M; Panthong, Apirudee; Koken, Juline A; Bunupuradah, Torsak; Parsons, Jeffrey T

    2010-12-01

    With disproportionately higher rates of HIV/AIDS among youth and increasing access to antiretroviral therapy (ART) in Thailand, there is a growing urgency in understanding the challenges to medication adherence confronting this population and in developing theory-based interventions to address these challenges. One potentially relevant model, the information-motivation-behavioral skills (IMB) model of adherence, was developed in Western settings characterized by a more individualistic culture in contrast to the more collectivistic culture of Thailand. We explored the application and adaptability of IMB on ART adherence among HIV-positive Thai youth through the analysis of qualitative data from a pilot motivational interviewing study. Twenty-two interview sessions from 10 HIV-positive Thai youth (17-24 years) were analyzed; 6 youth were on ART. Data support the utility of IMB as a potential framework for understanding ART adherence in this population. However, data indicate a consideration to expand the motivation construct of IMB to incorporate youths' perceived familial and social responsibilities and the need to adhere to medications for short- and long-term well-being of self, family, and society in a context of Buddhist values. These modifications to IMB could be relevant in other cultural settings with more collectivistic worldviews.

  17. Applying the Information-Motivation-Behavioral Skills Model in Medication Adherence Among Thai Youth Living with HIV: A Qualitative Study

    Science.gov (United States)

    Naar-King, Sylvie; Kaljee, Linda M.; Panthong, Apirudee; Koken, Juline A.; Bunupuradah, Torsak; Parsons, Jeffrey T.

    2010-01-01

    Abstract With disproportionately higher rates of HIV/AIDS among youth and increasing access to antiretroviral therapy (ART) in Thailand, there is a growing urgency in understanding the challenges to medication adherence confronting this population and in developing theory-based interventions to address these challenges. One potentially relevant model, the information-motivation-behavioral skills (IMB) model of adherence, was developed in Western settings characterized by a more individualistic culture in contrast to the more collectivistic culture of Thailand. We explored the application and adaptability of IMB on ART adherence among HIV-positive Thai youth through the analysis of qualitative data from a pilot motivational interviewing study. Twenty-two interview sessions from 10 HIV-positive Thai youth (17–24 years) were analyzed; 6 youth were on ART. Data support the utility of IMB as a potential framework for understanding ART adherence in this population. However, data indicate a consideration to expand the motivation construct of IMB to incorporate youths' perceived familial and social responsibilities and the need to adhere to medications for short- and long-term well-being of self, family, and society in a context of Buddhist values. These modifications to IMB could be relevant in other cultural settings with more collectivistic worldviews. PMID:21091238

  18. A Reduced Duty Hours Model for Senior Internal Medicine Residents: A Qualitative Analysis of Residents' Experiences and Perceptions.

    Science.gov (United States)

    Mathew, Rebecca; Gundy, Serena; Ulic, Diana; Haider, Shariq; Wasi, Parveen

    2016-09-01

    To assess senior internal medicine residents' experience of the implementation of a reduced duty hours model with night float, the transition from the prior 26-hour call system, and the new model's effects on resident quality of life and perceived patient safety in the emergency department and clinical teaching unit at McMaster University. Qualitative data were collected during May 2013-July 2014, through resident focus groups held prior to implementation of a reduced duty hours model and 10 to 12 months postimplementation. Data analysis was guided by a constructivist grounded theory based in a relativist paradigm. Transcripts were coded; codes were collapsed into themes. Thematic analysis revealed five themes. Residents described reduced fatigue in the early morning, counterbalanced with worsened long-term fatigue on night float blocks; anticipation of negative impacts of the loss of distributed on-call experience and on-call shift volume; an urgency to sleep postcall in anticipation of consecutive night float shifts accompanied by conflicting role demands to stay postcall for care continuity; increased handover frequency accompanied by inaccurate/incomplete communication of patients' issues; and improvement in the senior resident experience on the clinical teaching unit, with increased ownership over patient care and improved relationships with junior housestaff. A reduced duty hours model with night float has potential to improve residents' perceived fatigue on call and care continuity on the clinical teaching unit. This must be weighed against increased handover frequency and loss of the postcall day, which may negatively affect patient care and resident quality of life.

  19. Adjustment modes in the trajectory of progressive multiple sclerosis: a qualitative study and conceptual model.

    Science.gov (United States)

    Bogosian, Angeliki; Morgan, Myfanwy; Bishop, Felicity L; Day, Fern; Moss-Morris, Rona

    2017-03-01

    We examined cognitive and behavioural challenges and adaptations for people with progressive multiple sclerosis (MS) and developed a preliminary conceptual model of changes in adjustment over time. Using theoretical sampling, 34 semi-structured interviews were conducted with people with MS. Participants were between 41 and 77 years of age. Thirteen were diagnosed with primary progressive MS and 21 with secondary progressive MS. Data were analysed using a grounded theory approach. Participants described initially bracketing the illness off and carrying on their usual activities but this became problematic as the condition progressed and they employed different adjustment modes to cope with increased disabilities. Some scaled back their activities to live a more comfortable life, others identified new activities or adapted old ones, whereas at times, people disengaged from the adjustment process altogether and resigned to their condition. Relationships with partners, emotional reactions, environment and perception of the environment influenced adjustment, while people were often flexible and shifted among modes. Adjusting to a progressive condition is a fluid process. Future interventions can be tailored to address modifiable factors at different stages of the condition and may involve addressing emotional reactions concealing/revealing the condition and perceptions of the environment.

  20. Qualitative and quantitative changes in phospholipids and proteins investigated by spectroscopic techniques in animal depression model

    Science.gov (United States)

    Depciuch, J.; Sowa-Kucma, M.; Nowak, G.; Papp, M.; Gruca, P.; Misztak, P.; Parlinska-Wojtan, M.

    2017-04-01

    Depression becomes nowadays a high mortality civilization disease with one of the major causes being chronic stress. Raman, Fourier Transform Infra Red (FTIR) and Ultraviolet-Visible (UV-vis) spectroscopies were used to determine the changes in the quantity and structure of phospholipids and proteins in the blood serum of rats subjected to chronic mild stress, which is a common animal depression model. Moreover, the efficiency of the imipramine treatment was evaluated. It was found that chronic mild stress not only damages the structure of the phospholipids and proteins, but also decreases their level in the blood serum. A 5 weeks imipramine treatment did increase slightly the quantity of proteins, leaving the damaged phospholipids unchanged. Structural information from phospholipids and proteins was obtained by UV-vis spectroscopy combined with the second derivative of the FTIR spectra. Indeed, the structure of proteins in blood serum of stressed rats was normalized after imipramine therapy, while the impaired structure of phospholipids remained unaffected. These findings strongly suggest that the depression factor, which is chronic mild stress, may induce permanent (irreversible) damages into the phospholipid structure identified as shortened carbon chains. This study shows a possible new application of spectroscopic techniques in the diagnosis and therapy monitoring of depression.

  1. A qualitative model for strategic analysis of organizations. Application and alternative proposal on a study case

    Directory of Open Access Journals (Sweden)

    Santiago Ferro Moreno

    2015-12-01

    Full Text Available The strategic analysis of organizations is based on the internal and external environments, in order to identify positive and negative variables and factors. The interrelation and timing of these strategic forces are essential to create alternative solutions that tend to achieve the organizational objectives.The normative prospective has theorical and methodological foundations to create a desired future and from it, be able to identify impelling and restraining forces that have influence on the particular problematic situation (go from the current situation to a better one in a certain time.The aim of this article is to analyze on a strategic way a real case with a normative-prospective model that considers the temporal dynamics of the factors impact and variables in time allowing to suggest alternative solutions.Semi-structured interviews were performed with all the employees of this case and structured observations and workshops with the commercial and general management.In consequence, with the results, the desired, current and improved situations were built. Additionally, forces were identified classified and appreciated and lastly solutions were suggested. With the proposed prospective method, alternative solutions could be constructed in order to settle temporary organizational objectives. No constraints were found to use the current method in other cases.Keywords: Strategic forces, Normative prospective, Problematic situations, Strategies

  2. A Critical Review of Qualitative Research Methods in Evaluating Nursing Curriculum Models: Implication for Nursing Education in the Arab World

    Science.gov (United States)

    Devadas, Briliya

    2016-01-01

    Aim: The purpose of this critical literature review was to examine qualitative studies done on innovative nursing curriculums in order to determine which qualitative methods have been most effective in investigating the effectiveness of the curriculum and which would be most appropriate in an Arab Islamic country. Data Sources: At least 25 studies…

  3. Method of asymptotic expansions and qualitative analysis of finite-dimensional models in the nonlinear field theory

    International Nuclear Information System (INIS)

    Eleonskij, V.M.; Kulagin, N.E.; Novozhilova, N.S.; Silin, V.P.

    1984-01-01

    The reasons which prevent the existence of periodic in time and self-localised in space solutions of the nonlinear wave equation u=F (u) are determined by the methods of qualitative theory of dynamical systems. The correspondence between the qualitative behaviour of special (separatrix) trajectories in the phase space and asymptotic solutions of the nonlinear wave equation is analysed

  4. MRSA model of learning and adaptation: a qualitative study among the general public

    Science.gov (United States)

    2012-01-01

    Background More people in the US now die from Methicillin Resistant Staphylococcus aureus (MRSA) infections than from HIV/AIDS. Often acquired in healthcare facilities or during healthcare procedures, the extremely high incidence of MRSA infections and the dangerously low levels of literacy regarding antibiotic resistance in the general public are on a collision course. Traditional medical approaches to infection control and the conventional attitude healthcare practitioners adopt toward public education are no longer adequate to avoid this collision. This study helps us understand how people acquire and process new information and then adapt behaviours based on learning. Methods Using constructivist theory, semi-structured face-to-face and phone interviews were conducted to gather pertinent data. This allowed participants to tell their stories so their experiences could deepen our understanding of this crucial health issue. Interview transcripts were analysed using grounded theory and sensitizing concepts. Results Our findings were classified into two main categories, each of which in turn included three subthemes. First, in the category of Learning, we identified how individuals used their Experiences with MRSA, to answer the questions: What was learned? and, How did learning occur? The second category, Adaptation gave us insights into Self-reliance, Reliance on others, and Reflections on the MRSA journey. Conclusions This study underscores the critical importance of educational programs for patients, and improved continuing education for healthcare providers. Five specific results of this study can reduce the vacuum that currently exists between the knowledge and information available to healthcare professionals, and how that information is conveyed to the public. These points include: 1) a common model of MRSA learning and adaptation; 2) the self-directed nature of adult learning; 3) the focus on general MRSA information, care and prevention, and antibiotic

  5. MRSA model of learning and adaptation: a qualitative study among the general public

    Directory of Open Access Journals (Sweden)

    Rohde Rodney E

    2012-04-01

    Full Text Available Abstract Background More people in the US now die from Methicillin Resistant Staphylococcus aureus (MRSA infections than from HIV/AIDS. Often acquired in healthcare facilities or during healthcare procedures, the extremely high incidence of MRSA infections and the dangerously low levels of literacy regarding antibiotic resistance in the general public are on a collision course. Traditional medical approaches to infection control and the conventional attitude healthcare practitioners adopt toward public education are no longer adequate to avoid this collision. This study helps us understand how people acquire and process new information and then adapt behaviours based on learning. Methods Using constructivist theory, semi-structured face-to-face and phone interviews were conducted to gather pertinent data. This allowed participants to tell their stories so their experiences could deepen our understanding of this crucial health issue. Interview transcripts were analysed using grounded theory and sensitizing concepts. Results Our findings were classified into two main categories, each of which in turn included three subthemes. First, in the category of Learning, we identified how individuals used their Experiences with MRSA, to answer the questions: What was learned? and, How did learning occur? The second category, Adaptation gave us insights into Self-reliance, Reliance on others, and Reflections on the MRSA journey. Conclusions This study underscores the critical importance of educational programs for patients, and improved continuing education for healthcare providers. Five specific results of this study can reduce the vacuum that currently exists between the knowledge and information available to healthcare professionals, and how that information is conveyed to the public. These points include: 1 a common model of MRSA learning and adaptation; 2 the self-directed nature of adult learning; 3 the focus on general MRSA information, care and

  6. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  7. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  8. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  9. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  10. Reproducibility in cyclostratigraphy: initiating an intercomparison project

    Science.gov (United States)

    Sinnesael, Matthias; De Vleeschouwer, David; Zeeden, Christian; Claeys, Philippe

    2017-04-01

    The study of astronomical climate forcing and the application of cyclostratigraphy have experienced a spectacular growth over the last decades. In the field of cyclostratigraphy a broad range in methodological approaches exist. However, comparative study between the different approaches is lacking. Different cases demand different approaches, but with the growing importance of the field, questions arise about reproducibility, uncertainties and standardization of results. The radioisotopic dating community, in particular, has done far-reaching efforts to improve reproducibility and intercomparison of radioisotopic dates and their errors. To satisfy this need in cyclostratigraphy, we initiate a comparable framework for the community. The aims are to investigate and quantify reproducibility of, and uncertainties related to cyclostratigraphic studies and to provide a platform to discuss the merits and pitfalls of different methodologies, and their applicabilities. With this poster, we ask the feedback from the community on how to design this comparative framework in a useful, meaningful and productive manner. In parallel, we would like to discuss how reproducibility should be tested and what uncertainties should stand for in cyclostratigraphy. On the other hand, we intend to trigger interest for a cyclostratigraphic intercomparison project. This intercomparison project would imply the analysis of artificial and genuine geological records by individual researchers. All participants would be free to determine their method of choice. However, a handful of criterions will be required for an outcome to be comparable. The different results would be compared (e.g. during a workshop or a special session), and the lessons learned from the comparison could potentially be reported in a review paper. The aim of an intercomparison project is not to rank the different methods according to their merits, but to get insight into which specific methods are most suitable for which

  11. A how to guide to reproducible research

    OpenAIRE

    Whitaker, Kirstie

    2018-01-01

    This talk will discuss the perceived and actual barriers experienced by researchers attempting to do reproducible research, and give practical guidance on how they can be overcome. It will include suggestions on how to make your code and data available and usable for others (including a strong suggestion to document both clearly so you don't have to reply to lots of email questions from future users). Specifically it will include a brief guide to version control, collaboration and disseminati...

  12. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  13. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  14. On the origin of reproducible sequential activity in neural circuits

    Science.gov (United States)

    Afraimovich, V. S.; Zhigulin, V. P.; Rabinovich, M. I.

    2004-12-01

    Robustness and reproducibility of sequential spatio-temporal responses is an essential feature of many neural circuits in sensory and motor systems of animals. The most common mathematical images of dynamical regimes in neural systems are fixed points, limit cycles, chaotic attractors, and continuous attractors (attractive manifolds of neutrally stable fixed points). These are not suitable for the description of reproducible transient sequential neural dynamics. In this paper we present the concept of a stable heteroclinic sequence (SHS), which is not an attractor. SHS opens the way for understanding and modeling of transient sequential activity in neural circuits. We show that this new mathematical object can be used to describe robust and reproducible sequential neural dynamics. Using the framework of a generalized high-dimensional Lotka-Volterra model, that describes the dynamics of firing rates in an inhibitory network, we present analytical results on the existence of the SHS in the phase space of the network. With the help of numerical simulations we confirm its robustness in presence of noise in spite of the transient nature of the corresponding trajectories. Finally, by referring to several recent neurobiological experiments, we discuss possible applications of this new concept to several problems in neuroscience.

  15. Student midwives' perceptions on the organisation of maternity care and alternative maternity care models in the Netherlands - a qualitative study.

    Science.gov (United States)

    Warmelink, J Catja; de Cock, T Paul; Combee, Yvonne; Rongen, Marloes; Wiegers, Therese A; Hutton, Eileen K

    2017-01-11

    A major change in the organisation of maternity care in the Netherlands is under consideration, going from an echelon system where midwives provide primary care in the community and refer to obstetricians for secondary and tertiary care, to a more integrated maternity care system involving midwives and obstetricians at all care levels. Student midwives are the future maternity care providers and they may be entering into a changing maternity care system, so inclusion of their views in the discussion is relevant. This study aimed to explore student midwives' perceptions on the current organisation of maternity care and alternative maternity care models, including integrated care. This qualitative study was based on the interpretivist/constructivist paradigm, using a grounded theory design. Interviews and focus groups with 18 female final year student midwives of the Midwifery Academy Amsterdam Groningen (AVAG) were held on the basis of a topic list, then later transcribed, coded and analysed. Students felt that inevitably there will be a change in the organisation of maternity care, and they were open to change. Participants indicated that good collaboration between professions, including a shared system of maternity notes and guidelines, and mutual trust and respect were important aspects of any alternative model. The students indicated that client-centered care and the safeguarding of the physiological, normalcy approach to pregnancy and birth should be maintained in any alternative model. Students expressed worries that the role of midwives in intrapartum care could become redundant, and thus they are motivated to take on new roles and competencies, so they can ensure their own role in intrapartum care. Final year student midwives recognise that change in the organisation of maternity care is inevitable and have an open attitude towards changes if they include good collaboration, client-centred care and safeguards for normal physiological birth. The graduating

  16. Introducing a model incorporating early integration of specialist palliative care: A qualitative research study of staff's perspectives.

    Science.gov (United States)

    Michael, Natasha; O'Callaghan, Clare; Brooker, Joanne E; Walker, Helen; Hiscock, Richard; Phillips, David

    2016-03-01

    Palliative care has evolved to encompass early integration, with evaluation of patient and organisational outcomes. However, little is known of staff's experiences and adaptations when change occurs within palliative care services. To explore staff experiences of a transition from a service predominantly focused on end-of-life care to a specialist service encompassing early integration. Qualitative research incorporating interviews, focus groups and anonymous semi-structured questionnaires. Data were analysed using a comparative approach. Service activity data were also aggregated. A total of 32 medical, nursing, allied health and administrative staff serving a 22-bed palliative care unit and community palliative service, within a large health service. Patients cared for within the new model were significantly more likely to be discharged home (7.9% increase, p = 0.003) and less likely to die in the inpatient unit (10.4% decrease, p management was considered valuable, nurses particularly found additional skill expectations challenging, and perceived patients' acute care needs as detracting from emotional and end-of-life care demands. Staff views varied on whether they regarded the new model's faster-paced work-life as consistent with fundamental palliative care principles. Less certainty about care goals, needing to prioritise care tasks, reduced shared support rituals and other losses could intensify stress, leading staff to develop personalised coping strategies. Services introducing and researching innovative models of palliative care need to ensure adequate preparation, maintenance of holistic care principles in faster work-paced contexts and assist staff dealing with demands associated with caring for patients at different stages of illness trajectories. © The Author(s) 2015.

  17. Timbral aspects of reproduced sound in small rooms. I

    DEFF Research Database (Denmark)

    Bech, Søren

    1995-01-01

    , has been simulated using an electroacoustic setup. The model included the direct sound, 17 individual reflections, and the reverberant field. The threshold of detection and just-noticeable differences for an increase in level were measured for individual reflections using eight subjects for noise......This paper reports some of the influences of individual reflections on the timbre of reproduced sound. A single loudspeaker with frequency-independent directivity characteristics, positioned in a listening room of normal size with frequency-independent absorption coefficients of the room surfaces...... and speech. The results have shown that the first-order floor and ceiling reflections are likely to individually contribute to the timbre of reproduced speech. For a noise signal, additional reflections from the left sidewall will contribute individually. The level of the reverberant field has been found...

  18. Properties of galaxies reproduced by a hydrodynamic simulation

    Science.gov (United States)

    Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.

  19. Exploring Parental and Staff Perceptions of the Family-Integrated Care Model: A Qualitative Focus Group Study.

    Science.gov (United States)

    Broom, Margaret; Parsons, Georgia; Carlisle, Hazel; Kecskes, Zsuzsoka; Thibeau, Shelley

    2017-12-01

    Family-integrated care (FICare) is an innovative model of care developed at Mount Sinai Hospital, Canada, to better integrate parents into the team caring for their infant in the neonatal intensive care unit (NICU). The effects of FICare on neonatal outcomes and parental anxiety were assessed in an international multicenter randomized trial. As an Australian regional level 3 NICU that was randomized to the intervention group, we aimed to explore parent and staff perceptions of the FICare program in our dual occupancy NICU. This qualitative study took place in a level 3 NICU with 5 parent participants and 8 staff participants, using a post implementation review design. Parents and staff perceptions of FICare were explored through focus group methodology. Thematic content analysis was done on focus group transcripts. Parents and staff perceived the FICare program to have had a positive impact on parental confidence and role attainment and thought that FICare improved parent-to-parent and parent-to-staff communication. Staff reported that nurses working with families in the program performed less hands-on care and spent more time educating and supporting parents. FICare may change current NICU practice through integrating and accepting parents as active members of the infant's care team. In addition, nurse's roles may transition from bedside carer to care coordinator, educating and supporting parents during their journey through the NICU. Further research is needed to assess the long-term impact of FICare on neonates, parents, and staff.

  20. Presenting a practical model for governmental political mapping on road traffic injuries in Iran in 2008: a qualitative study.

    Science.gov (United States)

    Ainy, E; Soori, Hamid; Mahfozphoor, S; Movahedinejad, Aa

    2011-10-01

    This study was conducted to assess political mapping in relation to road traffic injuries (RTIs) management and prevention to present a practical model for RTIs. A phenomenological qualitative study was developed to identify stakeholders on RTI in Iran in 2008. The designed questions were discussed by systematic discussion with the relevant specialists. After receiving written consent from the main responsible stakeholders, the questionnaire was filled in by trained experts. Themes were determined and content was analysed in each part. Main responsible stakeholders. By comparing other countries' political mappings which were found in the library and by Internet searching, political mapping of RTI in Iran was suggested. Subjects were 26 experts from governmental and non-governmental organizations. The main proposed leading agencies were traffic police and presidency (13% each). Findings showed that only 31% of our political mapping was formed according to the World Health Organization (WHO). In 94% of cases, the involved organizations had unspecified roles; the reason was poor monitoring for RTI in 39% of organizations. Lack of adequate authority and suitable legislation, appropriate laws and tasks definition were 94% and 18%, respectively. The most essential policy to overcome problems was defined as appropriate legislation (21%), and the most frequent type of support needed was mentioned as adequate budgeting (25%). Traffic police can play the leading agency role by government support, with strong leadership, appropriate legislation, defined tasks and adequate budget.

  1. Application of qualitative response models in a relevance study of older adults' health depreciation and medical care demand.

    Science.gov (United States)

    Weng, Shuo-Chun; Chen, Yu-Chi; Chen, Ching-Yu; Cheng, Yuan-Yang; Tang, Yih-Jing; Yang, Shu-Hui; Lin, Jwu-Rong

    2017-04-01

    The effect of health depreciation in older people on medical care demand is not well understood. We tried to assess the medical care demand with length of hospitalization and their impact on profits as a result of health depreciation. All participants who underwent comprehensive geriatric assessment were from a prospective cohort study at a tertiary hospital. A total of 1191 cases between September 2008 to October 2012 were investigated. Three sets of qualitative response models were constructed to estimate the impact of older adults' health depreciation on multidisciplinary geriatric care services. Furthermore, we analyzed the factors affecting the composite end-point of rehospitalization within 14 days, re-admission to the emergency department within 3 days and patient death. Greater health depreciation in elderly patients was positively correlated with greater medical care demand. Three major components were defined as health depreciation: elderly adaptation function, geriatric syndromes and multiple chronic diseases. On admission, the better the basic living functions, the shorter the length of hospitalization (coefficient = -0.35, P age and length of hospitalization. However, factors that correlated with relatively good outcome were functional improvement after medical care services and level of disease education. An optimal allocation system for selection of cases into multidisciplinary geriatric care is required because of limited resources. Outcomes will improve with health promotion and preventive care services. Geriatr Gerontol Int 2017; 17: 645-652. © 2016 Japan Geriatrics Society.

  2. Accuracy, reproducibility, and time efficiency of dental measurements using different technologies.

    Science.gov (United States)

    Grünheid, Thorsten; Patel, Nishant; De Felippe, Nanci L; Wey, Andrew; Gaillard, Philippe R; Larson, Brent E

    2014-02-01

    Historically, orthodontists have taken dental measurements on plaster models. Technological advances now allow orthodontists to take these measurements on digital models. In this study, we aimed to assess the accuracy, reproducibility, and time efficiency of dental measurements taken on 3 types of digital models. emodels (GeoDigm, Falcon Heights, Minn), SureSmile models (OraMetrix, Richardson, Tex), and AnatoModels (Anatomage, San Jose, Calif) were made for 30 patients. Mesiodistal tooth-width measurements taken on these digital models were timed and compared with those on the corresponding plaster models, which were used as the gold standard. Accuracy and reproducibility were assessed using the Bland-Altman method. Differences in time efficiency were tested for statistical significance with 1-way analysis of variance. Measurements on SureSmile models were the most accurate, followed by those on emodels and AnatoModels. Measurements taken on SureSmile models were also the most reproducible. Measurements taken on SureSmile models and emodels were significantly faster than those taken on AnatoModels and plaster models. Tooth-width measurements on digital models can be as accurate as, and might be more reproducible and significantly faster than, those taken on plaster models. Of the models studied, the SureSmile models provided the best combination of accuracy, reproducibility, and time efficiency of measurement. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  3. Modeling Stop-and-Go Waves in Pedestrian Dynamics

    OpenAIRE

    Portz, Andrea; Seyfried, Armin

    2010-01-01

    Several spatially continuous pedestrian dynamics models have been validated against empirical data. We try to reproduce the experimental fundamental diagram (velocity versus density) with simulations. In addition to this quantitative criterion, we tried to reproduce stop-and-go waves as a qualitative criterion. Stop-and-go waves are a characteristic phenomenon for the single file movement. Only one of three investigated models satisfies both criteria.

  4. Standing Together for Reproducibility in Large-Scale Computing: Report on reproducibility@XSEDE

    OpenAIRE

    James, Doug; Wilkins-Diehr, Nancy; Stodden, Victoria; Colbry, Dirk; Rosales, Carlos; Fahey, Mark; Shi, Justin; Silva, Rafael F.; Lee, Kyo; Roskies, Ralph; Loewe, Laurence; Lindsey, Susan; Kooper, Rob; Barba, Lorena; Bailey, David

    2014-01-01

    This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enab...

  5. Application of qualitative reasoning with functional knowledge represented by Multilevel Flow Modeling to diagnosis of accidental situation in nuclear power plant

    International Nuclear Information System (INIS)

    Yoshida, Kazuo; Tanabe, Fumiya; Kawase, Katumi.

    1996-01-01

    It has been proposed to use the Multilevel Flow Modeling (MFM) by M. Lind as a framework for functional knowledge representation for qualitative reasoning in a complex process system such as nuclear power plant. To build a knowledge base with MFM framework makes it possible to represent functional characteristics in different levels of abstraction and aggregation. A pilot inference system based on the qualitative reasoning with MFM has been developed to diagnose a cause of abnormal events in a typical PWR power plant. Some single failure events has been diagnosed with this system to verify the proposed method. In the verification study, some investigation has been also performed to clarify the effects of this knowledge representation in efficiency of reasoning and ambiguity of qualitative reasoning. (author)

  6. Ratio-scaling of listener preference of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian

    2005-01-01

    -trivial assumption in the case of complex spatial sounds. In the present study the Bradley-Terry-Luce (BTL) model was employed to investigate the unidimensionality of preference judgments made by 40 listeners on multichannel reproduced sound. Short musical excerpts played back in eight reproduction modes (mono...... music). As a main result, the BTL model was found to predict the choice frequencies well. This implies that listeners were able to integrate the complex nature of the sounds into a unidimensional preference judgment. It further implies the existence of a preference scale on which the reproduction modes...

  7. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  8. Qualitative Student Models,

    Science.gov (United States)

    1986-05-01

    University of Leyden AFOSR Baltimore, MD 21204 Education Research Center Bol g Boerhaavelaan 2 asolling ADCDr. Davida Charney 2334 EN Leyden Washington, DC...Kotovsky Dr. Charles Lewis Washington, DC 20002Department of Psychology Faculteit Sociale Wetenschappen Dr. Kathleen McKeownCommunity College of

  9. Elaboration of the Reciprocal-Engagement Model of Genetic Counseling Practice: a Qualitative Investigation of Goals and Strategies.

    Science.gov (United States)

    Redlinger-Grosse, Krista; Veach, Patricia McCarthy; LeRoy, Bonnie S; Zierhut, Heather

    2017-12-01

    As the genetic counseling field evolves, a comprehensive model of practice is critical. The Reciprocal-Engagement Model (REM) consists of 5 tenets and 17 goals. Lacking in the REM, however, are well-articulated counselor strategies and behaviors. The purpose of the present study was to further elaborate and provide supporting evidence for the REM by identifying and mapping genetic counseling strategies to the REM goals. A secondary, qualitative analysis was conducted on data from two prior studies: 1) focus group results of genetic counseling outcomes (Redlinger-Grosse et al., Journal of Genetic Counseling, 2015); and 2) genetic counselors' examples of successful and unsuccessful genetic counseling sessions (Geiser et al. 2009). Using directed content analysis, 337 unique strategies were extracted from focus group data. A Q-sort of the 337 strategies yielded 15 broader strategy domains that were then mapped to the successful and unsuccessful session examples. Differing prevalence of strategy domains identified in successful sessions versus the prevalence of domains identified as lacking in unsuccessful sessions provide further support for the REM goals. The most prevalent domains for successful sessions were Information Giving and Use Psychosocial Skills and Strategies; and for unsuccessful sessions, Information Giving and Establish Working Alliance. Identified strategies support the REM's reciprocal nature, especially with regard to addressing patients' informational and psychosocial needs. Patients' contributions to success (or lack thereof) of sessions was also noted, supporting a REM tenet that individual characteristics and the counselor-patient relationship are central to processes and outcomes. The elaborated REM could be used as a framework for certain graduate curricular objectives, and REM components could also inform process and outcomes research studies to document and further characterize genetic counselor strategies.

  10. Neonatal intensive care nursing curriculum challenges based on context, input, process, and product evaluation model: A qualitative study

    Directory of Open Access Journals (Sweden)

    Mansoureh Ashghali-Farahani

    2018-01-01

    Full Text Available Background: Weakness of curriculum development in nursing education results in lack of professional skills in graduates. This study was done on master's students in nursing to evaluate challenges of neonatal intensive care nursing curriculum based on context, input, process, and product (CIPP evaluation model. Materials and Methods: This study was conducted with qualitative approach, which was completed according to the CIPP evaluation model. The study was conducted from May 2014 to April 2015. The research community included neonatal intensive care nursing master's students, the graduates, faculty members, neonatologists, nurses working in neonatal intensive care unit (NICU, and mothers of infants who were hospitalized in such wards. Purposeful sampling was applied. Results: The data analysis showed that there were two main categories: “inappropriate infrastructure” and “unknown duties,” which influenced the context formation of NICU master's curriculum. The input was formed by five categories, including “biomedical approach,” “incomprehensive curriculum,” “lack of professional NICU nursing mentors,” “inappropriate admission process of NICU students,” and “lack of NICU skill labs.” Three categories were extracted in the process, including “more emphasize on theoretical education,” “the overlap of credits with each other and the inconsistency among the mentors,” and “ineffective assessment.” Finally, five categories were extracted in the product, including “preferring routine work instead of professional job,” “tendency to leave the job,” “clinical incompetency of graduates,” “the conflict between graduates and nursing staff expectations,” and “dissatisfaction of graduates.” Conclusions: Some changes are needed in NICU master's curriculum by considering the nursing experts' comments and evaluating the consequences of such program by them.

  11. Quantitative assessment of key parameters in qualitative vulnerability methods applied in karst systems based on an integrated numerical modelling approach

    Science.gov (United States)

    Doummar, Joanna; Kassem, Assaad

    2017-04-01

    In the framework of a three-year PEER (USAID/NSF) funded project, flow in a Karst system in Lebanon (Assal) dominated by snow and semi arid conditions was simulated and successfully calibrated using an integrated numerical model (MIKE-She 2016) based on high resolution input data and detailed catchment characterization. Point source infiltration and fast flow pathways were simulated by a bypass function and a high conductive lens respectively. The approach consisted of identifying all the factors used in qualitative vulnerability methods (COP, EPIK, PI, DRASTIC, GOD) applied in karst systems and to assess their influence on recharge signals in the different hydrological karst compartments (Atmosphere, Unsaturated zone and Saturated zone) based on the integrated numerical model. These parameters are usually attributed different weights according to their estimated impact on Groundwater vulnerability. The aim of this work is to quantify the importance of each of these parameters and outline parameters that are not accounted for in standard methods, but that might play a role in the vulnerability of a system. The spatial distribution of the detailed evapotranspiration, infiltration, and recharge signals from atmosphere to unsaturated zone to saturated zone was compared and contrasted among different surface settings and under varying flow conditions (e.g., in varying slopes, land cover, precipitation intensity, and soil properties as well point source infiltration). Furthermore a sensitivity analysis of individual or coupled major parameters allows quantifying their impact on recharge and indirectly on vulnerability. The preliminary analysis yields a new methodology that accounts for most of the factors influencing vulnerability while refining the weights attributed to each one of them, based on a quantitative approach.

  12. Qualitative modeling identifies IL-11 as a novel regulator in maintaining self-renewal in human pluripotent stem cells

    Directory of Open Access Journals (Sweden)

    Hedi ePeterson

    2013-10-01

    Full Text Available Pluripotency in human embryonic stem cells (hESCs and induced pluripotent stem cells (iPSCs is regulated by three transcription factors - OCT3/4, SOX2 and NANOG. To fully exploit the therapeutic potential of these cells it is essential to have a good mechanistic understanding of the maintenance of self-renewal and pluripotency. In this study, we demonstrate a powerful systems biology approach in which we first expand literature-based network encompassing the core regulators of pluripotency by assessing the behaviour of genes targeted by perturbation experiments. We focused our attention on highly regulated genes encoding cell surface and secreted proteins as these can be more easily manipulated by the use of inhibitors or recombinant proteins. Qualitative modeling based on combining boolean networks and in silico perturbation experiments were employed to identify novel pluripotency-regulating genes. We validated Interleukin-11 (IL-11 and demonstrate that this cytokine is a novel pluripotency-associated factor capable of supporting self-renewal in the absence of exogenously added bFGF in culture. To date, the various protocols for hESCs maintenance require supplementation with bFGF to activate the Activin/Nodal branch of the TGFβ signaling pathway. Additional evidence supporting our findings is that IL-11 belongs to the same protein family as LIF, which is known to be necessary for maintaining pluripotency in mouse but not in human ESCs. These cytokines operate through the same gp130 receptor which interacts with Janus kinases. Our finding might explain why mESCs are in a more naïve cell state compared to hESCs and how to convert primed hESCs back to the naïve state. Taken together, our integrative modeling approach has identified novel genes as putative candidates to be incorporated into the expansion of the current gene regulatory network responsible for inducing and maintaining pluripotency.

  13. Assessing the effect of quantitative and qualitative predictors on gastric cancer individuals survival using hierarchical artificial neural network models.

    Science.gov (United States)

    Amiri, Zohreh; Mohammad, Kazem; Mahmoudi, Mahmood; Parsaeian, Mahbubeh; Zeraati, Hojjat

    2013-01-01

    There are numerous unanswered questions in the application of artificial neural network models for analysis of survival data. In most studies, independent variables have been studied as qualitative dichotomous variables, and results of using discrete and continuous quantitative, ordinal, or multinomial categorical predictive variables in these models are not well understood in comparison to conventional models. This study was designed and conducted to examine the application of these models in order to determine the survival of gastric cancer patients, in comparison to the Cox proportional hazards model. We studied the postoperative survival of 330 gastric cancer patients who suffered surgery at a surgical unit of the Iran Cancer Institute over a five-year period. Covariates of age, gender, history of substance abuse, cancer site, type of pathology, presence of metastasis, stage, and number of complementary treatments were entered in the models, and survival probabilities were calculated at 6, 12, 18, 24, 36, 48, and 60 months using the Cox proportional hazards and neural network models. We estimated coefficients of the Cox model and the weights in the neural network (with 3, 5, and 7 nodes in the hidden layer) in the training group, and used them to derive predictions in the study group. Predictions with these two methods were compared with those of the Kaplan-Meier product limit estimator as the gold standard. Comparisons were performed with the Friedman and Kruskal-Wallis tests. Survival probabilities at different times were determined using the Cox proportional hazards and a neural network with three nodes in the hidden layer; the ratios of standard errors with these two methods to the Kaplan-Meier method were 1.1593 and 1.0071, respectively, revealed a significant difference between Cox and Kaplan-Meier (P neural network, and the neural network and the standard (Kaplan-Meier), as well as better accuracy for the neural network (with 3 nodes in the hidden layer

  14. Ecological Applications of Qualitative Reasoning

    NARCIS (Netherlands)

    Bredeweg, B.; Salles, P.; Neumann, M.; Recknagel, F.

    2006-01-01

    Representing qualitative ecological knowledge is of great interest for ecological modelling. QR provides means to build conceptual models and to make qualitative knowledge explicit, organized and manageable by means of symbolic computing. This chapter discusses the main characteristics of QR using

  15. Convergence of macrostates under reproducible processes

    International Nuclear Information System (INIS)

    Rau, Jochen

    2010-01-01

    I show that whenever a system undergoes a reproducible macroscopic process the mutual distinguishability of macrostates, as measured by their relative entropy, diminishes. This extends the second law which regards only ordinary entropies, and hence only the distinguishability between macrostates and one specific reference state (equidistribution). The new result holds regardless of whether the process is linear or nonlinear. Its proof hinges on the monotonicity of quantum relative entropy under arbitrary coarse grainings, even those that cannot be represented by trace-preserving completely positive maps.

  16. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  17. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  18. How well can DFT reproduce key interactions in Ziegler-Natta systems?

    KAUST Repository

    Correa, Andrea; Bahri-Laleh, Naeimeh; Cavallo, Luigi

    2013-01-01

    The performance of density functional theory in reproducing some of the main interactions occurring in MgCl2-supported Ziegler-Natta catalytic systems is assessed. Eight model systems, representatives of key interactions occurring in Ziegler

  19. Qualitative Economics

    DEFF Research Database (Denmark)

    Fast, Michael; Clark, Woodrow

    2012-01-01

    the everyday economic life is the central issue and is discussed from the perspective of interactionism. It is a perspective developed from the Lifeworld philosophical traditions, such as symbolic interactionism and phenomenology, seeking to develop the thinking of economics. The argument is that economics...... and the process of thinking, e.g. the ontology and the epistemology. Keywords: qualitative, interaction, process, organizing, thinking, perspective, epistemology....

  20. Thompson revisited. Ein empirisch fundiertes Modell zur Qualität von „Quality-TV“ aus Nutzersicht

    Directory of Open Access Journals (Sweden)

    Michael Harnischmacher

    2015-07-01

    Full Text Available Was bedeutet das Attribut „Quality-TV“ eigentlich für das Publikum? Nach welchen Kriterien beurteilen Zuschauerinnen und Zuschauer, ob eine Serie Qualitätsfernsehen ist oder nicht? Im Bereich der rezipientenorientierten Qualitätsforschung bezüglich Fernsehserien sind bislang fast ausschließlich qualitativ erhobene Modelle bedeutsam, am bekanntesten sicherlich die bereits 1996 von Robert J. Thompson vorgeschlagenen 12 Kriterien. Die vorliegende Untersuchung widmet sich nun der Frage, ob diese Qualitätskriterien tatsächlich die „richtigen“ sind. Sind sie für die Zuschauer/innen von Serien bedeutsam für die Einschätzung, ob ein Programm „Quality-TV“ ist oder nicht? Bislang fehlt eine empirische Fundierung der einzelnen Merkmale. Ebenso ungeklärt ist bislang, ob es eine Rangfolge dieser Merkmale gibt. Welche sind bedeutsamer, welche weniger wichtig für die Wahrnehmung einer Serie als Qualitätsprodukt? Die Studie hat Thompsons Vorschlag (unter Bezugnahme auf weitere Studien zum Thema (z.B. Cardwell 2007; Feuer 2007; Dreher 2010; Blanchett 2011; Kumpf 2011 operationalisiert und in einer standardisierten Befragung der Nutzer von 13 Onlineforen zu Qualitätsserien (n=1382 getestet. Auf Basis dieser Befragung kann statistisch nachgewiesen werden, welche Merkmale von den Zuschauer/innen als besonders wichtig angesehen werden und wie diese zu Qualitätsfaktoren zusammengefasst werden können, die das Phänomen „Quality-TV“ aus Zuschauersicht tatsächlich beschreiben können.

  1. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    Science.gov (United States)

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. Copyright © 2015, American Association for the Advancement of Science.

  2. Assessment of the relationships between morphometric characteristics of relief with quantitative and qualitative characteristics of forests using ASTER and SRTM digital terrain models

    OpenAIRE

    D. M. Chernikhovsky

    2017-01-01

    In the article are shown results of assessment of relationships between quantitative and qualitative characteristics of forests and morphometric characteristics of relief on an example model plot in Nanayskoe forest district of Khabarovsk Territory. The relevance of the investigation is connected with need for improvement of the system of forest evaluation operations in the Russian Federation, including with use of the landscape approach. The tasks of the investigation were assessment of rela...

  3. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    Science.gov (United States)

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but

  4. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    Directory of Open Access Journals (Sweden)

    Spjuth Ola

    2010-06-01

    Full Text Available Abstract Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join

  5. Empirical questions for collective-behaviour modelling

    Indian Academy of Sciences (India)

    The collective behaviour of groups of social animals has been an active topic of study ... Models have been successful at reproducing qualitative features of ... quantitative and detailed empirical results for a range of animal systems. ... standard method [23], the redundant information recorded by the cameras can be used to.

  6. Reproducibility of morphometric X-ray absorptiometry

    International Nuclear Information System (INIS)

    Culton, N.; Pocock, N.

    1999-01-01

    Full text: Morphometric X-ray absorptiometry (MXA) using DXA is potentially a useful clinical tool which may provide additional vertebral fracture information with low radiation exposure. While morphometric analysis is semi-automated, operator intervention is crucial for the accurate positioning of the six data points quantifying the vertebral heights at the anterior, middle and posterior positions. Our study evaluated intra-operator reproducibility of MXA in an elderly patient population and assessed the effect of training and experience on vertebral height precision. Ten patients, with a mean lumbar T score of - 2.07, were studied. Images were processed by a trained operator who had initially only limited morphometric experience. The analysis of the data files were repeated at 2 and 6 weeks, during which time the operator had obtained further experience and training. The intra-operator precision of vertebral height measurements was calculated using the three separate combinations of paired analyses, and expressed as the coefficient of variation. This study confirms the importance of adequate training and attention to detail in MXA analysis. The data indicate that the precision of MXA is adequate for its use in the diagnosis of vertebral fractures, based on a 20% deformity criteria. Use of MXA for monitoring would require approximately an 8% change in vertebral heights to achieve statistical significance

  7. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  8. Environment and industrial economy: Challenge of reproducibility

    International Nuclear Information System (INIS)

    Rullani, E.

    1992-01-01

    Historically and methodologically counterposed until now, the environmentalist and the economic approach to environmental problems need to be integrated in a new approach that considers, from one side, the relevance of the ecological equilibria for the economic systems and, from the other side, the economic dimension (in terms of investments and transformations in the production system) of any attempt to achieve a better environment. In order to achieve this integration, both approaches are compelled to give up some cultural habits that have characterized them, and have contributed to over-emphasize the opposition between them. The article shows that both approaches can converge into a new one, in which environment is no longer only an holistic, not bargainable, natural external limit to human activity (as in the environmentalist approach), nor simply a scarce and exhaustible resource (as economics tends to consider it); environment should instead become part of the reproducibility sphere, or, in other words, it must be regarded as part of the output that the economic system provides. This new approach, due to scientific and technological advances, is made possible for an increasing class of environmental problems. In order to do this, an evolution is required, that could be able to convert environmental goals into investment and technological innovation goals, and communicate to the firms the value society assigns to environmental resources. This value, the author suggests, should correspond to the reproduction cost. Various examples of this new approach are analyzed and discussed

  9. Reproducibility of temporomandibular joint tomography. Influence of shifted X-ray beam and tomographic focal plane on reproducibility

    International Nuclear Information System (INIS)

    Saito, Masashi

    1999-01-01

    Proper tomographic focal plane and x-ray beam direction are the most important factors to obtain accurate images of the temporomandibular joint (TMJ). In this study, to clarify the magnitude of effect of these two factors on the image quality. We evaluated the reproducibility of tomograms by measuring the distortion when the x-ray beam was shifted from the correct center of the object. The effects of the deviation of the tomographic focal plane on image quality were evaluated by the MTF (Modulation Transfer Function). Two types of tomograms, one the plane type, the other the rotational type were used in this study. A TMJ model was made from Teflon for the purpose of evaluation by shifting the x-ray beam. The x-ray images were obtained by tilting the model from 0 to 10 degrees 2-degree increments. These x-ray images were processed for computer image analysis, and then the distance between condyle and the joint space was measured. To evaluate the influence of the shifted tomographic focal plane on image sharpness, the x-ray images from each setting were analyzed by MTF. To obtain the MTF, ''knife-edge'' made from Pb was used. The images were scanned with a microdensitometer at the central focal plane, and 0, 0.5, 1 mm away respectively. The density curves were analyzed by Fourier analysis and the MTF was calculated. The reproducibility of images became worse by shifting the x-ray beam. This tendency was similar for both tomograms. Object characteristics such as anterior and posterior portion of the joint space affected the deterioration of reproducibility of the tomography. The deviation of the tomographic focal plane also decreased the reproducibility of the x-ray images. The rotational type showed a better MTF, but it became seriously unfavorable with slight changes of the tomographic focal plane. Contrarily, the plane type showed a lower MTF, but the image was stable with shifting of the tomographic focal plane. (author)

  10. Efficient and reproducible identification of mismatch repair deficient colon cancer

    DEFF Research Database (Denmark)

    Joost, Patrick; Bendahl, Pär-Ola; Halvarsson, Britta

    2013-01-01

    BACKGROUND: The identification of mismatch-repair (MMR) defective colon cancer is clinically relevant for diagnostic, prognostic and potentially also for treatment predictive purposes. Preselection of tumors for MMR analysis can be obtained with predictive models, which need to demonstrate ease...... of application and favorable reproducibility. METHODS: We validated the MMR index for the identification of prognostically favorable MMR deficient colon cancers and compared performance to 5 other prediction models. In total, 474 colon cancers diagnosed ≥ age 50 were evaluated with correlation between...... clinicopathologic variables and immunohistochemical MMR protein expression. RESULTS: Female sex, age ≥60 years, proximal tumor location, expanding growth pattern, lack of dirty necrosis, mucinous differentiation and presence of tumor-infiltrating lymphocytes significantly correlated with MMR deficiency. Presence...

  11. Assessment of the relationships between morphometric characteristics of relief with quantitative and qualitative characteristics of forests using ASTER and SRTM digital terrain models

    Directory of Open Access Journals (Sweden)

    D. M. Chernikhovsky

    2017-06-01

    Full Text Available In the article are shown results of assessment of relationships between quantitative and qualitative characteristics of forests and morphometric characteristics of relief on an example model plot in Nanayskoe forest district of Khabarovsk Territory. The relevance of the investigation is connected with need for improvement of the system of forest evaluation operations in the Russian Federation, including with use of the landscape approach. The tasks of the investigation were assessment of relationships between characteristics of relief and characteristics of forest vegetation cover on different levels of forest management; evaluation of morphometric characteristics of relief are important for structure and productivity of forests; comparison of the results obtained through the use of digital terrain models ASTER and SRTM. Geoinformatic projects were formed for a model plot on the basis of digital terrain models and data of forest mensuration and State (National Forest Inventory. On the basis of the developed method with use geoinformatic technologies were estimated morphometric characteristics of relief (average height, standard deviation of height, entropy, exposition and gradient of slopes, indexes of ruggedness and roughness, quantitative and qualitative characteristics of forests. The multifactor regression analysis, where characteristics of forests (as dependent variables and morphometric characteristics of relief (as independent variables were used, have been done. As a result of research, the set of morphometric characteristics of relief able to influence to variability of quantitative and qualitative characteristics of forests was identified. The set of linear regression equations able to explain 30–50 % of variability of dependent variables was obtained. The regression equations, obtained on base of digital terrain models ASTER and SRTM, comparable to each other in strength of relations (coefficients of determination, but includes the

  12. On the solutions of electrohydrodynamic flow with fractional differential equations by reproducing kernel method

    Directory of Open Access Journals (Sweden)

    Akgül Ali

    2016-01-01

    Full Text Available In this manuscript we investigate electrodynamic flow. For several values of the intimate parameters we proved that the approximate solution depends on a reproducing kernel model. Obtained results prove that the reproducing kernel method (RKM is very effective. We obtain good results without any transformation or discretization. Numerical experiments on test examples show that our proposed schemes are of high accuracy and strongly support the theoretical results.

  13. An Evaluation Model of Quantitative and Qualitative Fuzzy Multi-Criteria Decision-Making Approach for Location Selection of Transshipment Ports

    Directory of Open Access Journals (Sweden)

    Ji-Feng Ding

    2013-01-01

    Full Text Available The role of container logistics centre as home bases for merchandise transportation has become increasingly important. The container carriers need to select a suitable centre location of transshipment port to meet the requirements of container shipping logistics. In the light of this, the main purpose of this paper is to develop a fuzzy multi-criteria decision-making (MCDM model to evaluate the best selection of transshipment ports for container carriers. At first, some concepts and methods used to develop the proposed model are briefly introduced. The performance values of quantitative and qualitative subcriteria are discussed to evaluate the fuzzy ratings. Then, the ideal and anti-ideal concepts and the modified distance measure method are used in the proposed model. Finally, a step-by-step example is illustrated to study the computational process of the quantitative and qualitative fuzzy MCDM model. The proposed approach has successfully accomplished our goal. In addition, the proposed fuzzy MCDM model can be empirically employed to select the best location of transshipment port for container carriers in the future study.

  14. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.

    Science.gov (United States)

    Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta

    2018-01-01

    The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  15. Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

    Directory of Open Access Journals (Sweden)

    Eiji Watanabe

    2018-03-01

    Full Text Available The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

  16. Systematic Methodology for Reproducible Optimizing Batch Operation

    DEFF Research Database (Denmark)

    Bonné, Dennis; Jørgensen, Sten Bay

    2006-01-01

    This contribution presents a systematic methodology for rapid acquirement of discrete-time state space model representations of batch processes based on their historical operation data. These state space models are parsimoniously parameterized as a set of local, interdependent models. The present...

  17. Liquid scintigraphic gastric emptying - is it reproducible?

    International Nuclear Information System (INIS)

    Cooper, R.G.; Shuter, B.; Leach, M.; Roach, P.J.

    1999-01-01

    Full text: Radioisotope gastric emptying (GE) studies have been used as a non-invasive technique for motility assessment for many years. In a recent study investigating the correlation of mesenteric vascular changes with GE, six subjects had a repeat study 2-4 months later. Repeat studies were required due to minor technical problems (5 subjects) and a very slow GE (I subject) on the original study. Subjects drank 275 ml of 'Ensure Plus' mixed with 8 MBq 67 Ga-DTPA and were imaged for 2 h while lying supine. GE time-activity curves for each subject were generated and time to half emptying (T l/2 ) calculated. Five of the six subjects had more rapid GE on the second study. Three of the subjects had T l/2 values on their second study which were within ± 15 min of their original T l/2 . The other three subjects had T l/2 values on their second study which were 36 min, 55 min and 280 min (subject K.H.) less than their original T l/2 . Statistical analysis (t-test) was performed on paired T l/2 values. The average T l/2 value was greater in the first study than in the second (149 ± 121 and 86 ± 18 min respectively), although the difference was not statistically significant (P ∼ 0.1). Subjects' anxiety levels were not quantitated during the GE study; however, several major equipment faults occurred during the original study of subject K.H., who became visibly stressed. These results suggest that the reproducibility of GE studies may be influenced by psychological factors

  18. Is my network module preserved and reproducible?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    2011-01-01

    Full Text Available In many applications, one is interested in determining which of the properties of a network module change across conditions. For example, to validate the existence of a module, it is desirable to show that it is reproducible (or preserved in an independent test network. Here we study several types of network preservation statistics that do not require a module assignment in the test network. We distinguish network preservation statistics by the type of the underlying network. Some preservation statistics are defined for a general network (defined by an adjacency matrix while others are only defined for a correlation network (constructed on the basis of pairwise correlations between numeric variables. Our applications show that the correlation structure facilitates the definition of particularly powerful module preservation statistics. We illustrate that evaluating module preservation is in general different from evaluating cluster preservation. We find that it is advantageous to aggregate multiple preservation statistics into summary preservation statistics. We illustrate the use of these methods in six gene co-expression network applications including 1 preservation of cholesterol biosynthesis pathway in mouse tissues, 2 comparison of human and chimpanzee brain networks, 3 preservation of selected KEGG pathways between human and chimpanzee brain networks, 4 sex differences in human cortical networks, 5 sex differences in mouse liver networks. While we find no evidence for sex specific modules in human cortical networks, we find that several human cortical modules are less preserved in chimpanzees. In particular, apoptosis genes are differentially co-expressed between humans and chimpanzees. Our simulation studies and applications show that module preservation statistics are useful for studying differences between the modular structure of networks. Data, R software and accompanying tutorials can be downloaded from the following webpage: http

  19. Qualitative Analysis of the Goodwin Model of the Growth Cycle || Análisis cualitativo del modelo de Goodwin de ciclos de crecimiento

    Directory of Open Access Journals (Sweden)

    Serebriakov, Vladimir

    2017-06-01

    Full Text Available Goodwin's model is a set of ordinary differential equations and is a well-known model of the growth cycle. However, its four constants require an extensive numerical study of its two differential equations to identify all possible unsteady state behaviors, i.e. phase portraits, which corresponds to infinitely many combinations of numerical values of the constants. Qualitative interpretation of Goodwin's model solves these problems by replacing all numerical constants and all derivatives by trends (increasing, constant and decreasing. The model has two variables - the employment rate V, and the labour share U. A solution of the qualitative Goodwin's model is a scenario. An example of a Goodwin's scenario is - V is increasing more and more rapidly, U is decreasing and the decrease is slowing down. The complete set of all possible 41 Goodwin's scenarios and 168 time transitions among them are given. This result qualitatively represents all possible unsteady state Goodwin's behaviours. It is therefore possible to predict all possible future behaviours if a current behaviour is known/chosen. A prediction example is presented in details. No prior knowledge of qualitative model theory is required. || El modelo de Goodwin es un conjunto de ecuaciones diferenciales ordinarias y resulta un modelo bien conocido para ciclos de crecimiento. Sin embargo, sus cuatro constantes requieren de un extenso estudio numérico de sus dos ecuaciones diferenciales para identificar todos los posibles comportamientos de estado no estacionario, i.e. retratos de fase, que corresponden a infinitamente muchas combinaciones de valores numéricos de las constantes. La interpretación cualitativa del modelo de Goodwin resuelve estos problemas reemplazando todas las constantes numéricas y todas las derivadas por tendencias (creciente, constante y decreciente. El modelo consiste en dos variables: la tasa de empleabilidad V y la repartición del valor agregado U. Una solución del

  20. Qualitative analysis of homogeneous universes

    International Nuclear Information System (INIS)

    Novello, M.; Araujo, R.A.

    1980-01-01

    The qualitative behaviour of cosmological models is investigated in two cases: Homogeneous and isotropic Universes containing viscous fluids in a stokesian non-linear regime; Rotating expanding universes in a state which matter is off thermal equilibrium. (Author) [pt

  1. Integrating Qualitative and Quantitative Methods in Participatory Modeling to Elicit Behavioral Drivers in Environmental Dilemmas: the Case of Air Pollution in Talca, Chile.

    Science.gov (United States)

    Meinherz, Franziska; Videira, Nuno

    2018-04-10

    The aim of this paper is to contribute to the exploration of environmental modeling methods based on the elicitation of stakeholders' mental models. This aim is motivated by the necessity to understand the dilemmas and behavioral rationales of individuals for supporting the management of environmental problems. The methodology developed for this paper integrates qualitative and quantitative methods by deploying focus groups for the elicitation of the behavioral rationales of the target population, and grounded theory to code the information gained in the focus groups and to guide the development of a dynamic simulation model. The approach is applied to a case of urban air pollution caused by residential heating with wood in central Chile. The results show how the households' behavior interrelates with the governmental management strategies and provide valuable and novel insights into potential challenges to the implementation of policies to manage the local air pollution problem. The experience further shows that the developed participatory modeling approach allows to overcome some of the issues currently encountered in the elicitation of individuals' behavioral rationales and in the quantification of qualitative information.

  2. Understanding stakeholder important outcomes and perceptions of equity, acceptability and feasibility of a care model for haemophilia management in the US: a qualitative study.

    Science.gov (United States)

    Lane, S J; Sholapur, N S; Yeung, C H T; Iorio, A; Heddle, N M; Sholzberg, M; Pai, M

    2016-07-01

    Care for persons with haemophilia (PWH) is most commonly delivered through the integrated care model used by Hemophilia Treatment Centers (HTCs). Although this model is widely accepted as the gold standard for the management of haemophilia; there is little evidence comparing different care models. We performed a qualitative study to gain insight into issues related to outcomes, acceptability, equity and feasibility of different care models operating in the US. We used a qualitative descriptive approach with semi-structured interviews. Purposive sampling was used to recruit individuals with experience providing or receiving care for haemophilia in the US through either an integrated care centre, a specialty pharmacy or homecare company, or by a specialist in a non-specialized centre. Persons with haemophilia, parents of PWH aged ≤18, healthcare providers, insurance company representatives and policy developers were invited to participate. Twenty-nine interviews were conducted with participants representing 18 US states. Participants in the study sample had experience receiving or providing care predominantly within an HTC setting. Integrated care at HTCs was highly acceptable to participants, who appreciated the value of specialized, expert care in a multidisciplinary team setting. Equity and feasibility issues were primarily related to health insurance and funding limitations. Additional research is required to document the impact of care on health and psychosocial outcomes and identify effective ways to facilitate equitable access to haemophilia treatment and care. © 2016 John Wiley & Sons Ltd.

  3. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M A; Fink, D; Hua, Q; Jacobsen, G E; Lawson, E M; Smith, A M; Tuniz, C [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  4. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  5. Qualitative Economics

    DEFF Research Database (Denmark)

    Fast, Michael; Clark II, Woodrow W

                         This book is about science -- specifically, the science of economics. Or lack thereof is more accurate. The building of any science, let alone economics, is grounded in the understanding of what is beneath the "surface" of economics. Science, and hence economics, should...... be concerned with formulating ideas that express theories which produce descriptions of how to understand phenomenon and real world experiences.                       Economics must become a science, because the essence of economics in terms of human actions, group interactions and communities are in need...... of scientific inquiry. Academics and scholars need a scientific perspective that can hypothesize, theorize document, understand and analyze human dynamics from the individual to more societal interactions. And that is what qualitative economics does; it can make economics into becoming a science. The economic...

  6. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    2017-01-01

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economistís model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  7. The Qualitative Expectations Hypothesis

    DEFF Research Database (Denmark)

    Frydman, Roman; Johansen, Søren; Rahbek, Anders

    We introduce the Qualitative Expectations Hypothesis (QEH) as a new approach to modeling macroeconomic and financial outcomes. Building on John Muth's seminal insight underpinning the Rational Expectations Hypothesis (REH), QEH represents the market's forecasts to be consistent with the predictions...... of an economist's model. However, by assuming that outcomes lie within stochastic intervals, QEH, unlike REH, recognizes the ambiguity faced by an economist and market participants alike. Moreover, QEH leaves the model open to ambiguity by not specifying a mechanism determining specific values that outcomes take...

  8. Quark/gluon jet discrimination: a reproducible analysis using R

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The power to discriminate between light-quark jets and gluon jets would have a huge impact on many searches for new physics at CERN and beyond. This talk will present a walk-through of the development of a prototype machine learning classifier for differentiating between quark and gluon jets at experiments like those at the Large Hadron Collider at CERN. A new fast feature selection method that combines information theory and graph analytics will be outlined. This method has found new variables that promise significant improvements in discrimination power. The prototype jet tagger is simple, interpretable, parsimonious, and computationally extremely cheap, and therefore might be suitable for use in trigger systems for real-time data processing. Nested stratified k-fold cross validation was used to generate robust estimates of model performance. The data analysis was performed entirely in the R statistical programming language, and is fully reproducible. The entire analysis workflow is data-driven, automated a...

  9. Timbral aspects of reproduced sound in small rooms. II

    DEFF Research Database (Denmark)

    Bech, Søren

    1996-01-01

    A single loudspeaker with frequency-dependent directivity characteristics, positioned in a room of normal size with frequency-dependent absorption coefficients of the room surfaces, has been simulated using an electroacoustic setup. The model included the direct sound, seventeen individual...... reflections and the reverberant field. The threshold of detection, and just-noticeable differences for an increase in level were measured for individual reflections. The results have confirmed that the first-order floor reflection is likely to contribute individually to the timbre of reproduced noise. However......, for a speech signal none of the investigated reflections will contribute individually to the timbre. It is suggested that the threshold of detection is determined by the spectral changes in the dominant frequency range of 500 Hz to 2 kHz. For increases in the level of individual reflections, the most likely...

  10. GeoTrust Hub: A Platform For Sharing And Reproducing Geoscience Applications

    Science.gov (United States)

    Malik, T.; Tarboton, D. G.; Goodall, J. L.; Choi, E.; Bhatt, A.; Peckham, S. D.; Foster, I.; Ton That, D. H.; Essawy, B.; Yuan, Z.; Dash, P. K.; Fils, G.; Gan, T.; Fadugba, O. I.; Saxena, A.; Valentic, T. A.

    2017-12-01

    Recent requirements of scholarly communication emphasize the reproducibility of scientific claims. Text-based research papers are considered poor mediums to establish reproducibility. Papers must be accompanied by "research objects", aggregation of digital artifacts that together with the paper provide an authoritative record of a piece of research. We will present GeoTrust Hub (http://geotrusthub.org), a platform for creating, sharing, and reproducing reusable research objects. GeoTrust Hub provides tools for scientists to create `geounits'--reusable research objects. Geounits are self-contained, annotated, and versioned containers that describe and package computational experiments in an efficient and light-weight manner. Geounits can be shared on public repositories such as HydroShare and FigShare, and also using their respective APIs reproduced on provisioned clouds. The latter feature enables science applications to have a lifetime beyond sharing, wherein they can be independently verified and trust be established as they are repeatedly reused. Through research use cases from several geoscience laboratories across the United States, we will demonstrate how tools provided from GeoTrust Hub along with Hydroshare as its public repository for geounits is advancing the state of reproducible research in the geosciences. For each use case, we will address different computational reproducibility requirements. Our first use case will be an example of setup reproducibility which enables a scientist to set up and reproduce an output from a model with complex configuration and development environments. Our second use case will be an example of algorithm/data reproducibility, where in a shared data science model/dataset can be substituted with an alternate one to verify model output results, and finally an example of interactive reproducibility, in which an experiment is dependent on specific versions of data to produce the result. Toward this we will use software and data

  11. Reproducing an extreme flood with uncertain post-event information

    Directory of Open Access Journals (Sweden)

    D. Fuentes-Andino

    2017-07-01

    Full Text Available Studies for the prevention and mitigation of floods require information on discharge and extent of inundation, commonly unavailable or uncertain, especially during extreme events. This study was initiated by the devastating flood in Tegucigalpa, the capital of Honduras, when Hurricane Mitch struck the city. In this study we hypothesized that it is possible to estimate, in a trustworthy way considering large data uncertainties, this extreme 1998 flood discharge and the extent of the inundations that followed from a combination of models and post-event measured data. Post-event data collected in 2000 and 2001 were used to estimate discharge peaks, times of peak, and high-water marks. These data were used in combination with rain data from two gauges to drive and constrain a combination of well-known modelling tools: TOPMODEL, Muskingum–Cunge–Todini routing, and the LISFLOOD-FP hydraulic model. Simulations were performed within the generalized likelihood uncertainty estimation (GLUE uncertainty-analysis framework. The model combination predicted peak discharge, times of peaks, and more than 90 % of the observed high-water marks within the uncertainty bounds of the evaluation data. This allowed an inundation likelihood map to be produced. Observed high-water marks could not be reproduced at a few locations on the floodplain. Identifications of these locations are useful to improve model set-up, model structure, or post-event data-estimation methods. Rainfall data were of central importance in simulating the times of peak and results would be improved by a better spatial assessment of rainfall, e.g. from radar data or a denser rain-gauge network. Our study demonstrated that it was possible, considering the uncertainty in the post-event data, to reasonably reproduce the extreme Mitch flood in Tegucigalpa in spite of no hydrometric gauging during the event. The method proposed here can be part of a Bayesian framework in which more events

  12. Role Models and Teachers: medical students perception of teaching-learning methods in clinical settings, a qualitative study from Sri Lanka.

    Science.gov (United States)

    Jayasuriya-Illesinghe, Vathsala; Nazeer, Ishra; Athauda, Lathika; Perera, Jennifer

    2016-02-09

    Medical education research in general, and those focusing on clinical settings in particular, have been a low priority in South Asia. This explorative study from 3 medical schools in Sri Lanka, a South Asian country, describes undergraduate medical students' experiences during their final year clinical training with the aim of understanding the teaching-learning experiences. Using qualitative methods we conducted an exploratory study. Twenty eight graduates from 3 medical schools participated in individual interviews. Interview recordings were transcribed verbatim and analyzed using qualitative content analysis method. Emergent themes reveled 2 types of teaching-learning experiences, role modeling, and purposive teaching. In role modelling, students were expected to observe teachers while they conduct their clinical work, however, this method failed to create positive learning experiences. The clinical teachers who predominantly used this method appeared to be 'figurative' role models and were not perceived as modelling professional behaviors. In contrast, purposeful teaching allowed dedicated time for teacher-student interactions and teachers who created these learning experiences were more likely to be seen as 'true' role models. Students' responses and reciprocations to these interactions were influenced by their perception of teachers' behaviors, attitudes, and the type of teaching-learning situations created for them. Making a distinction between role modeling and purposeful teaching is important for students in clinical training settings. Clinical teachers' awareness of their own manifest professional characterizes, attitudes, and behaviors, could help create better teaching-learning experiences. Moreover, broader systemic reforms are needed to address the prevailing culture of teaching by humiliation and subordination.

  13. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  14. Psychometric Evaluation of the Brachial Assessment Tool Part 1: Reproducibility.

    Science.gov (United States)

    Hill, Bridget; Williams, Gavin; Olver, John; Ferris, Scott; Bialocerkowski, Andrea

    2018-04-01

    To evaluate reproducibility (reliability and agreement) of the Brachial Assessment Tool (BrAT), a new patient-reported outcome measure for adults with traumatic brachial plexus injury (BPI). Prospective repeated-measure design. Outpatient clinics. Adults with confirmed traumatic BPI (N=43; age range, 19-82y). People with BPI completed the 31-item 4-response BrAT twice, 2 weeks apart. Results for the 3 subscales and summed score were compared at time 1 and time 2 to determine reliability, including systematic differences using paired t tests, test retest using intraclass correlation coefficient model 1,1 (ICC 1,1 ), and internal consistency using Cronbach α. Agreement parameters included standard error of measurement, minimal detectable change, and limits of agreement. BrAT. Test-retest reliability was excellent (ICC 1,1 =.90-.97). Internal consistency was high (Cronbach α=.90-.98). Measurement error was relatively low (standard error of measurement range, 3.1-8.8). A change of >4 for subscale 1, >6 for subscale 2, >4 for subscale 3, and >10 for the summed score is indicative of change over and above measurement error. Limits of agreement ranged from ±4.4 (subscale 3) to 11.61 (summed score). These findings support the use of the BrAT as a reproducible patient-reported outcome measure for adults with traumatic BPI with evidence of appropriate reliability and agreement for both individual and group comparisons. Further psychometric testing is required to establish the construct validity and responsiveness of the BrAT. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  15. Double Regge model for non diffractive A1 production

    International Nuclear Information System (INIS)

    Anjos, J.C.; Endler, A.; Santoro, A.; Simao, F.R.A.

    1977-07-01

    A Reggeized double-nucleon-exchange model is shown to be able to to reproduce qualitatively the non-diffractive A 1 production recently observed in the reaction K - p → Σ - π + π - π + at 4.15 GeV/c

  16. Reproducible Infection Model for Clostridium perfringens in Broiler Chickens

    DEFF Research Database (Denmark)

    Pedersen, Karl; Friis-Holm, Lotte Bjerrum; Heuer, Ole Eske

    2008-01-01

    , 18, 20, and 24 ( Experiment 2). There was no mortality in any of the groups; however, chickens in the groups receiving both coccidial vaccine and C. perfringens developed the subclinical form of necrotic enteritis, demonstrated by focal necroses in the small intestine, whereas chickens in control...... groups or groups receiving only coccidial vaccine or only C. perfringens cultures developed no necroses. The results underline the importance of predisposing factors in the development of necrotic enteritis....

  17. A systematic review and qualitative analysis to inform the development of a new emergency department-based geriatric case management model.

    Science.gov (United States)

    Sinha, Samir K; Bessman, Edward S; Flomenbaum, Neal; Leff, Bruce

    2011-06-01

    We inform the future development of a new geriatric emergency management practice model. We perform a systematic review of the existing evidence for emergency department (ED)-based case management models designed to improve the health, social, and health service utilization outcomes for noninstitutionalized older patients within the context of an index ED visit. This was a systematic review of English-language articles indexed in MEDLINE and CINAHL (1966 to 2010), describing ED-based case management models for older adults. Bibliographies of the retrieved articles were reviewed to identify additional references. A systematic qualitative case study analytic approach was used to identify the core operational components and outcome measures of the described clinical interventions. The authors of the included studies were also invited to verify our interpretations of their work. The determined patterns of component adherence were then used to postulate the relative importance and effect of the presence or absence of a particular component in influencing the overall effectiveness of their respective interventions. Eighteen of 352 studies (reported in 20 articles) met study criteria. Qualitative analyses identified 28 outcome measures and 8 distinct model characteristic components that included having an evidence-based practice model, nursing clinical involvement or leadership, high-risk screening processes, focused geriatric assessments, the initiation of care and disposition planning in the ED, interprofessional and capacity-building work practices, post-ED discharge follow-up with patients, and evaluation and monitoring processes. Of the 15 positive study results, 6 had all 8 characteristic components and 9 were found to be lacking at least 1 component. Two studies with positive results lacked 2 characteristic components and none lacked more than 2 components. Of the 3 studies with negative results demonstrating no positive effects based on any outcome tested, one

  18. Prediction of lung tumour position based on spirometry and on abdominal displacement: Accuracy and reproducibility

    International Nuclear Information System (INIS)

    Hoisak, Jeremy D.P.; Sixel, Katharina E.; Tirona, Romeo; Cheung, Patrick C.F.; Pignol, Jean-Philippe

    2006-01-01

    Background and purpose: A simulation investigating the accuracy and reproducibility of a tumour motion prediction model over clinical time frames is presented. The model is formed from surrogate and tumour motion measurements, and used to predict the future position of the tumour from surrogate measurements alone. Patients and methods: Data were acquired from five non-small cell lung cancer patients, on 3 days. Measurements of respiratory volume by spirometry and abdominal displacement by a real-time position tracking system were acquired simultaneously with X-ray fluoroscopy measurements of superior-inferior tumour displacement. A model of tumour motion was established and used to predict future tumour position, based on surrogate input data. The calculated position was compared against true tumour motion as seen on fluoroscopy. Three different imaging strategies, pre-treatment, pre-fraction and intrafractional imaging, were employed in establishing the fitting parameters of the prediction model. The impact of each imaging strategy upon accuracy and reproducibility was quantified. Results: When establishing the predictive model using pre-treatment imaging, four of five patients exhibited poor interfractional reproducibility for either surrogate in subsequent sessions. Simulating the formulation of the predictive model prior to each fraction resulted in improved interfractional reproducibility. The accuracy of the prediction model was only improved in one of five patients when intrafractional imaging was used. Conclusions: Employing a prediction model established from measurements acquired at planning resulted in localization errors. Pre-fractional imaging improved the accuracy and reproducibility of the prediction model. Intrafractional imaging was of less value, suggesting that the accuracy limit of a surrogate-based prediction model is reached with once-daily imaging

  19. Reproducible diagnosis of Chronic Lymphocytic Leukemia by flow cytometry

    DEFF Research Database (Denmark)

    Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha

    2018-01-01

    The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the di...

  20. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  1. Participant Nonnaiveté and the reproducibility of cognitive psychology

    NARCIS (Netherlands)

    R.A. Zwaan (Rolf); D. Pecher (Diane); G. Paolacci (Gabriele); S. Bouwmeester (Samantha); P.P.J.L. Verkoeijen (Peter); K. Dijkstra (Katinka); R. Zeelenberg (René)

    2017-01-01

    textabstractMany argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature—three each from the domains of perception/action, memory, and language, respectively—and found that they are highly reproducible. Not only can

  2. Reproducing Kernels and Coherent States on Julia Sets

    Energy Technology Data Exchange (ETDEWEB)

    Thirulogasanthar, K., E-mail: santhar@cs.concordia.ca; Krzyzak, A. [Concordia University, Department of Computer Science and Software Engineering (Canada)], E-mail: krzyzak@cs.concordia.ca; Honnouvo, G. [Concordia University, Department of Mathematics and Statistics (Canada)], E-mail: g_honnouvo@yahoo.fr

    2007-11-15

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems.

  3. Reproducing Kernels and Coherent States on Julia Sets

    International Nuclear Information System (INIS)

    Thirulogasanthar, K.; Krzyzak, A.; Honnouvo, G.

    2007-01-01

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems

  4. Completely reproducible description of digital sound data with cellular automata

    International Nuclear Information System (INIS)

    Wada, Masato; Kuroiwa, Jousuke; Nara, Shigetoshi

    2002-01-01

    A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information

  5. Novel burn device for rapid, reproducible burn wound generation.

    Science.gov (United States)

    Kim, J Y; Dunham, D M; Supp, D M; Sen, C K; Powell, H M

    2016-03-01

    Scarring following full thickness burns leads to significant reductions in range of motion and quality of life for burn patients. To effectively study scar development and the efficacy of anti-scarring treatments in a large animal model (female red Duroc pigs), reproducible, uniform, full-thickness, burn wounds are needed to reduce variability in observed results that occur with burn depth. Prior studies have proposed that initial temperature of the burner, contact time with skin, thermal capacity of burner material, and the amount of pressure applied to the skin need to be strictly controlled to ensure reproducibility. The purpose of this study was to develop a new burner that enables temperature and pressure to be digitally controlled and monitored in real-time throughout burn wound creation and compare it to a standard burn device. A custom burn device was manufactured with an electrically heated burn stylus and a temperature control feedback loop via an electronic microstat. Pressure monitoring was controlled by incorporation of a digital scale into the device, which measured downward force. The standard device was comprised of a heat resistant handle with a long rod connected to the burn stylus, which was heated using a hot plate. To quantify skin surface temperature and internal stylus temperature as a function of contact time, the burners were heated to the target temperature (200±5°C) and pressed into the skin for 40s to create the thermal injuries. Time to reach target temperature and elapsed time between burns were recorded. In addition, each unit was evaluated for reproducibility within and across three independent users by generating burn wounds at contact times spanning from 5 to 40s at a constant pressure and at pressures of 1 or 3lbs with a constant contact time of 40s. Biopsies were collected for histological analysis and burn depth quantification using digital image analysis (ImageJ). The custom burn device maintained both its internal

  6. Qualitative and quantitative analysis of the students’ perceptions to the use of 3D electronic models in problem-based learning

    Directory of Open Access Journals (Sweden)

    Hai Ming Wong

    2017-06-01

    Full Text Available Faculty of Dentistry of the University of Hong Kong has introduced innovative blended problem-based learning (PBL with the aid of 3D electronic models (e-models to Bachelor of Dental Surgery (BDS curriculum. Statistical results of pre- and post-semester questionnaire surveys illustrated compatibility of e-models in PBL settings. The students’ importance ratings of two objectives “Complete assigned tasks on time” and “Active listener”, and twenty-two facilitator evaluation items including critical thinking and group problem-solving skills had increased significantly. The students’ PBL preparation behavior, attentions to problem understanding, problem analysis, and learning resource quality were also found to be related to online support of e-models and its software. Qualitative analysis of open-ended questions with visual text analytic software “Leximancer” improved validity of statistical results. Using e-model functions in treatment planning, problem analysis and giving instructions provided a method of informative communication. Therefore, it is critical for the faculty to continuously provide facilitator training and quality online e-model resources to the students.

  7. A neural coding scheme reproducing foraging trajectories

    Science.gov (United States)

    Gutiérrez, Esther D.; Cabrera, Juan Luis

    2015-12-01

    The movement of many animals may follow Lévy patterns. The underlying generating neuronal dynamics of such a behavior is unknown. In this paper we show that a novel discovery of multifractality in winnerless competition (WLC) systems reveals a potential encoding mechanism that is translatable into two dimensional superdiffusive Lévy movements. The validity of our approach is tested on a conductance based neuronal model showing WLC and through the extraction of Lévy flights inducing fractals from recordings of rat hippocampus during open field foraging. Further insights are gained analyzing mice motor cortex neurons and non motor cell signals. The proposed mechanism provides a plausible explanation for the neuro-dynamical fundamentals of spatial searching patterns observed in animals (including humans) and illustrates an until now unknown way to encode information in neuronal temporal series.

  8. Repeatability and reproducibility of Population Viability Analysis (PVA and the implications for threatened species management

    Directory of Open Access Journals (Sweden)

    Clare Morrison

    2016-08-01

    Full Text Available Conservation triage focuses on prioritizing species, populations or habitats based on urgency, biodiversity benefits, recovery potential as well as cost. Population Viability Analysis (PVA is frequently used in population focused conservation prioritizations. The critical nature of many of these management decisions requires that PVA models are repeatable and reproducible to reliably rank species and/or populations quantitatively. This paper assessed the repeatability and reproducibility of a subset of previously published PVA models. We attempted to rerun baseline models from 90 publicly available PVA studies published between 2000-2012 using the two most common PVA modelling software programs, VORTEX and RAMAS-GIS. Forty percent (n = 36 failed, 50% (45 were both repeatable and reproducible, and 10% (9 had missing baseline models. Repeatability was not linked to taxa, IUCN category, PVA program version used, year published or the quality of publication outlet, suggesting that the problem is systemic within the discipline. Complete and systematic presentation of PVA parameters and results are needed to ensure that the scientific input into conservation planning is both robust and reliable, thereby increasing the chances of making decisions that are both beneficial and defensible. The implications for conservation triage may be far reaching if population viability models cannot be reproduced with confidence, thus undermining their intended value.

  9. Climate model biases in seasonality of continental water storage revealed by satellite gravimetry

    Science.gov (United States)

    Swenson, Sean; Milly, P.C.D.

    2006-01-01

    Satellite gravimetric observations of monthly changes in continental water storage are compared with outputs from five climate models. All models qualitatively reproduce the global pattern of annual storage amplitude, and the seasonal cycle of global average storage is reproduced well, consistent with earlier studies. However, global average agreements mask systematic model biases in low latitudes. Seasonal extrema of low‐latitude, hemispheric storage generally occur too early in the models, and model‐specific errors in amplitude of the low‐latitude annual variations are substantial. These errors are potentially explicable in terms of neglected or suboptimally parameterized water stores in the land models and precipitation biases in the climate models.

  10. Eternally existing self-reproducing inflationary universe

    International Nuclear Information System (INIS)

    Linde, A.D.

    1986-05-01

    It is shown that the large-scale quantum fluctuations of the scalar field φ generated in the chaotic inflation scenario lead to an infinite process of self-reproduction of inflationary mini-universes. A model of eternally existing chaotic inflationary universe is suggested. It is pointed out that whereas the universe locally is very homogeneous as a result of inflation, which occurs at the classical level, the global structure of the universe is determined by quantum effects and is highly non-trivial. The universe consists of exponentially large number of different mini-universes, inside which all possible (metastable) vacuum states and all possible types of compactification are realized. The picture differs crucially from the standard picture of a one-domain universe in a ''true'' vacuum state. Our results may serve as a justification of the anthropic principle in the inflationary cosmology. These results may have important implications for the elementary particle theory as well. Namely, since all possible types of mini-universes, in which inflation may occur, should exist in our universe, there is no need to insist (as it is usually done) that in realistic theories the vacuum state of our type should be the only possible one or the best one. (author)

  11. Experiences of Community-Living Older Adults Receiving Integrated Care Based on the Chronic Care Model : A Qualitative Study

    NARCIS (Netherlands)

    Spoorenberg, Sophie L. W.; Wynia, Klaske; Fokkens, Andrea S.; Slotman, Karin; Kremer, Hubertus P. H.; Reijneveld, Sijmen A.

    2015-01-01

    Background Integrated care models aim to solve the problem of fragmented and poorly coordinated care in current healthcare systems. These models aim to be patient-centered by providing continuous and coordinated care and by considering the needs and preferences of patients. The objective of this

  12. Barriers to Translation of Physical Activity into the Lung Cancer Model of Care. A Qualitative Study of Clinicians' Perspectives.

    Science.gov (United States)

    Granger, Catherine L; Denehy, Linda; Remedios, Louisa; Retica, Sarah; Phongpagdi, Pimsiri; Hart, Nicholas; Parry, Selina M

    2016-12-01

    Evidence-based clinical practice guidelines recommend physical activity for people with lung cancer, however evidence has not translated into clinical practice and the majority of patients do not meet recommended activity levels. To identify factors (barriers and enablers) that influence clinicians' translation of the physical activity guidelines into practice. Qualitative study involving 17 participants (three respiratory physicians, two thoracic surgeons, two oncologists, two nurses, and eight physical therapists) who were recruited using purposive sampling from five hospitals in Melbourne, Victoria, Australia. Nine semistructured interviews and a focus group were conducted, transcribed verbatim, and independently cross-checked by a second researcher. Thematic analysis was used to analyze data. Five consistent themes emerged: (1) the clinicians perception of patient-related physical and psychological influences (including symptoms and comorbidities) that impact on patient's ability to perform regular physical activity; (2) the influence of the patient's past physical activity behavior and their perceived relevance and knowledge about physical activity; (3) the clinicians own knowledge and beliefs about physical activity; (4) workplace culture supporting or hindering physical activity; and (5) environmental and structural influences in the healthcare system (included clinicians time, staffing, protocols and services). Clinicians described potential strategies, including: (1) the opportunity for nurse practitioners to act as champions of regular physical activity and triage referrals for physical activity services; (2) opportunistically using the time when patients are in hospital after surgery to discuss physical activity; and (3) for all members of the multidisciplinary team to provide consistent messages to patients about the importance of physical activity. Key barriers to implementation of the physical activity guidelines in lung cancer are diverse and include

  13. Highly reproducible and sensitive silver nanorod array for the rapid detection of Allura Red in candy

    Science.gov (United States)

    Yao, Yue; Wang, Wen; Tian, Kangzhen; Ingram, Whitney Marvella; Cheng, Jie; Qu, Lulu; Li, Haitao; Han, Caiqin

    2018-04-01

    Allura Red (AR) is a highly stable synthetic red azo dye, which is widely used in the food industry to dye food and increase its attraction to consumers. However, the excessive consumption of AR can result in adverse health effects to humans. Therefore, a highly reproducible silver nanorod (AgNR) array was developed for surface enhanced Raman scattering (SERS) detection of AR in candy. The relative standard deviation (RSD) of AgNR substrate obtained from the same batch and different batches were 5.7% and 11.0%, respectively, demonstrating the high reproducibility. Using these highly reproducible AgNR arrays as the SERS substrates, AR was detected successfully, and its characteristic peaks were assigned by the density function theory (DFT) calculation. The limit of detection (LOD) of AR was determined to be 0.05 mg/L with a wide linear range of 0.8-100 mg/L. Furthermore, the AgNR SERS arrays can detect AR directly in different candy samples within 3 min without any complicated pretreatment. These results suggest the AgNR array can be used for rapid and qualitative SERS detection of AR, holding a great promise for expanding SERS application in food safety control field.

  14. A Conceptual Model of Dyadic Coordination in HIV Care Engagement Among Couples of Black Men Who Have Sex with Men: A Qualitative Dyadic Analysis.

    Science.gov (United States)

    Tan, Judy Y; Campbell, Chadwick K; Tabrisky, Alyssa P; Siedle-Khan, Robert; Conroy, Amy A

    2018-02-20

    Among Black men who have sex with men (MSM), HIV incidence is disproportionately high and HIV care engagement is disproportionately low. There may be important opportunities to leverage the primary relationship to improve engagement in HIV care and treatment among Black MSM couples. Using dyadic qualitative analysis of semi-structured, one-on-one interviews, we explored dyadic aspects of HIV care engagement among 14 Black MSM couples in which at least one partner was HIV-positive and identified as a Black cisgender man. Findings showed that men varied in how involved they were in their HIV-positive partner's care and treatment, and in how they reciprocated their partner's involvement. Patterns of dyadic HIV care engagement supported a conceptual model of dyadic coordination that describes Black MSM relationships in terms of two conceptual dimensions of dyadic HIV care engagement, and guides future intervention designs with Black MSM couples.

  15. Reproducible and controllable induction voltage adder for scaled beam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko [Department of Energy Sciences, Tokyo Institute of Technology, 4259 Nagatsuta, Midori-ku, Yokohama 226-8502 (Japan)

    2016-08-15

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  16. Qualitative feature extractions of chaotic systems

    International Nuclear Information System (INIS)

    Vicha, T.; Dohnal, M.

    2008-01-01

    The theory of chaos offers useful tools for systems analysis. However, models of complex systems are based on a network of inconsistent, space and uncertain knowledge items. Traditional quantitative methods of chaos analysis are therefore not applicable. The paper by the same authors [Vicha T, Dohnal M. Qualitative identification of chaotic systems behaviours. Chaos, Solitons and Fractals, in press, [Log. No. 601019] ] presents qualitative interpretation of some chaos concepts. There are only three qualitative values positive/increasing, negative/decreasing and zero/constant. It means that any set of qualitative multidimensional descriptions of unsteady state behaviours is discrete and finite. A finite upper limit exists for the total number of qualitatively distinguishable scenarios. A set of 21 published chaotic models is solved qualitatively and 21 sets of all existing qualitative scenarios are presented. The intersection of all 21 scenario sets is empty. There is no such a behaviour which is common for all 21 models. The set of 21 qualitative models (e.g. Lorenz, Roessler) can be used to compare chaotic behaviours of an unknown qualitative model with them to evaluate if its chaotic behaviours is close to e.g. Lorenz chaotic model and how much

  17. Estimation of the Influence of Thin Air Layers on Structures by the Use of Qualitative One-Dimensional Models

    Science.gov (United States)

    Chimeno Manguan, M.; Roibas Millan, E.; Simon Hidalgo, F.

    2014-06-01

    Air layers are regions of air between structural elements than can be found in numerous spacecraft structures. The space between folded solar panels and between antennas and a satellite's body are cases of air layers. For some cases, depending on the flexibility of the contiguous structures, the contribution of air layers can modify noticeably the dynamic response of a spacecraft structure. The analysis of these problems in detailed numerical models as Finite and Boundary Element models are characterised by a very small element size because of the requirements imposed by the thickness of the air layers and the fluid-structure interface. Then, a preliminary assessment of the influence of the air layer allows optimizing the development work flow of these elements. This work presents a methodology to preliminarily assess the influence of air layers in the structural response. The methodology is based on the definition of simplified one-dimensional models for the structure and the air gaps. The study of these simple models can be a useful tool to determine the degree of influence of the air layers in the system. Along with the introduction of the methodology a study on several of the model parameters as the number of degrees of freedom for the air layer or the structure is presented. The performance of the methodology is illustrated with results for several cases including actual spacecraft structures.

  18. In-vitro accuracy and reproducibility evaluation of probing depth measurements of selected periodontal probes

    Directory of Open Access Journals (Sweden)

    K.N. Al Shayeb

    2014-01-01

    Conclusion: Depth measurements with the Chapple UB-CF-15 probe were more accurate and reproducible compared to measurements with the Vivacare TPS and Williams 14 W probes. This in vitro model may be useful for intra-examiner calibration or clinician training prior to the clinical evaluation of patients or in longitudinal studies involving periodontal evaluation.

  19. Qualitative analysis in reliability and safety studies

    International Nuclear Information System (INIS)

    Worrell, R.B.; Burdick, G.R.

    1976-01-01

    The qualitative evaluation of system logic models is described as it pertains to assessing the reliability and safety characteristics of nuclear systems. Qualitative analysis of system logic models, i.e., models couched in an event (Boolean) algebra, is defined, and the advantages inherent in qualitative analysis are explained. Certain qualitative procedures that were developed as a part of fault-tree analysis are presented for illustration. Five fault-tree analysis computer-programs that contain a qualitative procedure for determining minimal cut sets are surveyed. For each program the minimal cut-set algorithm and limitations on its use are described. The recently developed common-cause analysis for studying the effect of common-causes of failure on system behavior is explained. This qualitative procedure does not require altering the fault tree, but does use minimal cut sets from the fault tree as part of its input. The method is applied using two different computer programs. 25 refs

  20. Enriched reproducing kernel particle method for fractional advection-diffusion equation

    Science.gov (United States)

    Ying, Yuping; Lian, Yanping; Tang, Shaoqiang; Liu, Wing Kam

    2018-06-01

    The reproducing kernel particle method (RKPM) has been efficiently applied to problems with large deformations, high gradients and high modal density. In this paper, it is extended to solve a nonlocal problem modeled by a fractional advection-diffusion equation (FADE), which exhibits a boundary layer with low regularity. We formulate this method on a moving least-square approach. Via the enrichment of fractional-order power functions to the traditional integer-order basis for RKPM, leading terms of the solution to the FADE can be exactly reproduced, which guarantees a good approximation to the boundary layer. Numerical tests are performed to verify the proposed approach.

  1. Soft and hard classification by reproducing kernel Hilbert space methods.

    Science.gov (United States)

    Wahba, Grace

    2002-12-24

    Reproducing kernel Hilbert space (RKHS) methods provide a unified context for solving a wide variety of statistical modelling and function estimation problems. We consider two such problems: We are given a training set [yi, ti, i = 1, em leader, n], where yi is the response for the ith subject, and ti is a vector of attributes for this subject. The value of y(i) is a label that indicates which category it came from. For the first problem, we wish to build a model from the training set that assigns to each t in an attribute domain of interest an estimate of the probability pj(t) that a (future) subject with attribute vector t is in category j. The second problem is in some sense less ambitious; it is to build a model that assigns to each t a label, which classifies a future subject with that t into one of the categories or possibly "none of the above." The approach to the first of these two problems discussed here is a special case of what is known as penalized likelihood estimation. The approach to the second problem is known as the support vector machine. We also note some alternate but closely related approaches to the second problem. These approaches are all obtained as solutions to optimization problems in RKHS. Many other problems, in particular the solution of ill-posed inverse problems, can be obtained as solutions to optimization problems in RKHS and are mentioned in passing. We caution the reader that although a large literature exists in all of these topics, in this inaugural article we are selectively highlighting work of the author, former students, and other collaborators.

  2. Reproducibility of corneal, macular and retinal nerve fiber layer ...

    African Journals Online (AJOL)

    side the limits of a consulting room.5. Reproducibility of ... examination, intraocular pressure and corneal thickness ... All OCT measurements were taken between 2 and 5 pm ..... CAS-OCT, Slit-lamp OCT, RTVue-100) have shown ICC.

  3. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  4. The reproducibility of random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    RAPD) profiles of Streptococcus thermophilus strains by using the polymerase chain reaction (PCR). Several factors can cause the amplification of false and non reproducible bands in the RAPD profiles. We tested three primers, OPI-02 MOD, ...

  5. Digital Video as a Personalized Learning Assignment: A Qualitative Study of Student Authored Video Using the ICSDR Model

    Science.gov (United States)

    Campbell, Laurie O.; Cox, Thomas D.

    2018-01-01

    Students within this study followed the ICSDR (Identify, Conceptualize/Connect, Storyboard, Develop, Review/Reflect/Revise) development model to create digital video, as a personalized and active learning assignment. The participants, graduate students in education, indicated that following the ICSDR framework for student-authored video guided…

  6. The development of a model of dignity in illness based on qualitative interviews with seriously ill patients

    NARCIS (Netherlands)

    van Gennip, Isis E.; Pasman, H. Roeline W.; Oosterveld-Vlug, Mariska G.; Willems, Dick L.; Onwuteaka-Philipsen, Bregje D.

    2013-01-01

    While knowledge on factors affecting personal dignity of patients nearing death is quite substantial, far less is known about how patients living with a serious disease understand dignity. To develop a conceptual model of dignity that illuminates the process by which serious illness can undermine

  7. The Impact of Three-Dimensional Computational Modeling on Student Understanding of Astronomy Concepts: A Qualitative Analysis. Research Report

    Science.gov (United States)

    Hansen, John A.; Barnett, Michael; MaKinster, James G.; Keating, Thomas

    2004-01-01

    In this study, we explore an alternate mode for teaching and learning the dynamic, three-dimensional (3D) relationships that are central to understanding astronomical concepts. To this end, we implemented an innovative undergraduate course in which we used inexpensive computer modeling tools. As the second of a two-paper series, this report…

  8. The development of a model of dignity in illness based on qualitative interviews with seriously ill patients

    NARCIS (Netherlands)

    van Gennip, I.E.; Pasman, H.R.W.; Oosterveld-Vlug, M.G.; Willems, D.L.; Onwuteaka-Philipsen, B.D.

    2013-01-01

    Background: While knowledge on factors affecting personal dignity of patients nearing death is quite substantial, far less is known about how patients living with a serious disease understand dignity. Objective: To develop a conceptual model of dignity that illuminates the process by which serious

  9. Entangled states that cannot reproduce original classical games in their quantum version

    International Nuclear Information System (INIS)

    Shimamura, Junichi; Oezdemir, S.K.; Morikoshi, Fumiaki; Imoto, Nobuyuki

    2004-01-01

    A model of a quantum version of classical games should reproduce the original classical games in order to be able to make a comparative analysis of quantum and classical effects. We analyze a class of symmetric multipartite entangled states and their effect on the reproducibility of the classical games. We present the necessary and sufficient condition for the reproducibility of the original classical games. Satisfying this condition means that complete orthogonal bases can be constructed from a given multipartite entangled state provided that each party is restricted to two local unitary operators. We prove that most of the states belonging to the class of symmetric states with respect to permutations, including the N-qubit W state, do not satisfy this condition

  10. Experiences of Community-Living Older Adults Receiving Integrated Care Based on the Chronic Care Model: A Qualitative Study.

    Science.gov (United States)

    Spoorenberg, Sophie L W; Wynia, Klaske; Fokkens, Andrea S; Slotman, Karin; Kremer, Hubertus P H; Reijneveld, Sijmen A

    2015-01-01

    Integrated care models aim to solve the problem of fragmented and poorly coordinated care in current healthcare systems. These models aim to be patient-centered by providing continuous and coordinated care and by considering the needs and preferences of patients. The objective of this study was to evaluate the opinions and experiences of community-living older adults with regard to integrated care and support, along with the extent to which it meets their health and social needs. Semi-structured interviews were conducted with 23 older adults receiving integrated care and support through "Embrace," an integrated care model for community-living older adults that is based on the Chronic Care Model and a population health management model. Embrace is currently fully operational in the northern region of the Netherlands. Data analysis was based on the grounded theory approach. Responses of participants concerned two focus areas: 1) Experiences with aging, with the themes "Struggling with health," "Increasing dependency," "Decreasing social interaction," "Loss of control," and "Fears;" and 2) Experiences with Embrace, with the themes "Relationship with the case manager," "Interactions," and "Feeling in control, safe, and secure". The prospect of becoming dependent and losing control was a key concept in the lives of the older adults interviewed. Embrace reinforced the participants' ability to stay in control, even if they were dependent on others. Furthermore, participants felt safe and secure, in contrast to the fears of increasing dependency within the standard care system. The results indicate that integrated care and support provided through Embrace met the health and social needs of older adults, who were coping with the consequences of aging.

  11. What is eHealth (6)? Development of a Conceptual Model for eHealth: Qualitative Study with Key Informants.

    Science.gov (United States)

    Shaw, Tim; McGregor, Deborah; Brunner, Melissa; Keep, Melanie; Janssen, Anna; Barnet, Stewart

    2017-10-24

    Despite rapid growth in eHealth research, there remains a lack of consistency in defining and using terms related to eHealth. More widely cited definitions provide broad understanding of eHealth but lack sufficient conceptual clarity to operationalize eHealth and enable its implementation in health care practice, research, education, and policy. Definitions that are more detailed are often context or discipline specific, limiting ease of translation of these definitions across the breadth of eHealth perspectives and situations. A conceptual model of eHealth that adequately captures its complexity and potential overlaps is required. This model must also be sufficiently detailed to enable eHealth operationalization and hypothesis testing. This study aimed to develop a conceptual practice-based model of eHealth to support health professionals in applying eHealth to their particular professional or discipline contexts. We conducted semistructured interviews with key informants (N=25) from organizations involved in health care delivery, research, education, practice, governance, and policy to explore their perspectives on and experiences with eHealth. We used purposeful sampling for maximum diversity. Interviews were coded and thematically analyzed for emergent domains. Thematic analyses revealed 3 prominent but overlapping domains of eHealth: (1) health in our hands (using eHealth technologies to monitor, track, and inform health), (2) interacting for health (using digital technologies to enable health communication among practitioners and between health professionals and clients or patients), and (3) data enabling health (collecting, managing, and using health data). These domains formed a model of eHealth that addresses the need for clear definitions and a taxonomy of eHealth while acknowledging the fluidity of this area and the strengths of initiatives that span multiple eHealth domains. This model extends current understanding of eHealth by providing clearly

  12. Systematic heterogenization for better reproducibility in animal experimentation.

    Science.gov (United States)

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  13. Evaluation of how a curriculum change in nurse education was managed through the application of a business change management model: A qualitative case study.

    Science.gov (United States)

    Chowthi-Williams, Annette; Curzio, Joan; Lerman, Stephen

    2016-01-01

    Curriculum changes are a regular feature of nurse education, yet little is known about how such changes are managed. Research in this arena is yet to emerge. Evaluation of how a curriculum change in nurse education was managed through the application of a business change management model. A qualitative case study: the single case was the new curriculum, the Primary Care Pathway. One executive, three senior managers, two academics and nineteen students participated in this study in one faculty of health and social care in a higher education institution. The findings suggest that leadership was pivotal to the inception of the programme and guiding teams managed the change and did not take on a leadership role. The vision for the change and efforts to communicate it did not reach the frontline. Whilst empowerment was high amongst stakeholders and students, academics felt dis-empowered. Short-term wins were not significant in keeping up the momentum of change. The credibility of the change was under challenge and the concept of the new programme was not yet embedded in academia. Differences between the strategic and operational part of the organisation surfaced with many challenges occurring at the implementation stage. The business change model used was valuable, but was found to not be applicable during curriculum changes in nurse education. A new change model emerged, and a tool was developed alongside to aid future curriculum changes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. What else are psychotherapy trainees learning? A qualitative model of students' persona