WorldWideScience

Sample records for modeling system reproduced

  1. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  2. Reproducibility in a multiprocessor system

    Science.gov (United States)

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  3. Coupled RipCAS-DFLOW (CoRD) Software and Data Management System for Reproducible Floodplain Vegetation Succession Modeling

    Science.gov (United States)

    Turner, M. A.; Miller, S.; Gregory, A.; Cadol, D. D.; Stone, M. C.; Sheneman, L.

    2016-12-01

    We present the Coupled RipCAS-DFLOW (CoRD) modeling system created to encapsulate the workflow to analyze the effects of stream flooding on vegetation succession. CoRD provides an intuitive command-line and web interface to run DFLOW and RipCAS in succession over many years automatically, which is a challenge because, for our application, DFLOW must be run on a supercomputing cluster via the PBS job scheduler. RipCAS is a vegetation succession model, and DFLOW is a 2D open channel flow model. Data adaptors have been developed to seamlessly connect DFLOW output data to be RipCAS inputs, and vice-versa. CoRD provides automated statistical analysis and visualization, plus automatic syncing of input and output files and model run metadata to the hydrological data management system HydroShare using its excellent Python REST client. This combination of technologies and data management techniques allows the results to be shared with collaborators and eventually published. Perhaps most importantly, it allows results to be easily reproduced via either the command-line or web user interface. This system is a result of collaboration between software developers and hydrologists participating in the Western Consortium for Watershed Analysis, Visualization, and Exploration (WC-WAVE). Because of the computing-intensive nature of this particular workflow, including automating job submission/monitoring and data adaptors, software engineering expertise is required. However, the hydrologists provide the software developers with a purpose and ensure a useful, intuitive tool is developed. Our hydrologists contribute software, too: RipCAS was developed from scratch by hydrologists on the team as a specialized, open-source version of the Computer Aided Simulation Model for Instream Flow and Riparia (CASiMiR) vegetation model; our hydrologists running DFLOW provided numerous examples and help with the supercomputing system. This project is written in Python, a popular language in the

  4. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  5. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  6. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  7. Skills of General Circulation and Earth System Models in reproducing streamflow to the ocean: the case of Congo river

    Science.gov (United States)

    Santini, M.; Caporaso, L.

    2017-12-01

    Although the importance of water resources in the context of climate change, it is still difficult to correctly simulate the freshwater cycle over the land via General Circulation and Earth System Models (GCMs and ESMs). Existing efforts from the Climate Model Intercomparison Project 5 (CMIP5) were mainly devoted to the validation of atmospheric variables like temperature and precipitation, with low attention to discharge.Here we investigate the present-day performances of GCMs and ESMs participating to CMIP5 in simulating the discharge of the river Congo to the sea thanks to: i) the long-term availability of discharge data for the Kinshasa hydrological station representative of more than 95% of the water flowing in the whole catchment; and ii) the River's still low influence by human intervention, which enables comparison with the (mostly) natural streamflow simulated within CMIP5.Our findings suggest how most of models appear overestimating the streamflow in terms of seasonal cycle, especially in the late winter and spring, while overestimation and variability across models are lower in late summer. Weighted ensemble means are also calculated, based on simulations' performances given by several metrics, showing some improvements of results.Although simulated inter-monthly and inter-annual percent anomalies do not appear significantly different from those in observed data, when translated into well consolidated indicators of drought attributes (frequency, magnitude, timing, duration), usually adopted for more immediate communication to stakeholders and decision makers, such anomalies can be misleading.These inconsistencies produce incorrect assessments towards water management planning and infrastructures (e.g. dams or irrigated areas), especially if models are used instead of measurements, as in case of ungauged basins or for basins with insufficient data, as well as when relying on models for future estimates without a preliminary quantification of model biases.

  8. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  9. 3D-modeling of the spine using EOS imaging system: Inter-reader reproducibility and reliability.

    Directory of Open Access Journals (Sweden)

    Johannes Rehm

    Full Text Available To retrospectively assess the interreader reproducibility and reliability of EOS 3D full spine reconstructions in patients with adolescent idiopathic scoliosis (AIS.73 patients with mean age of 17 years and a moderate AIS (median Cobb Angle 18.2° obtained low-dose standing biplanar radiographs with EOS. Two independent readers performed "full spine" 3D reconstructions of the spine with the "full-spine" method adjusting the bone contour of every thoracic and lumbar vertebra (Th1-L5. Interreader reproducibility was assessed regarding rotation of every single vertebra in the coronal (i.e. frontal, sagittal (i.e. lateral, and axial plane, T1/T12 kyphosis, T4/T12 kyphosis, L1/L5 lordosis, L1/S1 lordosis and pelvic parameters. Radiation exposure, scan-time and 3D reconstruction time were recorded.Interclass correlation (ICC ranged between 0.83 and 0.98 for frontal vertebral rotation, between 0.94 and 0.99 for lateral vertebral rotation and between 0.51 and 0.88 for axial vertebral rotation. ICC was 0.92 for T1/T12 kyphosis, 0.95 for T4/T12 kyphosis, 0.90 for L1/L5 lordosis, 0.85 for L1/S1 lordosis, 0.97 for pelvic incidence, 0.96 for sacral slope, 0.98 for sagittal pelvic tilt and 0.94 for lateral pelvic tilt. The mean time for reconstruction was 14.9 minutes (reader 1: 14.6 minutes, reader 2: 15.2 minutes, p<0.0001. The mean total absorbed dose was 593.4μGy ±212.3 per patient.EOS "full spine" 3D angle measurement of vertebral rotation proved to be reliable and was performed in an acceptable reconstruction time. Interreader reproducibility of axial rotation was limited to some degree in the upper and middle thoracic spine due the obtuse angulation of the pedicles and the processi spinosi in the frontal view somewhat complicating their delineation.

  10. Measurement of cerebral blood flow by intravenous xenon-133 technique and a mobile system. Reproducibility using the Obrist model compared to total curve analysis

    DEFF Research Database (Denmark)

    Schroeder, T; Holstein, P; Lassen, N A

    1986-01-01

    and side-to-side asymmetry. Data were analysed according to the Obrist model and the results compared with those obtained using a model correcting for the air passage artifact. Reproducibility was of the same order of magnitude as reported using stationary equipment. The side-to-side CBF asymmetry...... was considerably more reproducible than CBF level. Using a single detector instead of five regional values averaged as the hemispheric flow increased standard deviation of CBF level by 10-20%, while the variation in asymmetry was doubled. In optimal measuring conditions the two models revealed no significant...... differences, but in low flow situations the artifact model yielded significantly more stable results. The present apparatus, equipped with 3-5 detectors covering each hemisphere, offers the opportunity of performing serial CBF measurements in situations not otherwise feasible....

  11. Towards reproducible descriptions of neuronal network models.

    Directory of Open Access Journals (Sweden)

    Eilen Nordlie

    2009-08-01

    Full Text Available Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.

  12. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  13. How well can DFT reproduce key interactions in Ziegler-Natta systems?

    KAUST Repository

    Correa, Andrea; Bahri-Laleh, Naeimeh; Cavallo, Luigi

    2013-01-01

    The performance of density functional theory in reproducing some of the main interactions occurring in MgCl2-supported Ziegler-Natta catalytic systems is assessed. Eight model systems, representatives of key interactions occurring in Ziegler

  14. Reproducibility of computer-aided detection system in digital mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Cho, Nariya; Cha, Joo Hee; Chung, Hye Kyung; Lee, Sin Ho; Cho, Kyung Soo; Kim, Sun Mi; Moon, Woo Kyung

    2005-01-01

    To evaluate the reproducibility of the computer-aided detection (CAD) system for digital mammograms. We applied the CAD system (ImageChecker M1000-DM, version 3.1; R2 Technology) to full field digital mammograms. These mammograms were taken twice at an interval of 10-45 days (mean:25 days) for 34 preoperative patients (breast cancer n=27, benign disease n=7, age range:20-66 years, mean age:47.9 years). On the mammograms, lesions were visible in 19 patients and these were depicted as 15 masses and 12 calcification clusters. We analyzed the sensitivity, the false positive rate (FPR) and the reproducibility of the CAD marks. The broader sensitivities of the CAD system were 80% (12 of 15), 67%(10 of 15) for masses and those for calcification clusters were 100% (12 of 12). The strict sensitivities were 50% (15 of 30) and 50% (15 of 30) for masses and 92% (22 of 24) and 79% (19 of 24) for the clusters. The FPR for the masses was 0.21-0.22/image, the FPR for the clusters was 0.03-0.04/image and the total FPR was 0.24-0.26/image. Among 132 mammography images, the identical images regardless of the existence of CAD marks were 59% (78 of 132), and the identical images with CAD marks were 22% (15 of 69). The reproducibility of the CAD marks for the true positive mass was 67% (12 of 18) and 71% (17 of 24) for the true positive cluster. The reproducibility of CAD marks for the false positive mass was 8% (4 of 53), and the reproducibility of CAD marks for the false positive clusters was 14% (1 of 7). The reproducibility of the total mass marks was 23% (16 of 71), and the reproducibility of the total cluster marks was 58% (18 of 31). CAD system showed higher sensitivity and reproducibility of CAD marks for the calcification clusters which are related to breast cancer. Yet the overall reproducibility of CAD marks was low; therefore, the CAD system must be applied considering this limitation

  15. Modeling and evaluating repeatability and reproducibility of ordinal classifications

    NARCIS (Netherlands)

    de Mast, J.; van Wieringen, W.N.

    2010-01-01

    This paper argues that currently available methods for the assessment of the repeatability and reproducibility of ordinal classifications are not satisfactory. The paper aims to study whether we can modify a class of models from Item Response Theory, well established for the study of the reliability

  16. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  17. Reproducible analyses of microbial food for advanced life support systems

    Science.gov (United States)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  18. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    Science.gov (United States)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  19. From alginate impressions to digital virtual models: accuracy and reproducibility.

    Science.gov (United States)

    Dalstra, Michel; Melsen, Birte

    2009-03-01

    To compare the accuracy and reproducibility of measurements performed on digital virtual models with those taken on plaster casts from models poured immediately after the impression was taken, the 'gold standard', and from plaster models poured following a 3-5 day shipping procedure of the alginate impression. Direct comparison of two measuring techniques. The study was conducted at the Department of Orthodontics, School of Dentistry, University of Aarhus, Denmark in 2006/2007. Twelve randomly selected orthodontic graduate students with informed consent. Three sets of alginate impressions were taken from the participants within 1 hour. Plaster models were poured immediately from two of the sets, while the third set was kept in transit in the mail for 3-5 days. Upon return a plaster model was poured as well. Finally digital models were made from the plaster models. A number of measurements were performed on the plaster casts with a digital calliper and on the corresponding digital models using the virtual measuring tool of the accompanying software. Afterwards these measurements were compared statistically. No statistical differences were found between the three sets of plaster models. The intra- and inter-observer variability are smaller for the measurements performed on the digital models. Sending alginate impressions by mail does not affect the quality and accuracy of plaster casts poured from them afterwards. Virtual measurements performed on digital models display less variability than the corresponding measurements performed with a calliper on the actual models.

  20. Hydrological Modeling Reproducibility Through Data Management and Adaptors for Model Interoperability

    Science.gov (United States)

    Turner, M. A.

    2015-12-01

    Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of

  1. The Web system of visualization and analysis equipped with reproducibility

    International Nuclear Information System (INIS)

    Ueshima, Yutaka; Saito, Kanji; Takeda, Yasuhiro; Nakai, Youichi; Hayashi, Sachiko

    2005-01-01

    In the advanced photon experimental research, real-time visualization and steering system is thought as desirable method of data analysis. This approach is valid only in the fixed analysis at one time or in the easily reproducible experiment. But, in the research for an unknown problem like the advanced photon experimental research, it is necessary that the observation data can be analyzed many times because profitable analysis is difficult at the first time. Consequently, output data should be filed to refer and analyze at any time. To support the research, we need the followed automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be integrated database system with several functional servers distributed on the network. (author)

  2. A reproducible brain tumour model established from human glioblastoma biopsies

    International Nuclear Information System (INIS)

    Wang, Jian; Chekenya, Martha; Bjerkvig, Rolf; Enger, Per Ø; Miletic, Hrvoje; Sakariassen, Per Ø; Huszthy, Peter C; Jacobsen, Hege; Brekkå, Narve; Li, Xingang; Zhao, Peng; Mørk, Sverre

    2009-01-01

    Establishing clinically relevant animal models of glioblastoma multiforme (GBM) remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression

  3. Can a coupled meteorology–chemistry model reproduce the ...

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  4. Measurement System Analyses - Gauge Repeatability and Reproducibility Methods

    Science.gov (United States)

    Cepova, Lenka; Kovacikova, Andrea; Cep, Robert; Klaput, Pavel; Mizera, Ondrej

    2018-02-01

    The submitted article focuses on a detailed explanation of the average and range method (Automotive Industry Action Group, Measurement System Analysis approach) and of the honest Gauge Repeatability and Reproducibility method (Evaluating the Measurement Process approach). The measured data (thickness of plastic parts) were evaluated by both methods and their results were compared on the basis of numerical evaluation. Both methods were additionally compared and their advantages and disadvantages were discussed. One difference between both methods is the calculation of variation components. The AIAG method calculates the variation components based on standard deviation (then a sum of variation components does not give 100 %) and the honest GRR study calculates the variation components based on variance, where the sum of all variation components (part to part variation, EV & AV) gives the total variation of 100 %. Acceptance of both methods among the professional society, future use, and acceptance by manufacturing industry were also discussed. Nowadays, the AIAG is the leading method in the industry.

  5. Modelling soil erosion at European scale: towards harmonization and reproducibility

    Science.gov (United States)

    Bosco, C.; de Rigo, D.; Dewitte, O.; Poesen, J.; Panagos, P.

    2015-02-01

    Soil erosion by water is one of the most widespread forms of soil degradation. The loss of soil as a result of erosion can lead to decline in organic matter and nutrient contents, breakdown of soil structure and reduction of the water-holding capacity. Measuring soil loss across the whole landscape is impractical and thus research is needed to improve methods of estimating soil erosion with computational modelling, upon which integrated assessment and mitigation strategies may be based. Despite the efforts, the prediction value of existing models is still limited, especially at regional and continental scale, because a systematic knowledge of local climatological and soil parameters is often unavailable. A new approach for modelling soil erosion at regional scale is here proposed. It is based on the joint use of low-data-demanding models and innovative techniques for better estimating model inputs. The proposed modelling architecture has at its basis the semantic array programming paradigm and a strong effort towards computational reproducibility. An extended version of the Revised Universal Soil Loss Equation (RUSLE) has been implemented merging different empirical rainfall-erosivity equations within a climatic ensemble model and adding a new factor for a better consideration of soil stoniness within the model. Pan-European soil erosion rates by water have been estimated through the use of publicly available data sets and locally reliable empirical relationships. The accuracy of the results is corroborated by a visual plausibility check (63% of a random sample of grid cells are accurate, 83% at least moderately accurate, bootstrap p ≤ 0.05). A comparison with country-level statistics of pre-existing European soil erosion maps is also provided.

  6. A reproducible brain tumour model established from human glioblastoma biopsies

    Directory of Open Access Journals (Sweden)

    Li Xingang

    2009-12-01

    Full Text Available Abstract Background Establishing clinically relevant animal models of glioblastoma multiforme (GBM remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. Methods In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. Results The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. Conclusions In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression.

  7. Development of a Consistent and Reproducible Porcine Scald Burn Model

    Science.gov (United States)

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  8. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    Science.gov (United States)

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    osteosarcoma model was shown to be feasible: the take rate was high, surgical mortality was negligible and the procedure was simple to perform and easily reproduced. It may be a useful tool in the investigation of antiangiogenic and anticancer therapeutics. Ultrasound was found to be a highly accurate tool for tumor diagnosis, localization and measurement and may be recommended for monitoring tumor growth in this model.

  9. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    Science.gov (United States)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  10. Reproducible and expedient rice regeneration system using in vitro ...

    African Journals Online (AJOL)

    Inevitable prerequisite for expedient regeneration in rice is the selection of totipotent explant and developing an apposite combination of growth hormones. Here, we reported a reproducible regeneration protocol in which basal segments of the stem of the in vitro grown rice plants were used as ex-plant. Using the protocol ...

  11. Systemic thioridazine in combination with dicloxacillin against early aortic graft infections caused by Staphylococcus aureus in a porcine model: In vivo results do not reproduce the in vitro synergistic activity.

    Directory of Open Access Journals (Sweden)

    Michael Stenger

    Full Text Available Conservative treatment solutions against aortic prosthetic vascular graft infection (APVGI for inoperable patients are limited. The combination of antibiotics with antibacterial helper compounds, such as the neuroleptic drug thioridazine (TDZ, should be explored.To investigate the efficacy of conservative systemic treatment with dicloxacillin (DCX in combination with TDZ (DCX+TDZ, compared to DCX alone, against early APVGI caused by methicillin-sensitive Staphylococcus aureus (MSSA in a porcine model.The synergism of DCX+TDZ against MSSA was initially assessed in vitro by viability assay. Thereafter, thirty-two pigs had polyester grafts implanted in the infrarenal aorta, followed by inoculation with 106 CFU of MSSA, and were randomly administered oral systemic treatment with either 1 DCX or 2 DCX+TDZ. Treatment was initiated one week postoperatively and continued for a further 21 days. Weight, temperature, and blood samples were collected at predefined intervals. By termination, bacterial quantities from the graft surface, graft material, and perigraft tissue were obtained.Despite in vitro synergism, the porcine experiment revealed no statistical differences for bacteriological endpoints between the two treatment groups, and none of the treatments eradicated the APVGI. Accordingly, the mixed model analyses of weight, temperature, and blood samples revealed no statistical differences.Conservative systemic treatment with DCX+TDZ did not reproduce in vitro results against APVGI caused by MSSA in this porcine model. However, unexpected severe adverse effects related to the planned dose of TDZ required a considerable reduction to the administered dose of TDZ, which may have compromised the results.

  12. The construction of a two-dimensional reproducing kernel function and its application in a biomedical model.

    Science.gov (United States)

    Guo, Qi; Shen, Shu-Ting

    2016-04-29

    There are two major classes of cardiac tissue models: the ionic model and the FitzHugh-Nagumo model. During computer simulation, each model entails solving a system of complex ordinary differential equations and a partial differential equation with non-flux boundary conditions. The reproducing kernel method possesses significant applications in solving partial differential equations. The derivative of the reproducing kernel function is a wavelet function, which has local properties and sensitivities to singularity. Therefore, study on the application of reproducing kernel would be advantageous. Applying new mathematical theory to the numerical solution of the ventricular muscle model so as to improve its precision in comparison with other methods at present. A two-dimensional reproducing kernel function inspace is constructed and applied in computing the solution of two-dimensional cardiac tissue model by means of the difference method through time and the reproducing kernel method through space. Compared with other methods, this method holds several advantages such as high accuracy in computing solutions, insensitivity to different time steps and a slow propagation speed of error. It is suitable for disorderly scattered node systems without meshing, and can arbitrarily change the location and density of the solution on different time layers. The reproducing kernel method has higher solution accuracy and stability in the solutions of the two-dimensional cardiac tissue model.

  13. Evaluation of 12 blood glucose monitoring systems for self-testing: system accuracy and measurement reproducibility.

    Science.gov (United States)

    Freckmann, Guido; Baumstark, Annette; Schmid, Christina; Pleus, Stefan; Link, Manuela; Haug, Cornelia

    2014-02-01

    Systems for self-monitoring of blood glucose (SMBG) have to provide accurate and reproducible blood glucose (BG) values in order to ensure adequate therapeutic decisions by people with diabetes. Twelve SMBG systems were compared in a standardized manner under controlled laboratory conditions: nine systems were available on the German market and were purchased from a local pharmacy, and three systems were obtained from the manufacturer (two systems were available on the U.S. market, and one system was not yet introduced to the German market). System accuracy was evaluated following DIN EN ISO (International Organization for Standardization) 15197:2003. In addition, measurement reproducibility was assessed following a modified TNO (Netherlands Organization for Applied Scientific Research) procedure. Comparison measurements were performed with either the glucose oxidase method (YSI 2300 STAT Plus™ glucose analyzer; YSI Life Sciences, Yellow Springs, OH) or the hexokinase method (cobas(®) c111; Roche Diagnostics GmbH, Mannheim, Germany) according to the manufacturer's measurement procedure. The 12 evaluated systems showed between 71.5% and 100% of the measurement results within the required system accuracy limits. Ten systems fulfilled with the evaluated test strip lot minimum accuracy requirements specified by DIN EN ISO 15197:2003. In addition, accuracy limits of the recently published revision ISO 15197:2013 were applied and showed between 54.5% and 100% of the systems' measurement results within the required accuracy limits. Regarding measurement reproducibility, each of the 12 tested systems met the applied performance criteria. In summary, 83% of the systems fulfilled with the evaluated test strip lot minimum system accuracy requirements of DIN EN ISO 15197:2003. Each of the tested systems showed acceptable measurement reproducibility. In order to ensure sufficient measurement quality of each distributed test strip lot, regular evaluations are required.

  14. Using a 1-D model to reproduce diurnal SST signals

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob L.

    2014-01-01

    The diurnal variability of SST has been extensively studied as it poses challenges for validating and calibrating satellite sensors, merging SST time series, oceanic and atmospheric modelling. As heat is significantly trapped close to the surface, the diurnal signal’s maximum amplitude is best...... captured by radiometers. The availability of infra-red retrievals from a geostationary orbit allows the hourly monitoring of the diurnal SST evolution. When infra-red SSTs are validated with in situ measurements a general mismatch is found, associated with the different reference depth of each type...... of measurement. A generally preferred approach to bridge the gap between in situ and remotely obtained measurements is through modelling of the upper ocean temperature. This ESA supported study focuses on the implementation of the 1 dimensional General Ocean Turbulence Model (GOTM), in order to resolve...

  15. Interobserver reproducibility of the Paris system for reporting urinary cytology

    Directory of Open Access Journals (Sweden)

    Theresa Long

    2017-01-01

    Full Text Available Background: The Paris System for Reporting Urinary Cytology represents a significant improvement in classification of urinary specimens. The system acknowledges the difficulty in cytologically diagnosing low-grade urothelial carcinomas and has developed categories to deal with this issue. The system uses six categories: unsatisfactory, negative for high-grade urothelial carcinoma (NHGUC, atypical urothelial cells, suspicious for high-grade urothelial carcinoma, high-grade urothelial carcinoma, other malignancies and a seventh subcategory (low-grade urothelial neoplasm. Methods: Three hundred and fifty-seven urine specimens were independently reviewed by four cytopathologists unaware of the previous diagnoses. Each cytopathologist rendered a diagnosis according to the Paris System categories. Agreement was assessed using absolute agreement and weighted chance-corrected agreement (kappa. Disagreements were classified as low impact and high impact based on the potential impact of a misclassification on clinical management. Results: The average absolute agreement was 65% with an average expected agreement of 44%. The average chance-corrected agreement (kappa was 0.32. Nine hundred and ninety-nine of 1902 comparisons between rater pairs were in agreement, but 12% of comparisons differed by two or more categories for the category NHGUC. Approximately 15% of the disagreements were classified as high clinical impact. Conclusions: Our findings indicated that the scheme recommended by the Paris System shows adequate precision for the category NHGUC, but the other categories demonstrated unacceptable interobserver variability. This low level of diagnostic precision may negatively impact the applicability of the Paris System for widespread clinical application.

  16. Making Cloud-based Systems Elasticity Testing Reproducible

    OpenAIRE

    Albonico , Michel; Mottu , Jean-Marie; Sunyé , Gerson; Alvares , Frederico

    2017-01-01

    International audience; Elastic cloud infrastructures vary computational resources at runtime, i. e., elasticity, which is error-prone. That makes testing throughout elasticity crucial for those systems. Those errors are detected thanks to tests that should run deterministically many times all along the development. However, elasticity testing reproduction requires several features not supported natively by the main cloud providers, such as Amazon EC2. We identify three requirements that we c...

  17. How well can DFT reproduce key interactions in Ziegler-Natta systems?

    KAUST Repository

    Correa, Andrea

    2013-08-08

    The performance of density functional theory in reproducing some of the main interactions occurring in MgCl2-supported Ziegler-Natta catalytic systems is assessed. Eight model systems, representatives of key interactions occurring in Ziegler-Natta catalysts, are selected. Fifteen density functionals are tested in combination with two different basis sets, namely, TZVP and cc-pVTZ. As a general result, we found that the best performances are achieved by the PBEh1PBE hybrid generalized gradient approximation (GGA) functional, but also the cheaper PBEh GGA functional gives rather good results. The failure of the popular B3LYP and BP86 functionals is noticeable. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. COMBINE archive and OMEX format : One file to share all information to reproduce a modeling project

    NARCIS (Netherlands)

    Bergmann, Frank T.; Olivier, Brett G.; Soiland-Reyes, Stian

    2014-01-01

    Background: With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models,

  19. Reproducibility analysis of measurements with a mechanical semiautomatic eye model for evaluation of intraocular lenses

    Science.gov (United States)

    Rank, Elisabet; Traxler, Lukas; Bayer, Natascha; Reutterer, Bernd; Lux, Kirsten; Drauschke, Andreas

    2014-03-01

    Mechanical eye models are used to validate ex vivo the optical quality of intraocular lenses (IOLs). The quality measurement and test instructions for IOLs are defined in the ISO 11979-2. However, it was mentioned in literature that these test instructions could lead to inaccurate measurements in case of some modern IOL designs. Reproducibility of alignment and measurement processes are presented, performed with a semiautomatic mechanical ex vivo eye model based on optical properties published by Liou and Brennan in the scale 1:1. The cornea, the iris aperture and the IOL itself are separately changeable within the eye model. The adjustment of the IOL can be manipulated by automatic decentration and tilt of the IOL in reference to the optical axis of the whole system, which is defined by the connection line of the central point of the artificial cornea and the iris aperture. With the presented measurement setup two quality criteria are measurable: the modulation transfer function (MTF) and the Strehl ratio. First the reproducibility of the alignment process for definition of initial conditions of the lateral position and tilt in reference to the optical axis of the system is investigated. Furthermore, different IOL holders are tested related to the stable holding of the IOL. The measurement is performed by a before-after comparison of the lens position using a typical decentration and tilt tolerance analysis path. Modulation transfer function MTF and Strehl ratio S before and after this tolerance analysis are compared and requirements for lens holder construction are deduced from the presented results.

  20. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.

    Science.gov (United States)

    Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P

    2018-02-23

    Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.

  1. Prospective validation of pathologic complete response models in rectal cancer: Transferability and reproducibility.

    Science.gov (United States)

    van Soest, Johan; Meldolesi, Elisa; van Stiphout, Ruud; Gatta, Roberto; Damiani, Andrea; Valentini, Vincenzo; Lambin, Philippe; Dekker, Andre

    2017-09-01

    Multiple models have been developed to predict pathologic complete response (pCR) in locally advanced rectal cancer patients. Unfortunately, validation of these models normally omit the implications of cohort differences on prediction model performance. In this work, we will perform a prospective validation of three pCR models, including information whether this validation will target transferability or reproducibility (cohort differences) of the given models. We applied a novel methodology, the cohort differences model, to predict whether a patient belongs to the training or to the validation cohort. If the cohort differences model performs well, it would suggest a large difference in cohort characteristics meaning we would validate the transferability of the model rather than reproducibility. We tested our method in a prospective validation of three existing models for pCR prediction in 154 patients. Our results showed a large difference between training and validation cohort for one of the three tested models [Area under the Receiver Operating Curve (AUC) cohort differences model: 0.85], signaling the validation leans towards transferability. Two out of three models had a lower AUC for validation (0.66 and 0.58), one model showed a higher AUC in the validation cohort (0.70). We have successfully applied a new methodology in the validation of three prediction models, which allows us to indicate if a validation targeted transferability (large differences between training/validation cohort) or reproducibility (small cohort differences). © 2017 American Association of Physicists in Medicine.

  2. Stochastic model of financial markets reproducing scaling and memory in volatility return intervals

    Science.gov (United States)

    Gontis, V.; Havlin, S.; Kononovicius, A.; Podobnik, B.; Stanley, H. E.

    2016-11-01

    We investigate the volatility return intervals in the NYSE and FOREX markets. We explain previous empirical findings using a model based on the interacting agent hypothesis instead of the widely-used efficient market hypothesis. We derive macroscopic equations based on the microscopic herding interactions of agents and find that they are able to reproduce various stylized facts of different markets and different assets with the same set of model parameters. We show that the power-law properties and the scaling of return intervals and other financial variables have a similar origin and could be a result of a general class of non-linear stochastic differential equations derived from a master equation of an agent system that is coupled by herding interactions. Specifically, we find that this approach enables us to recover the volatility return interval statistics as well as volatility probability and spectral densities for the NYSE and FOREX markets, for different assets, and for different time-scales. We find also that the historical S&P500 monthly series exhibits the same volatility return interval properties recovered by our proposed model. Our statistical results suggest that human herding is so strong that it persists even when other evolving fluctuations perturbate the financial system.

  3. Reliability and reproducibility analysis of the AOSpine thoracolumbar spine injury classification system by Chinese spinal surgeons.

    Science.gov (United States)

    Cheng, Jie; Liu, Peng; Sun, Dong; Qin, Tingzheng; Ma, Zikun; Liu, Jingpei

    2017-05-01

    The objective of this study was to analyze the interobserver reliability and intraobserver reproducibility of the new AOSpine thoracolumbar spine injury classification system in young Chinese orthopedic surgeons with different levels of experience in spinal trauma. Previous reports suggest that the new AOSpine thoracolumbar spine injury classification system demonstrates acceptable interobserver reliability and intraobserver reproducibility. However, there are few studies in Asia, especially in China. The AOSpine thoracolumbar spine injury classification system was applied to 109 patients with acute, traumatic thoracolumbar spinal injuries by two groups of spinal surgeons with different levels of clinical experience. The Kappa coefficient was used to determine interobserver reliability and intraobserver reproducibility. The overall Kappa coefficient for all cases was 0.362, which represents fair reliability. The Kappa statistic was 0.385 for A-type injuries and 0.292 for B-type injuries, which represents fair reliability, and 0.552 for C-type injuries, which represents moderate reliability. The Kappa coefficient for intraobserver reproducibility was 0.442 for A-type injuries, 0.485 for B-type injuries, and 0.412 for C-type injuries. These values represent moderate reproducibility for all injury types. The raters in Group A provided significantly better interobserver reliability than Group B (P < 0.05). There were no between-group differences in intraobserver reproducibility. This study suggests that the new AO spine injury classification system may be applied in day-to-day clinical practice in China following extensive training of healthcare providers. Further prospective studies in different healthcare providers and clinical settings are essential for validation of this classification system and to assess its utility.

  4. A stable and reproducible human blood-brain barrier model derived from hematopoietic stem cells.

    Directory of Open Access Journals (Sweden)

    Romeo Cecchelli

    Full Text Available The human blood brain barrier (BBB is a selective barrier formed by human brain endothelial cells (hBECs, which is important to ensure adequate neuronal function and protect the central nervous system (CNS from disease. The development of human in vitro BBB models is thus of utmost importance for drug discovery programs related to CNS diseases. Here, we describe a method to generate a human BBB model using cord blood-derived hematopoietic stem cells. The cells were initially differentiated into ECs followed by the induction of BBB properties by co-culture with pericytes. The brain-like endothelial cells (BLECs express tight junctions and transporters typically observed in brain endothelium and maintain expression of most in vivo BBB properties for at least 20 days. The model is very reproducible since it can be generated from stem cells isolated from different donors and in different laboratories, and could be used to predict CNS distribution of compounds in human. Finally, we provide evidence that Wnt/β-catenin signaling pathway mediates in part the BBB inductive properties of pericytes.

  5. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  6. Investigation of dimensional variation in parts manufactured by fused deposition modeling using Gauge Repeatability and Reproducibility

    Science.gov (United States)

    Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar

    2018-02-01

    In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.

  7. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  8. Anatomical Reproducibility of a Head Model Molded by a Three-dimensional Printer.

    Science.gov (United States)

    Kondo, Kosuke; Nemoto, Masaaki; Masuda, Hiroyuki; Okonogi, Shinichi; Nomoto, Jun; Harada, Naoyuki; Sugo, Nobuo; Miyazaki, Chikao

    2015-01-01

    We prepared rapid prototyping models of heads with unruptured cerebral aneurysm based on image data of computed tomography angiography (CTA) using a three-dimensional (3D) printer. The objective of this study was to evaluate the anatomical reproducibility and accuracy of these models by comparison with the CTA images on a monitor. The subjects were 22 patients with unruptured cerebral aneurysm who underwent preoperative CTA. Reproducibility of the microsurgical anatomy of skull bone and arteries, the length and thickness of the main arteries, and the size of cerebral aneurysm were compared between the CTA image and rapid prototyping model. The microsurgical anatomy and arteries were favorably reproduced, apart from a few minute regions, in the rapid prototyping models. No significant difference was noted in the measured lengths of the main arteries between the CTA image and rapid prototyping model, but errors were noted in their thickness (p printer. It was concluded that these models are useful tools for neurosurgical simulation. The thickness of the main arteries and size of cerebral aneurysm should be comprehensively judged including other neuroimaging in consideration of errors.

  9. The Accuracy and Reproducibility of Linear Measurements Made on CBCT-derived Digital Models.

    Science.gov (United States)

    Maroua, Ahmad L; Ajaj, Mowaffak; Hajeer, Mohammad Y

    2016-04-01

    To evaluate the accuracy and reproducibility of linear measurements made on cone-beam computed tomography (CBCT)-derived digital models. A total of 25 patients (44% female, 18.7 ± 4 years) who had CBCT images for diagnostic purposes were included. Plaster models were obtained and digital models were extracted from CBCT scans. Seven linear measurements from predetermined landmarks were measured and analyzed on plaster models and the corresponding digital models. The measurements included arch length and width at different sites. Paired t test and Bland-Altman analysis were used to evaluate the accuracy of measurements on digital models compared to the plaster models. Also, intraclass correlation coefficients (ICCs) were used to evaluate the reproducibility of the measurements in order to assess the intraobserver reliability. The statistical analysis showed significant differences on 5 out of 14 variables, and the mean differences ranged from -0.48 to 0.51 mm. The Bland-Altman analysis revealed that the mean difference between variables was (0.14 ± 0.56) and (0.05 ± 0.96) mm and limits of agreement between the two methods ranged from -1.2 to 0.96 and from -1.8 to 1.9 mm in the maxilla and the mandible, respectively. The intraobserver reliability values were determined for all 14 variables of two types of models separately. The mean ICC value for the plaster models was 0.984 (0.924-0.999), while it was 0.946 for the CBCT models (range from 0.850 to 0.985). Linear measurements obtained from the CBCT-derived models appeared to have a high level of accuracy and reproducibility.

  10. An improved cost-effective, reproducible method for evaluation of bone loss in a rodent model.

    Science.gov (United States)

    Fine, Daniel H; Schreiner, Helen; Nasri-Heir, Cibele; Greenberg, Barbara; Jiang, Shuying; Markowitz, Kenneth; Furgang, David

    2009-02-01

    This study was designed to investigate the utility of two "new" definitions for assessment of bone loss in a rodent model of periodontitis. Eighteen rats were divided into three groups. Group 1 was infected by Aggregatibacter actinomycetemcomitans (Aa), group 2 was infected with an Aa leukotoxin knock-out, and group 3 received no Aa (controls). Microbial sampling and antibody titres were determined. Initially, two examiners measured the distance from the cemento-enamel-junction to alveolar bone crest using the three following methods; (1) total area of bone loss by radiograph, (2) linear bone loss by radiograph, (3) a direct visual measurement (DVM) of horizontal bone loss. Two "new" definitions were adopted; (1) any site in infected animals showing bone loss >2 standard deviations above the mean seen at that site in control animals was recorded as bone loss, (2) any animal with two or more sites in any quadrant affected by bone loss was considered as diseased. Using the "new" definitions both evaluators independently found that infected animals had significantly more disease than controls (DVM system; p<0.05). The DVM method provides a simple, cost effective, and reproducible method for studying periodontal disease in rodents.

  11. Reproducibility of the sella turcica landmark in three dimensions using a sella turcica-specific reference system

    International Nuclear Information System (INIS)

    Pittayapat, Pisha; Jacobs, Reinhilde; Odri, Guillaume A.; De Faria Vasconcelos, Karla; Willems, Guy; Olszewski, Raphael

    2015-01-01

    This study was performed to assess the reproducibility of identifying the sella turcica landmark in a three-dimensional (3D) model by using a new sella-specific landmark reference system. Thirty-two cone-beam computed tomographic scans (3D Accuitomo 170, J. Morita, Kyoto, Japan) were retrospectively collected. The 3D data were exported into the Digital Imaging and Communications in Medicine standard and then imported into the Maxilim software (Medicim NV, Sint-Niklaas, Belgium) to create 3D surface models. Five observers identified four osseous landmarks in order to create the reference frame and then identified two sella landmarks. The x, y, and z coordinates of each landmark were exported. The observations were repeated after four weeks. Statistical analysis was performed using the multiple paired t-test with Bonferroni correction (intraobserver precision: p<0.005, interobserver precision: p<0.0011). The intraobserver mean precision of all landmarks was <1 mm. Significant differences were found when comparing the intraobserver precision of each observer (p<0.005). For the sella landmarks, the intraobserver mean precision ranged from 0.43±0.34 mm to 0.51±0.46 mm. The intraobserver reproducibility was generally good. The overall interobserver mean precision was <1 mm. Significant differences between each pair of observers for all anatomical landmarks were found (p<0.0011). The interobserver reproducibility of sella landmarks was good, with >50% precision in locating the landmark within 1 mm. A newly developed reference system offers high precision and reproducibility for sella turcica identification in a 3D model without being based on two-dimensional images derived from 3D data.

  12. Reproducibility of the sella turcica landmark in three dimensions using a sella turcica-specific reference system

    Energy Technology Data Exchange (ETDEWEB)

    Pittayapat, Pisha; Jacobs, Reinhilde [University Hospitals Leuven, University of Leuven, Leuven (Belgium); Odri, Guillaume A. [Service de Chirurgie Orthopedique et Traumatologique, Centre Hospitalier Regional d' Orleans, Orleans Cedex2 (France); De Faria Vasconcelos, Karla [Dept. of Oral Diagnosis, Division of Oral Radiology, Piracicaba Dental School, University of Campinas, Sao Paulo (Brazil); Willems, Guy [Dept. of Oral Health Sciences, Orthodontics, KU Leuven and Dentistry, University Hospitals Leuven, University of Leuven, Leuven (Belgium); Olszewski, Raphael [Dept. of Oral and Maxillofacial Surgery, Cliniques Universitaires Saint Luc, Universite Catholique de Louvain, Brussels (Belgium)

    2015-03-15

    This study was performed to assess the reproducibility of identifying the sella turcica landmark in a three-dimensional (3D) model by using a new sella-specific landmark reference system. Thirty-two cone-beam computed tomographic scans (3D Accuitomo 170, J. Morita, Kyoto, Japan) were retrospectively collected. The 3D data were exported into the Digital Imaging and Communications in Medicine standard and then imported into the Maxilim software (Medicim NV, Sint-Niklaas, Belgium) to create 3D surface models. Five observers identified four osseous landmarks in order to create the reference frame and then identified two sella landmarks. The x, y, and z coordinates of each landmark were exported. The observations were repeated after four weeks. Statistical analysis was performed using the multiple paired t-test with Bonferroni correction (intraobserver precision: p<0.005, interobserver precision: p<0.0011). The intraobserver mean precision of all landmarks was <1 mm. Significant differences were found when comparing the intraobserver precision of each observer (p<0.005). For the sella landmarks, the intraobserver mean precision ranged from 0.43±0.34 mm to 0.51±0.46 mm. The intraobserver reproducibility was generally good. The overall interobserver mean precision was <1 mm. Significant differences between each pair of observers for all anatomical landmarks were found (p<0.0011). The interobserver reproducibility of sella landmarks was good, with >50% precision in locating the landmark within 1 mm. A newly developed reference system offers high precision and reproducibility for sella turcica identification in a 3D model without being based on two-dimensional images derived from 3D data.

  13. Cellular automaton model in the fundamental diagram approach reproducing the synchronized outflow of wide moving jams

    International Nuclear Information System (INIS)

    Tian, Jun-fang; Yuan, Zhen-zhou; Jia, Bin; Fan, Hong-qiang; Wang, Tao

    2012-01-01

    Velocity effect and critical velocity are incorporated into the average space gap cellular automaton model [J.F. Tian, et al., Phys. A 391 (2012) 3129], which was able to reproduce many spatiotemporal dynamics reported by the three-phase theory except the synchronized outflow of wide moving jams. The physics of traffic breakdown has been explained. Various congested patterns induced by the on-ramp are reproduced. It is shown that the occurrence of synchronized outflow, free outflow of wide moving jams is closely related with drivers time delay in acceleration at the downstream jam front and the critical velocity, respectively. -- Highlights: ► Velocity effect is added into average space gap cellular automaton model. ► The physics of traffic breakdown has been explained. ► The probabilistic nature of traffic breakdown is simulated. ► Various congested patterns induced by the on-ramp are reproduced. ► The occurrence of synchronized outflow of jams depends on drivers time delay.

  14. NRFixer: Sentiment Based Model for Predicting the Fixability of Non-Reproducible Bugs

    Directory of Open Access Journals (Sweden)

    Anjali Goyal

    2017-08-01

    Full Text Available Software maintenance is an essential step in software development life cycle. Nowadays, software companies spend approximately 45\\% of total cost in maintenance activities. Large software projects maintain bug repositories to collect, organize and resolve bug reports. Sometimes it is difficult to reproduce the reported bug with the information present in a bug report and thus this bug is marked with resolution non-reproducible (NR. When NR bugs are reconsidered, a few of them might get fixed (NR-to-fix leaving the others with the same resolution (NR. To analyse the behaviour of developers towards NR-to-fix and NR bugs, the sentiment analysis of NR bug report textual contents has been conducted. The sentiment analysis of bug reports shows that NR bugs' sentiments incline towards more negativity than reproducible bugs. Also, there is a noticeable opinion drift found in the sentiments of NR-to-fix bug reports. Observations driven from this analysis were an inspiration to develop a model that can judge the fixability of NR bugs. Thus a framework, {NRFixer,} which predicts the probability of NR bug fixation, is proposed. {NRFixer} was evaluated with two dimensions. The first dimension considers meta-fields of bug reports (model-1 and the other dimension additionally incorporates the sentiments (model-2 of developers for prediction. Both models were compared using various machine learning classifiers (Zero-R, naive Bayes, J48, random tree and random forest. The bug reports of Firefox and Eclipse projects were used to test {NRFixer}. In Firefox and Eclipse projects, J48 and Naive Bayes classifiers achieve the best prediction accuracy, respectively. It was observed that the inclusion of sentiments in the prediction model shows a rise in the prediction accuracy ranging from 2 to 5\\% for various classifiers.

  15. Reproducibility of ad libitum energy intake with the use of a computerized vending machine system123

    Science.gov (United States)

    Votruba, Susanne B; Franks, Paul W; Krakoff, Jonathan; Salbe, Arline D

    2010-01-01

    Background: Accurate assessment of energy intake is difficult but critical for the evaluation of eating behavior and intervention effects. Consequently, methods to assess ad libitum energy intake under controlled conditions have been developed. Objective: Our objective was to evaluate the reproducibility of ad libitum energy intake with the use of a computerized vending machine system. Design: Twelve individuals (mean ± SD: 36 ± 8 y old; 41 ± 8% body fat) consumed a weight-maintaining diet for 3 d; subsequently, they self-selected all food with the use of a computerized vending machine system for an additional 3 d. Mean daily energy intake was calculated from the actual weight of foods consumed and expressed as a percentage of weight-maintenance energy needs (%WMEN). Subjects repeated the study multiple times during 2 y. The within-person reproducibility of energy intake was determined through the calculation of the intraclass correlation coefficients (ICCs) between visits. Results: Daily energy intake for all subjects was 5020 ± 1753 kcal during visit 1 and 4855 ± 1615 kcal during visit 2. There were no significant associations between energy intake and body weight, body mass index, or percentage body fat while subjects used the vending machines, which indicates that intake was not driven by body size or need. Despite overconsumption (%WMEN = 181 ± 57%), the reproducibility of intake between visits, whether expressed as daily energy intake (ICC = 0.90), %WMEN (ICC = 0.86), weight of food consumed (ICC = 0.87), or fat intake (g/d; ICC = 0.87), was highly significant (P < 0.0001). Conclusion: Although ad libitum energy intake exceeded %WMEN, the within-person reliability of this intake across multiple visits was high, which makes this a reproducible method for the measurement of ad libitum intake in subjects who reside in a research unit. This trial was registered at clinicaltrials.gov as NCT00342732. PMID:19923376

  16. COMBINE archive and OMEX format: one file to share all information to reproduce a modeling project.

    Science.gov (United States)

    Bergmann, Frank T; Adams, Richard; Moodie, Stuart; Cooper, Jonathan; Glont, Mihai; Golebiewski, Martin; Hucka, Michael; Laibe, Camille; Miller, Andrew K; Nickerson, David P; Olivier, Brett G; Rodriguez, Nicolas; Sauro, Herbert M; Scharm, Martin; Soiland-Reyes, Stian; Waltemath, Dagmar; Yvon, Florent; Le Novère, Nicolas

    2014-12-14

    With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result. We describe the Open Modeling EXchange format (OMEX). Together with the use of other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive, a single file that supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, listing the content of the archive, an optional metadata file adding information about the archive and its content, and the files describing the model. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. Several tools that support the COMBINE Archive are available, either as independent libraries or embedded in modeling software. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails. We anticipate that the COMBINE Archive will become a significant help for modellers, as the domain moves to larger, more complex experiments such as multi-scale models of organs, digital organisms, and bioengineering.

  17. Spatial aspects of sound quality - and by multichannel systems subjective assessment of sound reproduced by stereo

    DEFF Research Database (Denmark)

    Choisel, Sylvain

    the fidelity with which sound reproduction systems can re-create the desired stereo image, a laser pointing technique was developed to accurately collect subjects' responses in a localization task. This method is subsequently applied in an investigation of the effects of loudspeaker directivity...... on the perceived direction of panned sources. The second part of the thesis addresses the identification of auditory attributes which play a role in the perception of sound reproduced by multichannel systems. Short musical excerpts were presented in mono, stereo and several multichannel formats to evoke various...

  18. The general theory of the Quasi-reproducible experiments: How to describe the measured data of complex systems?

    Science.gov (United States)

    Nigmatullin, Raoul R.; Maione, Guido; Lino, Paolo; Saponaro, Fabrizio; Zhang, Wei

    2017-01-01

    In this paper, we suggest a general theory that enables to describe experiments associated with reproducible or quasi-reproducible data reflecting the dynamical and self-similar properties of a wide class of complex systems. Under complex system we understand a system when the model based on microscopic principles and suppositions about the nature of the matter is absent. This microscopic model is usually determined as ;the best fit" model. The behavior of the complex system relatively to a control variable (time, frequency, wavelength, etc.) can be described in terms of the so-called intermediate model (IM). One can prove that the fitting parameters of the IM are associated with the amplitude-frequency response of the segment of the Prony series. The segment of the Prony series including the set of the decomposition coefficients and the set of the exponential functions (with k = 1,2,…,K) is limited by the final mode K. The exponential functions of this decomposition depend on time and are found by the original algorithm described in the paper. This approach serves as a logical continuation of the results obtained earlier in paper [Nigmatullin RR, W. Zhang and Striccoli D. General theory of experiment containing reproducible data: The reduction to an ideal experiment. Commun Nonlinear Sci Numer Simul, 27, (2015), pp 175-192] for reproducible experiments and includes the previous results as a partial case. In this paper, we consider a more complex case when the available data can create short samplings or exhibit some instability during the process of measurements. We give some justified evidences and conditions proving the validity of this theory for the description of a wide class of complex systems in terms of the reduced set of the fitting parameters belonging to the segment of the Prony series. The elimination of uncontrollable factors expressed in the form of the apparatus function is discussed. To illustrate how to apply the theory and take advantage of its

  19. A novel, comprehensive, and reproducible porcine model for determining the timing of bruises in forensic pathology

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2016-01-01

    Purpose Calculating the timing of bruises is crucial in forensic pathology but is a challenging discipline in both human and veterinary medicine. A mechanical device for inflicting bruises in pigs was developed and validated, and the pathological reactions in the bruises were studied over time......-dependent response. Combining these parameters, bruises could be grouped as being either less than 4 h old or between 4 and 10 h of age. Gross lesions and changes in the epidermis and dermis were inconclusive with respect to time determination. Conclusions The model was reproducible and resembled forensic cases...

  20. A novel highly reproducible and lethal nonhuman primate model for orthopox virus infection.

    Directory of Open Access Journals (Sweden)

    Marit Kramski

    Full Text Available The intentional re-introduction of Variola virus (VARV, the agent of smallpox, into the human population is of great concern due its bio-terroristic potential. Moreover, zoonotic infections with Cowpox (CPXV and Monkeypox virus (MPXV cause severe diseases in humans. Smallpox vaccines presently available can have severe adverse effects that are no longer acceptable. The efficacy and safety of new vaccines and antiviral drugs for use in humans can only be demonstrated in animal models. The existing nonhuman primate models, using VARV and MPXV, need very high viral doses that have to be applied intravenously or intratracheally to induce a lethal infection in macaques. To overcome these drawbacks, the infectivity and pathogenicity of a particular CPXV was evaluated in the common marmoset (Callithrix jacchus.A CPXV named calpox virus was isolated from a lethal orthopox virus (OPV outbreak in New World monkeys. We demonstrated that marmosets infected with calpox virus, not only via the intravenous but also the intranasal route, reproducibly develop symptoms resembling smallpox in humans. Infected animals died within 1-3 days after onset of symptoms, even when very low infectious viral doses of 5x10(2 pfu were applied intranasally. Infectious virus was demonstrated in blood, saliva and all organs analyzed.We present the first characterization of a new OPV infection model inducing a disease in common marmosets comparable to smallpox in humans. Intranasal virus inoculation mimicking the natural route of smallpox infection led to reproducible infection. In vivo titration resulted in an MID(50 (minimal monkey infectious dose 50% of 8.3x10(2 pfu of calpox virus which is approximately 10,000-fold lower than MPXV and VARV doses applied in the macaque models. Therefore, the calpox virus/marmoset model is a suitable nonhuman primate model for the validation of vaccines and antiviral drugs. Furthermore, this model can help study mechanisms of OPV pathogenesis.

  1. Improving the Pattern Reproducibility of Multiple-Point-Based Prior Models Using Frequency Matching

    DEFF Research Database (Denmark)

    Cordua, Knud Skou; Hansen, Thomas Mejer; Mosegaard, Klaus

    2014-01-01

    Some multiple-point-based sampling algorithms, such as the snesim algorithm, rely on sequential simulation. The conditional probability distributions that are used for the simulation are based on statistics of multiple-point data events obtained from a training image. During the simulation, data...... events with zero probability in the training image statistics may occur. This is handled by pruning the set of conditioning data until an event with non-zero probability is found. The resulting probability distribution sampled by such algorithms is a pruned mixture model. The pruning strategy leads...... to a probability distribution that lacks some of the information provided by the multiple-point statistics from the training image, which reduces the reproducibility of the training image patterns in the outcome realizations. When pruned mixture models are used as prior models for inverse problems, local re...

  2. Tackling the Reproducibility Problem in Systems Research with Declarative Experiment Specifications

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Ivo [Univ. of California, Santa Cruz, CA (United States); Maltzahn, Carlos [Univ. of California, Santa Cruz, CA (United States); Lofstead, Jay [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Moody, Adam [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Arpaci-Dusseau, Remzi [Univ. of Wisconsin, Madison, WI (United States); Arpaci-Dusseau, Andrea [Univ. of Wisconsin, Madison, WI (United States)

    2015-05-04

    Validating experimental results in the field of computer systems is a challenging task, mainly due to the many changes in software and hardware that computational environments go through. Determining if an experiment is reproducible entails two separate tasks: re-executing the experiment and validating the results. Existing reproducibility efforts have focused on the former, envisioning techniques and infrastructures that make it easier to re-execute an experiment. In this work we focus on the latter by analyzing the validation workflow that an experiment re-executioner goes through. We notice that validating results is done on the basis of experiment design and high-level goals, rather than exact quantitative metrics. Based on this insight, we introduce a declarative format for specifying the high-level components of an experiment as well as describing generic, testable conditions that serve as the basis for validation. We present a use case in the area of storage systems to illustrate the usefulness of this approach. We also discuss limitations and potential benefits of using this approach in other areas of experimental systems research.

  3. Reproducing the nonlinear dynamic behavior of a structured beam with a generalized continuum model

    Science.gov (United States)

    Vila, J.; Fernández-Sáez, J.; Zaera, R.

    2018-04-01

    In this paper we study the coupled axial-transverse nonlinear vibrations of a kind of one dimensional structured solids by application of the so called Inertia Gradient Nonlinear continuum model. To show the accuracy of this axiomatic model, previously proposed by the authors, its predictions are compared with numeric results from a previously defined finite discrete chain of lumped masses and springs, for several number of particles. A continualization of the discrete model equations based on Taylor series allowed us to set equivalent values of the mechanical properties in both discrete and axiomatic continuum models. Contrary to the classical continuum model, the inertia gradient nonlinear continuum model used herein is able to capture scale effects, which arise for modes in which the wavelength is comparable to the characteristic distance of the structured solid. The main conclusion of the work is that the proposed generalized continuum model captures the scale effects in both linear and nonlinear regimes, reproducing the behavior of the 1D nonlinear discrete model adequately.

  4. Reliability and reproducibility of subaxial cervical injury description system: a standardized nomenclature schema.

    Science.gov (United States)

    Bono, Christopher M; Schoenfeld, Andrew; Gupta, Giri; Harrop, James S; Anderson, Paul; Patel, Alpesh A; Dimar, John; Aarabi, Bizhan; Dailey, Andrew; Vaccaro, Alexander R; Gahr, Ralf; Shaffrey, Christopher; Anderson, David G; Rampersaud, Raj

    2011-08-01

    Radiographic measurement study. To develop a standardized cervical injury nomenclature system to facilitate description, communication, and classification among health care providers. The reliability and reproducibility of this system was then examined. Description of subaxial cervical injuries is critical for treatment decision making and comparing scientific reports of outcomes. Despite a number of available classification systems, surgeons, and researchers continue to use descriptive nomenclature, such as "burst" and "teardrop" fractures, to describe injuries. However, there is considerable inconsistency with use of such terms in the literature. Eleven distinct injury types and associated definitions were established for the subaxial cervical spine and subsequently refined by members of the Spine Trauma Study Group. A series of 18 cases of patients with a broad spectrum of subaxial cervical spine injuries was prepared and distributed to surgeon raters. Each rater was provided with the full nomenclature document and asked to select primary and secondary injury types for each case. After receipt of the raters' first round of classifications, the cases were resorted and returned to the raters for a second round of review. Interrater and intrarater reliabilities were calculated as percent agreement and Cohen kappa (κ) values. Intrarater reliability was assessed by comparing a given rater's diagnosis from the first and second rounds. Nineteen surgeons completed the first and second rounds of the study. Overall, the system demonstrated 56.4% interrater agreement and 72.8% intrarater agreement. Overall, interrater κ values demonstrated moderate agreement while intrarater κ values showed substantial agreement. Analyzed by injury types, only four (burst fractures, lateral mass fractures, flexion teardrop fractures, and anterior distraction injuries) demonstrated greater than 50% interrater agreement. This study demonstrated that, even in ideal circumstances, there is

  5. Mouse Models of Diet-Induced Nonalcoholic Steatohepatitis Reproduce the Heterogeneity of the Human Disease

    Science.gov (United States)

    Machado, Mariana Verdelho; Michelotti, Gregory Alexander; Xie, Guanhua; de Almeida, Thiago Pereira; Boursier, Jerome; Bohnic, Brittany; Guy, Cynthia D.; Diehl, Anna Mae

    2015-01-01

    Background and aims Non-alcoholic steatohepatitis (NASH), the potentially progressive form of nonalcoholic fatty liver disease (NAFLD), is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD) diet and Western diet. Methods Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose) for 16 weeks. Liver pathology and metabolic profile were compared. Results The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation) was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation. Conclusion Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH. PMID:26017539

  6. Mouse models of diet-induced nonalcoholic steatohepatitis reproduce the heterogeneity of the human disease.

    Directory of Open Access Journals (Sweden)

    Mariana Verdelho Machado

    Full Text Available Non-alcoholic steatohepatitis (NASH, the potentially progressive form of nonalcoholic fatty liver disease (NAFLD, is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD diet and Western diet.Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose for 16 weeks. Liver pathology and metabolic profile were compared.The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation.Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH.

  7. [Reproducibility of Fuhrman nuclear grade: advantages of a two-grade system].

    Science.gov (United States)

    Letourneux, Hervé; Lindner, Véronique; Lang, Hervé; Massfelder, Thierry; Meyer, Nicolas; Saussine, Christian; Jacqmin, Didier

    2006-06-01

    The Fuhrman nuclear grade is the reference histoprognostic grading system routinely used all over the world for renal cell carcinoma. Studies measuring the inter-observer and intra-observer concordance of Fuhrman grade show poor results in terms of reproducibility and repeatability. These variations are due to a certain degree of subjectivity of the pathologist in application of the definition of tumour grade, particularly nuclear grade. Elements able to account for this subjectivity in renal cell carcinoma are identified from a review of the literature. To improve the reliability of nuclear grade, the territory occupied by the highest grade must be specified and the grades should probably be combined. At the present time, regrouping of grade 1 and 2 tumours as low grade and grade 3 and 4 tumours as high grade would achieve better reproducibility, while preserving the prognostic: value for overall survival. The development of new treatment modalities and their use in adjuvant situations will imply the use of reliable histoprognostic factors to specify, indications.

  8. Why are models unable to reproduce multi-decadal trends in lower tropospheric baseline ozone levels?

    Science.gov (United States)

    Hu, L.; Liu, J.; Mickley, L. J.; Strahan, S. E.; Steenrod, S.

    2017-12-01

    Assessments of tropospheric ozone radiative forcing rely on accurate model simulations. Parrish et al (2014) found that three chemistry-climate models (CCMs) overestimate present-day O3 mixing ratios and capture only 50% of the observed O3 increase over the last five decades at 12 baseline sites in the northern mid-latitudes, indicating large uncertainties in our understanding of the ozone trends and their implications for radiative forcing. Here we present comparisons of outputs from two chemical transport models (CTMs) - GEOS-Chem and the Global Modeling Initiative model - with O3 observations from the same sites and from the global ozonesonde network. Both CTMs are driven by reanalysis meteorological data (MERRA or MERRA2) and thus are expected to be different in atmospheric transport processes relative to those freely running CCMs. We test whether recent model developments leading to more active ozone chemistry affect the computed ozone sensitivity to perturbations in emissions. Preliminary results suggest these CTMs can reproduce present-day ozone levels but fail to capture the multi-decadal trend since 1980. Both models yield widespread overpredictions of free tropospheric ozone in the 1980s. Sensitivity studies in GEOS-Chem suggest that the model estimate of natural background ozone is too high. We discuss factors that contribute to the variability and trends of tropospheric ozone over the last 30 years, with a focus on intermodel differences in spatial resolution and in the representation of stratospheric chemistry, stratosphere-troposphere exchange, halogen chemistry, and biogenic VOC emissions and chemistry. We also discuss uncertainty in the historical emission inventories used in models, and how these affect the simulated ozone trends.

  9. Reproducing tailing in breakthrough curves: Are statistical models equally representative and predictive?

    Science.gov (United States)

    Pedretti, Daniele; Bianchi, Marco

    2018-03-01

    Breakthrough curves (BTCs) observed during tracer tests in highly heterogeneous aquifers display strong tailing. Power laws are popular models for both the empirical fitting of these curves, and the prediction of transport using upscaling models based on best-fitted estimated parameters (e.g. the power law slope or exponent). The predictive capacity of power law based upscaling models can be however questioned due to the difficulties to link model parameters with the aquifers' physical properties. This work analyzes two aspects that can limit the use of power laws as effective predictive tools: (a) the implication of statistical subsampling, which often renders power laws undistinguishable from other heavily tailed distributions, such as the logarithmic (LOG); (b) the difficulties to reconcile fitting parameters obtained from models with different formulations, such as the presence of a late-time cutoff in the power law model. Two rigorous and systematic stochastic analyses, one based on benchmark distributions and the other on BTCs obtained from transport simulations, are considered. It is found that a power law model without cutoff (PL) results in best-fitted exponents (αPL) falling in the range of typical experimental values reported in the literature (1.5 tailing becomes heavier. Strong fluctuations occur when the number of samples is limited, due to the effects of subsampling. On the other hand, when the power law model embeds a cutoff (PLCO), the best-fitted exponent (αCO) is insensitive to the degree of tailing and to the effects of subsampling and tends to a constant αCO ≈ 1. In the PLCO model, the cutoff rate (λ) is the parameter that fully reproduces the persistence of the tailing and is shown to be inversely correlated to the LOG scale parameter (i.e. with the skewness of the distribution). The theoretical results are consistent with the fitting analysis of a tracer test performed during the MADE-5 experiment. It is shown that a simple

  10. Reproducibility and reliability of hypoglycaemic episodes recorded with Continuous Glucose Monitoring System (CGMS) in daily life

    DEFF Research Database (Denmark)

    Høi-Hansen, T; Pedersen-Bjergaard, U; Thorsteinsson, B

    2005-01-01

    AIM: Continuous glucose monitoring may reveal episodes of unrecognized hypoglycaemia. We evaluated reproducibility and reliability of hypoglycaemic episodes recorded in daily life by the Medtronic MiniMed Continuous Glucose Monitoring System (CGMS). METHODS: Twenty-nine adult patients with Type 1...... data were recalibrated generating four different CGMS data sets [left-A (left side of abdomen, calibration set A), left-B, right-A and right-B]. Agreement between CGMS data sets was evaluated during hypoglycaemic events, comparing CGMS readings = 2.2 mmol/l with nadir values from corresponding CGMS...... data sets. CGMS readings were also compared with independent self-monitored blood glucose (SMBG) values. RESULTS: With hypoglycaemia (CGMS readings = 2.2 mmol/l) in calibration set left-A, values below 3.5 mmol/l were present in 99% (95% CI: 95-100%) of samples in left-B, 91% (95% CI: 84...

  11. Demography-based adaptive network model reproduces the spatial organization of human linguistic groups

    Science.gov (United States)

    Capitán, José A.; Manrubia, Susanna

    2015-12-01

    The distribution of human linguistic groups presents a number of interesting and nontrivial patterns. The distributions of the number of speakers per language and the area each group covers follow log-normal distributions, while population and area fulfill an allometric relationship. The topology of networks of spatial contacts between different linguistic groups has been recently characterized, showing atypical properties of the degree distribution and clustering, among others. Human demography, spatial conflicts, and the construction of networks of contacts between linguistic groups are mutually dependent processes. Here we introduce an adaptive network model that takes all of them into account and successfully reproduces, using only four model parameters, not only those features of linguistic groups already described in the literature, but also correlations between demographic and topological properties uncovered in this work. Besides their relevance when modeling and understanding processes related to human biogeography, our adaptive network model admits a number of generalizations that broaden its scope and make it suitable to represent interactions between agents based on population dynamics and competition for space.

  12. A small animal holding fixture system with positional reproducibility for longitudinal multimodal imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kokuryo, Daisuke; Kimura, Yuichi; Obata, Takayuki; Yamaya, Taiga; Kawamura, Kazunori; Zhang, Ming-Rong; Kanno, Iwao; Aoki, Ichio, E-mail: ukimura@ieee.or [Molecular Imaging Center, National Institute of Radiological Sciences, 4-9-1 Anagawa, Inage, Chiba 263-8555 (Japan)

    2010-07-21

    This study presents a combined small animal holding fixture system, termed a 'bridge capsule', which provides for small animal re-fixation with positional reproducibility. This system comprises separate holding fixtures for the head and lower body and a connecting part to a gas anesthesia system. A mouse is fixed in place by the combination of a head fixture with a movable part made from polyacetal resin, a lower body fixture made from vinyl-silicone and a holder for the legs and tail. For re-fixation, a similar posture could be maintained by the same holding fixtures and a constant distance between the head and lower body fixtures is maintained. Artifacts caused by the bridge capsule system were not observed on magnetic resonance (MRI) and positron emission tomography (PET) images. The average position differences of the spinal column and the iliac body before and after re-fixation for the same modality were approximately 1.1 mm. The difference between the MRI and PET images was approximately 1.8 mm for the lower body fixture after image registration using fiducial markers. This system would be useful for longitudinal, repeated and multimodal imaging experiments requiring similar animal postures.

  13. Contrasting response to nutrient manipulation in Arctic mesocosms are reproduced by a minimum microbial food web model.

    Science.gov (United States)

    Larsen, Aud; Egge, Jorun K; Nejstgaard, Jens C; Di Capua, Iole; Thyrhaug, Runar; Bratbak, Gunnar; Thingstad, T Frede

    2015-03-01

    A minimum mathematical model of the marine pelagic microbial food web has previously shown to be able to reproduce central aspects of observed system response to different bottom-up manipulations in a mesocosm experiment Microbial Ecosystem Dynamics (MEDEA) in Danish waters. In this study, we apply this model to two mesocosm experiments (Polar Aquatic Microbial Ecology (PAME)-I and PAME-II) conducted at the Arctic location Kongsfjorden, Svalbard. The different responses of the microbial community to similar nutrient manipulation in the three mesocosm experiments may be described as diatom-dominated (MEDEA), bacteria-dominated (PAME-I), and flagellated-dominated (PAME-II). When allowing ciliates to be able to feed on small diatoms, the model describing the diatom-dominated MEDEA experiment give a bacteria-dominated response as observed in PAME I in which the diatom community comprised almost exclusively small-sized cells. Introducing a high initial mesozooplankton stock as observed in PAME-II, the model gives a flagellate-dominated response in accordance with the observed response also of this experiment. The ability of the model originally developed for temperate waters to reproduce population dynamics in a 10°C colder Arctic fjord, does not support the existence of important shifts in population balances over this temperature range. Rather, it suggests a quite resilient microbial food web when adapted to in situ temperature. The sensitivity of the model response to its mesozooplankton component suggests, however, that the seasonal vertical migration of Arctic copepods may be a strong forcing factor on Arctic microbial food webs.

  14. How well do CMIP5 Climate Models Reproduce the Hydrologic Cycle of the Colorado River Basin?

    Science.gov (United States)

    Gautam, J.; Mascaro, G.

    2017-12-01

    The Colorado River, which is the primary source of water for nearly 40 million people in the arid Southwestern states of the United States, has been experiencing an extended drought since 2000, which has led to a significant reduction in water supply. As the water demands increase, one of the major challenges for water management in the region has been the quantification of uncertainties associated with streamflow predictions in the Colorado River Basin (CRB) under potential changes of future climate. Hence, testing the reliability of model predictions in the CRB is critical in addressing this challenge. In this study, we evaluated the performances of 17 General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase Five (CMIP5) and 4 Regional Climate Models (RCMs) in reproducing the statistical properties of the hydrologic cycle in the CRB. We evaluated the water balance components at four nested sub-basins along with the inter-annual and intra-annual changes of precipitation (P), evaporation (E), runoff (R) and temperature (T) from 1979 to 2005. Most of the models captured the net water balance fairly well in the most-upstream basin but simulated a weak hydrological cycle in the evaporation channel at the downstream locations. The simulated monthly variability of P had different patterns, with correlation coefficients ranging from -0.6 to 0.8 depending on the sub-basin and the models from same parent institution clustering together. Apart from the most-upstream sub-basin where the models were mainly characterized by a negative seasonal bias in SON (of up to -50%), most of them had a positive bias in all seasons (of up to +260%) in the other three sub-basins. The models, however, captured the monthly variability of T well at all sites with small inter-model variabilities and a relatively similar range of bias (-7 °C to +5 °C) across all seasons. Mann-Kendall test was applied to the annual P and T time-series where majority of the models

  15. Reproducibility of summertime diurnal precipitation over northern Eurasia simulated by CMIP5 climate models

    Science.gov (United States)

    Hirota, N.; Takayabu, Y. N.

    2015-12-01

    Reproducibility of diurnal precipitation over northern Eurasia simulated by CMIP5 climate models in their historical runs were evaluated, in comparison with station data (NCDC-9813) and satellite data (GSMaP-V5). We first calculated diurnal cycles by averaging precipitation at each local solar time (LST) in June-July-August during 1981-2000 over the continent of northern Eurasia (0-180E, 45-90N). Then we examined occurrence time of maximum precipitation and a contribution of diurnally varying precipitation to the total precipitation.The contribution of diurnal precipitation was about 21% in both NCDC-9813 and GSMaP-V5. The maximum precipitation occurred at 18LST in NCDC-9813 but 16LST in GSMaP-V5, indicating some uncertainties even in the observational datasets. The diurnal contribution of the CMIP5 models varied largely from 11% to 62%, and their timing of the precipitation maximum ranged from 11LST to 20LST. Interestingly, the contribution and the timing had strong negative correlation of -0.65. The models with larger diurnal precipitation showed precipitation maximum earlier around noon. Next, we compared sensitivity of precipitation to surface temperature and tropospheric humidity between 5 models with large diurnal precipitation (LDMs) and 5 models with small diurnal precipitation (SDMs). Precipitation in LDMs showed high sensitivity to surface temperature, indicating its close relationship with local instability. On the other hand, synoptic disturbances were more active in SDMs with a dominant role of the large scale condensation, and precipitation in SDMs was more related with tropospheric moisture. Therefore, the relative importance of the local instability and the synoptic disturbances was suggested to be an important factor in determining the contribution and timing of the diurnal precipitation. Acknowledgment: This study is supported by Green Network of Excellence (GRENE) Program by the Ministry of Education, Culture, Sports, Science and Technology

  16. Quality assurance: Fundamental reproducibility tests for 3D treatment‐planning systems

    Science.gov (United States)

    Able, Charles M.; Thomas, Michael D.

    2005-01-01

    The use of image‐based 3D treatment planning has significantly increased the complexity of commercially available treatment‐planning systems (TPSs). Medical physicists have traditionally focused their efforts on understanding the calculation algorithm; this is no longer possible. A quality assurance (QA) program for our 3D treatment‐planning system (ADAC Pinnacle3) is presented. The program is consistent with the American Association of Physicists in Medicine Task Group 53 guidelines and balances the cost‐versus‐benefit equation confronted by the clinical physicist in a community cancer center environment. Fundamental reproducibility tests are presented as required for a community cancer center environment using conventional and 3D treatment planning. A series of nondosimetric tests, including digitizer accuracy, image acquisition and display, and hardcopy output, is presented. Dosimetric tests include verification of monitor units (MUs), standard isodoses, and clinical cases. The tests are outlined for the Pinnacle3 TPS but can be generalized to any TPS currently in use. The program tested accuracy and constancy through several hardware and software upgrades to our TPS. This paper gives valuable guidance and insight to other physicists attempting to approach TPS QA at fundamental and practical levels. PACS numbers: 87.53.Tf, 87.53.Xd PMID:16143788

  17. Reproducibility and consistency of proteomic experiments on natural populations of a non-model aquatic insect.

    Science.gov (United States)

    Hidalgo-Galiana, Amparo; Monge, Marta; Biron, David G; Canals, Francesc; Ribera, Ignacio; Cieslak, Alexandra

    2014-01-01

    Population proteomics has a great potential to address evolutionary and ecological questions, but its use in wild populations of non-model organisms is hampered by uncontrolled sources of variation. Here we compare the response to temperature extremes of two geographically distant populations of a diving beetle species (Agabus ramblae) using 2-D DIGE. After one week of acclimation in the laboratory under standard conditions, a third of the specimens of each population were placed at either 4 or 27°C for 12 h, with another third left as a control. We then compared the protein expression level of three replicated samples of 2-3 specimens for each treatment. Within each population, variation between replicated samples of the same treatment was always lower than variation between treatments, except for some control samples that retained a wider range of expression levels. The two populations had a similar response, without significant differences in the number of protein spots over- or under-expressed in the pairwise comparisons between treatments. We identified exemplary proteins among those differently expressed between treatments, which proved to be proteins known to be related to thermal response or stress. Overall, our results indicate that specimens collected in the wild are suitable for proteomic analyses, as the additional sources of variation were not enough to mask the consistency and reproducibility of the response to the temperature treatments.

  18. The interobserver reproducibility of thyroid cytopathology using Bethesda Reporting System: Analysis of 200 cases

    International Nuclear Information System (INIS)

    Ahmed, S.; Khan, M.A.; Kazi, F.

    2013-01-01

    Objective: To determine interobserver reproducibility of thyroid cytopathology in cases of thyroid fine needle aspirates. Methods: The retrospective, descriptive study, was conducted at the Foundation University Medical College, Islamabad, using cases related to period between 2009 and 2011. A total of 200 cases of fine-needle aspirations were retrieved from the archives. Three histopathologists independently categorised them into 6 groups according to Bethesda reporting system guidelines without looking at previous reports. Kappa statistics were used for analysis of the results on SPSS 17. Results: Of the 200 patients, 194 (97%) were females and 6 (3%) were males. The overall mean age of patients was 46+-20 years. Kappa value calculated for observer-1 and observer-2 was 0.735; for observer-1 and observer-3, 0.841; and for observer-2 and observer-3, 0.838, showing substantial interobserver agreement. Histopathological correlation was available, for 39(19.5%). Of these cases, 5(13%) were non-diagnostic, 20(51%) benign, 2(5%) atypia of undetermined significance/follicular lesion of undetermined significance, 6(15%) follicular neoplasm, 1(3%) suspicious for malignancy, and 5(13%) malignant. Conclusions: Good overall interobserver agreement was found, but discordance was seen when certain categories were analysed separately. (author)

  19. Can Computational Sediment Transport Models Reproduce the Observed Variability of Channel Networks in Modern Deltas?

    Science.gov (United States)

    Nesvold, E.; Mukerji, T.

    2017-12-01

    River deltas display complex channel networks that can be characterized through the framework of graph theory, as shown by Tejedor et al. (2015). Deltaic patterns may also be useful in a Bayesian approach to uncertainty quantification of the subsurface, but this requires a prior distribution of the networks of ancient deltas. By considering subaerial deltas, one can at least obtain a snapshot in time of the channel network spectrum across deltas. In this study, the directed graph structure is semi-automatically extracted from satellite imagery using techniques from statistical processing and machine learning. Once the network is labeled with vertices and edges, spatial trends and width and sinuosity distributions can also be found easily. Since imagery is inherently 2D, computational sediment transport models can serve as a link between 2D network structure and 3D depositional elements; the numerous empirical rules and parameters built into such models makes it necessary to validate the output with field data. For this purpose we have used a set of 110 modern deltas, with average water discharge ranging from 10 - 200,000 m3/s, as a benchmark for natural variability. Both graph theoretic and more general distributions are established. A key question is whether it is possible to reproduce this deltaic network spectrum with computational models. Delft3D was used to solve the shallow water equations coupled with sediment transport. The experimental setup was relatively simple; incoming channelized flow onto a tilted plane, with varying wave and tidal energy, sediment types and grain size distributions, river discharge and a few other input parameters. Each realization was run until a delta had fully developed: between 50 and 500 years (with a morphology acceleration factor). It is shown that input parameters should not be sampled independently from the natural ranges, since this may result in deltaic output that falls well outside the natural spectrum. Since we are

  20. Superficial Ultrasound Shear Wave Speed Measurements in Soft and Hard Elasticity Phantoms: Repeatability and Reproducibility Using Two Different Ultrasound Systems

    Science.gov (United States)

    Dillman, Jonathan R.; Chen, Shigao; Davenport, Matthew S.; Zhao, Heng; Urban, Matthew W.; Song, Pengfei; Watcharotone, Kuanwong; Carson, Paul L.

    2014-01-01

    Background There is a paucity of data available regarding the repeatability and reproducibility of superficial shear wave speed (SWS) measurements at imaging depths relevant to the pediatric population. Purpose To assess the repeatability and reproducibility of superficial shear wave speed (SWS) measurements acquired from elasticity phantoms at varying imaging depths using three different imaging methods, two different ultrasound systems, and multiple operators. Methods and Materials Soft and hard elasticity phantoms manufactured by Computerized Imaging Reference Systems, Inc. (Norfolk, VA) were utilized for our investigation. Institution #1 used an Acuson S3000 ultrasound system (Siemens Medical Solutions USA, Inc.) and three different shear wave imaging method/transducer combinations, while institution #2 used an Aixplorer ultrasound system (Supersonic Imagine) and two different transducers. Ten stiffness measurements were acquired from each phantom at three depths (1.0, 2.5, and 4.0 cm) by four operators at each institution. Student’s t-test was used to compare SWS measurements between imaging techniques, while SWS measurement agreement was assessed with two-way random effects single measure intra-class correlation coefficients and coefficients of variation. Mixed model regression analysis determined the effect of predictor variables on SWS measurements. Results For the soft phantom, the average of mean SWS measurements across the various imaging methods and depths was 0.84 ± 0.04 m/s (mean ± standard deviation) for the Acuson S3000 system and 0.90 ± 0.02 m/s for the Aixplorer system (p=0.003). For the hard phantom, the average of mean SWS measurements across the various imaging methods and depths was 2.14 ± 0.08 m/s for the Acuson S3000 system and 2.07 ± 0.03 m/s Aixplorer system (p>0.05). The coefficients of variation were low (0.5–6.8%), and inter-operator agreement was near-perfect (ICCs ≥0.99). Shear wave imaging method and imaging depth

  1. Superficial ultrasound shear wave speed measurements in soft and hard elasticity phantoms: repeatability and reproducibility using two ultrasound systems.

    Science.gov (United States)

    Dillman, Jonathan R; Chen, Shigao; Davenport, Matthew S; Zhao, Heng; Urban, Matthew W; Song, Pengfei; Watcharotone, Kuanwong; Carson, Paul L

    2015-03-01

    There is a paucity of data available regarding the repeatability and reproducibility of superficial shear wave speed (SWS) measurements at imaging depths relevant to the pediatric population. To assess the repeatability and reproducibility of superficial shear wave speed measurements acquired from elasticity phantoms at varying imaging depths using three imaging methods, two US systems and multiple operators. Soft and hard elasticity phantoms manufactured by Computerized Imaging Reference Systems Inc. (Norfolk, VA) were utilized for our investigation. Institution No. 1 used an Acuson S3000 US system (Siemens Medical Solutions USA, Malvern, PA) and three shear wave imaging method/transducer combinations, while institution No. 2 used an Aixplorer US system (SuperSonic Imagine, Bothell, WA) and two different transducers. Ten stiffness measurements were acquired from each phantom at three depths (1.0 cm, 2.5 cm and 4.0 cm) by four operators at each institution. Student's t-test was used to compare SWS measurements between imaging techniques, while SWS measurement agreement was assessed with two-way random effects single-measure intra-class correlation coefficients (ICCs) and coefficients of variation. Mixed model regression analysis determined the effect of predictor variables on SWS measurements. For the soft phantom, the average of mean SWS measurements across the various imaging methods and depths was 0.84 ± 0.04 m/s (mean ± standard deviation) for the Acuson S3000 system and 0.90 ± 0.02 m/s for the Aixplorer system (P = 0.003). For the hard phantom, the average of mean SWS measurements across the various imaging methods and depths was 2.14 ± 0.08 m/s for the Acuson S3000 system and 2.07 ± 0.03 m/s Aixplorer system (P > 0.05). The coefficients of variation were low (0.5-6.8%), and interoperator agreement was near-perfect (ICCs ≥ 0.99). Shear wave imaging method and imaging depth significantly affected measured SWS (P

  2. Validation of the 3D Skin Comet assay using full thickness skin models: Transferability and reproducibility.

    Science.gov (United States)

    Reisinger, Kerstin; Blatz, Veronika; Brinkmann, Joep; Downs, Thomas R; Fischer, Anja; Henkler, Frank; Hoffmann, Sebastian; Krul, Cyrille; Liebsch, Manfred; Luch, Andreas; Pirow, Ralph; Reus, Astrid A; Schulz, Markus; Pfuhler, Stefan

    2018-03-01

    Recently revised OECD Testing Guidelines highlight the importance of considering the first site-of-contact when investigating the genotoxic hazard. Thus far, only in vivo approaches are available to address the dermal route of exposure. The 3D Skin Comet and Reconstructed Skin Micronucleus (RSMN) assays intend to close this gap in the in vitro genotoxicity toolbox by investigating DNA damage after topical application. This represents the most relevant route of exposure for a variety of compounds found in household products, cosmetics, and industrial chemicals. The comet assay methodology is able to detect both chromosomal damage and DNA lesions that may give rise to gene mutations, thereby complementing the RSMN which detects only chromosomal damage. Here, the comet assay was adapted to two reconstructed full thickness human skin models: the EpiDerm™- and Phenion ® Full-Thickness Skin Models. First, tissue-specific protocols for the isolation of single cells and the general comet assay were transferred to European and US-American laboratories. After establishment of the assay, the protocol was then further optimized with appropriate cytotoxicity measurements and the use of aphidicolin, a DNA repair inhibitor, to improve the assay's sensitivity. In the first phase of an ongoing validation study eight chemicals were tested in three laboratories each using the Phenion ® Full-Thickness Skin Model, informing several validation modules. Ultimately, the 3D Skin Comet assay demonstrated a high predictive capacity and good intra- and inter-laboratory reproducibility with four laboratories reaching a 100% predictivity and the fifth yielding 70%. The data are intended to demonstrate the use of the 3D Skin Comet assay as a new in vitro tool for following up on positive findings from the standard in vitro genotoxicity test battery for dermally applied chemicals, ultimately helping to drive the regulatory acceptance of the assay. To expand the database, the validation will

  3. Reproducibility of small animal cine and scar cardiac magnetic resonance imaging using a clinical 3.0 tesla system

    International Nuclear Information System (INIS)

    Manka, Robert; Jahnke, Cosima; Hucko, Thomas; Dietrich, Thore; Gebker, Rolf; Schnackenburg, Bernhard; Graf, Kristof; Paetsch, Ingo

    2013-01-01

    To evaluate the inter-study, inter-reader and intra-reader reproducibility of cardiac cine and scar imaging in rats using a clinical 3.0 Tesla magnetic resonance (MR) system. Thirty-three adult rats (Sprague–Dawley) were imaged 24 hours after surgical occlusion of the left anterior descending coronary artery using a 3.0 Tesla clinical MR scanner (Philips Healthcare, Best, The Netherlands) equipped with a dedicated 70 mm solenoid receive-only coil. Left-ventricular (LV) volumes, mass, ejection fraction and amount of myocardial scar tissue were measured. Intra-and inter-observer reproducibility was assessed in all animals. In addition, repeat MR exams were performed in 6 randomly chosen rats within 24 hours to assess inter-study reproducibility. The MR imaging protocol was successfully completed in 32 (97%) animals. Bland-Altman analysis demonstrated high intra-reader reproducibility (mean bias%: LV end-diastolic volume (LVEDV), -1.7%; LV end-systolic volume (LVESV), -2.2%; LV ejection fraction (LVEF), 1.0%; LV mass, -2.7%; and scar mass, -1.2%) and high inter-reader reproducibility (mean bias%: LVEDV, 3.3%; LVESV, 6.2%; LVEF, -4.8%; LV mass, -1.9%; and scar mass, -1.8%). In addition, a high inter-study reproducibility was found (mean bias%: LVEDV, 0.1%; LVESV, -1.8%; LVEF, 1.0%; LV mass, -4.6%; and scar mass, -6.2%). Cardiac MR imaging of rats yielded highly reproducible measurements of cardiac volumes/function and myocardial infarct size on a clinical 3.0 Tesla MR scanner system. Consequently, more widely available high field clinical MR scanners can be employed for small animal imaging of the heart e.g. when aiming at serial assessments during therapeutic intervention studies

  4. Reproducing the Wechsler Intelligence Scale for Children-Fifth Edition: Factor Model Results

    Science.gov (United States)

    Beaujean, A. Alexander

    2016-01-01

    One of the ways to increase the reproducibility of research is for authors to provide a sufficient description of the data analytic procedures so that others can replicate the results. The publishers of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) do not follow these guidelines when reporting their confirmatory factor…

  5. Can a coupled meteorology–chemistry model reproduce the historical trend in aerosol direct radiative effects over the Northern Hemisphere?

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere h...

  6. Assessment of the potential forecasting skill of a global hydrological model in reproducing the occurrence of monthly flow extremes

    Directory of Open Access Journals (Sweden)

    N. Candogan Yossef

    2012-11-01

    Full Text Available As an initial step in assessing the prospect of using global hydrological models (GHMs for hydrological forecasting, this study investigates the skill of the GHM PCR-GLOBWB in reproducing the occurrence of past extremes in monthly discharge on a global scale. Global terrestrial hydrology from 1958 until 2001 is simulated by forcing PCR-GLOBWB with daily meteorological data obtained by downscaling the CRU dataset to daily fields using the ERA-40 reanalysis. Simulated discharge values are compared with observed monthly streamflow records for a selection of 20 large river basins that represent all continents and a wide range of climatic zones.

    We assess model skill in three ways all of which contribute different information on the potential forecasting skill of a GHM. First, the general skill of the model in reproducing hydrographs is evaluated. Second, model skill in reproducing significantly higher and lower flows than the monthly normals is assessed in terms of skill scores used for forecasts of categorical events. Third, model skill in reproducing flood and drought events is assessed by constructing binary contingency tables for floods and droughts for each basin. The skill is then compared to that of a simple estimation of discharge from the water balance (PE.

    The results show that the model has skill in all three types of assessments. After bias correction the model skill in simulating hydrographs is improved considerably. For most basins it is higher than that of the climatology. The skill is highest in reproducing monthly anomalies. The model also has skill in reproducing floods and droughts, with a markedly higher skill in floods. The model skill far exceeds that of the water balance estimate. We conclude that the prospect for using PCR-GLOBWB for monthly and seasonal forecasting of the occurrence of hydrological extremes is positive. We argue that this conclusion applies equally to other similar GHMs and

  7. Reproducing Electric Field Observations during Magnetic Storms by means of Rigorous 3-D Modelling and Distortion Matrix Co-estimation

    Science.gov (United States)

    Püthe, Christoph; Manoj, Chandrasekharan; Kuvshinov, Alexey

    2015-04-01

    Electric fields induced in the conducting Earth during magnetic storms drive currents in power transmission grids, telecommunication lines or buried pipelines. These geomagnetically induced currents (GIC) can cause severe service disruptions. The prediction of GIC is thus of great importance for public and industry. A key step in the prediction of the hazard to technological systems during magnetic storms is the calculation of the geoelectric field. To address this issue for mid-latitude regions, we developed a method that involves 3-D modelling of induction processes in a heterogeneous Earth and the construction of a model of the magnetospheric source. The latter is described by low-degree spherical harmonics; its temporal evolution is derived from observatory magnetic data. Time series of the electric field can be computed for every location on Earth's surface. The actual electric field however is known to be perturbed by galvanic effects, arising from very local near-surface heterogeneities or topography, which cannot be included in the conductivity model. Galvanic effects are commonly accounted for with a real-valued time-independent distortion matrix, which linearly relates measured and computed electric fields. Using data of various magnetic storms that occurred between 2000 and 2003, we estimated distortion matrices for observatory sites onshore and on the ocean bottom. Strong correlations between modellings and measurements validate our method. The distortion matrix estimates prove to be reliable, as they are accurately reproduced for different magnetic storms. We further show that 3-D modelling is crucial for a correct separation of galvanic and inductive effects and a precise prediction of electric field time series during magnetic storms. Since the required computational resources are negligible, our approach is suitable for a real-time prediction of GIC. For this purpose, a reliable forecast of the source field, e.g. based on data from satellites

  8. Systems-based biological concordance and predictive reproducibility of gene set discovery methods in cardiovascular disease.

    Science.gov (United States)

    Azuaje, Francisco; Zheng, Huiru; Camargo, Anyela; Wang, Haiying

    2011-08-01

    The discovery of novel disease biomarkers is a crucial challenge for translational bioinformatics. Demonstration of both their classification power and reproducibility across independent datasets are essential requirements to assess their potential clinical relevance. Small datasets and multiplicity of putative biomarker sets may explain lack of predictive reproducibility. Studies based on pathway-driven discovery approaches have suggested that, despite such discrepancies, the resulting putative biomarkers tend to be implicated in common biological processes. Investigations of this problem have been mainly focused on datasets derived from cancer research. We investigated the predictive and functional concordance of five methods for discovering putative biomarkers in four independently-generated datasets from the cardiovascular disease domain. A diversity of biosignatures was identified by the different methods. However, we found strong biological process concordance between them, especially in the case of methods based on gene set analysis. With a few exceptions, we observed lack of classification reproducibility using independent datasets. Partial overlaps between our putative sets of biomarkers and the primary studies exist. Despite the observed limitations, pathway-driven or gene set analysis can predict potentially novel biomarkers and can jointly point to biomedically-relevant underlying molecular mechanisms. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. [The Autocad system for planimetric study of the optic disc in glaucoma: technique and reproducibility study].

    Science.gov (United States)

    Sánchez Pérez, A; Honrubia López, F M; Larrosa Poves, J M; Polo Llorens, V; Melcon Sánchez-Frieras, B

    2001-09-01

    To develop a lens planimetry technique for the optic disc using AutoCAD. To determine variability magnitude of the optic disc morphological measurements. We employed AutoCAD R.14.0 Autodesk: image acquisition, contour delimitation by multiple lines fitting or ellipse adjustment, image sectorialization and measurements quantification (optic disc and excavation, vertical diameters, optic disc area, excavation area, neuroretinal sector area and Beta atrophy area). Intraimage or operator and interimage o total reproducibility was studied by coefficient of variability (CV) (n=10) in normal and myopic optic discs. This technique allows to obtain optic disc measurement in 5 to 10 minutes time. Total or interimage variability of measurements introduced by one observer presents CV range from 1.18-4.42. Operator or intraimage measurement presents CV range from 0.30-4.21. Optic disc contour delimitation by ellipse adjustment achieved better reproducibility results than multiple lines adjustment in all measurements. Computer assisted AutoCAD planimetry is an interactive method to analyse the optic disc, feasible to incorporate to clinical practice. Reproducibility results are comparable to other analyzers in quantification optic disc morphology. Ellipse adjustment improves results in optic disc contours delimitation.

  10. Reproducibility and geometric accuracy of the fixster system during hypofractionated stereotactic radiotherapy

    International Nuclear Information System (INIS)

    Lindvall, Peter; Bergström, Per; Löfroth, Per-Olov; Henriksson, Roger; Bergenheim, A Tommy

    2008-01-01

    Hypofractionated radiotherapy has been used for the treatment of AVMs and brain metastases. Hypofractionation necessitates the use of a relocatable stereotactic frame that has to be applied on several occasions. The stereotactic frame needs to have a high degree of reproducibility, and patient positioning is crucial to achieve a high accuracy of the treatment. In this study we have, by radiological means, evaluated the reproducibility of the isocenter in consecutive treatment sessions using the Fixster frame. Deviations in the X, Y and Z-axis were measured in 10 patients treated with hypofractionated radiotherapy. The mean deviation in the X-axis was 0.4 mm (range -2.1 – 2.1, median 0.7 mm) and in the Y-axis -0.3 mm (range -1.4 – 0.7, median -0.2 mm). The mean deviation in the Z-axis was -0.6 (range -1.4 – 1.4, median 0.0 mm). There is a high degree of reproducibility of the isocenter during successive treatment sessions with HCSRT using the Fixster frame for stereotactic targeting. The high reducibility enables a safe treatment using hypofractionated stereotactic radiotherapy

  11. Two-Finger Tightness: What Is It? Measuring Torque and Reproducibility in a Simulated Model.

    Science.gov (United States)

    Acker, William B; Tai, Bruce L; Belmont, Barry; Shih, Albert J; Irwin, Todd A; Holmes, James R

    2016-05-01

    Residents in training are often directed to insert screws using "two-finger tightness" to impart adequate torque but minimize the chance of a screw stripping in bone. This study seeks to quantify and describe two-finger tightness and to assess the variability of its application by residents in training. Cortical bone was simulated using a polyurethane foam block (30-pcf density) that was prepared with predrilled holes for tightening 3.5 × 14-mm long cortical screws and mounted to a custom-built apparatus on a load cell to capture torque data. Thirty-three residents in training, ranging from the first through fifth years of residency, along with 8 staff members, were directed to tighten 6 screws to two-finger tightness in the test block, and peak torque values were recorded. The participants were blinded to their torque values. Stripping torque (2.73 ± 0.56 N·m) was determined from 36 trials and served as a threshold for failed screw placement. The average torques varied substantially with regard to absolute torque values, thus poorly defining two-finger tightness. Junior residents less consistently reproduced torque compared with other groups (0.29 and 0.32, respectively). These data quantify absolute values of two-finger tightness but demonstrate considerable variability in absolute torque values, percentage of stripping torque, and ability to consistently reproduce given torque levels. Increased years in training are weakly correlated with reproducibility, but experience does not seem to affect absolute torque levels. These results question the usefulness of two-finger tightness as a teaching tool and highlight the need for improvement in resident motor skill training and development within a teaching curriculum. Torque measuring devices may be a useful simulation tools for this purpose.

  12. Intestinal microdialysis--applicability, reproducibility and local tissue response in a pig model

    DEFF Research Database (Denmark)

    Emmertsen, K J; Wara, P; Sørensen, Flemming Brandt

    2005-01-01

    BACKGROUND AND AIMS: Microdialysis has been applied to the intestinal wall for the purpose of monitoring local ischemia. The aim of this study was to investigate the applicability, reproducibility and local response to microdialysis in the intestinal wall. MATERIALS AND METHODS: In 12 pigs two...... the probes were processed for histological examination. RESULTS: Large intra- and inter-group differences in the relative recovery were found between all locations. Absolute values of metabolites showed no significant changes during the study period. The lactate in blood was 25-30% of the intra-tissue values...

  13. Reproducibility of a novel model of murine asthma-like pulmonary inflammation.

    Science.gov (United States)

    McKinley, L; Kim, J; Bolgos, G L; Siddiqui, J; Remick, D G

    2004-05-01

    Sensitization to cockroach allergens (CRA) has been implicated as a major cause of asthma, especially among inner-city populations. Endotoxin from Gram-negative bacteria has also been investigated for its role in attenuating or exacerbating the asthmatic response. We have created a novel model utilizing house dust extract (HDE) containing high levels of both CRA and endotoxin to induce pulmonary inflammation (PI) and airway hyperresponsiveness (AHR). A potential drawback of this model is that the HDE is in limited supply and preparation of new HDE will not contain the exact components of the HDE used to define our model system. The present study involved testing HDEs collected from various homes for their ability to cause PI and AHR. Dust collected from five homes was extracted in phosphate buffered saline overnight. The levels of CRA and endotoxin in the supernatants varied from 7.1 to 49.5 mg/ml of CRA and 1.7-6 micro g/ml of endotoxin in the HDEs. Following immunization and two pulmonary exposures to HDE all five HDEs induced AHR, PI and plasma IgE levels substantially higher than normal mice. This study shows that HDE containing high levels of cockroach allergens and endotoxin collected from different sources can induce an asthma-like response in our murine model.

  14. Pharmacokinetic Modelling to Predict FVIII:C Response to Desmopressin and Its Reproducibility in Nonsevere Haemophilia A Patients.

    Science.gov (United States)

    Schütte, Lisette M; van Hest, Reinier M; Stoof, Sara C M; Leebeek, Frank W G; Cnossen, Marjon H; Kruip, Marieke J H A; Mathôt, Ron A A

    2018-04-01

     Nonsevere haemophilia A (HA) patients can be treated with desmopressin. Response of factor VIII activity (FVIII:C) differs between patients and is difficult to predict.  Our aims were to describe FVIII:C response after desmopressin and its reproducibility by population pharmacokinetic (PK) modelling.  Retrospective data of 128 nonsevere HA patients (age 7-75 years) receiving an intravenous or intranasal dose of desmopressin were used. PK modelling of FVIII:C was performed by nonlinear mixed effect modelling. Reproducibility of FVIII:C response was defined as less than 25% difference in peak FVIII:C between administrations.  A total of 623 FVIII:C measurements from 142 desmopressin administrations were available; 14 patients had received two administrations at different occasions. The FVIII:C time profile was best described by a two-compartment model with first-order absorption and elimination. Interindividual variability of the estimated baseline FVIII:C, central volume of distribution and clearance were 37, 43 and 50%, respectively. The most recently measured FVIII:C (FVIII-recent) was significantly associated with FVIII:C response to desmopressin ( p  C increase of 0.47 IU/mL (median, interquartile range: 0.32-0.65 IU/mL, n  = 142). C response was reproducible in 6 out of 14 patients receiving two desmopressin administrations.  FVIII:C response to desmopressin in nonsevere HA patients was adequately described by a population PK model. Large variability in FVIII:C response was observed, which could only partially be explained by FVIII-recent. C response was not reproducible in a small subset of patients. Therefore, monitoring FVIII:C around surgeries or bleeding might be considered. Research is needed to study this further. Schattauer Stuttgart.

  15. Assessing the reproducibility of high definition urethral pressure profilometry and its correlation with an air-charged system.

    Science.gov (United States)

    Klünder, Mario; Amend, Bastian; Sawodny, Oliver; Stenzl, Arnulf; Ederer, Michael; Kelp, Alexandra; Sievert, Karl-Dietrich; Feuer, Ronny

    2017-06-01

    Recently, a new urodynamic method for the assessment of stress urinary incontinence called high definition urethral pressure profilometry (HD-UPP) has been introduced. This method combines a novel microtip catheter with advanced signal processing to enable spatial data location and the reconstruction of a pressure image inside the urethra. In order to assess the reproducibility of HD-UPP data, we statistically evaluate HD-UPP datasets and compare them to data from a double balloon air-charged system. Both catheters are used on sedated female minipigs. Data from the microtip catheter are processed through a signal reconstruction algorithm, urodynamic features are extracted, and compared to the air-charged system. Reproducibility of HD-UPP data is assessed by statistically evaluating consecutive, intra-individual datasets. HD-UPP delivers results in agreement with previous comparisons of microtip and air-charged systems. The average deviation of two consecutive, intra-individual pressure images is very low at 7 cm H 2 O. HD-UPP provides physicians with detailed information on the pressure distribution inside the urethra. Through comparison with an air-charged catheter, it is shown that HD-UPP delivers results in agreement with previous studies on the comparison of microtip and air-charged catheters. It provides excellent reproducibility, as the difference between sequentially measured profiles from the same minipig is significantly lower than the one between profiles from different minipigs. © 2016 Wiley Periodicals, Inc.

  16. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  17. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  18. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  19. In vivo reproducibility of robotic probe placement for an integrated US-CT image-guided radiation therapy system

    Science.gov (United States)

    Lediju Bell, Muyinatu A.; Sen, H. Tutkun; Iordachita, Iulian; Kazanzides, Peter; Wong, John

    2014-03-01

    Radiation therapy is used to treat cancer by delivering high-dose radiation to a pre-defined target volume. Ultrasound (US) has the potential to provide real-time, image-guidance of radiation therapy to identify when a target moves outside of the treatment volume (e.g. due to breathing), but the associated probe-induced tissue deformation causes local anatomical deviations from the treatment plan. If the US probe is placed to achieve similar tissue deformations in the CT images required for treatment planning, its presence causes streak artifacts that will interfere with treatment planning calculations. To overcome these challenges, we propose robot-assisted placement of a real ultrasound probe, followed by probe removal and replacement with a geometrically-identical, CT-compatible model probe. This work is the first to investigate in vivo deformation reproducibility with the proposed approach. A dog's prostate, liver, and pancreas were each implanted with three 2.38-mm spherical metallic markers, and the US probe was placed to visualize the implanted markers in each organ. The real and model probes were automatically removed and returned to the same position (i.e. position control), and CT images were acquired with each probe placement. The model probe was also removed and returned with the same normal force measured with the real US probe (i.e. force control). Marker positions in CT images were analyzed to determine reproducibility, and a corollary reproducibility study was performed on ex vivo tissue. In vivo results indicate that tissue deformations with the real probe were repeatable under position control for the prostate, liver, and pancreas, with median 3D reproducibility of 0.3 mm, 0.3 mm, and 1.6 mm, respectively, compared to 0.6 mm for the ex vivo tissue. For the prostate, the mean 3D tissue displacement errors between the real and model probes were 0.2 mm under position control and 0.6 mm under force control, which are both within acceptable

  20. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...... source. Toluene was degraded under aerobic conditions at a constant temperature of 28 degreesC. The experiments were modelled by a Monod model - extended to meet the air/liquid system, and the parameter values were estimated using a statistical nonlinear estimation procedure. Model reduction analysis...... resulted in a simpler model without the biomass decay term. In order to test for model reduction and reproducibility of parameter estimates, a likelihood ratio test was employed. The limited reproducibility for these experiments implied that all 9 batch experiments could not be described by the same set...

  1. Three-dimensional accuracy and interfractional reproducibility of patient fixation and positioning using a stereotactic head mask system

    International Nuclear Information System (INIS)

    Karger, Christian P.; Jaekel, Oliver; Debus, Juergen; Kuhn, Sabine; Hartmann, Guenther H.

    2001-01-01

    Purpose: Conformal radiotherapy in the head and neck region requires precise and reproducible patient setup. The definition of safety margins around the clinical target volume has to take into account uncertainties of fixation and positioning. Data are presented to quantify the involved uncertainties for the system used. Methods and Materials: Interfractional reproducibility of fixation and positioning of a target point in the brain was evaluated by biplanar films. 118 film pairs obtained at 52 fractions in 4 patients were analyzed. The setup was verified at the actual treatment table position by diagnostic X-ray units aligned to the isocenter and by a stereotactic X-ray localization technique. The stereotactic coordinates of the treated isocenter, of fiducials on the mask, and of implanted internal markers within the patient were measured to determine systematic and random errors. The data are corrected for uncertainty of the localization method. Results: Displacements in target point positioning were 0.35±0.41 mm, 1.22±0.25 mm, and -0.74±0.32 mm in the x, y, and z direction, respectively. The reproducibility of the fixation of the patient's head within the mask was 0.48 mm (x), 0.67 mm (y), and 0.72 mm (z). Rotational uncertainties around an axis parallel to the x, y, and z axis were 0.72 deg., 0.43 deg., and 0.70 deg., respectively. A simulation, based on the acquired data, yields a typical radial overall uncertainty for positioning and fixation of 1.80±0.60 mm. Conclusions: The applied setup technique showed to be highly reproducible. The data suggest that for the applied technique, a safety margin between clinical and planning target volume of 1-2 mm along one axis is sufficient for a target at the base of skull

  2. Monitoring microbiological changes in drinking water systems using a fast and reproducible flow cytometric method

    KAUST Repository

    Prest, Emmanuelle I E C; Hammes, Frederik A.; Kö tzsch, Stefan; van Loosdrecht, Mark C.M.; Vrouwenvelder, Johannes S.

    2013-01-01

    Flow cytometry (FCM) is a rapid, cultivation-independent tool to assess and evaluate bacteriological quality and biological stability of water. Here we demonstrate that a stringent, reproducible staining protocol combined with fixed FCM operational and gating settings is essential for reliable quantification of bacteria and detection of changes in aquatic bacterial communities. Triplicate measurements of diverse water samples with this protocol typically showed relative standard deviation values and 95% confidence interval values below 2.5% on all the main FCM parameters. We propose a straightforward and instrument-independent method for the characterization of water samples based on the combination of bacterial cell concentration and fluorescence distribution. Analysis of the fluorescence distribution (or so-called fluorescence fingerprint) was accomplished firstly through a direct comparison of the raw FCM data and subsequently simplified by quantifying the percentage of large and brightly fluorescent high nucleic acid (HNA) content bacteria in each sample. Our approach enables fast differentiation of dissimilar bacterial communities (less than 15min from sampling to final result), and allows accurate detection of even small changes in aquatic environments (detection above 3% change). Demonstrative studies on (a) indigenous bacterial growth in water, (b) contamination of drinking water with wastewater, (c) household drinking water stagnation and (d) mixing of two drinking water types, univocally showed that this FCM approach enables detection and quantification of relevant bacterial water quality changes with high sensitivity. This approach has the potential to be used as a new tool for application in the drinking water field, e.g. for rapid screening of the microbial water quality and stability during water treatment and distribution in networks and premise plumbing. © 2013 Elsevier Ltd.

  3. Monitoring microbiological changes in drinking water systems using a fast and reproducible flow cytometric method

    KAUST Repository

    Prest, Emmanuelle I E C

    2013-12-01

    Flow cytometry (FCM) is a rapid, cultivation-independent tool to assess and evaluate bacteriological quality and biological stability of water. Here we demonstrate that a stringent, reproducible staining protocol combined with fixed FCM operational and gating settings is essential for reliable quantification of bacteria and detection of changes in aquatic bacterial communities. Triplicate measurements of diverse water samples with this protocol typically showed relative standard deviation values and 95% confidence interval values below 2.5% on all the main FCM parameters. We propose a straightforward and instrument-independent method for the characterization of water samples based on the combination of bacterial cell concentration and fluorescence distribution. Analysis of the fluorescence distribution (or so-called fluorescence fingerprint) was accomplished firstly through a direct comparison of the raw FCM data and subsequently simplified by quantifying the percentage of large and brightly fluorescent high nucleic acid (HNA) content bacteria in each sample. Our approach enables fast differentiation of dissimilar bacterial communities (less than 15min from sampling to final result), and allows accurate detection of even small changes in aquatic environments (detection above 3% change). Demonstrative studies on (a) indigenous bacterial growth in water, (b) contamination of drinking water with wastewater, (c) household drinking water stagnation and (d) mixing of two drinking water types, univocally showed that this FCM approach enables detection and quantification of relevant bacterial water quality changes with high sensitivity. This approach has the potential to be used as a new tool for application in the drinking water field, e.g. for rapid screening of the microbial water quality and stability during water treatment and distribution in networks and premise plumbing. © 2013 Elsevier Ltd.

  4. Closed-channel culture system for efficient and reproducible differentiation of human pluripotent stem cells into islet cells

    International Nuclear Information System (INIS)

    Hirano, Kunio; Konagaya, Shuhei; Turner, Alexander; Noda, Yuichiro; Kitamura, Shigeru; Kotera, Hidetoshi; Iwata, Hiroo

    2017-01-01

    Human pluripotent stem cells (hPSCs) are thought to be a promising cell-source solution for regenerative medicine due to their indefinite proliferative potential and ability to differentiate to functional somatic cells. However, issues remain with regard to achieving reproducible differentiation of cells with the required functionality for realizing human transplantation therapies and with regard to reducing the potential for bacterial or fungal contamination. To meet these needs, we have developed a closed-channel culture device and corresponding control system. Uniformly-sized spheroidal hPSCs aggregates were formed inside wells within a closed-channel and maintained continuously throughout the culture process. Functional islet-like endocrine cell aggregates were reproducibly induced following a 30-day differentiation protocol. Our system shows an easily scalable, novel method for inducing PSC differentiation with both purity and functionality. - Highlights: • A simple, closed-channel-based, semi-automatic culture system is proposed. • Uniform cell aggregate formation and culture is realized in microwell structure. • Functional islet cells are successfully induced following 30-plus-day protocol. • System requires no daily medium replacement and reduces contamination risk.

  5. Reproducibility of “The bethesda system for reporting thyroid cytopathology:” A retrospective analysis of 107 patients

    Directory of Open Access Journals (Sweden)

    Pragati Awasthi

    2018-01-01

    Full Text Available Objectives: Fine-needle aspiration cytology (FNAC has emerged as an indispensable tool to discriminate thyroid lesions into benign or malignant for appropriate management. The need for simplicity of communication and standardization of terminology for thyroid FNAC reporting led to introduction of “The Bethesda system for reporting Thyroid Cytopathology” (TBSRTC in a conference held at the National Cancer Institute in 2007. This study aims at establishing the reproducibility of TBSRTC for diagnosing thyroid lesions. Materials and Methods: The present study comprised thyroid FNAC from 107 patients retrospectively over a period of 1.5 year (June 2013 to December 2014, which were reviewed by two trained cytopathologists and re-categorized according to TBSRTC. The interobserver variation and reproducibility of the reporting system was statistically assessed using Cohen's kappa. Results: The cytopathologists were in agreement in 98 out of 107 cases (91.5%. Maximum concordance was noted in benign category (91 of 96 cases; 92.85%, followed by 2 cases each in nondiagnostic/unsatisfactory (ND/US and follicular neoplasm/suspicious for follicular neoplasm (FN/SFN category (2.04% each and 1 case each in atypia of undetermined significance/follicular lesion of undetermined significance (AUS/FLUS, suspicious for malignancy (SUS, and malignant category (1.02% each. The highest diagnostic disagreement was noted among ND/US and benign and benign and FN/SFN categories. Conclusion: The utilization of TBSRTC for reporting thyroid cytology should be promoted in our country because it provides a homogeneous, standardized, and unanimous terminology for cytological diagnosis of thyroid lesions. The present study could substantiate the diagnostic reproducibility of this system.

  6. The quantum CP-violating kaon system reproduced in the electronic laboratory

    Science.gov (United States)

    Caruso, M.; Fanchiotti, H.; García Canal, C. A.; Mayosky, M.; Veiga, A.

    2016-11-01

    The equivalence between the Schrödinger dynamics of a quantum system with a finite number of basis states and a classical dynamics is realized in terms of electric networks. The isomorphism that connects in a univocal way both dynamical systems was applied to the case of neutral mesons, kaons in particular, and the class of electric networks univocally related to the quantum system was analysed. Moreover, under CPT invariance, the relevant ɛ parameter that measures CP violation in the kaon system is reinterpreted in terms of network parameters. All these results were explicitly shown by means of both a numerical simulation of the implied networks and by constructing the corresponding circuits.

  7. The Computable Catchment: An executable document for model-data software sharing, reproducibility and interactive visualization

    Science.gov (United States)

    Gil, Y.; Duffy, C.

    2015-12-01

    This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.

  8. Validation of EURO-CORDEX regional climate models in reproducing the variability of precipitation extremes in Romania

    Science.gov (United States)

    Dumitrescu, Alexandru; Busuioc, Aristita

    2016-04-01

    EURO-CORDEX is the European branch of the international CORDEX initiative that aims to provide improved regional climate change projections for Europe. The main objective of this paper is to document the performance of the individual models in reproducing the variability of precipitation extremes in Romania. Here three EURO-CORDEX regional climate models (RCMs) ensemble (scenario RCP4.5) are analysed and inter-compared: DMI-HIRHAM5, KNMI-RACMO2.2 and MPI-REMO. Compared to previous studies, when the RCM validation regarding the Romanian climate has mainly been made on mean state and at station scale, a more quantitative approach of precipitation extremes is proposed. In this respect, to have a more reliable comparison with observation, a high resolution daily precipitation gridded data set was used as observational reference (CLIMHYDEX project). The comparison between the RCM outputs and observed grid point values has been made by calculating three extremes precipitation indices, recommended by the Expert Team on Climate Change Detection Indices (ETCCDI), for the 1976-2005 period: R10MM, annual count of days when precipitation ≥10mm; RX5DAY, annual maximum 5-day precipitation and R95P%, precipitation fraction of annual total precipitation due to daily precipitation > 95th percentile. The RCMs capability to reproduce the mean state for these variables, as well as the main modes of their spatial variability (given by the first three EOF patterns), are analysed. The investigation confirms the ability of RCMs to simulate the main features of the precipitation extreme variability over Romania, but some deficiencies in reproducing of their regional characteristics were found (for example, overestimation of the mea state, especially over the extra Carpathian regions). This work has been realised within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian

  9. Failure of Stadard Optical Models to Reproduce Neutron Total Cross Section Difference in the W Isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, J D; Bauer, R W; Dietrich, F S; Grimes, S M; Finlay, R W; Abfalterer, W P; Bateman, F B; Haight, R C; Morgan, G L; Bauge, E; Delaroche, J P; Romain, P

    2001-11-01

    Recently cross section differences among the isotopes{sup 182,184,186}W have been measured as part of a study of total cross sections in the 5-560 MeV energy range. These measurements show oscillations up to 150 mb between 5 and 100 MeV. Spherical and deformed phenomenological optical potentials with typical radial and isospin dependences show very small oscillations, in disagreement with the data. In a simple Ramsauer model, this discrepancy can be traced to a cancellation between radial and isospin effects. Understanding this problem requires a more detailed model that incorporates a realistic description of the neutron and proton density distributions. This has been done with results of Hartree-Fock-Bogolyubov calculations using the Gogny force, together with a microscopic folding model employing a modification of the JLM potential as an effective interaction. This treatment yields a satisfactory interpretation of the observed total cross section differences.

  10. A simple branching model that reproduces language family and language population distributions

    Science.gov (United States)

    Schwämmle, Veit; de Oliveira, Paulo Murilo Castro

    2009-07-01

    Human history leaves fingerprints in human languages. Little is known about language evolution and its study is of great importance. Here we construct a simple stochastic model and compare its results to statistical data of real languages. The model is based on the recent finding that language changes occur independently of the population size. We find agreement with the data additionally assuming that languages may be distinguished by having at least one among a finite, small number of different features. This finite set is also used in order to define the distance between two languages, similarly to linguistics tradition since Swadesh.

  11. Assessment of the urban water system with an open, reproducible process applied to Chicago

    Science.gov (United States)

    Urban water systems convey complex environmental and man-made flows. The relationships among water flows and networked storages remains difficult to comprehensively evaluate. Such evaluation is important, however, as interventions are designed (e.g, conservation measures, green...

  12. The ability of a GCM-forced hydrological model to reproduce global discharge variability

    NARCIS (Netherlands)

    Sperna Weiland, F.C.; Beek, L.P.H. van; Kwadijk, J.C.J.; Bierkens, M.F.P.

    2010-01-01

    Data from General Circulation Models (GCMs) are often used to investigate hydrological impacts of climate change. However GCM data are known to have large biases, especially for precipitation. In this study the usefulness of GCM data for hydrological studies, with focus on discharge variability

  13. Establishing a Reproducible Hypertrophic Scar following Thermal Injury: A Porcine Model

    Directory of Open Access Journals (Sweden)

    Scott J. Rapp, MD

    2015-02-01

    Conclusions: Deep partial-thickness thermal injury to the back of domestic swine produces an immature hypertrophic scar by 10 weeks following burn with thickness appearing to coincide with the location along the dorsal axis. With minimal pig to pig variation, we describe our technique to provide a testable immature scar model.

  14. Evaluating the systemic right ventricle by CMR: the importance of consistent and reproducible delineation of the cavity

    Directory of Open Access Journals (Sweden)

    van Dijk Arie PJ

    2008-08-01

    Full Text Available Abstract Background The method used to delineate the boundary of the right ventricle (RV, relative to the trabeculations and papillary muscles in cardiovascular magnetic resonance (CMR ventricular volume analysis, may matter more when these structures are hypertrophied than in individuals with normal cardiovascular anatomy. This study aimed to compare two methods of cavity delineation in patients with systemic RV. Methods Twenty-nine patients (mean age 34.7 ± 12.4 years with a systemic RV (12 with congenitally corrected transposition of the great arteries (ccTGA and 17 with atrially switched (TGA underwent CMR. We compared measurements of systemic RV volumes and function using two analysis protocols. The RV trabeculations and papillary muscles were either included in the calculated blood volume, the boundary drawn immediately within the apparently compacted myocardial layer, or they were manually outlined and excluded. RV stroke volume (SV calculated using each method was compared with corresponding left ventricular (LV SV. Additionally, we compared the differences in analysis time, and in intra- and inter-observer variability between the two methods. Paired samples t-test was used to test for differences in volumes, function and analysis time between the two methods. Differences in intra- and inter-observer reproducibility were tested using an extension of the Bland-Altman method. Results The inclusion of trabeculations and papillary muscles in the ventricular volume resulted in higher values for systemic RV end diastolic volume (mean difference 28.7 ± 10.6 ml, p Conclusion The choice of method for systemic RV cavity delineation significantly affected volume measurements, given the CMR acquisition and analysis systems used. We recommend delineation outside the trabeculations for routine clinical measurements of systemic RV volumes as this approach took less time and gave more reproducible measurements.

  15. A Versatile and Reproducible Multi-Frequency Electrical Impedance Tomography System

    Directory of Open Access Journals (Sweden)

    James Avery

    2017-01-01

    Full Text Available A highly versatile Electrical Impedance Tomography (EIT system, nicknamed the ScouseTom, has been developed. The system allows control over current amplitude, frequency, number of electrodes, injection protocol and data processing. Current is injected using a Keithley 6221 current source, and voltages are recorded with a 24-bit EEG system with minimum bandwidth of 3.2 kHz. Custom PCBs interface with a PC to control the measurement process, electrode addressing and triggering of external stimuli. The performance of the system was characterised using resistor phantoms to represent human scalp recordings, with an SNR of 77.5 dB, stable across a four hour recording and 20 Hz to 20 kHz. In studies of both haeomorrhage using scalp electrodes, and evoked activity using epicortical electrode mats in rats, it was possible to reconstruct images matching established literature at known areas of onset. Data collected using scalp electrode in humans matched known tissue impedance spectra and was stable over frequency. The experimental procedure is software controlled and is readily adaptable to new paradigms. Where possible, commercial or open-source components were used, to minimise the complexity in reproduction. The hardware designs and software for the system have been released under an open source licence, encouraging contributions and allowing for rapid replication.

  16. Energy and nutrient deposition and excretion in the reproducing sow: model development and evaluation

    DEFF Research Database (Denmark)

    Hansen, A V; Strathe, A B; Theil, Peter Kappel

    2014-01-01

    requirements for maintenance, and fetal and maternal growth were described. In the lactating module, a factorial approach was used to estimate requirements for maintenance, milk production, and maternal growth. The priority for nutrient partitioning was assumed to be in the order of maintenance, milk...... production, and maternal growth with body tissue losses constrained within biological limits. Global sensitivity analysis showed that nonlinearity in the parameters was small. The model outputs considered were the total protein and fat deposition, average urinary and fecal N excretion, average methane...... emission, manure carbon excretion, and manure production. The model was evaluated using independent data sets from the literature using root mean square prediction error (RMSPE) and concordance correlation coefficients. The gestation module predicted body fat gain better than body protein gain, which...

  17. Evaluation of Nitinol staples for the Lapidus arthrodesis in a reproducible biomechanical model

    Directory of Open Access Journals (Sweden)

    Nicholas Alexander Russell

    2015-12-01

    Full Text Available While the Lapidus procedure is a widely accepted technique for treatment of hallux valgus, the optimal fixation method to maintain joint stability remains controversial. The purpose of this study was to evaluate the biomechanical properties of new Shape Memory Alloy staples arranged in different configurations in a repeatable 1st Tarsometatarsal arthrodesis model. Ten sawbones models of the whole foot (n=5 per group were reconstructed using a single dorsal staple or two staples in a delta configuration. Each construct was mechanically tested in dorsal four-point bending, medial four-point bending, dorsal three-point bending and plantar cantilever bending with the staples activated at 37°C. The peak load, stiffness and plantar gapping were determined for each test. Pressure sensors were used to measure the contact force and area of the joint footprint in each group. There was a significant (p < 0.05 increase in peak load in the two staple constructs compared to the single staple constructs for all testing modalities. Stiffness also increased significantly in all tests except dorsal four-point bending. Pressure sensor readings showed a significantly higher contact force at time zero and contact area following loading in the two staple constructs (p < 0.05. Both groups completely recovered any plantar gapping following unloading and restored their initial contact footprint. The biomechanical integrity and repeatability of the models was demonstrated with no construct failures due to hardware or model breakdown. Shape memory alloy staples provide fixation with the ability to dynamically apply and maintain compression across a simulated arthrodesis following a range of loading conditions.

  18. Evaluation of Nitinol Staples for the Lapidus Arthrodesis in a Reproducible Biomechanical Model.

    Science.gov (United States)

    Russell, Nicholas A; Regazzola, Gianmarco; Aiyer, Amiethab; Nomura, Tomohiro; Pelletier, Matthew H; Myerson, Mark; Walsh, William R

    2015-01-01

    While the Lapidus procedure is a widely accepted technique for treatment of hallux valgus, the optimal fixation method to maintain joint stability remains controversial. The purpose of this study is to evaluate the biomechanical properties of new shape memory alloy (SMA) staples arranged in different configurations in a repeatable first tarsometatarsal arthrodesis model. Ten sawbones models of the whole foot (n = 5 per group) were reconstructed using a single dorsal staple or two staples in a delta configuration. Each construct was mechanically tested non-destructively in dorsal four-point bending, medial four-point bending, dorsal three-point bending, and plantar cantilever bending with the staples activated at 37°C. The peak load (newton), stiffness (newton per millimeter), and plantar gapping (millimeter) were determined for each test. Pressure sensors were used to measure the contact force and area of the joint footprint in each group. There was a statistically significant increase in peak load in the two staple constructs compared to the single staple constructs for all testing modalities with P values range from 0.016 to 0.000. Stiffness also increased significantly in all tests except dorsal four-point bending. Pressure sensor readings showed a significantly higher contact force at time zero (P = 0.037) and contact area following loading in the two staple constructs (P = 0.045). Both groups completely recovered any plantar gapping following unloading and restored their initial contact footprint. The biomechanical integrity and repeatability of the models was demonstrated with no construct failures due to hardware or model breakdown. SMA staples provide fixation with the ability to dynamically apply and maintain compression across a simulated arthrodesis following a range of loading conditions.

  19. Can lagrangian models reproduce the migration time of European eel obtained from otolith analysis?

    Science.gov (United States)

    Rodríguez-Díaz, L.; Gómez-Gesteira, M.

    2017-12-01

    European eel can be found at the Bay of Biscay after a long migration across the Atlantic. The duration of migration, which takes place at larval stage, is of primary importance to understand eel ecology and, hence, its survival. This duration is still a controversial matter since it can range from 7 months to > 4 years depending on the method to estimate duration. The minimum migration duration estimated from our lagrangian model is similar to the duration obtained from the microstructure of eel otoliths, which is typically on the order of 7-9 months. The lagrangian model showed to be sensitive to different conditions like spatial and time resolution, release depth, release area and initial distribution. In general, migration showed to be faster when decreasing the depth and increasing the resolution of the model. In average, the fastest migration was obtained when only advective horizontal movement was considered. However, faster migration was even obtained in some cases when locally oriented random migration was taken into account.

  20. Reproducibility of the heat/capsaicin skin sensitization model in healthy volunteers

    Directory of Open Access Journals (Sweden)

    Cavallone LF

    2013-11-01

    Full Text Available Laura F Cavallone,1 Karen Frey,1 Michael C Montana,1 Jeremy Joyal,1 Karen J Regina,1 Karin L Petersen,2 Robert W Gereau IV11Department of Anesthesiology, Washington University in St Louis, School of Medicine, St Louis, MO, USA; 2California Pacific Medical Center Research Institute, San Francisco, CA, USAIntroduction: Heat/capsaicin skin sensitization is a well-characterized human experimental model to induce hyperalgesia and allodynia. Using this model, gabapentin, among other drugs, was shown to significantly reduce cutaneous hyperalgesia compared to placebo. Since the larger thermal probes used in the original studies to produce heat sensitization are now commercially unavailable, we decided to assess whether previous findings could be replicated with a currently available smaller probe (heated area 9 cm2 versus 12.5–15.7 cm2.Study design and methods: After Institutional Review Board approval, 15 adult healthy volunteers participated in two study sessions, scheduled 1 week apart (Part A. In both sessions, subjects were exposed to the heat/capsaicin cutaneous sensitization model. Areas of hypersensitivity to brush stroke and von Frey (VF filament stimulation were measured at baseline and after rekindling of skin sensitization. Another group of 15 volunteers was exposed to an identical schedule and set of sensitization procedures, but, in each session, received either gabapentin or placebo (Part B.Results: Unlike previous reports, a similar reduction of areas of hyperalgesia was observed in all groups/sessions. Fading of areas of hyperalgesia over time was observed in Part A. In Part B, there was no difference in area reduction after gabapentin compared to placebo.Conclusion: When using smaller thermal probes than originally proposed, modifications of other parameters of sensitization and/or rekindling process may be needed to allow the heat/capsaicin sensitization protocol to be used as initially intended. Standardization and validation of

  1. A computational model incorporating neural stem cell dynamics reproduces glioma incidence across the lifespan in the human population.

    Directory of Open Access Journals (Sweden)

    Roman Bauer

    Full Text Available Glioma is the most common form of primary brain tumor. Demographically, the risk of occurrence increases until old age. Here we present a novel computational model to reproduce the probability of glioma incidence across the lifespan. Previous mathematical models explaining glioma incidence are framed in a rather abstract way, and do not directly relate to empirical findings. To decrease this gap between theory and experimental observations, we incorporate recent data on cellular and molecular factors underlying gliomagenesis. Since evidence implicates the adult neural stem cell as the likely cell-of-origin of glioma, we have incorporated empirically-determined estimates of neural stem cell number, cell division rate, mutation rate and oncogenic potential into our model. We demonstrate that our model yields results which match actual demographic data in the human population. In particular, this model accounts for the observed peak incidence of glioma at approximately 80 years of age, without the need to assert differential susceptibility throughout the population. Overall, our model supports the hypothesis that glioma is caused by randomly-occurring oncogenic mutations within the neural stem cell population. Based on this model, we assess the influence of the (experimentally indicated decrease in the number of neural stem cells and increase of cell division rate during aging. Our model provides multiple testable predictions, and suggests that different temporal sequences of oncogenic mutations can lead to tumorigenesis. Finally, we conclude that four or five oncogenic mutations are sufficient for the formation of glioma.

  2. Realizing the Living Paper using the ProvONE Model for Reproducible Research

    Science.gov (United States)

    Jones, M. B.; Jones, C. S.; Ludäscher, B.; Missier, P.; Walker, L.; Slaughter, P.; Schildhauer, M.; Cuevas-Vicenttín, V.

    2015-12-01

    Science has advanced through traditional publications that codify research results as a permenant part of the scientific record. But because publications are static and atomic, researchers can only cite and reference a whole work when building on prior work of colleagues. The open source software model has demonstrated a new approach in which strong version control in an open environment can nurture an open ecosystem of software. Developers now commonly fork and extend software giving proper credit, with less repetition, and with confidence in the relationship to original software. Through initiatives like 'Beyond the PDF', an analogous model has been imagined for open science, in which software, data, analyses, and derived products become first class objects within a publishing ecosystem that has evolved to be finer-grained and is realized through a web of linked open data. We have prototyped a Living Paper concept by developing the ProvONE provenance model for scientific workflows, with prototype deployments in DataONE. ProvONE promotes transparency and openness by describing the authenticity, origin, structure, and processing history of research artifacts and by detailing the steps in computational workflows that produce derived products. To realize the Living Paper, we decompose scientific papers into their constituent products and publish these as compound objects in the DataONE federation of archival repositories. Each individual finding and sub-product of a reseach project (such as a derived data table, a workflow or script, a figure, an image, or a finding) can be independently stored, versioned, and cited. ProvONE provenance traces link these fine-grained products within and across versions of a paper, and across related papers that extend an original analysis. This allows for open scientific publishing in which researchers extend and modify findings, creating a dynamic, evolving web of results that collectively represent the scientific enterprise. The

  3. A discrete particle model reproducing collective dynamics of a bee swarm.

    Science.gov (United States)

    Bernardi, Sara; Colombi, Annachiara; Scianna, Marco

    2018-02-01

    In this article, we present a microscopic discrete mathematical model describing collective dynamics of a bee swarm. More specifically, each bee is set to move according to individual strategies and social interactions, the former involving the desire to reach a target destination, the latter accounting for repulsive/attractive stimuli and for alignment processes. The insects tend in fact to remain sufficiently close to the rest of the population, while avoiding collisions, and they are able to track and synchronize their movement to the flight of a given set of neighbors within their visual field. The resulting collective behavior of the bee cloud therefore emerges from non-local short/long-range interactions. Differently from similar approaches present in the literature, we here test different alignment mechanisms (i.e., based either on an Euclidean or on a topological neighborhood metric), which have an impact also on the other social components characterizing insect behavior. A series of numerical realizations then shows the phenomenology of the swarm (in terms of pattern configuration, collective productive movement, and flight synchronization) in different regions of the space of free model parameters (i.e., strength of attractive/repulsive forces, extension of the interaction regions). In this respect, constraints in the possible variations of such coefficients are here given both by reasonable empirical observations and by analytical results on some stability characteristics of the defined pairwise interaction kernels, which have to assure a realistic crystalline configuration of the swarm. An analysis of the effect of unconscious random fluctuations of bee dynamics is also provided. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Reproducing multi-model ensemble average with Ensemble-averaged Reconstructed Forcings (ERF) in regional climate modeling

    Science.gov (United States)

    Erfanian, A.; Fomenko, L.; Wang, G.

    2016-12-01

    Multi-model ensemble (MME) average is considered the most reliable for simulating both present-day and future climates. It has been a primary reference for making conclusions in major coordinated studies i.e. IPCC Assessment Reports and CORDEX. The biases of individual models cancel out each other in MME average, enabling the ensemble mean to outperform individual members in simulating the mean climate. This enhancement however comes with tremendous computational cost, which is especially inhibiting for regional climate modeling as model uncertainties can originate from both RCMs and the driving GCMs. Here we propose the Ensemble-based Reconstructed Forcings (ERF) approach to regional climate modeling that achieves a similar level of bias reduction at a fraction of cost compared with the conventional MME approach. The new method constructs a single set of initial and boundary conditions (IBCs) by averaging the IBCs of multiple GCMs, and drives the RCM with this ensemble average of IBCs to conduct a single run. Using a regional climate model (RegCM4.3.4-CLM4.5), we tested the method over West Africa for multiple combination of (up to six) GCMs. Our results indicate that the performance of the ERF method is comparable to that of the MME average in simulating the mean climate. The bias reduction seen in ERF simulations is achieved by using more realistic IBCs in solving the system of equations underlying the RCM physics and dynamics. This endows the new method with a theoretical advantage in addition to reducing computational cost. The ERF output is an unaltered solution of the RCM as opposed to a climate state that might not be physically plausible due to the averaging of multiple solutions with the conventional MME approach. The ERF approach should be considered for use in major international efforts such as CORDEX. Key words: Multi-model ensemble, ensemble analysis, ERF, regional climate modeling

  5. Acute multi-sgRNA knockdown of KEOPS complex genes reproduces the microcephaly phenotype of the stable knockout zebrafish model.

    Directory of Open Access Journals (Sweden)

    Tilman Jobst-Schwan

    Full Text Available Until recently, morpholino oligonucleotides have been widely employed in zebrafish as an acute and efficient loss-of-function assay. However, off-target effects and reproducibility issues when compared to stable knockout lines have compromised their further use. Here we employed an acute CRISPR/Cas approach using multiple single guide RNAs targeting simultaneously different positions in two exemplar genes (osgep or tprkb to increase the likelihood of generating mutations on both alleles in the injected F0 generation and to achieve a similar effect as morpholinos but with the reproducibility of stable lines. This multi single guide RNA approach resulted in median likelihoods for at least one mutation on each allele of >99% and sgRNA specific insertion/deletion profiles as revealed by deep-sequencing. Immunoblot showed a significant reduction for Osgep and Tprkb proteins. For both genes, the acute multi-sgRNA knockout recapitulated the microcephaly phenotype and reduction in survival that we observed previously in stable knockout lines, though milder in the acute multi-sgRNA knockout. Finally, we quantify the degree of mutagenesis by deep sequencing, and provide a mathematical model to quantitate the chance for a biallelic loss-of-function mutation. Our findings can be generalized to acute and stable CRISPR/Cas targeting for any zebrafish gene of interest.

  6. The perceptual influence of the cabin acoustics on the reproduced sound of a car audio system

    DEFF Research Database (Denmark)

    Kaplanis, Neofytos; Bech, Søren; Sakari, Tervo

    2015-01-01

    -end car audio system was performed for different physical settings of the car's cabin. A novel spatial auralization methodology was then used, and participants were asked to describe verbally the perceived acoustical characteristics of the stimuli. The elicited attributes were then analyzed following...... a previous review [Kaplanis et al., in 55th Int. Conf. Aud. Eng. Soc. (2014)] and possible links to the acoustical properties of the car cabin are discussed. [This study is a part of Marie Curie Network on Dereverberation and Reverberation of Audio, Music, and Speech. EU-FP7 under agreement ITN-GA-2012-316969.]...

  7. The diverse broad-band light-curves of Swift GRBs reproduced with the cannonball model

    CERN Document Server

    Dado, Shlomo; De Rújula, A

    2009-01-01

    Two radiation mechanisms, inverse Compton scattering (ICS) and synchrotron radiation (SR), suffice within the cannonball (CB) model of long gamma ray bursts (LGRBs) and X-ray flashes (XRFs) to provide a very simple and accurate description of their observed prompt emission and afterglows. Simple as they are, the two mechanisms and the burst environment generate the rich structure of the light curves at all frequencies and times. This is demonstrated for 33 selected Swift LGRBs and XRFs, which are well sampled from early time until late time and well represent the entire diversity of the broad band light curves of Swift LGRBs and XRFs. Their prompt gamma-ray and X-ray emission is dominated by ICS of glory light. During their fast decline phase, ICS is taken over by SR which dominates their broad band afterglow. The pulse shape and spectral evolution of the gamma-ray peaks and the early-time X-ray flares, and even the delayed optical `humps' in XRFs, are correctly predicted. The canonical and non-canonical X-ra...

  8. Reproducibility of patient positioning during routine radiotherapy, as assessed by an integrated megavoltage imaging system

    International Nuclear Information System (INIS)

    Gildersleve, J.; Dearnaley, D.P.; Evans, P.M.; Swindell, W.

    1995-01-01

    A portal imaging system has been used, in conjunction with a movie measurement technique to measure set-up errors for 15 patients treated with radiotherapy of the pelvis and for 12 patients treated with radiotherapy of the brain. The pelvic patients were treated without fixation devices and the brain patients were treated with individually-moulded plastic shells. As would be expected the brain treatments were found to be more accurate than the pelvic treatments. Results are presented in terms of five error types: random error from treatment to treatment, error between mean treatment position and simulation position, random simulation error, systematic simulator-to-treatment errors and total treatment error. For the brain patients the simulation-to-treatment error predominates and random treatment errors were small (95% ≤ 3 mm, 77% ≤ 1.5 mm). Vector components of the systematic simulation-to-treatment errors were 1-2 mm with maximal random simulation error of ± 5 mm (2 S.D.). There is much interest in the number of verification films necessary to evaluate treatment accuracy. These results indicate that one check film performed at the first treatment is likely to be sufficient for set-up evaluation. For the pelvis the random treatment error is larger (95% ≤ 4.5 mm, 87% ≤ 3 mm). The systematic simulation-to-treatment error is up to 3 mm and the maximal random simulation error is ± 6 mm (2 S.D.). Thus corrections made solely on the basis of a first day check film may not be sufficient for adequate set-up evaluation

  9. Efficient and reproducible myogenic differentiation from human iPS cells: prospects for modeling Miyoshi Myopathy in vitro.

    Directory of Open Access Journals (Sweden)

    Akihito Tanaka

    Full Text Available The establishment of human induced pluripotent stem cells (hiPSCs has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1 in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70-90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF. These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs.

  10. Efficient and Reproducible Myogenic Differentiation from Human iPS Cells: Prospects for Modeling Miyoshi Myopathy In Vitro

    Science.gov (United States)

    Tanaka, Akihito; Woltjen, Knut; Miyake, Katsuya; Hotta, Akitsu; Ikeya, Makoto; Yamamoto, Takuya; Nishino, Tokiko; Shoji, Emi; Sehara-Fujisawa, Atsuko; Manabe, Yasuko; Fujii, Nobuharu; Hanaoka, Kazunori; Era, Takumi; Yamashita, Satoshi; Isobe, Ken-ichi; Kimura, En; Sakurai, Hidetoshi

    2013-01-01

    The establishment of human induced pluripotent stem cells (hiPSCs) has enabled the production of in vitro, patient-specific cell models of human disease. In vitro recreation of disease pathology from patient-derived hiPSCs depends on efficient differentiation protocols producing relevant adult cell types. However, myogenic differentiation of hiPSCs has faced obstacles, namely, low efficiency and/or poor reproducibility. Here, we report the rapid, efficient, and reproducible differentiation of hiPSCs into mature myocytes. We demonstrated that inducible expression of myogenic differentiation1 (MYOD1) in immature hiPSCs for at least 5 days drives cells along the myogenic lineage, with efficiencies reaching 70–90%. Myogenic differentiation driven by MYOD1 occurred even in immature, almost completely undifferentiated hiPSCs, without mesodermal transition. Myocytes induced in this manner reach maturity within 2 weeks of differentiation as assessed by marker gene expression and functional properties, including in vitro and in vivo cell fusion and twitching in response to electrical stimulation. Miyoshi Myopathy (MM) is a congenital distal myopathy caused by defective muscle membrane repair due to mutations in DYSFERLIN. Using our induced differentiation technique, we successfully recreated the pathological condition of MM in vitro, demonstrating defective membrane repair in hiPSC-derived myotubes from an MM patient and phenotypic rescue by expression of full-length DYSFERLIN (DYSF). These findings not only facilitate the pathological investigation of MM, but could potentially be applied in modeling of other human muscular diseases by using patient-derived hiPSCs. PMID:23626698

  11. A Reliable and Reproducible Model for Assessing the Effect of Different Concentrations of α-Solanine on Rat Bone Marrow Mesenchymal Stem Cells

    Directory of Open Access Journals (Sweden)

    Adriana Ordóñez-Vásquez

    2017-01-01

    Full Text Available Αlpha-solanine (α-solanine is a glycoalkaloid present in potato (Solanum tuberosum. It has been of particular interest because of its toxicity and potential teratogenic effects that include abnormalities of the central nervous system, such as exencephaly, encephalocele, and anophthalmia. Various types of cell culture have been used as experimental models to determine the effect of α-solanine on cell physiology. The morphological changes in the mesenchymal stem cell upon exposure to α-solanine have not been established. This study aimed to describe a reliable and reproducible model for assessing the structural changes induced by exposure of mouse bone marrow mesenchymal stem cells (MSCs to different concentrations of α-solanine for 24 h. The results demonstrate that nonlethal concentrations of α-solanine (2–6 μM changed the morphology of the cells, including an increase in the number of nucleoli, suggesting elevated protein synthesis, and the formation of spicules. In addition, treatment with α-solanine reduced the number of adherent cells and the formation of colonies in culture. Immunophenotypic characterization and staining of MSCs are proposed as a reproducible method that allows description of cells exposed to the glycoalkaloid, α-solanine.

  12. TU-AB-BRC-05: Creation of a Monte Carlo TrueBeam Model by Reproducing Varian Phase Space Data

    International Nuclear Information System (INIS)

    O’Grady, K; Davis, S; Seuntjens, J

    2016-01-01

    Purpose: To create a Varian TrueBeam 6 MV FFF Monte Carlo model using BEAMnrc/EGSnrc that accurately reproduces the Varian representative dataset, followed by tuning the model’s source parameters to accurately reproduce in-house measurements. Methods: A BEAMnrc TrueBeam model for 6 MV FFF has been created by modifying a validated 6 MV Varian CL21EX model. Geometric dimensions and materials were adjusted in a trial and error approach to match the fluence and spectra of TrueBeam phase spaces output by the Varian VirtuaLinac. Once the model’s phase space matched Varian’s counterpart using the default source parameters, it was validated to match 10 × 10 cm"2 Varian representative data obtained with the IBA CC13. The source parameters were then tuned to match in-house 5 × 5 cm"2 PTW microDiamond measurements. All dose to water simulations included detector models to include the effects of volume averaging and the non-water equivalence of the chamber materials, allowing for more accurate source parameter selection. Results: The Varian phase space spectra and fluence were matched with excellent agreement. The in-house model’s PDD agreement with CC13 TrueBeam representative data was within 0.9% local percent difference beyond the first 3 mm. Profile agreement at 10 cm depth was within 0.9% local percent difference and 1.3 mm distance-to-agreement in the central axis and penumbra regions, respectively. Once the source parameters were tuned, PDD agreement with microDiamond measurements was within 0.9% local percent difference beyond 2 mm. The microDiamond profile agreement at 10 cm depth was within 0.6% local percent difference and 0.4 mm distance-to-agreement in the central axis and penumbra regions, respectively. Conclusion: An accurate in-house Monte Carlo model of the Varian TrueBeam was achieved independently of the Varian phase space solution and was tuned to in-house measurements. KO acknowledges partial support by the CREATE Medical Physics Research

  13. Respiratory-Gated Helical Computed Tomography of Lung: Reproducibility of Small Volumes in an Ex Vivo Model

    International Nuclear Information System (INIS)

    Biederer, Juergen; Dinkel, Julien; Bolte, Hendrik; Welzel, Thomas; Hoffmann, Beata M.Sc.; Thierfelder, Carsten; Mende, Ulrich; Debus, Juergen; Heller, Martin; Kauczor, Hans-Ulrich

    2007-01-01

    Purpose: Motion-adapted radiotherapy with gated irradiation or tracking of tumor positions requires dedicated imaging techniques such as four-dimensional (4D) helical computed tomography (CT) for patient selection and treatment planning. The objective was to evaluate the reproducibility of spatial information for small objects on respiratory-gated 4D helical CT using computer-assisted volumetry of lung nodules in a ventilated ex vivo system. Methods and Materials: Five porcine lungs were inflated inside a chest phantom and prepared with 55 artificial nodules (mean diameter, 8.4 mm ± 1.8). The lungs were respirated by a flexible diaphragm and scanned with 40-row detector CT (collimation, 24 x 1.2 mm; pitch, 0.1; rotation time, 1 s; slice thickness, 1.5 mm; increment, 0.8 mm). The 4D-CT scans acquired during respiration (eight per minute) and reconstructed at 0-100% inspiration and equivalent static scans were scored for motion-related artifacts (0 or absent to 3 or relevant). The reproducibility of nodule volumetry (three readers) was assessed using the variation coefficient (VC). Results: The mean volumes from the static and dynamic inspiratory scans were equal (364.9 and 360.8 mm 3 , respectively, p = 0.24). The static and dynamic end-expiratory volumes were slightly greater (371.9 and 369.7 mm 3 , respectively, p = 0.019). The VC for volumetry (static) was 3.1%, with no significant difference between 20 apical and 20 caudal nodules (2.6% and 3.5%, p = 0.25). In dynamic scans, the VC was greater (3.9%, p = 0.004; apical and caudal, 2.6% and 4.9%; p = 0.004), with a significant difference between static and dynamic in the 20 caudal nodules (3.5% and 4.9%, p = 0.015). This was consistent with greater motion-related artifacts and image noise at the diaphragm (p <0.05). The VC for interobserver variability was 0.6%. Conclusion: Residual motion-related artifacts had only minimal influence on volumetry of small solid lesions. This indicates a high reproducibility of

  14. Implementation of an experimental pilot reproducing the fouling of the exhaust gas recirculation system in diesel engines

    Directory of Open Access Journals (Sweden)

    Crepeau Gérald

    2012-04-01

    Full Text Available The European emission standards EURO 5 and EURO 6 define more stringent acceptable limits for exhaust emissions of new vehicles. The Exhaust Gas Recirculation (EGR system is a partial but essential solution for lowering the emission of nitrogen oxides and soot particulates. Yet, due to a more intensive use than in the past, the fouling of the EGR system is increased. Ensuring the reliability of the EGR system becomes a main challenge. In partnership with PSA Peugeot Citroën, we designed an experimental setup that mimics an operating EGR system. Its distinctive features are (1 its ability to reproduce precisely the operating conditions and (2 its ability to measure the temperature field on the heat exchanger surface with an Infra Red camera for detecting in real time the evolution of the fooling deposit based on its thermal resistance. Numerical codes are used in conjunction with this experimental setup to determine the evolution of the fouling thickness from its thermal resistance.

  15. Interpretative intra- and interobserver reproducibility of Stress/Rest 99m Tc-steamboat's myocardial perfusion SPECT using semi quantitative 20-segment model

    International Nuclear Information System (INIS)

    Fazeli, M.; Firoozi, F.

    2002-01-01

    It well established that myocardial perfusion SPECT with 201 T L or 99 mTc-se sta mi bi play an important role diagnosis and risk assessment in patients with known or suspected coronary artery disease. Both quantitative and qualitative methods are available for interpretation of images. The use of a semi quantitative scoring system in which each of 20 segments is scored according to a five-point scheme provides an approach to interpretation that is more systematic and reproducible than simple qualitative evaluation. Only a limited number of studies have dealt with the interpretive observer reproducibility of 99 mTc-steamboat's myocardial perfusion imaging. The aim of this study was to assess the intra-and inter observer variability of semi quantitative SPECT performed with this technique. Among 789 patients that underwent myocardial perfusion SPECT during last year 80 patients finally need to coronary angiography as gold standard. In this group of patients a semi quantitative visual interpretation was carried out using short axis and vertical long-axis myocardial tomograms and a 20-segments model. These segments we reassigned on six evenly spaced regions in the apical, mid-ventricular, and basal short-axis view and two apical segments on the mid-ventricular long-axis slice. Uptake in each segment was graded on a 5-point scale (0=normal, 1=equivocal, 2=moderate, 3=severe, 4=absence of uptake). The steamboat's images was interpreted separately w ice by two observers without knowledge of each other's findings or results of angiography. A SPECT study was judged abnormal if there were two or more segments with a stress score equal or more than 2. We con eluded that semi-quantitative visual analysis is a simple and reproducible method of interpretation

  16. Can CFMIP2 models reproduce the leading modes of cloud vertical structure in the CALIPSO-GOCCP observations?

    Science.gov (United States)

    Wang, Fang; Yang, Song

    2018-02-01

    Using principal component (PC) analysis, three leading modes of cloud vertical structure (CVS) are revealed by the GCM-Oriented CALIPSO Cloud Product (GOCCP), i.e. tropical high, subtropical anticyclonic and extratropical cyclonic cloud modes (THCM, SACM and ECCM, respectively). THCM mainly reflect the contrast between tropical high clouds and clouds in middle/high latitudes. SACM is closely associated with middle-high clouds in tropical convective cores, few-cloud regimes in subtropical anticyclonic clouds and stratocumulus over subtropical eastern oceans. ECCM mainly corresponds to clouds along extratropical cyclonic regions. Models of phase 2 of Cloud Feedback Model Intercomparison Project (CFMIP2) well reproduce the THCM, but SACM and ECCM are generally poorly simulated compared to GOCCP. Standardized PCs corresponding to CVS modes are generally captured, whereas original PCs (OPCs) are consistently underestimated (overestimated) for THCM (SACM and ECCM) by CFMIP2 models. The effects of CVS modes on relative cloud radiative forcing (RSCRF/RLCRF) (RSCRF being calculated at the surface while RLCRF at the top of atmosphere) are studied in terms of principal component regression method. Results show that CFMIP2 models tend to overestimate (underestimated or simulate the opposite sign) RSCRF/RLCRF radiative effects (REs) of ECCM (THCM and SACM) in unit global mean OPC compared to observations. These RE biases may be attributed to two factors, one of which is underestimation (overestimation) of low/middle clouds (high clouds) (also known as stronger (weaker) REs in unit low/middle (high) clouds) in simulated global mean cloud profiles, the other is eigenvector biases in CVS modes (especially for SACM and ECCM). It is suggested that much more attention should be paid on improvement of CVS, especially cloud parameterization associated with particular physical processes (e.g. downwelling regimes with the Hadley circulation, extratropical storm tracks and others), which

  17. Eccentric Contraction-Induced Muscle Injury: Reproducible, Quantitative, Physiological Models to Impair Skeletal Muscle's Capacity to Generate Force.

    Science.gov (United States)

    Call, Jarrod A; Lowe, Dawn A

    2016-01-01

    In order to investigate the molecular and cellular mechanisms of muscle regeneration an experimental injury model is required. Advantages of eccentric contraction-induced injury are that it is a controllable, reproducible, and physiologically relevant model to cause muscle injury, with injury being defined as a loss of force generating capacity. While eccentric contractions can be incorporated into conscious animal study designs such as downhill treadmill running, electrophysiological approaches to elicit eccentric contractions and examine muscle contractility, for example before and after the injurious eccentric contractions, allows researchers to circumvent common issues in determining muscle function in a conscious animal (e.g., unwillingness to participate). Herein, we describe in vitro and in vivo methods that are reliable, repeatable, and truly maximal because the muscle contractions are evoked in a controlled, quantifiable manner independent of subject motivation. Both methods can be used to initiate eccentric contraction-induced injury and are suitable for monitoring functional muscle regeneration hours to days to weeks post-injury.

  18. Is outdoor use of the six-minute walk test with a global positioning system in stroke patients' own neighbourhoods reproducible and valid?

    OpenAIRE

    Wevers, L.E.; Kwakkel, G.; Port, van de, I.G.

    2011-01-01

    Objective: To examine the reproducibility, responsiveness and concurrent validity of the six-minute walk test (6MWT) when tested outdoors in patients' own neighbourhoods using a global positioning system (GPS) or a measuring wheel. Methods: A total of 27 chronic stroke patients, discharged to their own homes, were tested twice, within 5 consecutive days. The 6MWT was conducted using a GPS and an measuring wheel simultaneously to determine walking distance. Reproducibility was determined as te...

  19. MRI assessment of knee osteoarthritis: Knee Osteoarthritis Scoring System (KOSS) - inter-observer and intra-observer reproducibility of a compartment-based scoring system

    International Nuclear Information System (INIS)

    Kornaat, Peter R.; Ceulemans, Ruth Y.T.; Kroon, Herman M.; Bloem, Johan L.; Riyazi, Naghmeh; Kloppenburg, Margreet; Carter, Wayne O.; Woodworth, Thasia G.

    2005-01-01

    To develop a scoring system for quantifying osteoarthritic changes of the knee as identified by magnetic resonance (MR) imaging, and to determine its inter- and intra-observer reproducibility, in order to monitor medical therapy in research studies. Two independent observers evaluated 25 consecutive MR examinations of the knee in patients with previously defined clinical symptoms and radiological signs of osteoarthritis. We acquired on a 1.5 T system: coronal and sagittal proton density- and T2-weighted dual spin echo (SE) images, sagittal three-dimensional T1-weighted gradient echo (GE) images with fat suppression, and axial dual turbo SE images with fat suppression. Images were scored for the presence of cartilaginous lesions, osteophytes, subchondral cysts, bone marrow edema, and for meniscal abnormalities. Presence and size of effusion, synovitis and Baker's cyst were recorded. All parameters were ranked on a previously defined, semiquantitative scale, reflecting increasing severity of findings. Kappa, weighted kappa and intraclass correlation coefficient (ICC) were used to determine inter- and intra-observer variability. Inter-observer reproducibility was good (ICC value 0.77). Inter- and intra-observer reproducibility for individual parameters was good to very good (inter-observer ICC value 0.63-0.91; intra-observer ICC value 0.76-0.96). The presented comprehensive MR scoring system for osteoarthritic changes of the knee has a good to very good inter-observer and intra-observer reproducibility. Thus the score form with its definitions can be used for standardized assessment of osteoarthritic changes to monitor medical therapy in research studies. (orig.)

  20. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    Directory of Open Access Journals (Sweden)

    Sonal eGoswami

    2012-06-01

    Full Text Available Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e. that require the hippocampus or not the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze. Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma, PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  1. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome.

    Science.gov (United States)

    Goswami, Sonal; Samuel, Sherin; Sierra, Olga R; Cascardi, Michele; Paré, Denis

    2012-01-01

    Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD) remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e., that require the hippocampus or not) the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze (EPM). Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  2. Reproducing the organic matter model of anthropogenic dark earth of Amazonia and testing the ecotoxicity of functionalized charcoal compounds

    Directory of Open Access Journals (Sweden)

    Carolina Rodrigues Linhares

    2012-05-01

    Full Text Available The objective of this work was to obtain organic compounds similar to the ones found in the organic matter of anthropogenic dark earth of Amazonia (ADE using a chemical functionalization procedure on activated charcoal, as well as to determine their ecotoxicity. Based on the study of the organic matter from ADE, an organic model was proposed and an attempt to reproduce it was described. Activated charcoal was oxidized with the use of sodium hypochlorite at different concentrations. Nuclear magnetic resonance was performed to verify if the spectra of the obtained products were similar to the ones of humic acids from ADE. The similarity between spectra indicated that the obtained products were polycondensed aromatic structures with carboxyl groups: a soil amendment that can contribute to soil fertility and to its sustainable use. An ecotoxicological test with Daphnia similis was performed on the more soluble fraction (fulvic acids of the produced soil amendment. Aryl chloride was formed during the synthesis of the organic compounds from activated charcoal functionalization and partially removed through a purification process. However, it is probable that some aryl chloride remained in the final product, since the ecotoxicological test indicated that the chemical functionalized soil amendment is moderately toxic.

  3. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  4. Reproducibility and accuracy of linear measurements on dental models derived from cone-beam computed tomography compared with digital dental casts

    NARCIS (Netherlands)

    Waard, O. de; Rangel, F.A.; Fudalej, P.S.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Breuning, K.H.

    2014-01-01

    INTRODUCTION: The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models

  5. A computational model for histone mark propagation reproduces the distribution of heterochromatin in different human cell types.

    Science.gov (United States)

    Schwämmle, Veit; Jensen, Ole Nørregaard

    2013-01-01

    Chromatin is a highly compact and dynamic nuclear structure that consists of DNA and associated proteins. The main organizational unit is the nucleosome, which consists of a histone octamer with DNA wrapped around it. Histone proteins are implicated in the regulation of eukaryote genes and they carry numerous reversible post-translational modifications that control DNA-protein interactions and the recruitment of chromatin binding proteins. Heterochromatin, the transcriptionally inactive part of the genome, is densely packed and contains histone H3 that is methylated at Lys 9 (H3K9me). The propagation of H3K9me in nucleosomes along the DNA in chromatin is antagonizing by methylation of H3 Lysine 4 (H3K4me) and acetylations of several lysines, which is related to euchromatin and active genes. We show that the related histone modifications form antagonized domains on a coarse scale. These histone marks are assumed to be initiated within distinct nucleation sites in the DNA and to propagate bi-directionally. We propose a simple computer model that simulates the distribution of heterochromatin in human chromosomes. The simulations are in agreement with previously reported experimental observations from two different human cell lines. We reproduced different types of barriers between heterochromatin and euchromatin providing a unified model for their function. The effect of changes in the nucleation site distribution and of propagation rates were studied. The former occurs mainly with the aim of (de-)activation of single genes or gene groups and the latter has the power of controlling the transcriptional programs of entire chromosomes. Generally, the regulatory program of gene transcription is controlled by the distribution of nucleation sites along the DNA string.

  6. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  7. Reproducibility of image quality for moving objects using respiratory-gated computed tomography. A study using a phantom model

    International Nuclear Information System (INIS)

    Fukumitsu, Nobuyoshi; Ishida, Masaya; Terunuma, Toshiyuki

    2012-01-01

    To investigate the reproducibility of computed tomography (CT) imaging quality in respiratory-gated radiation treatment planning is essential in radiotherapy of movable tumors. Seven series of regular and six series of irregular respiratory motions were performed using a thorax dynamic phantom. For the regular respiratory motions, the respiratory cycle was changed from 2.5 to 4 s and the amplitude was changed from 4 to 10 mm. For the irregular respiratory motions, a cycle of 2.5 to 4 or an amplitude of 4 to 10 mm was added to the base data (id est (i.e.) 3.5-s cycle, 6-mm amplitude) every three cycles. Images of the object were acquired six times using respiratory-gated data acquisition. The volume of the object was calculated and the reproducibility of the volume was decided based on the variety. The registered image of the object was added and the reproducibility of the shape was decided based on the degree of overlap of objects. The variety in the volumes and shapes differed significantly as the respiratory cycle changed according to regular respiratory motions. In irregular respiratory motion, shape reproducibility was further inferior, and the percentage of overlap among the six images was 35.26% in the 2.5- and 3.5-s cycle mixed group. Amplitude changes did not produce significant differences in the variety of the volumes and shapes. Respiratory cycle changes reduced the reproducibility of the image quality in respiratory-gated CT. (author)

  8. Attempting to train a digital human model to reproduce human subject reach capabilities in an ejection seat aircraft

    NARCIS (Netherlands)

    Zehner, G.F.; Hudson, J.A.; Oudenhuijzen, A.

    2006-01-01

    From 1997 through 2002, the Air Force Research Lab and TNO Defence, Security and Safety (Business Unit Human Factors) were involved in a series of tests to quantify the accuracy of five Human Modeling Systems (HMSs) in determining accommodation limits of ejection seat aircraft. The results of these

  9. Modelling impacts of performance on the probability of reproducing, and thereby on productive lifespan, allow prediction of lifetime efficiency in dairy cows.

    Science.gov (United States)

    Phuong, H N; Blavy, P; Martin, O; Schmidely, P; Friggens, N C

    2016-01-01

    Reproductive success is a key component of lifetime efficiency - which is the ratio of energy in milk (MJ) to energy intake (MJ) over the lifespan, of cows. At the animal level, breeding and feeding management can substantially impact milk yield, body condition and energy balance of cows, which are known as major contributors to reproductive failure in dairy cattle. This study extended an existing lifetime performance model to incorporate the impacts that performance changes due to changing breeding and feeding strategies have on the probability of reproducing and thereby on the productive lifespan, and thus allow the prediction of a cow's lifetime efficiency. The model is dynamic and stochastic, with an individual cow being the unit modelled and one day being the unit of time. To evaluate the model, data from a French study including Holstein and Normande cows fed high-concentrate diets and data from a Scottish study including Holstein cows selected for high and average genetic merit for fat plus protein that were fed high- v. low-concentrate diets were used. Generally, the model consistently simulated productive and reproductive performance of various genotypes of cows across feeding systems. In the French data, the model adequately simulated the reproductive performance of Holsteins but significantly under-predicted that of Normande cows. In the Scottish data, conception to first service was comparably simulated, whereas interval traits were slightly under-predicted. Selection for greater milk production impaired the reproductive performance and lifespan but not lifetime efficiency. The definition of lifetime efficiency used in this model did not include associated costs or herd-level effects. Further works should include such economic indicators to allow more accurate simulation of lifetime profitability in different production scenarios.

  10. Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000

    Science.gov (United States)

    Sanchez-Gomez, Emilia; Somot, S.; Déqué, M.

    2009-10-01

    One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation.

  11. Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000

    Energy Technology Data Exchange (ETDEWEB)

    Somot, S.; Deque, M. [Meteo-France CNRM/GMGEC CNRS/GAME, Toulouse (France); Sanchez-Gomez, Emilia

    2009-10-15

    One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation. (orig.)

  12. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  13. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  14. Is outdoor use of the six-minute walk test with a global positioning system in stroke patients' own neighbourhoods reproducible and valid?

    NARCIS (Netherlands)

    Wevers, L.E.; Kwakkel, G.; van de Port, I.G.

    2011-01-01

    Objective: To examine the reproducibility, responsiveness and concurrent validity of the six-minute walk test (6MWT) when tested outdoors in patients' own neighbourhoods using a global positioning system (GPS) or a measuring wheel. Methods: A total of 27 chronic stroke patients, discharged to their

  15. Development of a Three-Dimensional Hand Model Using Three-Dimensional Stereophotogrammetry: Assessment of Image Reproducibility.

    Directory of Open Access Journals (Sweden)

    Inge A Hoevenaren

    Full Text Available Using three-dimensional (3D stereophotogrammetry precise images and reconstructions of the human body can be produced. Over the last few years, this technique is mainly being developed in the field of maxillofacial reconstructive surgery, creating fusion images with computed tomography (CT data for precise planning and prediction of treatment outcome. Though, in hand surgery 3D stereophotogrammetry is not yet being used in clinical settings.A total of 34 three-dimensional hand photographs were analyzed to investigate the reproducibility. For every individual, 3D photographs were captured at two different time points (baseline T0 and one week later T1. Using two different registration methods, the reproducibility of the methods was analyzed. Furthermore, the differences between 3D photos of men and women were compared in a distance map as a first clinical pilot testing our registration method.The absolute mean registration error for the complete hand was 1.46 mm. This reduced to an error of 0.56 mm isolating the region to the palm of the hand. When comparing hands of both sexes, it was seen that the male hand was larger (broader base and longer fingers than the female hand.This study shows that 3D stereophotogrammetry can produce reproducible images of the hand without harmful side effects for the patient, so proving to be a reliable method for soft tissue analysis. Its potential use in everyday practice of hand surgery needs to be further explored.

  16. Reproducibility of Quantitative Brain Imaging Using a PET-Only and a Combined PET/MR System

    Directory of Open Access Journals (Sweden)

    Martin L. Lassen

    2017-07-01

    Full Text Available The purpose of this study was to test the feasibility of migrating a quantitative brain imaging protocol from a positron emission tomography (PET-only system to an integrated PET/MR system. Potential differences in both absolute radiotracer concentration as well as in the derived kinetic parameters as a function of PET system choice have been investigated. Five healthy volunteers underwent dynamic (R-[11C]verapamil imaging on the same day using a GE-Advance (PET-only and a Siemens Biograph mMR system (PET/MR. PET-emission data were reconstructed using a transmission-based attenuation correction (AC map (PET-only, whereas a standard MR-DIXON as well as a low-dose CT AC map was applied to PET/MR emission data. Kinetic modeling based on arterial blood sampling was performed using a 1-tissue-2-rate constant compartment model, yielding kinetic parameters (K1 and k2 and distribution volume (VT. Differences for parametric values obtained in the PET-only and the PET/MR systems were analyzed using a 2-way Analysis of Variance (ANOVA. Comparison of DIXON-based AC (PET/MR with emission data derived from the PET-only system revealed average inter-system differences of −33 ± 14% (p < 0.05 for the K1 parameter and −19 ± 9% (p < 0.05 for k2. Using a CT-based AC for PET/MR resulted in slightly lower systematic differences of −16 ± 18% for K1 and −9 ± 10% for k2. The average differences in VT were −18 ± 10% (p < 0.05 for DIXON- and −8 ± 13% for CT-based AC. Significant systematic differences were observed for kinetic parameters derived from emission data obtained from PET/MR and PET-only imaging due to different standard AC methods employed. Therefore, a transfer of imaging protocols from PET-only to PET/MR systems is not straightforward without application of proper correction methods.Clinical Trial Registration:www.clinicaltrialsregister.eu, identifier 2013-001724-19

  17. Developing a Collection of Composable Data Translation Software Units to Improve Efficiency and Reproducibility in Ecohydrologic Modeling Workflows

    Science.gov (United States)

    Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.

    2017-12-01

    Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of

  18. Reproducibility study of [{sup 18}F]FPP(RGD){sub 2} uptake in murine models of human tumor xenografts

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Edwin; Liu, Shuangdong; Chin, Frederick; Cheng, Zhen [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Gowrishankar, Gayatri; Yaghoubi, Shahriar [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Wedgeworth, James Patrick [Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Berndorff, Dietmar; Gekeler, Volker [Bayer Schering Pharma AG, Global Drug Discovery, Berlin (Germany); Gambhir, Sanjiv S. [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Canary Center at Stanford for Cancer Early Detection, Nuclear Medicine, Departments of Radiology and Bioengineering, Molecular Imaging Program at Stanford, Stanford, CA (United States)

    2011-04-15

    An {sup 18}F-labeled PEGylated arginine-glycine-aspartic acid (RGD) dimer [{sup 18}F]FPP(RGD){sub 2} has been used to image tumor {alpha}{sub v}{beta}{sub 3} integrin levels in preclinical and clinical studies. Serial positron emission tomography (PET) studies may be useful for monitoring antiangiogenic therapy response or for drug screening; however, the reproducibility of serial scans has not been determined for this PET probe. The purpose of this study was to determine the reproducibility of the integrin {alpha}{sub v}{beta}{sub 3}-targeted PET probe, [{sup 18}F ]FPP(RGD){sub 2} using small animal PET. Human HCT116 colon cancer xenografts were implanted into nude mice (n = 12) in the breast and scapular region and grown to mean diameters of 5-15 mm for approximately 2.5 weeks. A 3-min acquisition was performed on a small animal PET scanner approximately 1 h after administration of [{sup 18}F]FPP(RGD){sub 2} (1.9-3.8 MBq, 50-100 {mu}Ci) via the tail vein. A second small animal PET scan was performed approximately 6 h later after reinjection of the probe to assess for reproducibility. Images were analyzed by drawing an ellipsoidal region of interest (ROI) around the tumor xenograft activity. Percentage injected dose per gram (%ID/g) values were calculated from the mean or maximum activity in the ROIs. Coefficients of variation and differences in %ID/g values between studies from the same day were calculated to determine the reproducibility. The coefficient of variation (mean {+-}SD) for %ID{sub mean}/g and %ID{sub max}/g values between [{sup 18}F]FPP(RGD){sub 2} small animal PET scans performed 6 h apart on the same day were 11.1 {+-} 7.6% and 10.4 {+-} 9.3%, respectively. The corresponding differences in %ID{sub mean}/g and %ID{sub max}/g values between scans were -0.025 {+-} 0.067 and -0.039 {+-} 0.426. Immunofluorescence studies revealed a direct relationship between extent of {alpha}{sub {nu}}{beta}{sub 3} integrin expression in tumors and tumor vasculature

  19. Controlling the reproducibility of Coulomb blockade phenomena for gold nanoparticles on an organic monolayer/silicon system.

    Science.gov (United States)

    Caillard, L; Sattayaporn, S; Lamic-Humblot, A-F; Casale, S; Campbell, P; Chabal, Y J; Pluchery, O

    2015-02-13

    Two types of highly ordered organic layers were prepared on silicon modified with an amine termination for binding gold nanoparticles (AuNPs). These two grafted organic monolayers (GOMs), consisting of alkyl chains with seven or 11 carbon atoms, were grafted on oxide-free Si(111) surfaces as tunnel barriers between the silicon electrode and the AuNPs. Three kinds of colloidal AuNPs were prepared by reducing HAuCl4 with three different reactants: citrate (Turkevich synthesis, diameter ∼16 nm), ascorbic acid (diameter ∼9 nm), or NaBH4 (Natan synthesis, diameter ∼7 nm). Scanning tunnel spectroscopy (STS) was performed in a UHV STM at 40 K, and Coulomb blockade behaviour was observed. The reproducibility of the Coulomb behavior was analysed as a function of several chemical and physical parameters: size, crystallinity of the AuNPs, influence of surrounding surfactant molecules, and quality of the GOM/Si interface (degree of oxidation after the full processing). Samples were characterized with scanning tunneling microscope, STS, atomic force microscope, Fourier transform infrared spectroscopy, x-ray photoelectron spectroscopy (XPS), and high resolution transmission electronic microscope. We show that the reproducibility in observing Coulomb behavior can be as high as ∼80% with the Natan synthesis of AuNPs and GOMs with short alkyl chains.

  20. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  1. Synchronized mammalian cell culture: part II--population ensemble modeling and analysis for development of reproducible processes.

    Science.gov (United States)

    Jandt, Uwe; Barradas, Oscar Platas; Pörtner, Ralf; Zeng, An-Ping

    2015-01-01

    The consideration of inherent population inhomogeneities of mammalian cell cultures becomes increasingly important for systems biology study and for developing more stable and efficient processes. However, variations of cellular properties belonging to different sub-populations and their potential effects on cellular physiology and kinetics of culture productivity under bioproduction conditions have not yet been much in the focus of research. Culture heterogeneity is strongly determined by the advance of the cell cycle. The assignment of cell-cycle specific cellular variations to large-scale process conditions can be optimally determined based on the combination of (partially) synchronized cultivation under otherwise physiological conditions and subsequent population-resolved model adaptation. The first step has been achieved using the physical selection method of countercurrent flow centrifugal elutriation, recently established in our group for different mammalian cell lines which is presented in Part I of this paper series. In this second part, we demonstrate the successful adaptation and application of a cell-cycle dependent population balance ensemble model to describe and understand synchronized bioreactor cultivations performed with two model mammalian cell lines, AGE1.HNAAT and CHO-K1. Numerical adaptation of the model to experimental data allows for detection of phase-specific parameters and for determination of significant variations between different phases and different cell lines. It shows that special care must be taken with regard to the sampling frequency in such oscillation cultures to minimize phase shift (jitter) artifacts. Based on predictions of long-term oscillation behavior of a culture depending on its start conditions, optimal elutriation setup trade-offs between high cell yields and high synchronization efficiency are proposed. © 2014 American Institute of Chemical Engineers.

  2. Magni Reproducibility Example

    DEFF Research Database (Denmark)

    2016-01-01

    An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set.......An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set....

  3. Reproducibility of Carbon and Water Cycle by an Ecosystem Process Based Model Using a Weather Generator and Effect of Temporal Concentration of Precipitation on Model Outputs

    Science.gov (United States)

    Miyauchi, T.; Machimura, T.

    2014-12-01

    GCM is generally used to produce input weather data for the simulation of carbon and water cycle by ecosystem process based models under climate change however its temporal resolution is sometimes incompatible to requirement. A weather generator (WG) is used for temporal downscaling of input weather data for models, where the effect of WG algorithms on reproducibility of ecosystem model outputs must be assessed. In this study simulated carbon and water cycle by Biome-BGC model using weather data measured and generated by CLIMGEN weather generator were compared. The measured weather data (daily precipitation, maximum, minimum air temperature) at a few sites for 30 years was collected from NNDC Online weather data. The generated weather data was produced by CLIMGEN parameterized using the measured weather data. NPP, heterotrophic respiration (HR), NEE and water outflow were simulated by Biome-BGC using measured and generated weather data. In the case of deciduous broad leaf forest in Lushi, Henan Province, China, 30 years average monthly NPP by WG was 10% larger than that by measured weather in the growing season. HR by WG was larger than that by measured weather in all months by 15% in average. NEE by WG was more negative in winter and was close to that by measured weather in summer. These differences in carbon cycle were because the soil water content by WG was larger than that by measured weather. The difference between monthly water outflow by WG and by measured weather was large and variable, and annual outflow by WG was 50% of that by measured weather. The inconsistency in carbon and water cycle by WG and measured weather was suggested be affected by the difference in temporal concentration of precipitation, which was assessed.

  4. The use of real-time cell analyzer technology in drug discovery: defining optimal cell culture conditions and assay reproducibility with different adherent cellular models.

    Science.gov (United States)

    Atienzar, Franck A; Tilmant, Karen; Gerets, Helga H; Toussaint, Gaelle; Speeckaert, Sebastien; Hanon, Etienne; Depelchin, Olympe; Dhalluin, Stephane

    2011-07-01

    The use of impedance-based label-free technology applied to drug discovery is nowadays receiving more and more attention. Indeed, such a simple and noninvasive assay that interferes minimally with cell morphology and function allows one to perform kinetic measurements and to obtain information on proliferation, migration, cytotoxicity, and receptor-mediated signaling. The objective of the study was to further assess the usefulness of a real-time cell analyzer (RTCA) platform based on impedance in the context of quality control and data reproducibility. The data indicate that this technology is useful to determine the best coating and cellular density conditions for different adherent cellular models including hepatocytes, cardiomyocytes, fibroblasts, and hybrid neuroblastoma/neuronal cells. Based on 31 independent experiments, the reproducibility of cell index data generated from HepG2 cells exposed to DMSO and to Triton X-100 was satisfactory, with a coefficient of variation close to 10%. Cell index data were also well reproduced when cardiomyocytes and fibroblasts were exposed to 21 compounds three times (correlation >0.91, p technology appears to be a powerful and reliable tool in drug discovery because of the reasonable throughput, rapid and efficient performance, technical optimization, and cell quality control.

  5. Three-dimensional surgical modelling with an open-source software protocol: study of precision and reproducibility in mandibular reconstruction with the fibula free flap.

    Science.gov (United States)

    Ganry, L; Quilichini, J; Bandini, C M; Leyder, P; Hersant, B; Meningaud, J P

    2017-08-01

    Very few surgical teams currently use totally independent and free solutions to perform three-dimensional (3D) surgical modelling for osseous free flaps in reconstructive surgery. This study assessed the precision and technical reproducibility of a 3D surgical modelling protocol using free open-source software in mandibular reconstruction with fibula free flaps and surgical guides. Precision was assessed through comparisons of the 3D surgical guide to the sterilized 3D-printed guide, determining accuracy to the millimetre level. Reproducibility was assessed in three surgical cases by volumetric comparison to the millimetre level. For the 3D surgical modelling, a difference of less than 0.1mm was observed. Almost no deformations (free flap modelling was between 0.1mm and 0.4mm, and the average precision of the complete reconstructed mandible was less than 1mm. The open-source software protocol demonstrated high accuracy without complications. However, the precision of the surgical case depends on the surgeon's 3D surgical modelling. Therefore, surgeons need training on the use of this protocol before applying it to surgical cases; this constitutes a limitation. Further studies should address the transfer of expertise. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  7. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Science.gov (United States)

    Nada, Rania M; Maal, Thomas J J; Breuning, K Hero; Bergé, Stefaan J; Mostafa, Yehya A; Kuijpers-Jagtman, Anne Marie

    2011-02-09

    Superimposition of serial Cone Beam Computed Tomography (CBCT) scans has become a valuable tool for three dimensional (3D) assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16) for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27) for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  8. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    OpenAIRE

    Goswami, Sonal; Samuel, Sherin; Sierra, Olga R.; Cascardi, Michele; Paré, Denis

    2012-01-01

    Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD) remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situ...

  9. Reproducibility of Quantitative Brain Imaging Using a PET-Only and a Combined PET/MR System

    DEFF Research Database (Denmark)

    Lassen, Martin L; Muzik, Otto; Beyer, Thomas

    2017-01-01

    The purpose of this study was to test the feasibility of migrating a quantitative brain imaging protocol from a positron emission tomography (PET)-only system to an integrated PET/MR system. Potential differences in both absolute radiotracer concentration as well as in the derived kinetic paramet...

  10. Assessment of a numerical model to reproduce event‐scale erosion and deposition distributions in a braided river

    Science.gov (United States)

    Measures, R.; Hicks, D. M.; Brasington, J.

    2016-01-01

    Abstract Numerical morphological modeling of braided rivers, using a physics‐based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth‐averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high‐flow event. Evaluation of model performance primarily focused upon using high‐resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach‐scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers. PMID:27708477

  11. Assessment of a numerical model to reproduce event-scale erosion and deposition distributions in a braided river.

    Science.gov (United States)

    Williams, R D; Measures, R; Hicks, D M; Brasington, J

    2016-08-01

    Numerical morphological modeling of braided rivers, using a physics-based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth-averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high-flow event. Evaluation of model performance primarily focused upon using high-resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach-scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers.

  12. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  13. Highly reproducible alkali metal doping system for organic crystals through enhanced diffusion of alkali metal by secondary thermal activation.

    Science.gov (United States)

    Lee, Jinho; Park, Chibeom; Song, Intek; Koo, Jin Young; Yoon, Taekyung; Kim, Jun Sung; Choi, Hee Cheul

    2018-05-16

    In this paper, we report an efficient alkali metal doping system for organic single crystals. Our system employs an enhanced diffusion method for the introduction of alkali metal into organic single crystals by controlling the sample temperature to induce secondary thermal activation. Using this system, we achieved intercalation of potassium into picene single crystals with closed packed crystal structures. Using optical microscopy and Raman spectroscopy, we confirmed that the resulting samples were uniformly doped and became K 2 picene single crystal, while only parts of the crystal are doped and transformed into K 2 picene without secondary thermal activation. Moreover, using a customized electrical measurement system, the insulator-to-semiconductor transition of picene single crystals upon doping was confirmed by in situ electrical conductivity and ex situ temperature-dependent resistivity measurements. X-ray diffraction studies showed that potassium atoms were intercalated between molecular layers of picene, and doped samples did not show any KH- nor KOH-related peaks, indicating that picene molecules are retained without structural decomposition. During recent decades, tremendous efforts have been exerted to develop high-performance organic semiconductors and superconductors, whereas as little attention has been devoted to doped organic crystals. Our method will enable efficient alkali metal doping of organic crystals and will be a resource for future systematic studies on the electrical property changes of these organic crystals upon doping.

  14. Reproducibility of the acute rejection diagnosis in human cardiac allografts. The Stanford Classification and the International Grading System

    DEFF Research Database (Denmark)

    Nielsen, H; Sørensen, Flemming Brandt; Nielsen, B

    1993-01-01

    Transplantation has become an accepted treatment of many cardiac end-stage diseases. Acute cellular rejection accounts for 15% to 20% of all graft failures. The first grading system of acute cellular rejection, the Stanford Classification, was introduced in 1979, and since then many other grading...

  15. Second edition of 'The Bethesda System for reporting cervical cytology' – atlas, website, and Bethesda interobserver reproducibility project

    Directory of Open Access Journals (Sweden)

    Nayar Ritu

    2004-10-01

    Full Text Available Abstract A joint task force of the American Society of Cytopathology (ASC and the National Cancer Institute (NCI recently completed a 2-year effort to revise the Bethesda System "blue book" atlas and develop a complementary web-based collection of cervical cytology images. The web-based collection of images is housed on the ASC website, which went live on November 5th, 2003; it can be directly accessed at http://www.cytopathology.org/NIH/.

  16. A CRPS-IgG-transfer-trauma model reproducing inflammatory and positive sensory signs associated with complex regional pain syndrome.

    Science.gov (United States)

    Tékus, Valéria; Hajna, Zsófia; Borbély, Éva; Markovics, Adrienn; Bagoly, Teréz; Szolcsányi, János; Thompson, Victoria; Kemény, Ágnes; Helyes, Zsuzsanna; Goebel, Andreas

    2014-02-01

    The aetiology of complex regional pain syndrome (CRPS), a highly painful, usually post-traumatic condition affecting the limbs, is unknown, but recent results have suggested an autoimmune contribution. To confirm a role for pathogenic autoantibodies, we established a passive-transfer trauma model. Prior to undergoing incision of hind limb plantar skin and muscle, mice were injected either with serum IgG obtained from chronic CRPS patients or matched healthy volunteers, or with saline. Unilateral hind limb plantar skin and muscle incision was performed to induce typical, mild tissue injury. Mechanical hyperalgesia, paw swelling, heat and cold sensitivity, weight-bearing ability, locomotor activity, motor coordination, paw temperature, and body weight were investigated for 8days. After sacrifice, proinflammatory sensory neuropeptides and cytokines were measured in paw tissues. CRPS patient IgG treatment significantly increased hind limb mechanical hyperalgesia and oedema in the incised paw compared with IgG from healthy subjects or saline. Plantar incision induced a remarkable elevation of substance P immunoreactivity on day 8, which was significantly increased by CRPS-IgG. In this IgG-transfer-trauma model for CRPS, serum IgG from chronic CRPS patients induced clinical and laboratory features resembling the human disease. These results support the hypothesis that autoantibodies may contribute to the pathophysiology of CRPS, and that autoantibody-removing therapies may be effective treatments for long-standing CRPS. Copyright © 2013 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  17. Reproducibility of The Abdominal and Chest Wall Position by Voluntary Breath-Hold Technique Using a Laser-Based Monitoring and Visual Feedback System

    International Nuclear Information System (INIS)

    Nakamura, Katsumasa; Shioyama, Yoshiyuki; Nomoto, Satoru; Ohga, Saiji; Toba, Takashi; Yoshitake, Tadamasa; Anai, Shigeo; Terashima, Hiromi; Honda, Hiroshi

    2007-01-01

    Purpose: The voluntary breath-hold (BH) technique is a simple method to control the respiration-related motion of a tumor during irradiation. However, the abdominal and chest wall position may not be accurately reproduced using the BH technique. The purpose of this study was to examine whether visual feedback can reduce the fluctuation in wall motion during BH using a new respiratory monitoring device. Methods and Materials: We developed a laser-based BH monitoring and visual feedback system. For this study, five healthy volunteers were enrolled. The volunteers, practicing abdominal breathing, performed shallow end-expiration BH (SEBH), shallow end-inspiration BH (SIBH), and deep end-inspiration BH (DIBH) with or without visual feedback. The abdominal and chest wall positions were measured at 80-ms intervals during BHs. Results: The fluctuation in the chest wall position was smaller than that of the abdominal wall position. The reproducibility of the wall position was improved by visual feedback. With a monitoring device, visual feedback reduced the mean deviation of the abdominal wall from 2.1 ± 1.3 mm to 1.5 ± 0.5 mm, 2.5 ± 1.9 mm to 1.1 ± 0.4 mm, and 6.6 ± 2.4 mm to 2.6 ± 1.4 mm in SEBH, SIBH, and DIBH, respectively. Conclusions: Volunteers can perform the BH maneuver in a highly reproducible fashion when informed about the position of the wall, although in the case of DIBH, the deviation in the wall position remained substantial

  18. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  19. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Directory of Open Access Journals (Sweden)

    Rania M Nada

    Full Text Available Superimposition of serial Cone Beam Computed Tomography (CBCT scans has become a valuable tool for three dimensional (3D assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16 for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27 for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  20. Reproducibility of haemodynamical simulations in a subject-specific stented aneurysm model--a report on the Virtual Intracranial Stenting Challenge 2007.

    Science.gov (United States)

    Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F

    2008-07-19

    This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and

  1. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes......, as well as overall preference, was based on consistency tests of binary paired-comparison judgments and on modeling the choice frequencies using probabilistic choice models. As a result, the preferences of non-expert listeners could be measured reliably at a ratio scale level. Principal components derived...

  2. Modeller af komplicerede systemer

    DEFF Research Database (Denmark)

    Mortensen, J.

    emphasizes their use in relation to technical systems. All the presented models, with the exception of the types presented in chapter 2, are non-theoretical non-formal conceptual network models. Two new model types are presented: 1) The System-Environment model, which describes the environments interaction...... with conceptual modeling in relation to process control. It´s purpose is to present classify and exemplify the use of a set of qualitative model types. Such model types are useful in the early phase of modeling, where no structured methods are at hand. Although the models are general in character, this thesis......This thesis, "Modeller af komplicerede systemer", represents part of the requirements for the Danish Ph.D.degree. Assisting professor John Nørgaard-Nielsen, M.Sc.E.E.Ph.D. has been principal supervisor and professor Morten Lind, M.Sc.E.E.Ph.D. has been assisting supervisor. The thesis is concerned...

  3. The systems integration modeling system

    International Nuclear Information System (INIS)

    Danker, W.J.; Williams, J.R.

    1990-01-01

    This paper discusses the systems integration modeling system (SIMS), an analysis tool for the detailed evaluation of the structure and related performance of the Federal Waste Management System (FWMS) and its interface with waste generators. It's use for evaluations in support of system-level decisions as to FWMS configurations, the allocation, sizing, balancing and integration of functions among elements, and the establishment of system-preferred waste selection and sequencing methods and other operating strategies is presented. SIMS includes major analysis submodels which quantify the detailed characteristics of individual waste items, loaded casks and waste packages, simulate the detailed logistics of handling and processing discrete waste items and packages, and perform detailed cost evaluations

  4. Magnet stability and reproducibility

    CERN Document Server

    Marks, N

    2010-01-01

    Magnet stability and reproducibility have become increasingly important as greater precision and beams with smaller dimension are required for research, medical and other purpose. The observed causes of mechanical and electrical instability are introduced and the engineering arrangements needed to minimize these problems discussed; the resulting performance of a state-of-the-art synchrotron source (Diamond) is then presented. The need for orbit feedback to obtain best possible beam stability is briefly introduced, but omitting any details of the necessary technical equipment, which is outside the scope of the presentation.

  5. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  6. Retrospective Correction of Physiological Noise: Impact on Sensitivity, Specificity, and Reproducibility of Resting-State Functional Connectivity in a Reading Network Model.

    Science.gov (United States)

    Krishnamurthy, Venkatagiri; Krishnamurthy, Lisa C; Schwam, Dina M; Ealey, Ashley; Shin, Jaemin; Greenberg, Daphne; Morris, Robin D

    2018-03-01

    It is well accepted that physiological noise (PN) obscures the detection of neural fluctuations in resting-state functional connectivity (rsFC) magnetic resonance imaging. However, a clear consensus for an optimal PN correction (PNC) methodology and how it can impact the rsFC signal characteristics is still lacking. In this study, we probe the impact of three PNC methods: RETROICOR: (Glover et al., 2000 ), ANATICOR: (Jo et al., 2010 ), and RVTMBPM: (Bianciardi et al., 2009 ). Using a reading network model, we systematically explore the effects of PNC optimization on sensitivity, specificity, and reproducibility of rsFC signals. In terms of specificity, ANATICOR was found to be effective in removing local white matter (WM) fluctuations and also resulted in aggressive removal of expected cortical-to-subcortical functional connections. The ability of RETROICOR to remove PN was equivalent to removal of simulated random PN such that it artificially inflated the connection strength, thereby decreasing sensitivity. RVTMBPM maintained specificity and sensitivity by balanced removal of vasodilatory PN and local WM nuisance edges. Another aspect of this work was exploring the effects of PNC on identifying reading group differences. Most PNC methods accounted for between-subject PN variability resulting in reduced intersession reproducibility. This effect facilitated the detection of the most consistent group differences. RVTMBPM was most effective in detecting significant group differences due to its inherent sensitivity to removing spatially structured and temporally repeating PN arising from dense vasculature. Finally, results suggest that combining all three PNC resulted in "overcorrection" by removing signal along with noise.

  7. Microbial community development in a dynamic gut model is reproducible, colon region specific, and selective for Bacteroidetes and Clostridium cluster IX.

    Science.gov (United States)

    Van den Abbeele, Pieter; Grootaert, Charlotte; Marzorati, Massimo; Possemiers, Sam; Verstraete, Willy; Gérard, Philippe; Rabot, Sylvie; Bruneau, Aurélia; El Aidy, Sahar; Derrien, Muriel; Zoetendal, Erwin; Kleerebezem, Michiel; Smidt, Hauke; Van de Wiele, Tom

    2010-08-01

    Dynamic, multicompartment in vitro gastrointestinal simulators are often used to monitor gut microbial dynamics and activity. These reactors need to harbor a microbial community that is stable upon inoculation, colon region specific, and relevant to in vivo conditions. Together with the reproducibility of the colonization process, these criteria are often overlooked when the modulatory properties from different treatments are compared. We therefore investigated the microbial colonization process in two identical simulators of the human intestinal microbial ecosystem (SHIME), simultaneously inoculated with the same human fecal microbiota with a high-resolution phylogenetic microarray: the human intestinal tract chip (HITChip). Following inoculation of the in vitro colon compartments, microbial community composition reached steady state after 2 weeks, whereas 3 weeks were required to reach functional stability. This dynamic colonization process was reproducible in both SHIME units and resulted in highly diverse microbial communities which were colon region specific, with the proximal regions harboring saccharolytic microbes (e.g., Bacteroides spp. and Eubacterium spp.) and the distal regions harboring mucin-degrading microbes (e.g., Akkermansia spp.). Importantly, the shift from an in vivo to an in vitro environment resulted in an increased Bacteroidetes/Firmicutes ratio, whereas Clostridium cluster IX (propionate producers) was enriched compared to clusters IV and XIVa (butyrate producers). This was supported by proportionally higher in vitro propionate concentrations. In conclusion, high-resolution analysis of in vitro-cultured gut microbiota offers new insight on the microbial colonization process and indicates the importance of digestive parameters that may be crucial in the development of new in vitro models.

  8. Reproducibility of isotope ratio measurements

    International Nuclear Information System (INIS)

    Elmore, D.

    1981-01-01

    The use of an accelerator as part of a mass spectrometer has improved the sensitivity for measuring low levels of long-lived radionuclides by several orders of magnitude. However, the complexity of a large tandem accelerator and beam transport system has made it difficult to match the precision of low energy mass spectrometry. Although uncertainties for accelerator measured isotope ratios as low as 1% have been obtained under favorable conditions, most errors quoted in the literature for natural samples are in the 5 to 20% range. These errors are dominated by statistics and generally the reproducibility is unknown since the samples are only measured once

  9. The Earth System Model

    Science.gov (United States)

    Schoeberl, Mark; Rood, Richard B.; Hildebrand, Peter; Raymond, Carol

    2003-01-01

    The Earth System Model is the natural evolution of current climate models and will be the ultimate embodiment of our geophysical understanding of the planet. These models are constructed from components - atmosphere, ocean, ice, land, chemistry, solid earth, etc. models and merged together through a coupling program which is responsible for the exchange of data from the components. Climate models and future earth system models will have standardized modules, and these standards are now being developed by the ESMF project funded by NASA. The Earth System Model will have a variety of uses beyond climate prediction. The model can be used to build climate data records making it the core of an assimilation system, and it can be used in OSSE experiments to evaluate. The computing and storage requirements for the ESM appear to be daunting. However, the Japanese ES theoretical computing capability is already within 20% of the minimum requirements needed for some 2010 climate model applications. Thus it seems very possible that a focused effort to build an Earth System Model will achieve succcss.

  10. RSMASS system model development

    International Nuclear Information System (INIS)

    Marshall, A.C.; Gallup, D.R.

    1998-01-01

    RSMASS system mass models have been used for more than a decade to make rapid estimates of space reactor power system masses. This paper reviews the evolution of the RSMASS models and summarizes present capabilities. RSMASS has evolved from a simple model used to make rough estimates of space reactor and shield masses to a versatile space reactor power system model. RSMASS uses unique reactor and shield models that permit rapid mass optimization calculations for a variety of space reactor power and propulsion systems. The RSMASS-D upgrade of the original model includes algorithms for the balance of the power system, a number of reactor and shield modeling improvements, and an automatic mass optimization scheme. The RSMASS-D suite of codes cover a very broad range of reactor and power conversion system options as well as propulsion and bimodal reactor systems. Reactor choices include in-core and ex-core thermionic reactors, liquid metal cooled reactors, particle bed reactors, and prismatic configuration reactors. Power conversion options include thermoelectric, thermionic, Stirling, Brayton, and Rankine approaches. Program output includes all major component masses and dimensions, efficiencies, and a description of the design parameters for a mass optimized system. In the past, RSMASS has been used as an aid to identify and select promising concepts for space power applications. The RSMASS modeling approach has been demonstrated to be a valuable tool for guiding optimization of the power system design; consequently, the model is useful during system design and development as well as during the selection process. An improved in-core thermionic reactor system model RSMASS-T is now under development. The current development of the RSMASS-T code represents the next evolutionary stage of the RSMASS models. RSMASS-T includes many modeling improvements and is planned to be more user-friendly. RSMASS-T will be released as a fully documented, certified code at the end of

  11. A highly reproducible solenoid micropump system for the analysis of total inorganic carbon and ammonium using gas-diffusion with conductimetric detection.

    Science.gov (United States)

    Henríquez, Camelia; Horstkotte, Burkhard; Cerdà, Víctor

    2014-01-01

    In this work, a simple, economic, and miniaturized flow-based analyzer based on solenoid micropumps is presented. It was applied to determine two parameters of high environmental interest: ammonium and total inorganic carbon (TIC) in natural waters. The method is based on gas diffusion (GD) of CO₂ and NH3 through a hydrophobic gas permeable membrane from an acidic or alkaline donor stream, respectively. The analytes are trapped in an acceptor solution, being slightly alkaline for CO₂ and slightly acidic for NH₃. The analytes are quantified using a homemade stainless steel conductimetric cell. The proposed system required five solenoid micro-pumps, one for each reagent and sample. Two especially made air bubble traps were placed down-stream of the solendoid pumps, which provided the acceptor solutions, by this increasing the method's reproducibility. Values of RSD lower than 1% were obtained. Achieved limits of detection were 0.27 µmol L⁻¹ for NH₄⁺ and 50 µmol L⁻¹ for TIC. Add-recovery tests were used to prove the trueness of the method and recoveries of 99.5 ± 7.5% were obtained for both analytes. The proposed system proved to be adequate for monitoring purpose of TIC and NH₄⁺ due to its high sample throughput and repeatability. © 2013 Published by Elsevier B.V.

  12. Systemic resilience model

    International Nuclear Information System (INIS)

    Lundberg, Jonas; Johansson, Björn JE

    2015-01-01

    It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies

  13. Selected System Models

    Science.gov (United States)

    Schmidt-Eisenlohr, F.; Puñal, O.; Klagges, K.; Kirsche, M.

    Apart from the general issue of modeling the channel, the PHY and the MAC of wireless networks, there are specific modeling assumptions that are considered for different systems. In this chapter we consider three specific wireless standards and highlight modeling options for them. These are IEEE 802.11 (as example for wireless local area networks), IEEE 802.16 (as example for wireless metropolitan networks) and IEEE 802.15 (as example for body area networks). Each section on these three systems discusses also at the end a set of model implementations that are available today.

  14. Reproducing early Martian atmospheric carbon dioxide partial pressure by modeling the formation of Mg-Fe-Ca carbonate identified in the Comanche rock outcrops on Mars

    Science.gov (United States)

    Berk, Wolfgang; Fu, Yunjiao; Ilger, Jan-Michael

    2012-10-01

    The well defined composition of the Comanche rock's carbonate (Magnesite0.62Siderite0.25Calcite0.11Rhodochrosite0.02) and its host rock's composition, dominated by Mg-rich olivine, enable us to reproduce the atmospheric CO2partial pressure that may have triggered the formation of these carbonates. Hydrogeochemical one-dimensional transport modeling reveals that similar aqueous rock alteration conditions (including CO2partial pressure) may have led to the formation of Mg-Fe-Ca carbonate identified in the Comanche rock outcrops (Gusev Crater) and also in the ultramafic rocks exposed in the Nili Fossae region. Hydrogeochemical conditions enabling the formation of Mg-rich solid solution carbonate result from equilibrium species distributions involving (1) ultramafic rocks (ca. 32 wt% olivine; Fo0.72Fa0.28), (2) pure water, and (3) CO2partial pressures of ca. 0.5 to 2.0 bar at water-to-rock ratios of ca. 500 molH2O mol-1rock and ca. 5°C (278 K). Our modeled carbonate composition (Magnesite0.64Siderite0.28Calcite0.08) matches the measured composition of carbonates preserved in the Comanche rocks. Considerably different carbonate compositions are achieved at (1) higher temperature (85°C), (2) water-to-rock ratios considerably higher and lower than 500 mol mol-1 and (3) CO2partial pressures differing from 1.0 bar in the model set up. The Comanche rocks, hosting the carbonate, may have been subjected to long-lasting (>104 to 105 years) aqueous alteration processes triggered by atmospheric CO2partial pressures of ca. 1.0 bar at low temperature. Their outcrop may represent a fragment of the upper layers of an altered olivine-rich rock column, which is characterized by newly formed Mg-Fe-Ca solid solution carbonate, and phyllosilicate-rich alteration assemblages within deeper (unexposed) units.

  15. Modeling cellular systems

    CERN Document Server

    Matthäus, Franziska; Pahle, Jürgen

    2017-01-01

    This contributed volume comprises research articles and reviews on topics connected to the mathematical modeling of cellular systems. These contributions cover signaling pathways, stochastic effects, cell motility and mechanics, pattern formation processes, as well as multi-scale approaches. All authors attended the workshop on "Modeling Cellular Systems" which took place in Heidelberg in October 2014. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  16. Modeling Sustainable Food Systems.

    Science.gov (United States)

    Allen, Thomas; Prosperi, Paolo

    2016-05-01

    The processes underlying environmental, economic, and social unsustainability derive in part from the food system. Building sustainable food systems has become a predominating endeavor aiming to redirect our food systems and policies towards better-adjusted goals and improved societal welfare. Food systems are complex social-ecological systems involving multiple interactions between human and natural components. Policy needs to encourage public perception of humanity and nature as interdependent and interacting. The systemic nature of these interdependencies and interactions calls for systems approaches and integrated assessment tools. Identifying and modeling the intrinsic properties of the food system that will ensure its essential outcomes are maintained or enhanced over time and across generations, will help organizations and governmental institutions to track progress towards sustainability, and set policies that encourage positive transformations. This paper proposes a conceptual model that articulates crucial vulnerability and resilience factors to global environmental and socio-economic changes, postulating specific food and nutrition security issues as priority outcomes of food systems. By acknowledging the systemic nature of sustainability, this approach allows consideration of causal factor dynamics. In a stepwise approach, a logical application is schematized for three Mediterranean countries, namely Spain, France, and Italy.

  17. Modeling Complex Systems

    CERN Document Server

    Boccara, Nino

    2010-01-01

    Modeling Complex Systems, 2nd Edition, explores the process of modeling complex systems, providing examples from such diverse fields as ecology, epidemiology, sociology, seismology, and economics. It illustrates how models of complex systems are built and provides indispensable mathematical tools for studying their dynamics. This vital introductory text is useful for advanced undergraduate students in various scientific disciplines, and serves as an important reference book for graduate students and young researchers. This enhanced second edition includes: . -recent research results and bibliographic references -extra footnotes which provide biographical information on cited scientists who have made significant contributions to the field -new and improved worked-out examples to aid a student’s comprehension of the content -exercises to challenge the reader and complement the material Nino Boccara is also the author of Essentials of Mathematica: With Applications to Mathematics and Physics (Springer, 2007).

  18. Modeling Complex Systems

    International Nuclear Information System (INIS)

    Schreckenberg, M

    2004-01-01

    This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)

  19. Modeling the earth system

    Energy Technology Data Exchange (ETDEWEB)

    Ojima, D. [ed.

    1992-12-31

    The 1990 Global Change Institute (GCI) on Earth System Modeling is the third of a series organized by the Office for Interdisciplinary Earth Studies to look in depth at particular issues critical to developing a better understanding of the earth system. The 1990 GCI on Earth System Modeling was organized around three themes: defining critical gaps in the knowledge of the earth system, developing simplified working models, and validating comprehensive system models. This book is divided into three sections that reflect these themes. Each section begins with a set of background papers offering a brief tutorial on the subject, followed by working group reports developed during the institute. These reports summarize the joint ideas and recommendations of the participants and bring to bear the interdisciplinary perspective that imbued the institute. Since the conclusion of the 1990 Global Change Institute, research programs, nationally and internationally, have moved forward to implement a number of the recommendations made at the institute, and many of the participants have maintained collegial interactions to develop research projects addressing the needs identified during the two weeks in Snowmass.

  20. A Model to Reproduce the Response of the Gaseous Fission Product Monitor (GFPM) in a CANDU{sup R} 6 Reactor (An Estimate of Tramp Uranium Mass in a Candu Core)

    Energy Technology Data Exchange (ETDEWEB)

    Mostofian, Sara; Boss, Charles [AECL Atomic Energy of Canada Limited, 2251 Speakman Drive, Mississauga Ontario L5K 1B2 (Canada)

    2008-07-01

    In a Canada Deuterium Uranium (Candu) reactor, the fuel bundles produce gaseous and volatile fission products that are contained within the fuel matrix and the welded zircaloy sheath. Sometimes a fuel sheath can develop a defect and release the fission products into the circulating coolant. To detect fuel defects, a Gaseous Fission Product Monitoring (GFPM) system is provided in Candu reactors. The (GFPM) is a gamma ray spectrometer that measures fission products in the coolant and alerts the operator to the presence of defected fuel through an increase in measured fission product concentration. A background fission product concentration in the coolant also arises from tramp uranium. The sources of the tramp uranium are small quantities of uranium contamination on the surfaces of fuel bundles and traces of uranium on the pressure tubes, arising from the rare defected fuel element that released uranium into the core. This paper presents a dynamic model that reproduces the behaviour of a GFPM in a Candu 6 plant. The model predicts the fission product concentrations in the coolant from the chronic concentration of tramp uranium on the inner surface of the pressure tubes (PT) and the surface of the fuel bundles (FB) taking into account the on-power refuelling system. (authors)

  1. System equivalent model mixing

    Science.gov (United States)

    Klaassen, Steven W. B.; van der Seijs, Maarten V.; de Klerk, Dennis

    2018-05-01

    This paper introduces SEMM: a method based on Frequency Based Substructuring (FBS) techniques that enables the construction of hybrid dynamic models. With System Equivalent Model Mixing (SEMM) frequency based models, either of numerical or experimental nature, can be mixed to form a hybrid model. This model follows the dynamic behaviour of a predefined weighted master model. A large variety of applications can be thought of, such as the DoF-space expansion of relatively small experimental models using numerical models, or the blending of different models in the frequency spectrum. SEMM is outlined, both mathematically and conceptually, based on a notation commonly used in FBS. A critical physical interpretation of the theory is provided next, along with a comparison to similar techniques; namely DoF expansion techniques. SEMM's concept is further illustrated by means of a numerical example. It will become apparent that the basic method of SEMM has some shortcomings which warrant a few extensions to the method. One of the main applications is tested in a practical case, performed on a validated benchmark structure; it will emphasize the practicality of the method.

  2. Mechanical Systems, Classical Models

    CERN Document Server

    Teodorescu, Petre P

    2009-01-01

    This third volume completes the Work Mechanical Systems, Classical Models. The first two volumes dealt with particle dynamics and with discrete and continuous mechanical systems. The present volume studies analytical mechanics. Topics like Lagrangian and Hamiltonian mechanics, the Hamilton-Jacobi method, and a study of systems with separate variables are thoroughly discussed. Also included are variational principles and canonical transformations, integral invariants and exterior differential calculus, and particular attention is given to non-holonomic mechanical systems. The author explains in detail all important aspects of the science of mechanics, regarded as a natural science, and shows how they are useful in understanding important natural phenomena and solving problems of interest in applied and engineering sciences. Professor Teodorescu has spent more than fifty years as a Professor of Mechanics at the University of Bucharest and this book relies on the extensive literature on the subject as well as th...

  3. Prognostic Performance and Reproducibility of the 1973 and 2004/2016 World Health Organization Grading Classification Systems in Non-muscle-invasive Bladder Cancer: A European Association of Urology Non-muscle Invasive Bladder Cancer Guidelines Panel Systematic Review.

    Science.gov (United States)

    Soukup, Viktor; Čapoun, Otakar; Cohen, Daniel; Hernández, Virginia; Babjuk, Marek; Burger, Max; Compérat, Eva; Gontero, Paolo; Lam, Thomas; MacLennan, Steven; Mostafid, A Hugh; Palou, Joan; van Rhijn, Bas W G; Rouprêt, Morgan; Shariat, Shahrokh F; Sylvester, Richard; Yuan, Yuhong; Zigeuner, Richard

    2017-11-01

    Tumour grade is an important prognostic indicator in non-muscle-invasive bladder cancer (NMIBC). Histopathological classifications are limited by interobserver variability (reproducibility), which may have prognostic implications. European Association of Urology NMIBC guidelines suggest concurrent use of both 1973 and 2004/2016 World Health Organization (WHO) classifications. To compare the prognostic performance and reproducibility of the 1973 and 2004/2016 WHO grading systems for NMIBC. A systematic literature search was undertaken incorporating Medline, Embase, and the Cochrane Library. Studies were critically appraised for risk of bias (QUIPS). For prognosis, the primary outcome was progression to muscle-invasive or metastatic disease. Secondary outcomes were disease recurrence, and overall and cancer-specific survival. For reproducibility, the primary outcome was interobserver variability between pathologists. Secondary outcome was intraobserver variability (repeatability) by the same pathologist. Of 3593 articles identified, 20 were included in the prognostic review; three were eligible for the reproducibility review. Increasing tumour grade in both classifications was associated with higher disease progression and recurrence rates. Progression rates in grade 1 patients were similar to those in low-grade patients; progression rates in grade 3 patients were higher than those in high-grade patients. Survival data were limited. Reproducibility of the 2004/2016 system was marginally better than that of the 1973 system. Two studies on repeatability showed conflicting results. Most studies had a moderate to high risk of bias. Current grading classifications in NMIBC are suboptimal. The 1973 system identifies more aggressive tumours. Intra- and interobserver variability was slightly less in the 2004/2016 classification. We could not confirm that the 2004/2016 classification outperforms the 1973 classification in prediction of recurrence and progression. This article

  4. Isolated heart models: cardiovascular system studies and technological advances.

    Science.gov (United States)

    Olejnickova, Veronika; Novakova, Marie; Provaznik, Ivo

    2015-07-01

    Isolated heart model is a relevant tool for cardiovascular system studies. It represents a highly reproducible model for studying broad spectrum of biochemical, physiological, morphological, and pharmaceutical parameters, including analysis of intrinsic heart mechanics, metabolism, and coronary vascular response. Results obtained in this model are under no influence of other organ systems, plasma concentration of hormones or ions and influence of autonomic nervous system. The review describes various isolated heart models, the modes of heart perfusion, and advantages and limitations of various experimental setups. It reports the improvements of perfusion setup according to Langendorff introduced by the authors.

  5. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  6. Modeling dental radiographic systems

    International Nuclear Information System (INIS)

    Webber, R.L.

    1980-01-01

    The Bureau of Radiological Health has been actively collaborating with the Clinical Investigations Branch, NIDR, in applied research involving diagnostic use of ionizing radiation in dentistry. This work has centered on the search for alternatives to conventional radiographic systems in an attempt to improve diagnostic performance while reducing the required exposure. The basic approach involves analysis of factors limiting performance of properly defined diagnostic tasks and the modeling alternative systems with an eye toward increasing objective measures of performance. Previous collaborative work involved using a nonlinear model to compare various x-ray spectra. The data were expressed as brightness-contrast versus exposure for simulated tasks of clinical interest. This report supplements these findings by extending the number of parameters under investigation and modifying the mode of data display so that an actual radiographic image can be simulated on a television screen

  7. Production process reproducibility and product quality consistency of transient gene expression in HEK293 cells with anti-PD1 antibody as the model protein.

    Science.gov (United States)

    Ding, Kai; Han, Lei; Zong, Huifang; Chen, Junsheng; Zhang, Baohong; Zhu, Jianwei

    2017-03-01

    Demonstration of reproducibility and consistency of process and product quality is one of the most crucial issues in using transient gene expression (TGE) technology for biopharmaceutical development. In this study, we challenged the production consistency of TGE by expressing nine batches of recombinant IgG antibody in human embryonic kidney 293 cells to evaluate reproducibility including viable cell density, viability, apoptotic status, and antibody yield in cell culture supernatant. Product quality including isoelectric point, binding affinity, secondary structure, and thermal stability was assessed as well. In addition, major glycan forms of antibody from different batches of production were compared to demonstrate glycosylation consistency. Glycan compositions of the antibody harvested at different time periods were also measured to illustrate N-glycan distribution over the culture time. From the results, it has been demonstrated that different TGE batches are reproducible from lot to lot in overall cell growth, product yield, and product qualities including isoelectric point, binding affinity, secondary structure, and thermal stability. Furthermore, major N-glycan compositions are consistent among different TGE batches and conserved during cell culture time.

  8. Modeling Novo Nordisk Production Systems

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth

    1997-01-01

    This report describes attributes of models and systems, and how models can be used for description of production systems. There are special attention on the 'Theory of Domains'.......This report describes attributes of models and systems, and how models can be used for description of production systems. There are special attention on the 'Theory of Domains'....

  9. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  10. Thermodynamic modeling of the Mg-Al-Ca system

    Energy Technology Data Exchange (ETDEWEB)

    Janz, A.; Groebner, J. [Clausthal University of Technology, Institute of Metallurgy, Robert-Koch-Str. 42, D-38678 Clausthal-Zellerfeld (Germany); Cao, H.; Zhu, J.; Chang, Y.A. [Department of Materials Science and Engineering, University of Wisconsin, 1509 University Ave., Madison, WI 53706 (United States); Schmid-Fetzer, R. [Clausthal University of Technology, Institute of Metallurgy, Robert-Koch-Str. 42, D-38678 Clausthal-Zellerfeld (Germany)], E-mail: schmid-fetzer@tu-clausthal.de

    2009-02-15

    A thermodynamic model has been developed that provides a quantitative description for a wide area of the Mg-Al-Ca system. All available experimental data plus new key experiments using differential scanning calorimetry/differential thermal analysis have been considered to create a dataset which reproduces the primary crystallizing phases, the extensive ternary solubilities of binary phases and the ternary C36 Laves phase. This enables validated thermodynamic calculations in various areas of this ternary system.

  11. Model for paramagnetic Fermi systems

    International Nuclear Information System (INIS)

    Ainsworth, T.L.; Bedell, K.S.; Brown, G.E.; Quader, K.F.

    1983-01-01

    We develop a mode for paramagnetic Fermi liquids. This model has both direct and induced interactions, the latter including both density-density and current-current response. The direct interactions are chosen to reproduce the Fermi liquid parameters F/sup s/ 0 , F/sup a/ 0 , F/sup s/ 1 and to satify the forward scattering sum rule. The F/sup a/ 1 and F/sup s/,a/sub l/ for l>1 are determined self-consistently by the induced interactions; they are checked aginst experimental determinations. The model is applied in detail to liquid 3 He, using data from spin-echo experiments, sound attenuation, and the velocities of first and zero sound. Consistency with experiments gives definite preferences for values of m. The model is also applied to paramagnetic metals. Arguments are given that this model should provide a basis for calculating effects of magnetic fields

  12. On the origin of reproducible sequential activity in neural circuits

    Science.gov (United States)

    Afraimovich, V. S.; Zhigulin, V. P.; Rabinovich, M. I.

    2004-12-01

    Robustness and reproducibility of sequential spatio-temporal responses is an essential feature of many neural circuits in sensory and motor systems of animals. The most common mathematical images of dynamical regimes in neural systems are fixed points, limit cycles, chaotic attractors, and continuous attractors (attractive manifolds of neutrally stable fixed points). These are not suitable for the description of reproducible transient sequential neural dynamics. In this paper we present the concept of a stable heteroclinic sequence (SHS), which is not an attractor. SHS opens the way for understanding and modeling of transient sequential activity in neural circuits. We show that this new mathematical object can be used to describe robust and reproducible sequential neural dynamics. Using the framework of a generalized high-dimensional Lotka-Volterra model, that describes the dynamics of firing rates in an inhibitory network, we present analytical results on the existence of the SHS in the phase space of the network. With the help of numerical simulations we confirm its robustness in presence of noise in spite of the transient nature of the corresponding trajectories. Finally, by referring to several recent neurobiological experiments, we discuss possible applications of this new concept to several problems in neuroscience.

  13. Contextual sensitivity in scientific reproducibility

    Science.gov (United States)

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  14. Reproducibility of somatosensory spatial perceptual maps.

    Science.gov (United States)

    Steenbergen, Peter; Buitenweg, Jan R; Trojan, Jörg; Veltink, Peter H

    2013-02-01

    Various studies have shown subjects to mislocalize cutaneous stimuli in an idiosyncratic manner. Spatial properties of individual localization behavior can be represented in the form of perceptual maps. Individual differences in these maps may reflect properties of internal body representations, and perceptual maps may therefore be a useful method for studying these representations. For this to be the case, individual perceptual maps need to be reproducible, which has not yet been demonstrated. We assessed the reproducibility of localizations measured twice on subsequent days. Ten subjects participated in the experiments. Non-painful electrocutaneous stimuli were applied at seven sites on the lower arm. Subjects localized the stimuli on a photograph of their own arm, which was presented on a tablet screen overlaying the real arm. Reproducibility was assessed by calculating intraclass correlation coefficients (ICC) for the mean localizations of each electrode site and the slope and offset of regression models of the localizations, which represent scaling and displacement of perceptual maps relative to the stimulated sites. The ICCs of the mean localizations ranged from 0.68 to 0.93; the ICCs of the regression parameters were 0.88 for the intercept and 0.92 for the slope. These results indicate a high degree of reproducibility. We conclude that localization patterns of non-painful electrocutaneous stimuli on the arm are reproducible on subsequent days. Reproducibility is a necessary property of perceptual maps for these to reflect properties of a subject's internal body representations. Perceptual maps are therefore a promising method for studying body representations.

  15. Energy-dissipation-model for metallurgical multi-phase-systems

    Energy Technology Data Exchange (ETDEWEB)

    Mavrommatis, K.T. [Rheinisch-Westfaelische Technische Hochschule Aachen, Aachen (Germany)

    1996-12-31

    Entropy production in real processes is directly associated with the dissipation of energy. Both are potential measures for the proceed of irreversible processes taking place in metallurgical systems. Many of these processes in multi-phase-systems could then be modelled on the basis of the energy-dissipation associated with. As this entity can often be estimated using very simple assumptions from first principles, the evolution of an overall measure of systems behaviour can be studied constructing an energy-dissipation -based model of the system. In this work a formulation of this concept, the Energy-Dissipation-Model (EDM), for metallurgical multi-phase-systems is given. Special examples are studied to illustrate the concept, and benefits as well as the range of validity are shown. This concept might be understood as complement to usual CFD-modelling of complex systems on a more abstract level but reproducing essential attributes of complex metallurgical systems. (author)

  16. Energy-dissipation-model for metallurgical multi-phase-systems

    Energy Technology Data Exchange (ETDEWEB)

    Mavrommatis, K T [Rheinisch-Westfaelische Technische Hochschule Aachen, Aachen (Germany)

    1997-12-31

    Entropy production in real processes is directly associated with the dissipation of energy. Both are potential measures for the proceed of irreversible processes taking place in metallurgical systems. Many of these processes in multi-phase-systems could then be modelled on the basis of the energy-dissipation associated with. As this entity can often be estimated using very simple assumptions from first principles, the evolution of an overall measure of systems behaviour can be studied constructing an energy-dissipation -based model of the system. In this work a formulation of this concept, the Energy-Dissipation-Model (EDM), for metallurgical multi-phase-systems is given. Special examples are studied to illustrate the concept, and benefits as well as the range of validity are shown. This concept might be understood as complement to usual CFD-modelling of complex systems on a more abstract level but reproducing essential attributes of complex metallurgical systems. (author)

  17. Gravitational instantons as models for charged particle systems

    Science.gov (United States)

    Franchetti, Guido; Manton, Nicholas S.

    2013-03-01

    In this paper we propose ALF gravitational instantons of types A k and D k as models for charged particle systems. We calculate the charges of the two families. These are -( k + 1) for A k , which is proposed as a model for k + 1 electrons, and 2 - k for D k , which is proposed as a model for either a particle of charge +2 and k electrons or a proton and k - 1 electrons. Making use of preferred topological and metrical structures of the manifolds, namely metrically preferred representatives of middle dimension homology classes, we construct two different energy functionals which reproduce the Coulomb interaction energy for a system of charged particles.

  18. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  19. Reproducing Kernels and Coherent States on Julia Sets

    Energy Technology Data Exchange (ETDEWEB)

    Thirulogasanthar, K., E-mail: santhar@cs.concordia.ca; Krzyzak, A. [Concordia University, Department of Computer Science and Software Engineering (Canada)], E-mail: krzyzak@cs.concordia.ca; Honnouvo, G. [Concordia University, Department of Mathematics and Statistics (Canada)], E-mail: g_honnouvo@yahoo.fr

    2007-11-15

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems.

  20. Reproducing Kernels and Coherent States on Julia Sets

    International Nuclear Information System (INIS)

    Thirulogasanthar, K.; Krzyzak, A.; Honnouvo, G.

    2007-01-01

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems

  1. On Modelling an Immune System

    OpenAIRE

    Monroy, Raúl; Saab, Rosa; Godínez, Fernando

    2004-01-01

    Immune systems of live forms have been an abundant source of inspiration to contemporary computer scientists. Problem solving strategies, stemming from known immune system phenomena, have been successfully applied to challenging problems of modern computing. However, research in artificial immune systems has overlooked establishing a coherent model of known immune system behaviour. This paper aims reports on an preliminary computer model of an immune system, where each immune system component...

  2. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  3. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  4. Impact of revising the National Nosocomial Infection Surveillance System definition for catheter-related bloodstream infection in ICU: reproducibility of the National Healthcare Safety Network case definition in an Australian cohort of infection control professionals.

    Science.gov (United States)

    Worth, Leon J; Brett, Judy; Bull, Ann L; McBryde, Emma S; Russo, Philip L; Richards, Michael J

    2009-10-01

    Effective and comparable surveillance for central venous catheter-related bloodstream infections (CLABSIs) in the intensive care unit requires a reproducible case definition that can be readily applied by infection control professionals. Using a questionnaire containing clinical cases, reproducibility of the National Nosocomial Infection Surveillance System (NNIS) surveillance definition for CLABSI was assessed in an Australian cohort of infection control professionals participating in the Victorian Hospital Acquired Infection Surveillance System (VICNISS). The same questionnaire was then used to evaluate the reproducibility of the National Healthcare Safety Network (NHSN) surveillance definition for CLABSI. Target hospitals were defined as large metropolitan (1A) or other large hospitals (non-1A), according to the Victorian Department of Human Services. Questionnaire responses of Centers for Disease Control and Prevention NHSN surveillance experts were used as gold standard comparator. Eighteen of 21 eligible VICNISS centers participated in the survey. Overall concordance with the gold standard was 57.1%, and agreement was highest for 1A hospitals (60.6%). The proportion of congruently classified cases varied according to NNIS criteria: criterion 1 (recognized pathogen), 52.8%; criterion 2a (skin contaminant in 2 or more blood cultures), 83.3%; criterion 2b (skin contaminant in 1 blood culture and appropriate antimicrobial therapy instituted), 58.3%; non-CLABSI cases, 51.4%. When survey questions regarding identification of cases of CLABSI criterion 2b were removed (consistent with the current NHSN definition), overall percentage concordance increased to 62.5% (72.2% for 1A centers). Further educational interventions are required to improve the discrimination of primary and secondary causes of bloodstream infection in Victorian intensive care units. Although reproducibility of the CLABSI case definition is relatively poor, adoption of the revised NHSN definition

  5. Pembangunan Model Restaurant Management System

    OpenAIRE

    Fredy Jingga; Natalia Limantara

    2014-01-01

    Model design for Restaurant Management System aims to help in restaurant business process, where Restaurant Management System (RMS) help the waitress and chef could interact each other without paper limitation.  This Restaurant Management System Model develop using Agile Methodology and developed based on PHP Programming Langguage. The database management system is using MySQL. This web-based application model will enable the waitress and the chef to interact in realtime, from the time they a...

  6. Modelling of wastewater systems

    DEFF Research Database (Denmark)

    Bechmann, Henrik

    to analyze and quantify the effect of the Aeration Tank Settling (ATS) operating mode, which is used during rain events. Furthermore, the model is used to propose a control algorithm for the phase lengths during ATS operation. The models are mainly formulated as state space model in continuous time......In this thesis, models of pollution fluxes in the inlet to 2 Danish wastewater treatment plants (WWTPs) as well as of suspended solids (SS) concentrations in the aeration tanks of an alternating WWTP and in the effluent from the aeration tanks are developed. The latter model is furthermore used...... at modelling the fluxes in terms of the multiple correlation coefficient R2. The model of the SS concentrations in the aeration tanks of an alternating WWTP as well as in the effluent from the aeration tanks is a mass balance model based on measurements of SS in one aeration tank and in the common outlet...

  7. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  8. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  9. Modeling and estimating system availability

    International Nuclear Information System (INIS)

    Gaver, D.P.; Chu, B.B.

    1976-11-01

    Mathematical models to infer the availability of various types of more or less complicated systems are described. The analyses presented are probabilistic in nature and consist of three parts: a presentation of various analytic models for availability; a means of deriving approximate probability limits on system availability; and a means of statistical inference of system availability from sparse data, using a jackknife procedure. Various low-order redundant systems are used as examples, but extension to more complex systems is not difficult

  10. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built-up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognize systematic error distorting the performance test....

  11. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built–up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognise systematic error distorting the performance test....

  12. Evaluation of Land Surface Models in Reproducing Satellite-Derived LAI over the High-Latitude Northern Hemisphere. Part I: Uncoupled DGVMs

    Directory of Open Access Journals (Sweden)

    Ning Zeng

    2013-10-01

    Full Text Available Leaf Area Index (LAI represents the total surface area of leaves above a unit area of ground and is a key variable in any vegetation model, as well as in climate models. New high resolution LAI satellite data is now available covering a period of several decades. This provides a unique opportunity to validate LAI estimates from multiple vegetation models. The objective of this paper is to compare new, satellite-derived LAI measurements with modeled output for the Northern Hemisphere. We compare monthly LAI output from eight land surface models from the TRENDY compendium with satellite data from an Artificial Neural Network (ANN from the latest version (third generation of GIMMS AVHRR NDVI data over the period 1986–2005. Our results show that all the models overestimate the mean LAI, particularly over the boreal forest. We also find that seven out of the eight models overestimate the length of the active vegetation-growing season, mostly due to a late dormancy as a result of a late summer phenology. Finally, we find that the models report a much larger positive trend in LAI over this period than the satellite observations suggest, which translates into a higher trend in the growing season length. These results highlight the need to incorporate a larger number of more accurate plant functional types in all models and, in particular, to improve the phenology of deciduous trees.

  13. Modeling soft interface dominated systems

    NARCIS (Netherlands)

    Lamorgese, A.; Mauri, R.; Sagis, L.M.C.

    2017-01-01

    The two main continuum frameworks used for modeling the dynamics of soft multiphase systems are the Gibbs dividing surface model, and the diffuse interface model. In the former the interface is modeled as a two dimensional surface, and excess properties such as a surface density, or surface energy

  14. Traffic Modelling for Moving-Block Train Control System

    International Nuclear Information System (INIS)

    Tang Tao; Li Keping

    2007-01-01

    This paper presents a new cellular automaton (CA) model for train control system simulation. In the proposed CA model, the driver reactions to train movements are captured by some updated rules. The space-time diagram of traffic flow and the trajectory of train movement is used to obtain insight into the characteristic behavior of railway traffic flow. A number of simulation results demonstrate that the proposed CA model can be successfully used for the simulations of railway traffic. Not only the characteristic behavior of railway traffic flow can be reproduced, but also the simulation values of the minimum time headway are close to the theoretical values.

  15. Safeguards system effectiveness modeling

    International Nuclear Information System (INIS)

    Bennett, H.A.; Boozer, D.D.; Chapman, L.D.; Daniel, S.L.; Engi, D.; Hulme, B.L.; Varnado, G.B.

    1976-01-01

    A general methodology for the comparative evaluation of physical protection system effectiveness at nuclear facilities is presently under development. The approach is applicable to problems of sabotage or theft at fuel cycle facilities. The overall methodology and the primary analytic techniques used to assess system effectiveness are briefly outlined

  16. Safeguards system effectiveness modeling

    International Nuclear Information System (INIS)

    Boozer, D.D.; Hulme, B.L.; Daniel, S.L.; Varnado, G.B.; Bennett, H.A.; Chapman, L.D.; Engi, D.

    1976-09-01

    A general methodology for the comparative evaluation of physical protection system effectiveness at nuclear facilities is presently under development. The approach is applicable to problems of sabotage or theft at fuel cycle facilities. In this paper, the overall methodology and the primary analytic techniques used to assess system effectiveness are briefly outlined

  17. Safeguards system effectiveness modeling

    International Nuclear Information System (INIS)

    Bennett, H.A.; Boozer, D.D.; Chapman, L.D.; Daniel, S.L.; Engi, D.; Hulme, B.L.; Varnado, G.B.

    1976-01-01

    A general methodology for the comparative evaluation of physical protection system effectiveness at nuclear facilities is presently under development. The approach is applicable to problems of sabotage or theft at fuel cycle facilities. In this paper, the overall methodology and the primary analytic techniques used to assess system effectiveness are briefly outlined

  18. Constructing Agent Model for Virtual Training Systems

    Science.gov (United States)

    Murakami, Yohei; Sugimoto, Yuki; Ishida, Toru

    Constructing highly realistic agents is essential if agents are to be employed in virtual training systems. In training for collaboration based on face-to-face interaction, the generation of emotional expressions is one key. In training for guidance based on one-to-many interaction such as direction giving for evacuations, emotional expressions must be supplemented by diverse agent behaviors to make the training realistic. To reproduce diverse behavior, we characterize agents by using a various combinations of operation rules instantiated by the user operating the agent. To accomplish this goal, we introduce a user modeling method based on participatory simulations. These simulations enable us to acquire information observed by each user in the simulation and the operating history. Using these data and the domain knowledge including known operation rules, we can generate an explanation for each behavior. Moreover, the application of hypothetical reasoning, which offers consistent selection of hypotheses, to the generation of explanations allows us to use otherwise incompatible operation rules as domain knowledge. In order to validate the proposed modeling method, we apply it to the acquisition of an evacuee's model in a fire-drill experiment. We successfully acquire a subject's model corresponding to the results of an interview with the subject.

  19. ECONOMIC MODELING STOCKS CONTROL SYSTEM: SIMULATION MODEL

    OpenAIRE

    Климак, М.С.; Войтко, С.В.

    2016-01-01

    Considered theoretical and applied aspects of the development of simulation models to predictthe optimal development and production systems that create tangible products andservices. It isproved that theprocessof inventory control needs of economicandmathematical modeling in viewof thecomplexity of theoretical studies. A simulation model of stocks control that allows make managementdecisions with production logistics

  20. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  1. Adjustments in the Almod 3W2 code models for reproducing the net load trip test in Angra I nuclear power plant

    International Nuclear Information System (INIS)

    Camargo, C.T.M.; Madeira, A.A.; Pontedeiro, A.C.; Dominguez, L.

    1986-09-01

    The recorded traces got from the net load trip test in Angra I NPP yelded the oportunity to make fine adjustments in the ALMOD 3W2 code models. The changes are described and the results are compared against plant real data. (Author) [pt

  2. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  3. ASN reputation system model

    Science.gov (United States)

    Hutchinson, Steve; Erbacher, Robert F.

    2015-05-01

    Network security monitoring is currently challenged by its reliance on human analysts and the inability for tools to generate indications and warnings for previously unknown attacks. We propose a reputation system based on IP address set membership within the Autonomous System Number (ASN) system. Essentially, a metric generated based on the historic behavior, or misbehavior, of nodes within a given ASN can be used to predict future behavior and provide a mechanism to locate network activity requiring inspection. This will provide reinforcement of notifications and warnings and lead to inspection for ASNs known to be problematic even if initial inspection leads to interpretation of the event as innocuous. We developed proof of concept capabilities to generate the IP address to ASN set membership and analyze the impact of the results. These results clearly show that while some ASNs are one-offs with individual or small numbers of misbehaving IP addresses, there are definitive ASNs with a history of long term and wide spread misbehaving IP addresses. These ASNs with long histories are what we are especially interested in and will provide an additional correlation metric for the human analyst and lead to new tools to aid remediation of these IP address blocks.

  4. Stochastic Modelling of Energy Systems

    DEFF Research Database (Denmark)

    Andersen, Klaus Kaae

    2001-01-01

    is that the model structure has to be adequate for practical applications, such as system simulation, fault detection and diagnosis, and design of control strategies. This also reflects on the methods used for identification of the component models. The main result from this research is the identification......In this thesis dynamic models of typical components in Danish heating systems are considered. Emphasis is made on describing and evaluating mathematical methods for identification of such models, and on presentation of component models for practical applications. The thesis consists of seven...... research papers (case studies) together with a summary report. Each case study takes it's starting point in typical heating system components and both, the applied mathematical modelling methods and the application aspects, are considered. The summary report gives an introduction to the scope...

  5. Stress Erythropoiesis Model Systems.

    Science.gov (United States)

    Bennett, Laura F; Liao, Chang; Paulson, Robert F

    2018-01-01

    Bone marrow steady-state erythropoiesis maintains erythroid homeostasis throughout life. This process constantly generates new erythrocytes to replace the senescent erythrocytes that are removed by macrophages in the spleen. In contrast, anemic or hypoxic stress induces a physiological response designed to increase oxygen delivery to the tissues. Stress erythropoiesis is a key component of this response. It is best understood in mice where it is extramedullary occurring in the adult spleen and liver and in the fetal liver during development. Stress erythropoiesis utilizes progenitor cells and signals that are distinct from bone marrow steady-state erythropoiesis. Because of that observation many genes may play a role in stress erythropoiesis despite having no effect on steady-state erythropoiesis. In this chapter, we will discuss in vivo and in vitro techniques to study stress erythropoiesis in mice and how the in vitro culture system can be extended to study human stress erythropoiesis.

  6. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  7. Magnetic Active Agent Release System (MAARS): evaluation of a new way for a reproducible, externally controlled drug release into the small intestine.

    Science.gov (United States)

    Dietzel, Christian T; Richert, Hendryk; Abert, Sandra; Merkel, Ute; Hippius, Marion; Stallmach, Andreas

    2012-08-10

    Human absorption studies are used to test new drug candidates for their bioavailability in different regions of the gastrointestinal tract. In order to replace invasive techniques (e.g. oral or rectal intubation) a variety of externally controlled capsule-based drug release systems has been developed. Most of these use ionizing radiation, internal batteries, heating elements or even chemicals for the localization and disintegration process of the capsule. This embodies potential harms for volunteers and patients. We report about a novel technique called "Magnetic Active Agent Release System" (MAARS), which uses purely magnetic effects for this purpose. In our trial thirteen healthy volunteers underwent a complete monitoring and release procedure of 250 mg acetylsalicylic acid (ASA) targeting the flexura duodenojejunalis and the mid-part of the jejunum. During all experiments MAARS initiated a sufficient drug release and was well tolerated. Beside this we also could show that the absorption of ASA is about two times faster in the more proximal region of the flexura duodenojejunalis with a tmax of 47±13 min compared to the more distal jejunum with tmax values of 100±10 min (p=0.031). Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Convergence of macrostates under reproducible processes

    International Nuclear Information System (INIS)

    Rau, Jochen

    2010-01-01

    I show that whenever a system undergoes a reproducible macroscopic process the mutual distinguishability of macrostates, as measured by their relative entropy, diminishes. This extends the second law which regards only ordinary entropies, and hence only the distinguishability between macrostates and one specific reference state (equidistribution). The new result holds regardless of whether the process is linear or nonlinear. Its proof hinges on the monotonicity of quantum relative entropy under arbitrary coarse grainings, even those that cannot be represented by trace-preserving completely positive maps.

  9. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  10. Data management system performance modeling

    Science.gov (United States)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  11. A Mouse Model That Reproduces the Developmental Pathways and Site Specificity of the Cancers Associated With the Human BRCA1 Mutation Carrier State

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2015-10-01

    Full Text Available Predisposition to breast and extrauterine Müllerian carcinomas in BRCA1 mutation carriers is due to a combination of cell-autonomous consequences of BRCA1 inactivation on cell cycle homeostasis superimposed on cell-nonautonomous hormonal factors magnified by the effects of BRCA1 mutations on hormonal changes associated with the menstrual cycle. We used the Müllerian inhibiting substance type 2 receptor (Mis2r promoter and a truncated form of the Follicle stimulating hormone receptor (Fshr promoter to introduce conditional knockouts of Brca1 and p53 not only in mouse mammary and Müllerian epithelia, but also in organs that control the estrous cycle. Sixty percent of the double mutant mice developed invasive Müllerian and mammary carcinomas. Mice carrying heterozygous mutations in Brca1 and p53 also developed invasive tumors, albeit at a lesser (30% rate, in which the wild type alleles were no longer present due to loss of heterozygosity. While mice carrying heterozygous mutations in both genes developed mammary tumors, none of the mice carrying only a heterozygous p53 mutation developed such tumors (P < 0.0001, attesting to a role for Brca1 mutations in tumor development. This mouse model is attractive to investigate cell-nonautonomous mechanisms associated with cancer predisposition in BRCA1 mutation carriers and to investigate the merit of chemo-preventive drugs targeting such mechanisms.

  12. Effect of Initial Conditions on Reproducibility of Scientific Research

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-01-01

    Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705

  13. Evaluation of single- and dual-porosity models for reproducing the release of external and internal tracers from heterogeneous waste-rock piles.

    Science.gov (United States)

    Blackmore, S; Pedretti, D; Mayer, K U; Smith, L; Beckie, R D

    2018-05-30

    Accurate predictions of solute release from waste-rock piles (WRPs) are paramount for decision making in mining-related environmental processes. Tracers provide information that can be used to estimate effective transport parameters and understand mechanisms controlling the hydraulic and geochemical behavior of WRPs. It is shown that internal tracers (i.e. initially present) together with external (i.e. applied) tracers provide complementary and quantitative information to identify transport mechanisms. The analysis focuses on two experimental WRPs, Piles 4 and Pile 5 at the Antamina Mine site (Peru), where both an internal chloride tracer and externally applied bromide tracer were monitored in discharge over three years. The results suggest that external tracers provide insight into transport associated with relatively fast flow regions that are activated during higher-rate recharge events. In contrast, internal tracers provide insight into mechanisms controlling solutes release from lower-permeability zones within the piles. Rate-limited diffusive processes, which can be mimicked by nonlocal mass-transfer models, affect both internal and external tracers. The sensitivity of the mass-transfer parameters to heterogeneity is higher for external tracers than for internal tracers, as indicated by the different mean residence times characterizing the flow paths associated with each tracer. The joint use of internal and external tracers provides a more comprehensive understanding of the transport mechanisms in WRPs. In particular, the tracer tests support the notion that a multi-porosity conceptualization of WRPs is more adequate for capturing key mechanisms than a dual-porosity conceptualization. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Prediction of lung tumour position based on spirometry and on abdominal displacement: Accuracy and reproducibility

    International Nuclear Information System (INIS)

    Hoisak, Jeremy D.P.; Sixel, Katharina E.; Tirona, Romeo; Cheung, Patrick C.F.; Pignol, Jean-Philippe

    2006-01-01

    Background and purpose: A simulation investigating the accuracy and reproducibility of a tumour motion prediction model over clinical time frames is presented. The model is formed from surrogate and tumour motion measurements, and used to predict the future position of the tumour from surrogate measurements alone. Patients and methods: Data were acquired from five non-small cell lung cancer patients, on 3 days. Measurements of respiratory volume by spirometry and abdominal displacement by a real-time position tracking system were acquired simultaneously with X-ray fluoroscopy measurements of superior-inferior tumour displacement. A model of tumour motion was established and used to predict future tumour position, based on surrogate input data. The calculated position was compared against true tumour motion as seen on fluoroscopy. Three different imaging strategies, pre-treatment, pre-fraction and intrafractional imaging, were employed in establishing the fitting parameters of the prediction model. The impact of each imaging strategy upon accuracy and reproducibility was quantified. Results: When establishing the predictive model using pre-treatment imaging, four of five patients exhibited poor interfractional reproducibility for either surrogate in subsequent sessions. Simulating the formulation of the predictive model prior to each fraction resulted in improved interfractional reproducibility. The accuracy of the prediction model was only improved in one of five patients when intrafractional imaging was used. Conclusions: Employing a prediction model established from measurements acquired at planning resulted in localization errors. Pre-fractional imaging improved the accuracy and reproducibility of the prediction model. Intrafractional imaging was of less value, suggesting that the accuracy limit of a surrogate-based prediction model is reached with once-daily imaging

  15. Model systems in photosynthesis research

    International Nuclear Information System (INIS)

    Katz, J.J.; Hindman, J.C.

    1981-01-01

    After a general discussion of model studies in photosynthesis research, three recently developed model systems are described. The current status of covalently linked chlorophyll pairs as models for P700 and P865 is first briefly reviewed. Mg-tris(pyrochlorophyllide)1,1,1-tris(hydroxymethyl) ethane triester in its folded configuration is then discussed as a rudimentary antenna-photoreaction center model. Finally, self-assembled chlorophyll systems that contain a mixture of monomeric, oligomeric and special pair chlorophyll are shown to have fluorescence emission characteristics that resemble thoe of intact Tribonema aequale at room temperature in that both show fluorescence emission at 675 and 695 nm. In the self-assembled systems the wavelength of the emitted fluorescence depends on the wavelength of excitation, arguing that energy transfer between different chlorophyll species in these systems may be more complex than previously suspected

  16. Mobility Models for Systems Evaluation

    Science.gov (United States)

    Musolesi, Mirco; Mascolo, Cecilia

    Mobility models are used to simulate and evaluate the performance of mobile wireless systems and the algorithms and protocols at the basis of them. The definition of realistic mobility models is one of the most critical and, at the same time, difficult aspects of the simulation of applications and systems designed for mobile environments. There are essentially two possible types of mobility patterns that can be used to evaluate mobile network protocols and algorithms by means of simulations: traces and synthetic models [130]. Traces are obtained by means of measurements of deployed systems and usually consist of logs of connectivity or location information, whereas synthetic models are mathematical models, such as sets of equations, which try to capture the movement of the devices.

  17. Repeatability and reproducibility of Population Viability Analysis (PVA and the implications for threatened species management

    Directory of Open Access Journals (Sweden)

    Clare Morrison

    2016-08-01

    Full Text Available Conservation triage focuses on prioritizing species, populations or habitats based on urgency, biodiversity benefits, recovery potential as well as cost. Population Viability Analysis (PVA is frequently used in population focused conservation prioritizations. The critical nature of many of these management decisions requires that PVA models are repeatable and reproducible to reliably rank species and/or populations quantitatively. This paper assessed the repeatability and reproducibility of a subset of previously published PVA models. We attempted to rerun baseline models from 90 publicly available PVA studies published between 2000-2012 using the two most common PVA modelling software programs, VORTEX and RAMAS-GIS. Forty percent (n = 36 failed, 50% (45 were both repeatable and reproducible, and 10% (9 had missing baseline models. Repeatability was not linked to taxa, IUCN category, PVA program version used, year published or the quality of publication outlet, suggesting that the problem is systemic within the discipline. Complete and systematic presentation of PVA parameters and results are needed to ensure that the scientific input into conservation planning is both robust and reliable, thereby increasing the chances of making decisions that are both beneficial and defensible. The implications for conservation triage may be far reaching if population viability models cannot be reproduced with confidence, thus undermining their intended value.

  18. Stochastic Models of Polymer Systems

    Science.gov (United States)

    2016-01-01

    Distribution Unlimited Final Report: Stochastic Models of Polymer Systems The views, opinions and/or findings contained in this report are those of the...ADDRESS. Princeton University PO Box 0036 87 Prospect Avenue - 2nd floor Princeton, NJ 08544 -2020 14-Mar-2014 ABSTRACT Number of Papers published in...peer-reviewed journals: Number of Papers published in non peer-reviewed journals: Final Report: Stochastic Models of Polymer Systems Report Title

  19. National Energy Outlook Modelling System

    Energy Technology Data Exchange (ETDEWEB)

    Volkers, C.M. [ECN Policy Studies, Petten (Netherlands)

    2013-12-15

    For over 20 years, the Energy research Centre of the Netherlands (ECN) has been developing the National Energy Outlook Modelling System (NEOMS) for Energy projections and policy evaluations. NEOMS enables 12 energy models of ECN to exchange data and produce consistent and detailed results.

  20. Intervendor consistency and reproducibility of left ventricular 2D global and regional strain with two different high-end ultrasound systems.

    Science.gov (United States)

    Shiino, Kenji; Yamada, Akira; Ischenko, Matthew; Khandheria, Bijoy K; Hudaverdi, Mahala; Speranza, Vicki; Harten, Mary; Benjamin, Anthony; Hamilton-Craig, Christian R; Platts, David G; Burstow, Darryl J; Scalia, Gregory M; Chan, Jonathan

    2017-06-01

    We aimed to assess intervendor agreement of global (GLS) and regional longitudinal strain by vendor-specific software after EACVI/ASE Industry Task Force Standardization Initiatives for Deformation Imaging. Fifty-five patients underwent prospective dataset acquisitions on the same day by the same operator using two commercially available cardiac ultrasound systems (GE Vivid E9 and Philips iE33). GLS and regional peak longitudinal strain were analyzed offline using corresponding vendor-specific software (EchoPAC BT13 and QLAB version 10.3). Absolute mean GLS measurements were similar between the two vendors (GE -17.5 ± 5.2% vs. Philips -18.9 ± 5.1%, P = 0.15). There was excellent intervendor correlation of GLS by the same observer [r = 0.94, P limits of agreement (LOA) -4.8 to 2.2%). Intervendor comparison for regional longitudinal strain by coronary artery territories distribution were: LAD: r = 0.85, P < 0.0001; bias 0.5%, LOA -5.3 to 6.4%; RCA: r = 0.88, P < 0.0001; bias -2.4%, LOA -8.6 to 3.7%; LCX: r = 0.76, P < 0.0001; bias -5.3%, LOA -10.6 to 2.0%. Intervendor comparison for regional longitudinal strain by LV levels were: basal: r = 0.86, P < 0.0001; bias -3.6%, LOA -9.9 to 2.0%; mid: r = 0.90, P < 0.0001; bias -2.6%, LOA -7.8 to 2.6%; apical: r = 0.74; P < 0.0001; bias -1.3%, LOA -9.4 to 6.8%. Intervendor agreement in GLS and regional strain measurements have significantly improved after the EACVI/ASE Task Force Strain Standardization Initiatives. However, significant wide LOA still exist, especially for regional strain measurements, which remains relevant when considering vendor-specific software for serial measurements. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.

  1. Aerodynamic and Mechanical System Modelling

    DEFF Research Database (Denmark)

    Jørgensen, Martin Felix

    This thesis deals with mechanical multibody-systems applied to the drivetrain of a 500 kW wind turbine. Particular focus has been on gearbox modelling of wind turbines. The main part of the present project involved programming multibody systems to investigate the connection between forces, moments...

  2. Experimental Modeling of Dynamic Systems

    DEFF Research Database (Denmark)

    Knudsen, Morten Haack

    2006-01-01

    An engineering course, Simulation and Experimental Modeling, has been developed that is based on a method for direct estimation of physical parameters in dynamic systems. Compared with classical system identification, the method appears to be easier to understand, apply, and combine with physical...

  3. A Neuron Model Based Ultralow Current Sensor System for Bioapplications

    Directory of Open Access Journals (Sweden)

    A. K. M. Arifuzzman

    2016-01-01

    Full Text Available An ultralow current sensor system based on the Izhikevich neuron model is presented in this paper. The Izhikevich neuron model has been used for its superior computational efficiency and greater biological plausibility over other well-known neuron spiking models. Of the many biological neuron spiking features, regular spiking, chattering, and neostriatal spiny projection spiking have been reproduced by adjusting the parameters associated with the model at hand. This paper also presents a modified interpretation of the regular spiking feature in which the firing pattern is similar to that of the regular spiking but with improved dynamic range offering. The sensor current ranges between 2 pA and 8 nA and exhibits linearity in the range of 0.9665 to 0.9989 for different spiking features. The efficacy of the sensor system in detecting low amount of current along with its high linearity attribute makes it very suitable for biomedical applications.

  4. Modeling Multi-Level Systems

    CERN Document Server

    Iordache, Octavian

    2011-01-01

    This book is devoted to modeling of multi-level complex systems, a challenging domain for engineers, researchers and entrepreneurs, confronted with the transition from learning and adaptability to evolvability and autonomy for technologies, devices and problem solving methods. Chapter 1 introduces the multi-scale and multi-level systems and highlights their presence in different domains of science and technology. Methodologies as, random systems, non-Archimedean analysis, category theory and specific techniques as model categorification and integrative closure, are presented in chapter 2. Chapters 3 and 4 describe polystochastic models, PSM, and their developments. Categorical formulation of integrative closure offers the general PSM framework which serves as a flexible guideline for a large variety of multi-level modeling problems. Focusing on chemical engineering, pharmaceutical and environmental case studies, the chapters 5 to 8 analyze mixing, turbulent dispersion and entropy production for multi-scale sy...

  5. Pembangunan Model Restaurant Management System

    Directory of Open Access Journals (Sweden)

    Fredy Jingga

    2014-12-01

    Full Text Available Model design for Restaurant Management System aims to help in restaurant business process, where Restaurant Management System (RMS help the waitress and chef could interact each other without paper limitation.  This Restaurant Management System Model develop using Agile Methodology and developed based on PHP Programming Langguage. The database management system is using MySQL. This web-based application model will enable the waitress and the chef to interact in realtime, from the time they accept the customer order until the chef could know what to cook and checklist for the waitress wheter the order is fullfill or not, until the cahsier that will calculate the bill and the payment that they accep from the customer.

  6. Reproducible research: a minority opinion

    Science.gov (United States)

    Drummond, Chris

    2018-01-01

    Reproducible research, a growing movement within many scientific fields, including machine learning, would require the code, used to generate the experimental results, be published along with any paper. Probably the most compelling argument for this is that it is simply following good scientific practice, established over the years by the greats of science. The implication is that failure to follow such a practice is unscientific, not a label any machine learning researchers would like to carry. It is further claimed that misconduct is causing a growing crisis of confidence in science. That, without this practice being enforced, science would inevitably fall into disrepute. This viewpoint is becoming ubiquitous but here I offer a differing opinion. I argue that far from being central to science, what is being promulgated is a narrow interpretation of how science works. I contend that the consequences are somewhat overstated. I would also contend that the effort necessary to meet the movement's aims, and the general attitude it engenders would not serve well any of the research disciplines, including our own.

  7. Nutrient cycle benchmarks for earth system land model

    Science.gov (United States)

    Zhu, Q.; Riley, W. J.; Tang, J.; Zhao, L.

    2017-12-01

    Projecting future biosphere-climate feedbacks using Earth system models (ESMs) relies heavily on robust modeling of land surface carbon dynamics. More importantly, soil nutrient (particularly, nitrogen (N) and phosphorus (P)) dynamics strongly modulate carbon dynamics, such as plant sequestration of atmospheric CO2. Prevailing ESM land models all consider nitrogen as a potentially limiting nutrient, and several consider phosphorus. However, including nutrient cycle processes in ESM land models potentially introduces large uncertainties that could be identified and addressed by improved observational constraints. We describe the development of two nutrient cycle benchmarks for ESM land models: (1) nutrient partitioning between plants and soil microbes inferred from 15N and 33P tracers studies and (2) nutrient limitation effects on carbon cycle informed by long-term fertilization experiments. We used these benchmarks to evaluate critical hypotheses regarding nutrient cycling and their representation in ESMs. We found that a mechanistic representation of plant-microbe nutrient competition based on relevant functional traits best reproduced observed plant-microbe nutrient partitioning. We also found that for multiple-nutrient models (i.e., N and P), application of Liebig's law of the minimum is often inaccurate. Rather, the Multiple Nutrient Limitation (MNL) concept better reproduces observed carbon-nutrient interactions.

  8. Models of complex attitude systems

    DEFF Research Database (Denmark)

    Sørensen, Bjarne Taulo

    search algorithms and structural equation models. The results suggest that evaluative judgments of the importance of production system attributes are generated in a schematic manner, driven by personal value orientations. The effect of personal value orientations was strong and largely unmediated...... that evaluative affect propagates through the system in such a way that the system becomes evaluatively consistent and operates as a schema for the generation of evaluative judgments. In the empirical part of the paper, the causal structure of an attitude system from which people derive their evaluations of pork......Existing research on public attitudes towards agricultural production systems is largely descriptive, abstracting from the processes through which members of the general public generate their evaluations of such systems. The present paper adopts a systems perspective on such evaluations...

  9. Stirling Engine Dynamic System Modeling

    Science.gov (United States)

    Nakis, Christopher G.

    2004-01-01

    The Thermo-Mechanical systems branch at the Glenn Research Center focuses a large amount time on Stirling engines. These engines will be used on missions where solar power is inefficient, especially in deep space. I work with Tim Regan and Ed Lewandowski who are currently developing and validating a mathematical model for the Stirling engines. This model incorporates all aspects of the system including, mechanical, electrical and thermodynamic components. Modeling is done through Simplorer, a program capable of running simulations of the model. Once created and then proven to be accurate, a model is used for developing new ideas for engine design. My largest specific project involves varying key parameters in the model and quantifying the results. This can all be done relatively trouble-free with the help of Simplorer. Once the model is complete, Simplorer will do all the necessary calculations. The more complicated part of this project is determining which parameters to vary. Finding key parameters depends on the potential for a value to be independently altered in the design. For example, a change in one dimension may lead to a proportional change to the rest of the model, and no real progress is made. Also, the ability for a changed value to have a substantial impact on the outputs of the system is important. Results will be condensed into graphs and tables with the purpose of better communication and understanding of the data. With the changing of these parameters, a more optimal design can be created without having to purchase or build any models. Also, hours and hours of results can be simulated in minutes. In the long run, using mathematical models can save time and money. Along with this project, I have many other smaller assignments throughout the summer. My main goal is to assist in the processes of model development, validation and testing.

  10. System Convergence in Transport Modelling

    DEFF Research Database (Denmark)

    Rich, Jeppe; Nielsen, Otto Anker; Cantarella, Guilio E.

    2010-01-01

    A fundamental premise of most applied transport models is the existence and uniqueness of an equilibrium solution that balances demand x(t) and supply t(x). The demand consists of the people that travel in the transport system and on the defined network, whereas the supply consists of the resulting...... level-of-service attributes (e.g., travel time and cost) offered to travellers. An important source of complexity is the congestion, which causes increasing demand to affect travel time in a non-linear way. Transport models most often involve separate models for traffic assignment and demand modelling...... iterating between a route-choice (demand) model and a time-flow (supply) model. It is generally recognised that a simple iteration scheme where the level-of-service level is fed directly to the route-choice and vice versa may exhibit an unstable pattern and lead to cyclic unstable solutions. It can be shown...

  11. Reproducibility between conventional and digital periapical radiography for bone height measurement

    Directory of Open Access Journals (Sweden)

    Miguel Simancas Pallares

    2015-10-01

    Conclusions. Reproducibility between methods was considered poor, including subgroup analysis, therefore, reproducibility between methods is minimal. Usage of these methods in periodontics should be made implementing the whole knowledge of the technical features and the advantages of these systems.

  12. Model Reduction of Hybrid Systems

    DEFF Research Database (Denmark)

    Shaker, Hamid Reza

    gramians. Generalized gramians are the solutions to the observability and controllability Lyapunov inequalities. In the first framework the projection matrices are found based on the common generalized gramians. This framework preserves the stability of the original switched system for all switching...... is guaranteed to be preserved for arbitrary switching signal. To compute the common generalized gramians linear matrix inequalities (LMI’s) need to be solved. These LMI’s are not always feasible. In order to solve the problem of conservatism, the second framework is presented. In this method the projection......High-Technological solutions of today are characterized by complex dynamical models. A lot of these models have inherent hybrid/switching structure. Hybrid/switched systems are powerful models for distributed embedded systems design where discrete controls are applied to continuous processes...

  13. Numerical Modeling of Microelectrochemical Systems

    DEFF Research Database (Denmark)

    Adesokan, Bolaji James

    incorporates the finite size of ionic species in the transport equation. The model presents a more appropriate boundary conditions which describe the modified Butler-Volmer reaction kinetics and account for the surface capacitance of the thin electric double layer. We also have found analytical solution...... at the electrode in a microelectrochemical system. In our analysis, we account for the finite size properties of ions in the mass and the charge transport of ionic species in an electrochemical system. This term characterizes the saturation of the ionic species close to the electrode surface. We then analyse......The PhD dissertation is concerned with mathematical modeling and simulation of electrochemical systems. The first three chapters of the thesis consist of the introductory part, the model development chapter and the chapter on the summary of the main results. The remaining three chapters report...

  14. Executive Information Systems' Multidimensional Models

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.

  15. The ternary sorption system U(VI)-phosphate-silica explained by spectroscopy and thermodynamic modelling

    Energy Technology Data Exchange (ETDEWEB)

    Foerstendorf, Harald; Stockmann, Madlen; Heim, Karsten; Mueller, Katharina; Brendler, Vinzenz [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Surface Processes; Comarmond, M.J.; Payne, T.E. [Australian Nuclear Science and Technology Organisation, Lucas Heights (Australia); Steudtner, Robin [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Inst. of Resource Ecology

    2017-06-01

    Spectroscopic data of sorption processes potentially provide direct impact on Surface Complexation Modelling (SCM) approaches. Based on spectroscopic data of the ternary sorption system U(VI)/phosphate/silica strongly suggesting the formation of a precipitate as the predominant surface process, SCM calculations accurately reproduced results from classical batch experiments.

  16. The ternary sorption system U(VI)-phosphate-silica explained by spectroscopy and thermodynamic modelling

    International Nuclear Information System (INIS)

    Foerstendorf, Harald; Stockmann, Madlen; Heim, Karsten; Mueller, Katharina; Brendler, Vinzenz; Steudtner, Robin

    2017-01-01

    Spectroscopic data of sorption processes potentially provide direct impact on Surface Complexation Modelling (SCM) approaches. Based on spectroscopic data of the ternary sorption system U(VI)/phosphate/silica strongly suggesting the formation of a precipitate as the predominant surface process, SCM calculations accurately reproduced results from classical batch experiments.

  17. Turboelectric Distributed Propulsion System Modelling

    OpenAIRE

    Liu, Chengyuan

    2013-01-01

    The Blended-Wing-Body is a conceptual aircraft design with rear-mounted, over wing engines. Turboelectric distributed propulsion system with boundary layer ingestion has been considered for this aircraft. It uses electricity to transmit power from the core turbine to the fans, therefore dramatically increases bypass ratio to reduce fuel consumption and noise. This dissertation presents methods on designing the TeDP system, evaluating effects of boundary layer ingestion, modelling engine perfo...

  18. VALIDITY AND REPRODUCIBILITY OF MEASURING THE KINEMATIC COUPLING BEHAVIOR OF CALCANEAL PRONATION/SUPINATION AND SHANK ROTATION DURING WEIGHT BEARING USING AN OPTICAL THREE-DIMENSIONAL MOTION ANALYSIS SYSTEM

    Directory of Open Access Journals (Sweden)

    Masahiro Edo

    2017-12-01

    Full Text Available Background: It’s important to understand the kinematic coupling of calcaneus and shank to optimize the pathological movement of the lower extremity. However, the quantitative indicator to show the kinematic coupling hasn’t been clarified. We measured the angles of calcaneal pronation-to-supination and shank rotation during pronation and supination of both feet in standing position and devised a technique to quantify the kinematic coupling behavior of calcaneal pronation/supination and shank rotation as the linear regression coefficient (kinematic chain ratio: KCR of those measurements. Therefore, we verified the validity and reproducibility of this technique. Methods: This study is a non-comparative cross-sectional study. The KCR, which is an outcome, was measured using an optical three-dimensional motion analysis system in 10 healthy subjects. The coefficient of determination (R² was calculated for the linear regression equation of the angle of calcaneal pronation-to-supination and angle of shank rotation, and the intraclass correlation coefficient (ICC [1,1] was calculated for the KCR during foot pronation and foot supination and for the KCR measured on different days. And also, skin movement artifacts were investigated by measurement of the displacement of bone and body surface markers in one healthy subject. Results: The linear regression equation of calcaneal pronation/supination and the angle of shank rotation included R²≥0.9 for all subjects. The KCR on foot pronation and supination had an ICC(1,1 of 0.95. The KCR measured on different days had an ICC(1,1 of 0.72. Skin movement artifacts were within the allowable range. Conclusion: The validity and reproducibility of this technique were largely good, and the technique can be used to quantify kinematic coupling behavior.

  19. Video distribution system cost model

    Science.gov (United States)

    Gershkoff, I.; Haspert, J. K.; Morgenstern, B.

    1980-01-01

    A cost model that can be used to systematically identify the costs of procuring and operating satellite linked communications systems is described. The user defines a network configuration by specifying the location of each participating site, the interconnection requirements, and the transmission paths available for the uplink (studio to satellite), downlink (satellite to audience), and voice talkback (between audience and studio) segments of the network. The model uses this information to calculate the least expensive signal distribution path for each participating site. Cost estimates are broken downy by capital, installation, lease, operations and maintenance. The design of the model permits flexibility in specifying network and cost structure.

  20. Information Systems Outsourcing Relationship Model

    Directory of Open Access Journals (Sweden)

    Richard Flemming

    2007-09-01

    Full Text Available Increasing attention is being paid to what determines the success of an information systems outsourcing arrangement. The current research aims to provide an improved understanding of the factors influencing the outcome of an information systems outsourcing relationship and to provide a preliminary validation of an extended outsourcing relationship model by interviews with information systems outsourcing professionals in both the client and vendor of a major Australian outsourcing relationship. It also investigates whether the client and the vendor perceive the relationship differently and if so, how they perceive it differently and whether the two perspectives are interrelated.

  1. Aggregate modeling of manufacturing systems

    NARCIS (Netherlands)

    Lefeber, A.A.J.; Armbruster, H.D.; Kempf, K.G.; Keskinocak, P.; Uzsoy, R.

    2011-01-01

    In this chapter we will present three approaches to model manufacturing systems in an aggregate way leading to fast and effective (i.e., scalable) simulations that allow the development of simulation tools for rapid exploration of different production scenarios in a factory as well as in a whole

  2. Ventilation system in fire modelization

    International Nuclear Information System (INIS)

    Cordero Garcia, S.

    2012-01-01

    There is a model of fire in an enclosure formed by two rooms. In one of them, it will cause the fire and check how the system of ventilation in different configurations responds. In addition, the behavior of selected targets, which will be a configuration of cables similar to those found in nuclear power stations will be analyzed.

  3. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    , this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security...

  4. Aggregate modeling of manufacturing systems

    NARCIS (Netherlands)

    Lefeber, A.A.J.; Armbruster, H.D.

    2007-01-01

    In this report we will present three approaches to model manufacturing systems in an aggregate way leading to fast and effective (i.e., scalable) simulations that allow the development of simulation tools for rapid exploration of different production scenarios in a factory as well as in a whole

  5. Stochastic Modelling of Hydrologic Systems

    DEFF Research Database (Denmark)

    Jonsdottir, Harpa

    2007-01-01

    In this PhD project several stochastic modelling methods are studied and applied on various subjects in hydrology. The research was prepared at Informatics and Mathematical Modelling at the Technical University of Denmark. The thesis is divided into two parts. The first part contains...... an introduction and an overview of the papers published. Then an introduction to basic concepts in hydrology along with a description of hydrological data is given. Finally an introduction to stochastic modelling is given. The second part contains the research papers. In the research papers the stochastic methods...... are described, as at the time of publication these methods represent new contribution to hydrology. The second part also contains additional description of software used and a brief introduction to stiff systems. The system in one of the papers is stiff....

  6. Thermodynamic modeling of complex systems

    DEFF Research Database (Denmark)

    Liang, Xiaodong

    after an oil spill. Engineering thermodynamics could be applied in the state-of-the-art sonar products through advanced artificial technology, if the speed of sound, solubility and density of oil-seawater systems could be satisfactorily modelled. The addition of methanol or glycols into unprocessed well...... is successfully applied to model the phase behaviour of water, chemical and hydrocarbon (oil) containing systems with newly developed pure component parameters for water and chemicals and characterization procedures for petroleum fluids. The performance of the PCSAFT EOS on liquid-liquid equilibria of water...... with hydrocarbons has been under debate for some vii years. An interactive step-wise procedure is proposed to fit the model parameters for small associating fluids by taking the liquid-liquid equilibrium data into account. It is still far away from a simple task to apply PC-SAFT in routine PVT simulations and phase...

  7. Quark/gluon jet discrimination: a reproducible analysis using R

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The power to discriminate between light-quark jets and gluon jets would have a huge impact on many searches for new physics at CERN and beyond. This talk will present a walk-through of the development of a prototype machine learning classifier for differentiating between quark and gluon jets at experiments like those at the Large Hadron Collider at CERN. A new fast feature selection method that combines information theory and graph analytics will be outlined. This method has found new variables that promise significant improvements in discrimination power. The prototype jet tagger is simple, interpretable, parsimonious, and computationally extremely cheap, and therefore might be suitable for use in trigger systems for real-time data processing. Nested stratified k-fold cross validation was used to generate robust estimates of model performance. The data analysis was performed entirely in the R statistical programming language, and is fully reproducible. The entire analysis workflow is data-driven, automated a...

  8. Surface wind mixing in the Regional Ocean Modeling System (ROMS)

    Science.gov (United States)

    Robertson, Robin; Hartlipp, Paul

    2017-12-01

    Mixing at the ocean surface is key for atmosphere-ocean interactions and the distribution of heat, energy, and gases in the upper ocean. Winds are the primary force for surface mixing. To properly simulate upper ocean dynamics and the flux of these quantities within the upper ocean, models must reproduce mixing in the upper ocean. To evaluate the performance of the Regional Ocean Modeling System (ROMS) in replicating the surface mixing, the results of four different vertical mixing parameterizations were compared against observations, using the surface mixed layer depth, the temperature fields, and observed diffusivities for comparisons. The vertical mixing parameterizations investigated were Mellor- Yamada 2.5 level turbulent closure (MY), Large- McWilliams- Doney Kpp (LMD), Nakanishi- Niino (NN), and the generic length scale (GLS) schemes. This was done for one temperate site in deep water in the Eastern Pacific and three shallow water sites in the Baltic Sea. The model reproduced the surface mixed layer depth reasonably well for all sites; however, the temperature fields were reproduced well for the deep site, but not for the shallow Baltic Sea sites. In the Baltic Sea, the models overmixed the water column after a few days. Vertical temperature diffusivities were higher than those observed and did not show the temporal fluctuations present in the observations. The best performance was by NN and MY; however, MY became unstable in two of the shallow simulations with high winds. The performance of GLS nearly as good as NN and MY. LMD had the poorest performance as it generated temperature diffusivities that were too high and induced too much mixing. Further observational comparisons are needed to evaluate the effects of different stratification and wind conditions and the limitations on the vertical mixing parameterizations.

  9. Cotangent Models for Integrable Systems

    Science.gov (United States)

    Kiesenhofer, Anna; Miranda, Eva

    2017-03-01

    We associate cotangent models to a neighbourhood of a Liouville torus in symplectic and Poisson manifolds focusing on b-Poisson/ b-symplectic manifolds. The semilocal equivalence with such models uses the corresponding action-angle theorems in these settings: the theorem of Liouville-Mineur-Arnold for symplectic manifolds and an action-angle theorem for regular Liouville tori in Poisson manifolds (Laurent- Gengoux et al., IntMath Res Notices IMRN 8: 1839-1869, 2011). Our models comprise regular Liouville tori of Poisson manifolds but also consider the Liouville tori on the singular locus of a b-Poisson manifold. For this latter class of Poisson structures we define a twisted cotangent model. The equivalence with this twisted cotangent model is given by an action-angle theorem recently proved by the authors and Scott (Math. Pures Appl. (9) 105(1):66-85, 2016). This viewpoint of cotangent models provides a new machinery to construct examples of integrable systems, which are especially valuable in the b-symplectic case where not many sources of examples are known. At the end of the paper we introduce non-degenerate singularities as lifted cotangent models on b-symplectic manifolds and discuss some generalizations of these models to general Poisson manifolds.

  10. On the ternary Ag – Cu – Ga system: Electromotive force measurement and thermodynamic modeling

    International Nuclear Information System (INIS)

    Gierlotka, Wojciech; Jendrzejczyk-Handzlik, Dominika; Fitzner, Krzysztof; Handzlik, Piotr

    2015-01-01

    The ternary silver–copper–gallium system found application as a solder material in jewel crafting and electronics, thus a phase diagram of this system seems to be important tool, which is necessary for a proper application of different alloys. The activity of gallium in liquid phase was determined by electromotive measurement technique and after that the equilibrium diagram of Ag – Cu – Ga was modeled based on available experimental data using Calphad approach. A set of Gibbs energies was found and used for calculation a phase diagram and thermodynamic properties of liquid phase. The experimental data was reproduced well by calculation. - Highlights: • For the first time activity of Ga in liquid Ag – Cu – Ga alloys was measured. • For the first time the ternary Ag – Cu – Ga system was thermodynamically modeled. • Modeled Ag – Cu – Ga system reproduces experimental data well

  11. Graph modeling systems and methods

    Science.gov (United States)

    Neergaard, Mike

    2015-10-13

    An apparatus and a method for vulnerability and reliability modeling are provided. The method generally includes constructing a graph model of a physical network using a computer, the graph model including a plurality of terminating vertices to represent nodes in the physical network, a plurality of edges to represent transmission paths in the physical network, and a non-terminating vertex to represent a non-nodal vulnerability along a transmission path in the physical network. The method additionally includes evaluating the vulnerability and reliability of the physical network using the constructed graph model, wherein the vulnerability and reliability evaluation includes a determination of whether each terminating and non-terminating vertex represents a critical point of failure. The method can be utilized to evaluate wide variety of networks, including power grid infrastructures, communication network topologies, and fluid distribution systems.

  12. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  13. REPRODUCING THE OBSERVED ABUNDANCES IN RCB AND HdC STARS WITH POST-DOUBLE-DEGENERATE MERGER MODELS-CONSTRAINTS ON MERGER AND POST-MERGER SIMULATIONS AND PHYSICS PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    Menon, Athira; Herwig, Falk; Denissenkov, Pavel A. [Department of Physics and Astronomy, University of Victoria, Victoria, BC V8P5C2 (Canada); Clayton, Geoffrey C.; Staff, Jan [Department of Physics and Astronomy, Louisiana State University, 202 Nicholson Hall, Tower Dr., Baton Rouge, LA 70803-4001 (United States); Pignatari, Marco [Department of Physics, University of Basel, Klingelbergstrasse 82, CH-4056 Basel (Switzerland); Paxton, Bill [Kavli Institute for Theoretical Physics and Department of Physics, Kohn Hall, University of California, Santa Barbara, CA 93106 (United States)

    2013-07-20

    The R Coronae Borealis (RCB) stars are hydrogen-deficient, variable stars that are most likely the result of He-CO WD mergers. They display extremely low oxygen isotopic ratios, {sup 16}O/{sup 18}O {approx_equal} 1-10, {sup 12}C/{sup 13}C {>=} 100, and enhancements up to 2.6 dex in F and in s-process elements from Zn to La, compared to solar. These abundances provide stringent constraints on the physical processes during and after the double-degenerate merger. As shown previously, O-isotopic ratios observed in RCB stars cannot result from the dynamic double-degenerate merger phase, and we now investigate the role of the long-term one-dimensional spherical post-merger evolution and nucleosynthesis based on realistic hydrodynamic merger progenitor models. We adopt a model for extra envelope mixing to represent processes driven by rotation originating in the dynamical merger. Comprehensive nucleosynthesis post-processing simulations for these stellar evolution models reproduce, for the first time, the full range of the observed abundances for almost all the elements measured in RCB stars: {sup 16}O/{sup 18}O ratios between 9 and 15, C-isotopic ratios above 100, and {approx}1.4-2.35 dex F enhancements, along with enrichments in s-process elements. The nucleosynthesis processes in our models constrain the length and temperature in the dynamic merger shell-of-fire feature as well as the envelope mixing in the post-merger phase. s-process elements originate either in the shell-of-fire merger feature or during the post-merger evolution, but the contribution from the asymptotic giant branch progenitors is negligible. The post-merger envelope mixing must eventually cease {approx}10{sup 6} yr after the dynamic merger phase before the star enters the RCB phase.

  14. Discrete modelling of drapery systems

    Science.gov (United States)

    Thoeni, Klaus; Giacomini, Anna

    2016-04-01

    Drapery systems are an efficient and cost-effective measure in preventing and controlling rockfall hazards on rock slopes. The simplest form consists of a row of ground anchors along the top of the slope connected to a horizontal support cable from which a wire mesh is suspended down the face of the slope. Such systems are generally referred to as simple or unsecured draperies (Badger and Duffy 2012). Variations such as secured draperies, where a pattern of ground anchors is incorporated within the field of the mesh, and hybrid systems, where the upper part of an unsecured drapery is elevated to intercept rockfalls originating upslope of the installation, are becoming more and more popular. This work presents a discrete element framework for simulation of unsecured drapery systems and its variations. The numerical model is based on the classical discrete element method (DEM) and implemented into the open-source framework YADE (Šmilauer et al., 2010). The model takes all relevant interactions between block, drapery and slope into account (Thoeni et al., 2014) and was calibrated and validated based on full-scale experiments (Giacomini et al., 2012).The block is modelled as a rigid clump made of spherical particles which allows any shape to be approximated. The drapery is represented by a set of spherical particle with remote interactions. The behaviour of the remote interactions is governed by the constitutive behaviour of the wire and generally corresponds to a piecewise linear stress-strain relation (Thoeni et al., 2013). The same concept is used to model wire ropes. The rock slope is represented by rigid triangular elements where material properties (e.g., normal coefficient of restitution, friction angle) are assigned to each triangle. The capabilities of the developed model to simulate drapery systems and estimate the residual hazard involved with such systems is shown. References Badger, T.C., Duffy, J.D. (2012) Drapery systems. In: Turner, A.K., Schuster R

  15. Quantum models of classical systems

    International Nuclear Information System (INIS)

    Hájíček, P

    2015-01-01

    Quantum statistical methods that are commonly used for the derivation of classical thermodynamic properties are extended to classical mechanical properties. The usual assumption that every real motion of a classical mechanical system is represented by a sharp trajectory is not testable and is replaced by a class of fuzzy models, the so-called maximum entropy (ME) packets. The fuzzier are the compared classical and quantum ME packets, the better seems to be the match between their dynamical trajectories. Classical and quantum models of a stiff rod will be constructed to illustrate the resulting unified quantum theory of thermodynamic and mechanical properties. (paper)

  16. Models of the venous system

    DEFF Research Database (Denmark)

    Mehlsen, J

    2000-01-01

    Cardiac output is largely controlled by venous return, the driving force of which is the energy remaining at the postcapillary venous site. This force is influenced by forces acting close to the right atrium, and internally or externally upon the veins along their course. Analogue models of the v......Cardiac output is largely controlled by venous return, the driving force of which is the energy remaining at the postcapillary venous site. This force is influenced by forces acting close to the right atrium, and internally or externally upon the veins along their course. Analogue models...... of the venous system require at least three elements: a resistor, a capacitor and an inductor, with the latter being of more importance in the venous than in the arterial system. Non-linearities must be considered in pressure/flow relations in the small venules, during venous collapse, or low flow conditions...

  17. Studies of Catalytic Model Systems

    DEFF Research Database (Denmark)

    Holse, Christian

    The overall topic of this thesis is within the field of catalysis, were model systems of different complexity have been studied utilizing a multipurpose Ultra High Vacuum chamber (UHV). The thesis falls in two different parts. First a simple model system in the form of a ruthenium single crystal...... of the Cu/ZnO nanoparticles is highly relevant to industrial methanol synthesis for which the direct interaction of Cu and ZnO nanocrystals synergistically boost the catalytic activity. The dynamical behavior of the nanoparticles under reducing and oxidizing environments were studied by means of ex situ X......-ray Photoelectron Electron Spectroscopy (XPS) and in situ Transmission Electron Microscopy (TEM). The surface composition of the nanoparticles changes reversibly as the nanoparticles exposed to cycles of high-pressure oxidation and reduction (200 mbar). Furthermore, the presence of metallic Zn is observed by XPS...

  18. Aerial Measuring System Sensor Modeling

    International Nuclear Information System (INIS)

    Detwiler, R.S.

    2002-01-01

    This project deals with the modeling the Aerial Measuring System (AMS) fixed-wing and rotary-wing sensor systems, which are critical U.S. Department of Energy's National Nuclear Security Administration (NNSA) Consequence Management assets. The fixed-wing system is critical in detecting lost or stolen radiography or medical sources, or mixed fission products as from a commercial power plant release at high flying altitudes. The helicopter is typically used at lower altitudes to determine ground contamination, such as in measuring americium from a plutonium ground dispersal during a cleanup. Since the sensitivity of these instruments as a function of altitude is crucial in estimating detection limits of various ground contaminations and necessary count times, a characterization of their sensitivity as a function of altitude and energy is needed. Experimental data at altitude as well as laboratory benchmarks is important to insure that the strong effects of air attenuation are modeled correctly. The modeling presented here is the first attempt at such a characterization of the equipment for flying altitudes. The sodium iodide (NaI) sensors utilized with these systems were characterized using the Monte Carlo N-Particle code (MCNP) developed at Los Alamos National Laboratory. For the fixed wing system, calculations modeled the spectral response for the 3-element NaI detector pod and High-Purity Germanium (HPGe) detector, in the relevant energy range of 50 keV to 3 MeV. NaI detector responses were simulated for both point and distributed surface sources as a function of gamma energy and flying altitude. For point sources, photopeak efficiencies were calculated for a zero radial distance and an offset equal to the altitude. For distributed sources approximating an infinite plane, gross count efficiencies were calculated and normalized to a uniform surface deposition of 1 microCi/m 2 . The helicopter calculations modeled the transport of americium-241 ( 241 Am) as this is

  19. Modeling fuel cell stack systems

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J H [Los Alamos National Lab., Los Alamos, NM (United States); Lalk, T R [Dept. of Mech. Eng., Texas A and M Univ., College Station, TX (United States)

    1998-06-15

    A technique for modeling fuel cell stacks is presented along with the results from an investigation designed to test the validity of the technique. The technique was specifically designed so that models developed using it can be used to determine the fundamental thermal-physical behavior of a fuel cell stack for any operating and design configuration. Such models would be useful tools for investigating fuel cell power system parameters. The modeling technique can be applied to any type of fuel cell stack for which performance data is available for a laboratory scale single cell. Use of the technique is demonstrated by generating sample results for a model of a Proton Exchange Membrane Fuel Cell (PEMFC) stack consisting of 125 cells each with an active area of 150 cm{sup 2}. A PEMFC stack was also used in the verification investigation. This stack consisted of four cells, each with an active area of 50 cm{sup 2}. Results from the verification investigation indicate that models developed using the technique are capable of accurately predicting fuel cell stack performance. (orig.)

  20. Model reduction of parametrized systems

    CERN Document Server

    Ohlberger, Mario; Patera, Anthony; Rozza, Gianluigi; Urban, Karsten

    2017-01-01

    The special volume offers a global guide to new concepts and approaches concerning the following topics: reduced basis methods, proper orthogonal decomposition, proper generalized decomposition, approximation theory related to model reduction, learning theory and compressed sensing, stochastic and high-dimensional problems, system-theoretic methods, nonlinear model reduction, reduction of coupled problems/multiphysics, optimization and optimal control, state estimation and control, reduced order models and domain decomposition methods, Krylov-subspace and interpolatory methods, and applications to real industrial and complex problems. The book represents the state of the art in the development of reduced order methods. It contains contributions from internationally respected experts, guaranteeing a wide range of expertise and topics. Further, it reflects an important effor t, carried out over the last 12 years, to build a growing research community in this field. Though not a textbook, some of the chapters ca...

  1. Component Reification in Systems Modelling

    DEFF Research Database (Denmark)

    Bendisposto, Jens; Hallerstede, Stefan

    When modelling concurrent or distributed systems in Event-B, we often obtain models where the structure of the connected components is specified by constants. Their behaviour is specified by the non-deterministic choice of event parameters for events that operate on shared variables. From a certain......? These components may still refer to shared variables. Events of these components should not refer to the constants specifying the structure. The non-deterministic choice between these components should not be via parameters. We say the components are reified. We need to address how the reified components get...... reflected into the original model. This reflection should indicate the constraints on how to connect the components....

  2. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  3. Annotating with Propp's Morphology of the Folktale: Reproducibility and Trainability

    NARCIS (Netherlands)

    Fisseni, B.; Kurji, A.; Löwe, B.

    2014-01-01

    We continue the study of the reproducibility of Propp’s annotations from Bod et al. (2012). We present four experiments in which test subjects were taught Propp’s annotation system; we conclude that Propp’s system needs a significant amount of training, but that with sufficient time investment, it

  4. Genome Modeling System: A Knowledge Management Platform for Genomics.

    Directory of Open Access Journals (Sweden)

    Malachi Griffith

    2015-07-01

    Full Text Available In this work, we present the Genome Modeling System (GMS, an analysis information management system capable of executing automated genome analysis pipelines at a massive scale. The GMS framework provides detailed tracking of samples and data coupled with reliable and repeatable analysis pipelines. The GMS also serves as a platform for bioinformatics development, allowing a large team to collaborate on data analysis, or an individual researcher to leverage the work of others effectively within its data management system. Rather than separating ad-hoc analysis from rigorous, reproducible pipelines, the GMS promotes systematic integration between the two. As a demonstration of the GMS, we performed an integrated analysis of whole genome, exome and transcriptome sequencing data from a breast cancer cell line (HCC1395 and matched lymphoblastoid line (HCC1395BL. These data are available for users to test the software, complete tutorials and develop novel GMS pipeline configurations. The GMS is available at https://github.com/genome/gms.

  5. Model Reduction of Nonlinear Aeroelastic Systems Experiencing Hopf Bifurcation

    KAUST Repository

    Abdelkefi, Abdessattar

    2013-06-18

    In this paper, we employ the normal form to derive a reduced - order model that reproduces nonlinear dynamical behavior of aeroelastic systems that undergo Hopf bifurcation. As an example, we consider a rigid two - dimensional airfoil that is supported by nonlinear springs in the pitch and plunge directions and subjected to nonlinear aerodynamic loads. We apply the center manifold theorem on the governing equations to derive its normal form that constitutes a simplified representation of the aeroelastic sys tem near flutter onset (manifestation of Hopf bifurcation). Then, we use the normal form to identify a self - excited oscillator governed by a time - delay ordinary differential equation that approximates the dynamical behavior while reducing the dimension of the original system. Results obtained from this oscillator show a great capability to predict properly limit cycle oscillations that take place beyond and above flutter as compared with the original aeroelastic system.

  6. Models of hot stellar systems

    International Nuclear Information System (INIS)

    Van Albada, T.S.

    1986-01-01

    Elliptical galaxies consist almost entirely of stars. Sites of recent star formation are rare, and most stars are believed to be several billion years old, perhaps as old as the Universe itself (--10/sup 10/ yrs). Stellar motions in ellipticals show a modest amount of circulation about the center of the system, but most support against the force of gravity is provided by random motions; for this reason ellipticals are called 'hot' stellar systems. Spiral galaxies usually also contain an appreciable amount of gas (--10%, mainly atomic hydrogen) and new stars are continually being formed out of this gas, especially in the spiral arms. In contrast to ellipticals, support against gravity in spiral galaxies comes almost entirely from rotation; random motions of the stars with respect to rotation are small. Consequently, spiral galaxies are called 'cold' stellar systems. Other than in hot systems, in cold systems the collective response of stars to variations in the force field is an essential part of the dynamics. The present overview is limited to mathematical models of hot systems. Computational methods are also discussed

  7. Aerial measuring system sensor modeling

    International Nuclear Information System (INIS)

    Detwiler, Rebecca

    2002-01-01

    The AMS fixed-wing and rotary-wing systems are critical National Nuclear Security Administration (NNSA) Emergency Response assets. This project is principally focused on the characterization of the sensors utilized with these systems via radiation transport calculations. The Monte Carlo N-Particle code (MCNP) which has been developed at Los Alamos National Laboratory was used to model the detector response of the AMS fixed wing and helicopter systems. To validate the calculations, benchmark measurements were made for simple source-detector configurations. The fixed-wing system is an important tool in response to incidents involving the release of mixed fission products (a commercial power reactor release), the threat or actual explosion of a Radiological Dispersal Device, and the loss or theft of a large industrial source (a radiography source). Calculations modeled the spectral response for the sensors contained, a 3-element NaI detector pod and HpGe detector, in the relevant energy range of 50 keV to 3 MeV. NaI detector responses were simulated for both point and distributed surface sources as a function of gamma energy and flying altitude. For point sources, photo-peak efficiencies were calculated for a zero radial distance and an offset equal to the altitude. For distributed sources approximating infinite plane, gross count efficiencies were calculated and normalized to a uniform surface deposition of 1 C i/m2

  8. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  9. Environment and industrial economy: Challenge of reproducibility

    International Nuclear Information System (INIS)

    Rullani, E.

    1992-01-01

    Historically and methodologically counterposed until now, the environmentalist and the economic approach to environmental problems need to be integrated in a new approach that considers, from one side, the relevance of the ecological equilibria for the economic systems and, from the other side, the economic dimension (in terms of investments and transformations in the production system) of any attempt to achieve a better environment. In order to achieve this integration, both approaches are compelled to give up some cultural habits that have characterized them, and have contributed to over-emphasize the opposition between them. The article shows that both approaches can converge into a new one, in which environment is no longer only an holistic, not bargainable, natural external limit to human activity (as in the environmentalist approach), nor simply a scarce and exhaustible resource (as economics tends to consider it); environment should instead become part of the reproducibility sphere, or, in other words, it must be regarded as part of the output that the economic system provides. This new approach, due to scientific and technological advances, is made possible for an increasing class of environmental problems. In order to do this, an evolution is required, that could be able to convert environmental goals into investment and technological innovation goals, and communicate to the firms the value society assigns to environmental resources. This value, the author suggests, should correspond to the reproduction cost. Various examples of this new approach are analyzed and discussed

  10. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  11. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  12. Mathematical Modeling of Constrained Hamiltonian Systems

    NARCIS (Netherlands)

    Schaft, A.J. van der; Maschke, B.M.

    1995-01-01

    Network modelling of unconstrained energy conserving physical systems leads to an intrinsic generalized Hamiltonian formulation of the dynamics. Constrained energy conserving physical systems are directly modelled as implicit Hamiltonian systems with regard to a generalized Dirac structure on the

  13. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    OpenAIRE

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-01-01

    The Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamfl...

  14. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  15. Modelling of data acquisition systems

    International Nuclear Information System (INIS)

    Buono, S.; Gaponenko, I.; Jones, R.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Spiwoks, R.; Skiadelli, M.; Ambrosini, G.

    1994-01-01

    The RD13 project was approved in April 1991 for the development of a scalable data taking system suitable to host various LHC studies. One of its goals is to use simulations as a tool for understanding, evaluating, and constructing different configurations of such data acquisition (DAQ) systems. The RD13 project has developed a modelling framework for this purpose. It is based on MODSIM II, an object-oriented, discrete-event simulation language. A library of DAQ components allows to describe a variety of DAQ architectures and different hardware options in a modular and scalable way. A graphical user interface (GUI) is used to do easy configuration, initialization and on-line monitoring of the simulation program. A tracing facility is used to do flexible off-line analysis of a trace file written at run-time

  16. Investigation of the Intra- and Interlaboratory Reproducibility of a Small Scale Standardized Supersaturation and Precipitation Method

    DEFF Research Database (Denmark)

    Plum, Jakob; Madsen, Cecilie M; Teleki, Alexandra

    2017-01-01

    order for the three model compounds using the SSPM (aprepitant > felodipine ≈ fenofibrate). The α-value is dependent on the experimental setup and can be used as a parameter to evaluate the uniformity of the data set. This study indicated that the SSPM was able to obtain the same rank order of the β...... compound available for absorption. However, due to the stochastic nature of nucleation, supersaturating drug delivery systems may lead to inter- and intrapersonal variability. The ability to define a feasible range with respect to the supersaturation level is a crucial factor for a successful formulation...... reproducibility study of felodipine was conducted, after which seven partners contributed with data for three model compounds; aprepitant, felodipine, and fenofibrate, to determine the interlaboratory reproducibility of the SSPM. The first part of the SSPM determines the apparent degrees of supersaturation (a...

  17. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  18. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  19. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  20. Modeling integrated water user decisions in intermittent supply systems

    Science.gov (United States)

    Rosenberg, David E.; Tarawneh, Tarek; Abdel-Khaleq, Rania; Lund, Jay R.

    2007-07-01

    We apply systems analysis to estimate household water use in an intermittent supply system considering numerous interdependent water user behaviors. Some 39 household actions include conservation; improving local storage or water quality; and accessing sources having variable costs, availabilities, reliabilities, and qualities. A stochastic optimization program with recourse decisions identifies the infrastructure investments and short-term coping actions a customer can adopt to cost-effectively respond to a probability distribution of piped water availability. Monte Carlo simulations show effects for a population of customers. Model calibration reproduces the distribution of billed residential water use in Amman, Jordan. Parametric analyses suggest economic and demand responses to increased availability and alternative pricing. It also suggests potential market penetration for conservation actions, associated water savings, and subsidies to entice further adoption. We discuss new insights to size, target, and finance conservation.

  1. Modeling learning technology systems as business systems

    NARCIS (Netherlands)

    Avgeriou, Paris; Retalis, Symeon; Papaspyrou, Nikolaos

    2003-01-01

    The design of Learning Technology Systems, and the Software Systems that support them, is largely conducted on an intuitive, ad hoc basis, thus resulting in inefficient systems that defectively support the learning process. There is now justifiable, increasing effort in formalizing the engineering

  2. Reproducibility of Quantitative Structural and Physiological MRI Measurements

    Science.gov (United States)

    2017-08-09

    project.org/) and SPSS (IBM Corp., Armonk, NY) for data analysis. Mean and confidence inter- vals for each measure are found in Tables 1–7. To assess...visits, and was calculated using a two- way mixed model in SPSS MCV and MRD values closer to 0 are considered to be the most reproducible, and ICC

  3. Bond graph modeling of centrifugal compression systems

    OpenAIRE

    Uddin, Nur; Gravdahl, Jan Tommy

    2015-01-01

    A novel approach to model unsteady fluid dynamics in a compressor network by using a bond graph is presented. The model is intended in particular for compressor control system development. First, we develop a bond graph model of a single compression system. Bond graph modeling offers a different perspective to previous work by modeling the compression system based on energy flow instead of fluid dynamics. Analyzing the bond graph model explains the energy flow during compressor surge. Two pri...

  4. A model management system for combat simulation

    OpenAIRE

    Dolk, Daniel R.

    1986-01-01

    The design and implementation of a model management system to support combat modeling is discussed. Structured modeling is introduced as a formalism for representing mathematical models. A relational information resource dictionary system is developed which can accommodate structured models. An implementation is described. Structured modeling is then compared to Jackson System Development (JSD) as a methodology for facilitating discrete event simulation. JSD is currently better at representin...

  5. Fuzzy modeling and control of rotary inverted pendulum system using LQR technique

    International Nuclear Information System (INIS)

    Fairus, M A; Mohamed, Z; Ahmad, M N

    2013-01-01

    Rotary inverted pendulum (RIP) system is a nonlinear, non-minimum phase, unstable and underactuated system. Controlling such system can be a challenge and is considered a benchmark in control theory problem. Prior to designing a controller, equations that represent the behaviour of the RIP system must be developed as accurately as possible without compromising the complexity of the equations. Through Takagi-Sugeno (T-S) fuzzy modeling technique, the nonlinear system model is then transformed into several local linear time-invariant models which are then blended together to reproduce, or approximate, the nonlinear system model within local region. A parallel distributed compensation (PDC) based fuzzy controller using linear quadratic regulator (LQR) technique is designed to control the RIP system. The results show that the designed controller able to balance the RIP system

  6. Model Driven Development of Data Sensitive Systems

    DEFF Research Database (Denmark)

    Olsen, Petur

    2014-01-01

    storage systems, where the actual values of the data is not relevant for the behavior of the system. For many systems the values are important. For instance the control flow of the system can be dependent on the input values. We call this type of system data sensitive, as the execution is sensitive...... to the values of variables. This theses strives to improve model-driven development of such data-sensitive systems. This is done by addressing three research questions. In the first we combine state-based modeling and abstract interpretation, in order to ease modeling of data-sensitive systems, while allowing...... efficient model-checking and model-based testing. In the second we develop automatic abstraction learning used together with model learning, in order to allow fully automatic learning of data-sensitive systems to allow learning of larger systems. In the third we develop an approach for modeling and model-based...

  7. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  8. Relevant principal factors affecting the reproducibility of insect primary culture.

    Science.gov (United States)

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-06-01

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  9. Physical and mathematical models of communication systems

    International Nuclear Information System (INIS)

    Verkhovskaya, E.P.; Yavorskij, V.V.

    2006-01-01

    The theoretical parties connecting resources of communication system with characteristics of channels are received. The model of such systems from positions quasi-classical thermodynamics is considered. (author)

  10. Particle Tracking Model (PTM) with Coastal Modeling System (CMS)

    Science.gov (United States)

    2015-11-04

    Coastal Inlets Research Program Particle Tracking Model (PTM) with Coastal Modeling System ( CMS ) The Particle Tracking Model (PTM) is a Lagrangian...currents and waves. The Coastal Inlets Research Program (CIRP) supports the PTM with the Coastal Modeling System ( CMS ), which provides coupled wave...and current forcing for PTM simulations. CMS -PTM is implemented in the Surface-water Modeling System, a GUI environment for input development

  11. Reproducing Epidemiologic Research and Ensuring Transparency.

    Science.gov (United States)

    Coughlin, Steven S

    2017-08-15

    Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Using the Model Coupling Toolkit to couple earth system models

    Science.gov (United States)

    Warner, J.C.; Perlin, N.; Skyllingstad, E.D.

    2008-01-01

    Continued advances in computational resources are providing the opportunity to operate more sophisticated numerical models. Additionally, there is an increasing demand for multidisciplinary studies that include interactions between different physical processes. Therefore there is a strong desire to develop coupled modeling systems that utilize existing models and allow efficient data exchange and model control. The basic system would entail model "1" running on "M" processors and model "2" running on "N" processors, with efficient exchange of model fields at predetermined synchronization intervals. Here we demonstrate two coupled systems: the coupling of the ocean circulation model Regional Ocean Modeling System (ROMS) to the surface wave model Simulating WAves Nearshore (SWAN), and the coupling of ROMS to the atmospheric model Coupled Ocean Atmosphere Prediction System (COAMPS). Both coupled systems use the Model Coupling Toolkit (MCT) as a mechanism for operation control and inter-model distributed memory transfer of model variables. In this paper we describe requirements and other options for model coupling, explain the MCT library, ROMS, SWAN and COAMPS models, methods for grid decomposition and sparse matrix interpolation, and provide an example from each coupled system. Methods presented in this paper are clearly applicable for coupling of other types of models. ?? 2008 Elsevier Ltd. All rights reserved.

  13. Modeling vapor pressures of solvent systems with and without a salt effect: An extension of the LSER approach

    International Nuclear Information System (INIS)

    Senol, Aynur

    2015-01-01

    Highlights: • A new polynomial vapor pressure approach for pure solvents is presented. • Solvation models reproduce the vapor pressure data within a 4% mean error. • A concentration-basis vapor pressure model is also implemented on relevant systems. • The reliability of existing models was analyzed using log-ratio objective function. - Abstract: A new polynomial vapor pressure approach for pure solvents is presented. The model is incorporated into the LSER (linear solvation energy relation) based solvation model framework and checked for consistency in reproducing experimental vapor pressures of salt-containing solvent systems. The developed two structural forms of the generalized solvation model (Senol, 2013) provide a relatively accurate description of the salting effect on vapor pressure of (solvent + salt) systems. The equilibrium data spanning vapor pressures of eighteen (solvent + salt) and three (solvent (1) + solvent (2) + salt) systems have been subjected to establish the basis for the model reliability analysis using a log-ratio objective function. The examined vapor pressure relations reproduce the observed performance relatively accurately, yielding the overall design factors of 1.084, 1.091 and 1.052 for the integrated property-basis solvation model (USMIP), reduced property-basis solvation model and concentration-dependent model, respectively. Both the integrated property-basis and reduced property-basis solvation models were able to simulate satisfactorily the vapor pressure data of a binary solvent mixture involving a salt, yielding an overall mean error of 5.2%

  14. Modeling Power Systems as Complex Adaptive Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Malard, Joel M.; Posse, Christian; Gangopadhyaya, Asim; Lu, Ning; Katipamula, Srinivas; Mallow, J V.

    2004-12-30

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today's most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This report explores the state-of-the-art physical analogs for understanding the behavior of some econophysical systems and deriving stable and robust control strategies for using them. We review and discuss applications of some analytic methods based on a thermodynamic metaphor, according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood. We apply these methods to the question of how power markets can be expected to behave under a variety of conditions.

  15. A discrete model to study reaction-diffusion-mechanics systems.

    Science.gov (United States)

    Weise, Louis D; Nash, Martyn P; Panfilov, Alexander V

    2011-01-01

    This article introduces a discrete reaction-diffusion-mechanics (dRDM) model to study the effects of deformation on reaction-diffusion (RD) processes. The dRDM framework employs a FitzHugh-Nagumo type RD model coupled to a mass-lattice model, that undergoes finite deformations. The dRDM model describes a material whose elastic properties are described by a generalized Hooke's law for finite deformations (Seth material). Numerically, the dRDM approach combines a finite difference approach for the RD equations with a Verlet integration scheme for the equations of the mass-lattice system. Using this framework results were reproduced on self-organized pacemaking activity that have been previously found with a continuous RD mechanics model. Mechanisms that determine the period of pacemakers and its dependency on the medium size are identified. Finally it is shown how the drift direction of pacemakers in RDM systems is related to the spatial distribution of deformation and curvature effects.

  16. A discrete model to study reaction-diffusion-mechanics systems.

    Directory of Open Access Journals (Sweden)

    Louis D Weise

    Full Text Available This article introduces a discrete reaction-diffusion-mechanics (dRDM model to study the effects of deformation on reaction-diffusion (RD processes. The dRDM framework employs a FitzHugh-Nagumo type RD model coupled to a mass-lattice model, that undergoes finite deformations. The dRDM model describes a material whose elastic properties are described by a generalized Hooke's law for finite deformations (Seth material. Numerically, the dRDM approach combines a finite difference approach for the RD equations with a Verlet integration scheme for the equations of the mass-lattice system. Using this framework results were reproduced on self-organized pacemaking activity that have been previously found with a continuous RD mechanics model. Mechanisms that determine the period of pacemakers and its dependency on the medium size are identified. Finally it is shown how the drift direction of pacemakers in RDM systems is related to the spatial distribution of deformation and curvature effects.

  17. The radionuclide migration model in river system

    International Nuclear Information System (INIS)

    Zhukova, O.M.; Shiryaeva, N.M.; Myshkina, M.K.; Shagalova, Eh.D.; Denisova, V.V.; Skurat, V.V.

    2001-01-01

    It was propose the model of radionuclide migration in river system based on principle of the compartmental model at hydraulically stationary and chemically equilibrium conditions of interaction of radionuclides in system water-dredge, water-sediments. Different conditions of radioactive contamination entry in river system were considered. The model was verified on the data of radiation monitoring of Iput' river

  18. Model Information Exchange System (MIXS).

    Science.gov (United States)

    2013-08-01

    Many travel demand forecast models operate at state, regional, and local levels. While they share the same physical network in overlapping geographic areas, they use different and uncoordinated modeling networks. This creates difficulties for models ...

  19. Induction of a chloracne phenotype in an epidermal equivalent model by 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) is dependent on aryl hydrocarbon receptor activation and is not reproduced by aryl hydrocarbon receptor knock down.

    Science.gov (United States)

    Forrester, Alison R; Elias, Martina S; Woodward, Emma L; Graham, Mark; Williams, Faith M; Reynolds, Nick J

    2014-01-01

    2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) is a potent activator of the aryl hydrocarbon receptor (AhR) and causes chloracne in humans. The pathogenesis and role of AhR in chloracne remains incompletely understood. To elucidate the mechanisms contributing to the development of the chloracne-like phenotype in a human epidermal equivalent model and identify potential biomarkers. Using primary normal human epidermal keratinocytes (NHEK), we studied AhR activation by XRE-luciferase, AhR degradation and CYP1A1 induction. We treated epidermal equivalents with high affinity TCDD or two non-chloracnegens: β-naphthoflavone (β-NF) and 2-(1'H-indole-3'-carbonyl)-thiazole-4-carboxylic acid methyl ester (ITE). Using Western blotting and immunochemistry for filaggrin (FLG), involucrin (INV) and transglutaminase-1 (TGM-1), we compared the effects of the ligands on keratinocyte differentiation and development of the chloracne-like phenotype by H&E. In NHEKs, activation of an XRE-luciferase and CYP1A1 protein induction correlated with ligand binding affinity: TCDD>β-NF>ITE. AhR degradation was induced by all ligands. In epidermal equivalents, TCDD induced a chloracne-like phenotype, whereas β-NF or ITE did not. All three ligands induced involucrin and TGM-1 protein expression in epidermal equivalents whereas FLG protein expression decreased following treatment with TCDD and β-NF. Inhibition of AhR by α-NF blocked TCDD-induced AhR activation in NHEKs and blocked phenotypic changes in epidermal equivalents; however, AhR knock down did not reproduce the phenotype. Ligand-induced CYP1A1 and AhR degradation did not correlate with their chloracnegenic potential, indicating that neither CYP1A1 nor AhR are suitable biomarkers. Mechanistic studies showed that the TCDD-induced chloracne-like phenotype depends on AhR activation whereas AhR knock down did not appear sufficient to induce the phenotype. Copyright © 2013 Japanese Society for Investigative Dermatology. Published by Elsevier

  20. LHC Orbit Correction Reproducibility and Related Machine Protection

    CERN Document Server

    Baer, T; Schmidt, R; Wenninger, J

    2012-01-01

    The Large Hadron Collider (LHC) has an unprecedented nominal stored beam energy of up to 362 MJ per beam. In order to ensure an adequate machine protection by the collimation system, a high reproducibility of the beam position at collimators and special elements like the final focus quadrupoles is essential. This is realized by a combination of manual orbit corrections, feed forward and real time feedback. In order to protect the LHC against inconsistent orbit corrections, which could put the machine in a vulnerable state, a novel software-based interlock system for orbit corrector currents was developed. In this paper, the principle of the new interlock system is described and the reproducibility of the LHC orbit correction is discussed against the background of this system.

  1. Towards Modelling of Hybrid Systems

    DEFF Research Database (Denmark)

    Wisniewski, Rafal

    2006-01-01

    system consists of a number of dynamical systems that are glued together according to information encoded in the discrete part of the system. We develop a definition of a hybrid system as a functor from the category generated by a transition system to the category of directed topological spaces. Its...

  2. Reproducibility of central lumbar vertebral BMD

    International Nuclear Information System (INIS)

    Chan, F.; Pocock, N.; Griffiths, M.; Majerovic, Y.; Freund, J.

    1997-01-01

    Full text: Lumbar vertebral bone mineral density (BMD) using dual X-ray absorptiometry (DXA) has generally been calculated from a region of interest which includes the entire vertebral body. Although this region excludes part of the transverse processes, it does include the outer cortical shell of the vertebra. Recent software has been devised to calculate BMD in a central vertebral region of interest which excludes the outer cortical envelope. Theoretically this area may be more sensitive to detecting osteoporosis which affects trabecular bone to a greater extent than cortical bone. Apart from the sensitivity of BMD estimation, the reproducibility of any measurement is important owing to the slow rate of change of bone mass. We have evaluated the reproducibility of this new vertebral region of interest in 23 women who had duplicate lumbar spine DXA scans performed on the same day. The patients were repositioned between each measurement. Central vertebral analysis was performed for L2-L4 and the reproducibility of area, bone mineral content (BMC) and BMD calculated as the coefficient of variation; these values were compared with those from conventional analysis. Thus we have shown that the reproducibility of the central BMD is comparable to the conventional analysis which is essential if this technique is to provide any additional clinical data. The reasons for the decrease in reproducibility of the area and hence BMC requires further investigation

  3. Modelling, analysis and optimisation of energy systems on offshore platforms

    DEFF Research Database (Denmark)

    Nguyen, Tuong-Van

    of oil and gas facilities, (ii) the means to reduce their performance losses, and (iii) the systematic design of future plants. This work builds upon a combination of modelling tools, performance evaluation methods and multi-objective optimisation routines to reproduce the behaviour of five offshore......Nowadays, the offshore production of oil and gas requires on-site processing, which includes operations such as separation, compression and purification. The offshore system undergoes variations of the petroleum production rates over the field life – it is therefore operated far from its nominal...... with the combustion, pressure-change and cooling operations, but these processes are ranked differently depending on the plant layout and on the field production stage. The most promising improvements consist of introducing a multi-level production manifold, avoiding anti-surge gas recirculation, installing a waste...

  4. An information theory model for dissipation in open quantum systems

    Science.gov (United States)

    Rogers, David M.

    2017-08-01

    This work presents a general model for open quantum systems using an information game along the lines of Jaynes’ original work. It is shown how an energy based reweighting of propagators provides a novel moment generating function at each time point in the process. Derivatives of the generating function give moments of the time derivatives of observables. Aside from the mathematically helpful properties, the ansatz reproduces key physics of stochastic quantum processes. At high temperature, the average density matrix follows the Caldeira-Leggett equation. Its associated Langevin equation clearly demonstrates the emergence of dissipation and decoherence time scales, as well as an additional diffusion due to quantum confinement. A consistent interpretation of these results is that decoherence and wavefunction collapse during measurement are directly related to the degree of environmental noise, and thus occur because of subjective uncertainty of an observer.

  5. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M A; Fink, D; Hua, Q; Jacobsen, G E; Lawson, E M; Smith, A M; Tuniz, C [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  6. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  7. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin

    2010-01-01

    Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model...

  8. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  9. CFD Modeling of Flow and Ion Exchange Kinetics in a Rotating Bed Reactor System

    DEFF Research Database (Denmark)

    Larsson, Hilde Kristina; Schjøtt Andersen, Patrick Alexander; Byström, Emil

    2017-01-01

    A rotating bed reactor (RBR) has been modeled using computational fluid dynamics (CFD). The flow pattern in the RBR was investigated and the flow through the porous material in it was quantified. A simplified geometry representing the more complex RBR geometry was introduced and the simplified...... model was able to reproduce the main characteristics of the flow. Alternating reactor shapes were investigated, and it was concluded that the use of baffles has a very large impact on the flows through the porous material. The simulations suggested, therefore, that even faster reaction rates could...... be achieved by making the baffles deeper. Two-phase simulations were performed, which managed to reproduce the deflection of the gas–liquid interface in an unbaffled system. A chemical reaction was implemented in the model, describing the ion-exchange phenomena in the porous material using four different...

  10. Comment on "Most computational hydrology is not reproducible, so is it really science?" by Christopher Hutton et al.

    Science.gov (United States)

    Añel, Juan A.

    2017-03-01

    Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.Plain Language SummaryThis article highlights several important issues to ensure the scientific reproducibility of results within the current scientific framework, going beyond simple documentation. Several specific examples are discussed in the field of hydrological modeling.

  11. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  12. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  13. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  14. Reproducibility, controllability, and optimization of LENR experiments

    International Nuclear Information System (INIS)

    Nagel, David J.

    2006-01-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR

  15. Undefined cellulase formulations hinder scientific reproducibility.

    Science.gov (United States)

    Himmel, Michael E; Abbas, Charles A; Baker, John O; Bayer, Edward A; Bomble, Yannick J; Brunecky, Roman; Chen, Xiaowen; Felby, Claus; Jeoh, Tina; Kumar, Rajeev; McCleary, Barry V; Pletschke, Brett I; Tucker, Melvin P; Wyman, Charles E; Decker, Stephen R

    2017-01-01

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparations may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.

  16. Evaluation of AirGIS: a GIS-based air pollution and human exposure modelling system

    DEFF Research Database (Denmark)

    Ketzel, Matthias; Berkowicz, Ruwim; Hvidberg, Martin

    2011-01-01

    This study describes in brief the latest extensions of the Danish Geographic Information System (GIS)-based air pollution and human exposure modelling system (AirGIS), which has been developed in Denmark since 2001 and gives results of an evaluation with measured air pollution data. The system...... shows, in general, a good performance for both long-term averages (annual and monthly averages), short-term averages (hourly and daily) as well as when reproducing spatial variation in air pollution concentrations. Some shortcomings and future perspectives of the system are discussed too....

  17. Audiovisual biofeedback improves diaphragm motion reproducibility in MRI

    Science.gov (United States)

    Kim, Taeho; Pollock, Sean; Lee, Danny; O’Brien, Ricky; Keall, Paul

    2012-01-01

    Purpose: In lung radiotherapy, variations in cycle-to-cycle breathing results in four-dimensional computed tomography imaging artifacts, leading to inaccurate beam coverage and tumor targeting. In previous studies, the effect of audiovisual (AV) biofeedback on the external respiratory signal reproducibility has been investigated but the internal anatomy motion has not been fully studied. The aim of this study is to test the hypothesis that AV biofeedback improves diaphragm motion reproducibility of internal anatomy using magnetic resonance imaging (MRI). Methods: To test the hypothesis 15 healthy human subjects were enrolled in an ethics-approved AV biofeedback study consisting of two imaging sessions spaced ∼1 week apart. Within each session MR images were acquired under free breathing and AV biofeedback conditions. The respiratory signal to the AV biofeedback system utilized optical monitoring of an external marker placed on the abdomen. Synchronously, serial thoracic 2D MR images were obtained to measure the diaphragm motion using a fast gradient-recalled-echo MR pulse sequence in both coronal and sagittal planes. The improvement in the diaphragm motion reproducibility using the AV biofeedback system was quantified by comparing cycle-to-cycle variability in displacement, respiratory period, and baseline drift. Additionally, the variation in improvement between the two sessions was also quantified. Results: The average root mean square error (RMSE) of diaphragm cycle-to-cycle displacement was reduced from 2.6 mm with free breathing to 1.6 mm (38% reduction) with the implementation of AV biofeedback (p-value biofeedback (p-value biofeedback (p-value = 0.012). The diaphragm motion reproducibility improvements with AV biofeedback were consistent with the abdominal motion reproducibility that was observed from the external marker motion variation. Conclusions: This study was the first to investigate the potential of AV biofeedback to improve the motion

  18. [Natural head position's reproducibility on photographs].

    Science.gov (United States)

    Eddo, Marie-Line; El Hayeck, Émilie; Hoyeck, Maha; Khoury, Élie; Ghoubril, Joseph

    2017-12-01

    The purpose of this study is to evaluate the reproducibility of natural head position with time on profile photographs. Our sample is composed of 96 students (20-30 years old) at the department of dentistry of Saint Joseph University in Beirut. Two profile photographs were taken in natural head position about a week apart. No significant differences were found between T0 and T1 (E = 1.065°). Many studies confirmed this reproducibility with time. Natural head position can be adopted as an orientation for profile photographs in orthodontics. © EDP Sciences, SFODF, 2017.

  19. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  20. Agent oriented modeling of business information systems

    OpenAIRE

    Vymetal, Dominik

    2009-01-01

    Enterprise modeling is an abstract definition of processes running in enterprise using process, value, data and resource models. There are two perspectives of business modeling: process perspective and value chain perspective. Both have some advantages and disadvantages. This paper proposes a combination of both perspectives into one generic model. The model takes also social part or the enterprise system into consideration and pays attention to disturbances influencing the enterprise system....

  1. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  2. Service systems concepts, modeling, and programming

    CERN Document Server

    Cardoso, Jorge; Poels, Geert

    2014-01-01

    This SpringerBrief explores the internal workings of service systems. The authors propose a lightweight semantic model for an effective representation to capture the essence of service systems. Key topics include modeling frameworks, service descriptions and linked data, creating service instances, tool support, and applications in enterprises.Previous books on service system modeling and various streams of scientific developments used an external perspective to describe how systems can be integrated. This brief introduces the concept of white-box service system modeling as an approach to mo

  3. Modelling a data acquisition system

    International Nuclear Information System (INIS)

    Green, P.W.

    1986-01-01

    A data acquisition system to be run on a Data General ECLIPSE computer has been completely designed and developed using a VAX 11/780. This required that many of the features of the RDOS operating system be simulated on the VAX. Advantages and disadvantages of this approach are discussed, with particular regard to transportability of the system among different machines/operating systems, and the effect of the approach on various design decisions

  4. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  5. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  6. Modeling, Control and Coordination of Helicopter Systems

    CERN Document Server

    Ren, Beibei; Chen, Chang; Fua, Cheng-Heng; Lee, Tong Heng

    2012-01-01

    Modeling, Control and Coordination of Helicopter Systems provides a comprehensive treatment of helicopter systems, ranging from related nonlinear flight dynamic modeling and stability analysis to advanced control design for single helicopter systems, and also covers issues related to the coordination and formation control of multiple helicopter systems to achieve high performance tasks. Ensuring stability in helicopter flight is a challenging problem for nonlinear control design and development. This book is a valuable reference on modeling, control and coordination of helicopter systems,providing readers with practical solutions for the problems that still plague helicopter system design and implementation. Readers will gain a complete picture of helicopters at the systems level, as well as a better understanding of the technical intricacies involved. This book also: Presents a complete picture of modeling, control and coordination for helicopter systems Provides a modeling platform for a general class of ro...

  7. Grey Box Modelling of Hydrological Systems

    DEFF Research Database (Denmark)

    Thordarson, Fannar Ørn

    of two papers where the stochastic differential equation based model is used for sewer runoff from a drainage system. A simple model is used to describe a complex rainfall-runoff process in a catchment, but the stochastic part of the system is formulated to include the increasing uncertainty when...... rainwater flows through the system, as well as describe the lower limit of the uncertainty when the flow approaches zero. The first paper demonstrates in detail the grey box model and all related transformations required to obtain a feasible model for the sewer runoff. In the last paper this model is used......The main topic of the thesis is grey box modelling of hydrologic systems, as well as formulation and assessment of their embedded uncertainties. Grey box model is a combination of a white box model, a physically-based model that is traditionally formulated using deterministic ordinary differential...

  8. Composting in small laboratory pilots: Performance and reproducibility

    International Nuclear Information System (INIS)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-01-01

    Highlights: ► We design an innovative small-scale composting device including six 4-l reactors. ► We investigate the performance and reproducibility of composting on a small scale. ► Thermophilic conditions are established by self-heating in all replicates. ► Biochemical transformations, organic matter losses and stabilisation are realistic. ► The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors ( 2 consumption and CO 2 emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  9. Compositional Modelling of Stochastic Hybrid Systems

    NARCIS (Netherlands)

    Strubbe, S.N.

    2005-01-01

    In this thesis we present a modelling framework for compositional modelling of stochastic hybrid systems. Hybrid systems consist of a combination of continuous and discrete dynamics. The state space of a hybrid system is hybrid in the sense that it consists of a continuous component and a discrete

  10. Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour...

  11. Modeling and simulation of systems using Matlab and Simulink

    CERN Document Server

    Chaturvedi, Devendra K

    2009-01-01

    Introduction to SystemsSystemClassification of SystemsLinear SystemsTime-Varying vs. Time-Invariant Systems Lumped vs. Distributed Parameter SystemsContinuous- and Discrete-Time Systems Deterministic vs. Stochastic Systems Hard and Soft Systems Analysis of Systems Synthesis of Systems Introduction to System Philosophy System Thinking Large and Complex Applied System Engineering: A Generic ModelingSystems ModelingIntroduction Need of System Modeling Modeling Methods for Complex Systems Classification of ModelsCharacteristics of Models ModelingMathematical Modeling of Physical SystemsFormulation of State Space Model of SystemsPhysical Systems Theory System Components and Interconnections Computation of Parameters of a Component Single Port and Multiport Systems Techniques of System Analysis Basics of Linear Graph Theoretic ApproachFormulation of System Model for Conceptual SystemFormulation System Model for Physical SystemsTopological RestrictionsDevelopment of State Model of Degenerative SystemSolution of Stat...

  12. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  13. Reproducibility of the results in ultrasonic testing

    International Nuclear Information System (INIS)

    Chalaye, M.; Launay, J.P.; Thomas, A.

    1980-12-01

    This memorandum reports on the conclusions of the tests carried out in order to evaluate the reproducibility of ultrasonic tests made on welded joints. FRAMATOME have started a study to assess the dispersion of results afforded by the test line and to characterize its behaviour. The tests covered sensors and ultrasonic generators said to be identical to each other (same commercial batch) [fr

  14. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  15. Reproducibility, Controllability, and Optimization of Lenr Experiments

    Science.gov (United States)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  16. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  17. Second Order Kinetic Modeling of Headspace Solid Phase Microextraction of Flavors Released from Selected Food Model Systems

    Directory of Open Access Journals (Sweden)

    Jiyuan Zhang

    2014-09-01

    Full Text Available The application of headspace-solid phase microextraction (HS-SPME has been widely used in various fields as a simple and versatile method, yet challenging in quantification. In order to improve the reproducibility in quantification, a mathematical model with its root in psychological modeling and chemical reactor modeling was developed, describing the kinetic behavior of aroma active compounds extracted by SPME from two different food model systems, i.e., a semi-solid food and a liquid food. The model accounted for both adsorption and release of the analytes from SPME fiber, which occurred simultaneously but were counter-directed. The model had four parameters and their estimated values were found to be more reproducible than the direct measurement of the compounds themselves by instrumental analysis. With the relative standard deviations (RSD of each parameter less than 5% and root mean square error (RMSE less than 0.15, the model was proved to be a robust one in estimating the release of a wide range of low molecular weight acetates at three environmental temperatures i.e., 30, 40 and 60 °C. More insights of SPME behavior regarding the small molecule analytes were also obtained through the kinetic parameters and the model itself.

  18. Reproducibility of Computer-Aided Detection Marks in Digital Mammography

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Kim, Sun Mi; Im, Jung Gi; Cha, Joo Hee

    2007-01-01

    To evaluate the performance and reproducibility of a computeraided detection (CAD) system in mediolateral oblique (MLO) digital mammograms taken serially, without release of breast compression. A CAD system was applied preoperatively to the fulfilled digital mammograms of two MLO views taken without release of breast compression in 82 patients (age range: 33 83 years; mean age: 49 years) with previously diagnosed breast cancers. The total number of visible lesion components in 82 patients was 101: 66 masses and 35 microcalcifications. We analyzed the sensitivity and reproducibility of the CAD marks. The sensitivity of the CAD system for first MLO views was 71% (47/66) for masses and 80% (28/35) for microcalcifications. The sensitivity of the CAD system for second MLO views was 68% (45/66) for masses and 17% (6/35) for microcalcifications. In 84 ipsilateral serial MLO image sets (two patients had bilateral cancers), identical images, regardless of the existence of CAD marks, were obtained for 35% (29/84) and identical images with CAD marks were obtained for 29% (23/78). Identical images, regardless of the existence of CAD marks, for contralateral MLO images were 65% (52/80) and identical images with CAD marks were obtained for 28% (11/39). The reproducibility of CAD marks for the true positive masses in serial MLO views was 84% (42/50) and that for the true positive microcalcifications was 0% (0/34). The CAD system in digital mammograms showed a high sensitivity for detecting masses and microcalcifications. However, reproducibility of microcalcification marks was very low in MLO views taken serially without release of breast compression. Minute positional change and patient movement can alter the images and result in a significant effect on the algorithm utilized by the CAD for detecting microcalcifications

  19. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    Science.gov (United States)

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  20. System Dynamics Modeling of Multipurpose Reservoir Operation

    Directory of Open Access Journals (Sweden)

    Ebrahim Momeni

    2006-03-01

    Full Text Available System dynamics, a feedback – based object – oriented simulation approach, not only represents complex dynamic systemic systems in a realistic way but also allows the involvement of end users in model development to increase their confidence in modeling process. The increased speed of model development, the possibility of group model development, the effective communication of model results, and the trust developed in the model due to user participation are the main strengths of this approach. The ease of model modification in response to changes in the system and the ability to perform sensitivity analysis make this approach more attractive compared with systems analysis techniques for modeling water management systems. In this study, a system dynamics model was developed for the Zayandehrud basin in central Iran. This model contains river basin, dam reservoir, plains, irrigation systems, and groundwater. Current operation rule is conjunctive use of ground and surface water. Allocation factor for each irrigation system is computed based on the feedback from groundwater storage in its zone. Deficit water is extracted from groundwater.The results show that applying better rules can not only satisfy all demands such as Gawkhuni swamp environmental demand, but it can also  prevent groundwater level drawdown in future.

  1. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  2. Modelization of cooling system components

    Energy Technology Data Exchange (ETDEWEB)

    Copete, Monica; Ortega, Silvia; Vaquero, Jose Carlos; Cervantes, Eva [Westinghouse Electric (Spain)

    2010-07-01

    In the site evaluation study for licensing a new nuclear power facility, the criteria involved could be grouped in health and safety, environment, socio-economics, engineering and cost-related. These encompass different aspects such as geology, seismology, cooling system requirements, weather conditions, flooding, population, and so on. The selection of the cooling system is function of different parameters as the gross electrical output, energy consumption, available area for cooling system components, environmental conditions, water consumption, and others. Moreover, in recent years, extreme environmental conditions have been experienced and stringent water availability limits have affected water use permits. Therefore, modifications or alternatives of current cooling system designs and operation are required as well as analyses of the different possibilities of cooling systems to optimize energy production taking into account water consumption among other important variables. There are two basic cooling system configurations: - Once-through or Open-cycle; - Recirculating or Closed-cycle. In a once-through cooling system (or open-cycle), water from an external water sources passes through the steam cycle condenser and is then returned to the source at a higher temperature with some level of contaminants. To minimize the thermal impact to the water source, a cooling tower may be added in a once-through system to allow air cooling of the water (with associated losses on site due to evaporation) prior to returning the water to its source. This system has a high thermal efficiency, and its operating and capital costs are very low. So, from an economical point of view, the open-cycle is preferred to closed-cycle system, especially if there are no water limitations or environmental restrictions. In a recirculating system (or closed-cycle), cooling water exits the condenser, goes through a fixed heat sink, and is then returned to the condenser. This configuration

  3. Identifying optimal models to represent biochemical systems.

    Directory of Open Access Journals (Sweden)

    Mochamad Apri

    Full Text Available Biochemical systems involving a high number of components with intricate interactions often lead to complex models containing a large number of parameters. Although a large model could describe in detail the mechanisms that underlie the system, its very large size may hinder us in understanding the key elements of the system. Also in terms of parameter identification, large models are often problematic. Therefore, a reduced model may be preferred to represent the system. Yet, in order to efficaciously replace the large model, the reduced model should have the same ability as the large model to produce reliable predictions for a broad set of testable experimental conditions. We present a novel method to extract an "optimal" reduced model from a large model to represent biochemical systems by combining a reduction method and a model discrimination method. The former assures that the reduced model contains only those components that are important to produce the dynamics observed in given experiments, whereas the latter ensures that the reduced model gives a good prediction for any feasible experimental conditions that are relevant to answer questions at hand. These two techniques are applied iteratively. The method reveals the biological core of a model mathematically, indicating the processes that are likely to be responsible for certain behavior. We demonstrate the algorithm on two realistic model examples. We show that in both cases the core is substantially smaller than the full model.

  4. Introducing Model-Based System Engineering Transforming System Engineering through Model-Based Systems Engineering

    Science.gov (United States)

    2014-03-31

    Web  Presentation...Software  .....................................................  20   Figure  6.  Published   Web  Page  from  Data  Collection...the  term  Model  Based  Engineering  (MBE),  Model  Driven  Engineering  ( MDE ),  or  Model-­‐Based  Systems  

  5. Modeling Control Situations in Power System Operations

    DEFF Research Database (Denmark)

    Saleem, Arshad; Lind, Morten; Singh, Sri Niwas

    2010-01-01

    for intelligent operation and control must represent system features, so that information from measurements can be related to possible system states and to control actions. These general modeling requirements are well understood, but it is, in general, difficult to translate them into a model because of the lack...... of explicit principles for model construction. This paper presents a work on using explicit means-ends model based reasoning about complex control situations which results in maintaining consistent perspectives and selecting appropriate control action for goal driven agents. An example of power system......Increased interconnection and loading of the power system along with deregulation has brought new challenges for electric power system operation, control and automation. Traditional power system models used in intelligent operation and control are highly dependent on the task purpose. Thus, a model...

  6. MODEL OF CHANNEL AIRBORN ELECTRICAL POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    A. G. Demchenko

    2014-01-01

    Full Text Available This article is devoted to math modeling of channel of alternate current airborne electrical power-supply system. Considered to modeling of synchronous generator that runs on three-phase static load.

  7. System and circuit models for microwave antennas

    OpenAIRE

    Sobhy, Mohammed; Sanz-Izquierdo, Benito; Batchelor, John C.

    2007-01-01

    This paper describes how circuit and system models are derived for antennas from measurement of the input reflection coefficient. Circuit models are used to optimize the antenna performance and to calculate the radiated power and the transfer function of the antenna. System models are then derived for transmitting and receiving antennas. The most important contribution of this study is to show how microwave structures can be integrated into the simulation of digital communication systems. Thi...

  8. Modeling the Dynamic Digestive System Microbiome†

    OpenAIRE

    Estes, Anne M.

    2015-01-01

    Modeling the Dynamic Digestive System Microbiome” is a hands-on activity designed to demonstrate the dynamics of microbiome ecology using dried pasta and beans to model disturbance events in the human digestive system microbiome. This exercise demonstrates how microbiome diversity is influenced by: 1) niche availability and habitat space and 2) a major disturbance event, such as antibiotic use. Students use a pictorial key to examine prepared models of digestive system microbiomes to determi...

  9. Two sustainable energy system analysis models

    DEFF Research Database (Denmark)

    Lund, Henrik; Goran Krajacic, Neven Duic; da Graca Carvalho, Maria

    2005-01-01

    This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy....

  10. Life-Cycle Models for Survivable Systems

    National Research Council Canada - National Science Library

    Linger, Richard

    2002-01-01

    .... Current software development life-cycle models are not focused on creating survivable systems, and exhibit shortcomings when the goal is to develop systems with a high degree of assurance of survivability...

  11. Does systematic variation improve the reproducibility of animal experiments?

    NARCIS (Netherlands)

    Jonker, R.M.; Guenther, A.; Engqvist, L.; Schmoll, T.

    2013-01-01

    Reproducibility of results is a fundamental tenet of science. In this journal, Richter et al.1 tested whether systematic variation in experimental conditions (heterogenization) affects the reproducibility of results. Comparing this approach with the current standard of ensuring reproducibility

  12. Semantic models for adaptive interactive systems

    CERN Document Server

    Hussein, Tim; Lukosch, Stephan; Ziegler, Jürgen; Calvary, Gaëlle

    2013-01-01

    Providing insights into methodologies for designing adaptive systems based on semantic data, and introducing semantic models that can be used for building interactive systems, this book showcases many of the applications made possible by the use of semantic models.Ontologies may enhance the functional coverage of an interactive system as well as its visualization and interaction capabilities in various ways. Semantic models can also contribute to bridging gaps; for example, between user models, context-aware interfaces, and model-driven UI generation. There is considerable potential for using

  13. An expert system for dispersion model interpretation

    International Nuclear Information System (INIS)

    Skyllingstad, E.D.; Ramsdell, J.V.

    1988-10-01

    A prototype expert system designed to diagnose dispersion model uncertainty is described in this paper with application to a puff transport model. The system obtains qualitative information from the model user and through an expert-derived knowledge base, performs a rating of the current simulation. These results can then be used in combination with dispersion model output for deciding appropriate evacuation measures. Ultimately, the goal of this work is to develop an expert system that may be operated accurately by an individual uneducated in meteorology or dispersion modeling. 5 refs., 3 figs

  14. Fallout model for system studies

    International Nuclear Information System (INIS)

    Harvey, T.F.; Serduke, F.J.D.

    1979-01-01

    A versatile fallout model was developed to assess complex civil defense and military effect issues. Large technical and scenario uncertainties require a fast, adaptable, time-dependent model to obtain technically defensible fallout results in complex demographic scenarios. The KDFOC2 capability, coupled with other data bases, provides the essential tools to consider tradeoffs between various plans and features in different nuclear scenarios and estimate the technical uncertainties in the predictions. All available data were used to validate the model. In many ways, the capability is unmatched in its ability to predict fallout hazards to a society

  15. Comparison of the CATHENA model of Gentilly-2 end shield cooling system predictions to station data

    Energy Technology Data Exchange (ETDEWEB)

    Zagre, G.; Sabourin, G. [Candu Energy Inc., Montreal, Quebec (Canada); Chapados, S. [Hydro-Quebec, Montreal, Quebec (Canada)

    2012-07-01

    As part of the Gentilly-2 Refurbishment Project, Hydro-Quebec has elected to perform the End Shield Cooling Safety Analysis. A CATHENA model of Gentilly-2 End Shield Cooling System was developed for this purpose. This model includes new elements compared to other CANDU6 End Shield Cooling models such as a detailed heat exchanger and control logic model. In order to test the model robustness and accuracy, the model predictions were compared with plant measurements.This paper summarizes this comparison between the model predictions and the station measurements. It is shown that the CATHENA model is flexible and accurate enough to predict station measurements for critical parameters, and the detailed heat exchanger model allows reproducing station transients. (author)

  16. A new model with an anatomically accurate human renal collecting system for training in fluoroscopy-guided percutaneous nephrolithotomy access.

    Science.gov (United States)

    Turney, Benjamin W

    2014-03-01

    Obtaining renal access is one of the most important and complex steps in learning percutaneous nephrolithotomy (PCNL). Ideally, this skill should be practiced outside the operating room. There is a need for anatomically accurate and cheap models for simulated training. The objective was to develop a cost-effective, anatomically accurate, nonbiologic training model for simulated PCNL access under fluoroscopic guidance. Collecting systems from routine computed tomography urograms were extracted and reformatted using specialized software. These images were printed in a water-soluble plastic on a three-dimensional (3D) printer to create biomodels. These models were embedded in silicone and then the models were dissolved in water to leave a hollow collecting system within a silicone model. These PCNL models were filled with contrast medium and sealed. A layer of dense foam acted as a spacer to replicate the tissues between skin and kidney. 3D printed models of human collecting systems are a useful adjunct in planning PCNL access. The PCNL access training model is relatively low cost and reproduces the anatomy of the renal collecting system faithfully. A range of models reflecting the variety and complexity of human collecting systems can be reproduced. The fluoroscopic triangulation process needed to target the calix of choice can be practiced successfully in this model. This silicone PCNL training model accurately replicates the anatomic architecture and orientation of the human renal collecting system. It provides a safe, clean, and effective model for training in accurate fluoroscopy-guided PCNL access.

  17. Multiple system modelling of waste management

    International Nuclear Information System (INIS)

    Eriksson, Ola; Bisaillon, Mattias

    2011-01-01

    Highlights: → Linking of models will provide a more complete, correct and credible picture of the systems. → The linking procedure is easy to perform and also leads to activation of project partners. → The simulation procedure is a bit more complicated and calls for the ability to run both models. - Abstract: Due to increased environmental awareness, planning and performance of waste management has become more and more complex. Therefore waste management has early been subject to different types of modelling. Another field with long experience of modelling and systems perspective is energy systems. The two modelling traditions have developed side by side, but so far there are very few attempts to combine them. Waste management systems can be linked together with energy systems through incineration plants. The models for waste management can be modelled on a quite detailed level whereas surrounding systems are modelled in a more simplistic way. This is a problem, as previous studies have shown that assumptions on the surrounding system often tend to be important for the conclusions. In this paper it is shown how two models, one for the district heating system (MARTES) and another one for the waste management system (ORWARE), can be linked together. The strengths and weaknesses with model linking are discussed when compared to simplistic assumptions on effects in the energy and waste management systems. It is concluded that the linking of models will provide a more complete, correct and credible picture of the consequences of different simultaneous changes in the systems. The linking procedure is easy to perform and also leads to activation of project partners. However, the simulation procedure is a bit more complicated and calls for the ability to run both models.

  18. Network model of security system

    Directory of Open Access Journals (Sweden)

    Adamczyk Piotr

    2016-01-01

    Full Text Available The article presents the concept of building a network security model and its application in the process of risk analysis. It indicates the possibility of a new definition of the role of the network models in the safety analysis. Special attention was paid to the development of the use of an algorithm describing the process of identifying the assets, vulnerability and threats in a given context. The aim of the article is to present how this algorithm reduced the complexity of the problem by eliminating from the base model these components that have no links with others component and as a result and it was possible to build a real network model corresponding to reality.

  19. Modeling of the DZero data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Angstadt, R.; Johnson, M.; Manning, I.L. [Fermi National Accelerator Lab., Batavia, IL (United States); Wightman, J.A. [Texas A and M Univ., College Station, TX (United States). Dept. of Physics]|[Texas Accelerator Center, The Woodlands, TX (United States)

    1991-12-01

    A queuing theory model was used in the initial design of the D0 data acquisition system. It was mainly used for the front end electronic systems. Since then the model has been extended to include the entire data path for the tracking system. The tracking system generates the most data so we expect this system to determine the overall transfer rate. The model was developed using both analytical and simulation methods for solving a series of single server queues. We describe the model and the methods used to develop it. We also present results from the original models, updated calculations representing the system as built and comparisons with measurements made with the hardware in place for the cosmic ray test run. 3 refs.

  20. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  1. Integrating systems biology models and biomedical ontologies.

    Science.gov (United States)

    Hoehndorf, Robert; Dumontier, Michel; Gennari, John H; Wimalaratne, Sarala; de Bono, Bernard; Cook, Daniel L; Gkoutos, Georgios V

    2011-08-11

    Systems biology is an approach to biology that emphasizes the structure and dynamic behavior of biological systems and the interactions that occur within them. To succeed, systems biology crucially depends on the accessibility and integration of data across domains and levels of granularity. Biomedical ontologies were developed to facilitate such an integration of data and are often used to annotate biosimulation models in systems biology. We provide a framework to integrate representations of in silico systems biology with those of in vivo biology as described by biomedical ontologies and demonstrate this framework using the Systems Biology Markup Language. We developed the SBML Harvester software that automatically converts annotated SBML models into OWL and we apply our software to those biosimulation models that are contained in the BioModels Database. We utilize the resulting knowledge base for complex biological queries that can bridge levels of granularity, verify models based on the biological phenomenon they represent and provide a means to establish a basic qualitative layer on which to express the semantics of biosimulation models. We establish an information flow between biomedical ontologies and biosimulation models and we demonstrate that the integration of annotated biosimulation models and biomedical ontologies enables the verification of models as well as expressive queries. Establishing a bi-directional information flow between systems biology and biomedical ontologies has the potential to enable large-scale analyses of biological systems that span levels of granularity from molecules to organisms.

  2. An Empirical Model for Energy Storage Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rosewater, David Martin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Scott, Paul [TransPower, Poway, CA (United States)

    2016-03-17

    Improved models of energy storage systems are needed to enable the electric grid’s adaptation to increasing penetration of renewables. This paper develops a generic empirical model of energy storage system performance agnostic of type, chemistry, design or scale. Parameters for this model are calculated using test procedures adapted from the US DOE Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage. We then assess the accuracy of this model for predicting the performance of the TransPower GridSaver – a 1 MW rated lithium-ion battery system that underwent laboratory experimentation and analysis. The developed model predicts a range of energy storage system performance based on the uncertainty of estimated model parameters. Finally, this model can be used to better understand the integration and coordination of energy storage on the electric grid.

  3. Brief history of agricultural systems modeling.

    Science.gov (United States)

    Jones, James W; Antle, John M; Basso, Bruno; Boote, Kenneth J; Conant, Richard T; Foster, Ian; Godfray, H Charles J; Herrero, Mario; Howitt, Richard E; Janssen, Sander; Keating, Brian A; Munoz-Carpena, Rafael; Porter, Cheryl H; Rosenzweig, Cynthia; Wheeler, Tim R

    2017-07-01

    Agricultural systems science generates knowledge that allows researchers to consider complex problems or take informed agricultural decisions. The rich history of this science exemplifies the diversity of systems and scales over which they operate and have been studied. Modeling, an essential tool in agricultural systems science, has been accomplished by scientists from a wide range of disciplines, who have contributed concepts and tools over more than six decades. As agricultural scientists now consider the "next generation" models, data, and knowledge products needed to meet the increasingly complex systems problems faced by society, it is important to take stock of this history and its lessons to ensure that we avoid re-invention and strive to consider all dimensions of associated challenges. To this end, we summarize here the history of agricultural systems modeling and identify lessons learned that can help guide the design and development of next generation of agricultural system tools and methods. A number of past events combined with overall technological progress in other fields have strongly contributed to the evolution of agricultural system modeling, including development of process-based bio-physical models of crops and livestock, statistical models based on historical observations, and economic optimization and simulation models at household and regional to global scales. Characteristics of agricultural systems models have varied widely depending on the systems involved, their scales, and the wide range of purposes that motivated their development and use by researchers in different disciplines. Recent trends in broader collaboration across institutions, across disciplines, and between the public and private sectors suggest that the stage is set for the major advances in agricultural systems science that are needed for the next generation of models, databases, knowledge products and decision support systems. The lessons from history should be

  4. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  5. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  6. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  7. Reproducibility in cyclostratigraphy: initiating an intercomparison project

    Science.gov (United States)

    Sinnesael, Matthias; De Vleeschouwer, David; Zeeden, Christian; Claeys, Philippe

    2017-04-01

    The study of astronomical climate forcing and the application of cyclostratigraphy have experienced a spectacular growth over the last decades. In the field of cyclostratigraphy a broad range in methodological approaches exist. However, comparative study between the different approaches is lacking. Different cases demand different approaches, but with the growing importance of the field, questions arise about reproducibility, uncertainties and standardization of results. The radioisotopic dating community, in particular, has done far-reaching efforts to improve reproducibility and intercomparison of radioisotopic dates and their errors. To satisfy this need in cyclostratigraphy, we initiate a comparable framework for the community. The aims are to investigate and quantify reproducibility of, and uncertainties related to cyclostratigraphic studies and to provide a platform to discuss the merits and pitfalls of different methodologies, and their applicabilities. With this poster, we ask the feedback from the community on how to design this comparative framework in a useful, meaningful and productive manner. In parallel, we would like to discuss how reproducibility should be tested and what uncertainties should stand for in cyclostratigraphy. On the other hand, we intend to trigger interest for a cyclostratigraphic intercomparison project. This intercomparison project would imply the analysis of artificial and genuine geological records by individual researchers. All participants would be free to determine their method of choice. However, a handful of criterions will be required for an outcome to be comparable. The different results would be compared (e.g. during a workshop or a special session), and the lessons learned from the comparison could potentially be reported in a review paper. The aim of an intercomparison project is not to rank the different methods according to their merits, but to get insight into which specific methods are most suitable for which

  8. A how to guide to reproducible research

    OpenAIRE

    Whitaker, Kirstie

    2018-01-01

    This talk will discuss the perceived and actual barriers experienced by researchers attempting to do reproducible research, and give practical guidance on how they can be overcome. It will include suggestions on how to make your code and data available and usable for others (including a strong suggestion to document both clearly so you don't have to reply to lots of email questions from future users). Specifically it will include a brief guide to version control, collaboration and disseminati...

  9. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  10. Reproducibility and Practical Adoption of GEOBIA with Open-Source Software in Docker Containers

    Directory of Open Access Journals (Sweden)

    Christian Knoth

    2017-03-01

    Full Text Available Geographic Object-Based Image Analysis (GEOBIA mostly uses proprietary software,but the interest in Free and Open-Source Software (FOSS for GEOBIA is growing. This interest stems not only from cost savings, but also from benefits concerning reproducibility and collaboration. Technical challenges hamper practical reproducibility, especially when multiple software packages are required to conduct an analysis. In this study, we use containerization to package a GEOBIA workflow in a well-defined FOSS environment. We explore the approach using two software stacks to perform an exemplary analysis detecting destruction of buildings in bi-temporal images of a conflict area. The analysis combines feature extraction techniques with segmentation and object-based analysis to detect changes using automatically-defined local reference values and to distinguish disappeared buildings from non-target structures. The resulting workflow is published as FOSS comprising both the model and data in a ready to use Docker image and a user interface for interaction with the containerized workflow. The presented solution advances GEOBIA in the following aspects: higher transparency of methodology; easier reuse and adaption of workflows; better transferability between operating systems; complete description of the software environment; and easy application of workflows by image analysis experts and non-experts. As a result, it promotes not only the reproducibility of GEOBIA, but also its practical adoption.

  11. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  12. Transforming Graphical System Models to Graphical Attack Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, Rene Rydhof

    2016-01-01

    Manually identifying possible attacks on an organisation is a complex undertaking; many different factors must be considered, and the resulting attack scenarios can be complex and hard to maintain as the organisation changes. System models provide a systematic representation of organisations...... approach to transforming graphical system models to graphical attack models in the form of attack trees. Based on an asset in the model, our transformations result in an attack tree that represents attacks by all possible actors in the model, after which the actor in question has obtained the asset....

  13. Coupling population dynamics with earth system models: the POPEM model.

    Science.gov (United States)

    Navarro, Andrés; Moreno, Raúl; Jiménez-Alcázar, Alfonso; Tapiador, Francisco J

    2017-09-16

    Precise modeling of CO 2 emissions is important for environmental research. This paper presents a new model of human population dynamics that can be embedded into ESMs (Earth System Models) to improve climate modeling. Through a system dynamics approach, we develop a cohort-component model that successfully simulates historical population dynamics with fine spatial resolution (about 1°×1°). The population projections are used to improve the estimates of CO 2 emissions, thus transcending the bulk approach of existing models and allowing more realistic non-linear effects to feature in the simulations. The module, dubbed POPEM (from Population Parameterization for Earth Models), is compared with current emission inventories and validated against UN aggregated data. Finally, it is shown that the module can be used to advance toward fully coupling the social and natural components of the Earth system, an emerging research path for environmental science and pollution research.

  14. Modeling on a PWR power conversion system with system program

    International Nuclear Information System (INIS)

    Gao Rui; Yang Yanhua; Lin Meng

    2007-01-01

    Based on the power conversion system of nuclear and conventional islands of Daya Bay Power Station, this paper models the thermal-hydraulic systems of primary and secondary loops for PWR by using the PWR best-estimate program-RELAP5. To simulate the full-scope power conversion system, not only the traditional basic system models of nuclear island, but also the major system models of conventional island are all considered and modeled. A comparison between the calculated results and the actual data of reactor demonstrates a fine match for Daya Bay Nuclear Power Station, and manifests the feasibility in simulating full-scope power conversion system of PWR by RELAP5 at the same time. (authors)

  15. A statistical model for instable thermodynamical systems

    International Nuclear Information System (INIS)

    Sommer, Jens-Uwe

    2003-01-01

    A generic model is presented for statistical systems which display thermodynamic features in contrast to our everyday experience, such as infinite and negative heat capacities. Such system are instable in terms of classical equilibrium thermodynamics. Using our statistical model, we are able to investigate states of instable systems which are undefined in the framework of equilibrium thermodynamics. We show that a region of negative heat capacity in the adiabatic environment, leads to a first order like phase transition when the system is coupled to a heat reservoir. This phase transition takes place without a phase coexistence. Nevertheless, all intermediate states are stable due to fluctuations. When two instable system are brought in thermal contact, the temperature of the composed system is lower than the minimum temperature of the individual systems. Generally, the equilibrium states of instable system cannot be simply decomposed into equilibrium states of the individual systems. The properties of instable system depend on the environment, ensemble equivalence is broken

  16. Dynamic modeling of the INAPRO aquaponic system

    NARCIS (Netherlands)

    Karimanzira, Divas; Keesman, Karel J.; Kloas, Werner; Baganz, Daniela; Rauschenbach, Thomas

    2016-01-01

    The use of modeling techniques to analyze aquaponics systems is demonstrated with an example of dynamic modeling for the production of Nile tilapia (Oreochromis niloticus) and tomatoes (Solanum lycopersicon) using the innovative double recirculating aquaponic system ASTAF-PRO. For the management

  17. System dynamics modelling of situation awareness

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2015-11-01

    Full Text Available . The feedback loops and delays in the Command and Control system also contribute to the complex dynamic behavior. This paper will build on existing situation awareness models to develop a System Dynamics model to support a qualitative investigation through...

  18. Rapid Prototyping of Formally Modelled Distributed Systems

    OpenAIRE

    Buchs, Didier; Buffo, Mathieu; Titsworth, Frances M.

    1999-01-01

    This paper presents various kinds of prototypes, used in the prototyping of formally modelled distributed systems. It presents the notions of prototyping techniques and prototype evolution, and shows how to relate them to the software life-cycle. It is illustrated through the use of the formal modelling language for distributed systems CO-OPN/2.

  19. Modeling complex work systems - method meets reality

    NARCIS (Netherlands)

    van der Veer, Gerrit C.; Hoeve, Machteld; Lenting, Bert

    1996-01-01

    Modeling an existing task situation is often a first phase in the (re)design of information systems. For complex systems design, this model should consider both the people and the organization involved, the work, and situational aspects. Groupware Task Analysis (GTA) as part of a method for the

  20. Timbral aspects of reproduced sound in small rooms. I

    DEFF Research Database (Denmark)

    Bech, Søren

    1995-01-01

    , has been simulated using an electroacoustic setup. The model included the direct sound, 17 individual reflections, and the reverberant field. The threshold of detection and just-noticeable differences for an increase in level were measured for individual reflections using eight subjects for noise......This paper reports some of the influences of individual reflections on the timbre of reproduced sound. A single loudspeaker with frequency-independent directivity characteristics, positioned in a listening room of normal size with frequency-independent absorption coefficients of the room surfaces...... and speech. The results have shown that the first-order floor and ceiling reflections are likely to individually contribute to the timbre of reproduced speech. For a noise signal, additional reflections from the left sidewall will contribute individually. The level of the reverberant field has been found...

  1. Properties of galaxies reproduced by a hydrodynamic simulation

    Science.gov (United States)

    Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Bird, S.; Nelson, D.; Hernquist, L.

    2014-05-01

    Previous simulations of the growth of cosmic structures have broadly reproduced the `cosmic web' of galaxies that we see in the Universe, but failed to create a mixed population of elliptical and spiral galaxies, because of numerical inaccuracies and incomplete physical models. Moreover, they were unable to track the small-scale evolution of gas and stars to the present epoch within a representative portion of the Universe. Here we report a simulation that starts 12 million years after the Big Bang, and traces 13 billion years of cosmic evolution with 12 billion resolution elements in a cube of 106.5 megaparsecs a side. It yields a reasonable population of ellipticals and spirals, reproduces the observed distribution of galaxies in clusters and characteristics of hydrogen on large scales, and at the same time matches the `metal' and hydrogen content of galaxies on small scales.

  2. Regression Models for Repairable Systems

    Czech Academy of Sciences Publication Activity Database

    Novák, Petr

    2015-01-01

    Roč. 17, č. 4 (2015), s. 963-972 ISSN 1387-5841 Institutional support: RVO:67985556 Keywords : Reliability analysis * Repair models * Regression Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.782, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/novak-0450902.pdf

  3. Mathematical Modeling Of Life-Support Systems

    Science.gov (United States)

    Seshan, Panchalam K.; Ganapathi, Balasubramanian; Jan, Darrell L.; Ferrall, Joseph F.; Rohatgi, Naresh K.

    1994-01-01

    Generic hierarchical model of life-support system developed to facilitate comparisons of options in design of system. Model represents combinations of interdependent subsystems supporting microbes, plants, fish, and land animals (including humans). Generic model enables rapid configuration of variety of specific life support component models for tradeoff studies culminating in single system design. Enables rapid evaluation of effects of substituting alternate technologies and even entire groups of technologies and subsystems. Used to synthesize and analyze life-support systems ranging from relatively simple, nonregenerative units like aquariums to complex closed-loop systems aboard submarines or spacecraft. Model, called Generic Modular Flow Schematic (GMFS), coded in such chemical-process-simulation languages as Aspen Plus and expressed as three-dimensional spreadsheet.

  4. Modeling of Embedded Human Systems

    Science.gov (United States)

    2013-07-01

    ISAT study [7] for DARPA in 20051 concretized the notion of an embedded human, who is a necessary component of the system. The proposed work integrates...Technology, IEEE Transactions on, vol. 16, no. 2, pp. 229–244, March 2008. [7] C. J. Tomlin and S. S. Sastry, “Embedded humans,” tech. rep., DARPA ISAT

  5. Modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, Vidyadhar G

    2011-01-01

    Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi

  6. Towards reproducibility of research by reuse of IT best practices

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Reproducibility of any research gives much higher credibility both to research results and to the researchers. This is true for any kind of research including computer science, where a lot of tools and approaches have been developed to ensure reproducibility. In this talk I will focus on basic and seemingly simple principles, which sometimes look too obvious to follow, but help researchers build beautiful and reliable systems that produce consistent, measurable results. My talk will cover, among other things, the problem of embedding machine learning techniques into analysis strategy. I will also speak about the most common pitfalls in this process and how to avoid them. In addition, I will demonstrate the research environment based on the principles that I will have outlined. About the speaker Andrey Ustyuzhanin (36) is Head of CERN partnership program at Yandex. He is involved in the development of event indexing and event filtering services which Yandex has been providing for the LHCb experiment sinc...

  7. Test-driven modeling of embedded systems

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2015-01-01

    To benefit maximally from model-based systems engineering (MBSE) trustworthy high quality models are required. From the software disciplines it is known that test-driven development (TDD) can significantly increase the quality of the products. Using a test-driven approach with MBSE may have...... a similar positive effect on the quality of the system models and the resulting products and may therefore be desirable. To define a test-driven model-based systems engineering (TD-MBSE) approach, we must define this approach for numerous sub disciplines such as modeling of requirements, use cases...... suggest that our method provides a sound foundation for rapid development of high quality system models....

  8. Formal heterogeneous system modeling with SystemC

    DEFF Research Database (Denmark)

    Niaki, Seyed Hosein Attarzadeh; Jakobsen, Mikkel Koefoed; Sulonen, Tero

    2012-01-01

    Electronic System Level (ESL) design of embedded systems proposes raising the abstraction level of the design entry to cope with the increasing complexity of such systems. To exploit the benefits of ESL, design languages should allow specification of models which are a) heterogeneous, to describe...

  9. Modeling aluminum-air battery systems

    Science.gov (United States)

    Savinell, R. F.; Willis, M. S.

    The performance of a complete aluminum-air battery system was studied with a flowsheet model built from unit models of each battery system component. A plug flow model for heat transfer was used to estimate the amount of heat transferred from the electrolyte to the air stream. The effect of shunt currents on battery performance was found to be insignificant. Using the flowsheet simulator to analyze a 100 cell battery system now under development demonstrated that load current, aluminate concentration, and electrolyte temperature are dominant variables controlling system performance. System efficiency was found to decrease as both load current and aluminate concentration increases. The flowsheet model illustrates the interdependence of separate units on overall system performance.

  10. Systems Engineering Model for ART Energy Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Mendez Cruz, Carmen Margarita [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rochau, Gary E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wilson, Mollye C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    The near-term objective of the EC team is to establish an operating, commercially scalable Recompression Closed Brayton Cycle (RCBC) to be constructed for the NE - STEP demonstration system (demo) with the lowest risk possible. A systems engineering approach is recommended to ensure adequate requirements gathering, documentation, and mode ling that supports technology development relevant to advanced reactors while supporting crosscut interests in potential applications. A holistic systems engineering model was designed for the ART Energy Conversion program by leveraging Concurrent Engineering, Balance Model, Simplified V Model, and Project Management principles. The resulting model supports the identification and validation of lifecycle Brayton systems requirements, and allows designers to detail system-specific components relevant to the current stage in the lifecycle, while maintaining a holistic view of all system elements.

  11. Agent-Based Modeling in Systems Pharmacology.

    Science.gov (United States)

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  12. Modelling Geomorphic Systems: Landscape Evolution

    OpenAIRE

    Valters, Declan

    2016-01-01

    Landscape evolution models (LEMs) present the geomorphologist with a means of investigating how landscapes evolve in response to external forcings, such as climate and tectonics, as well as internal process laws. LEMs typically incorporate a range of different geomorphic transport laws integrated in a way that simulates the evolution of a 3D terrain surface forward through time. The strengths of LEMs as research tools lie in their ability to rapidly test many different hypotheses of landscape...

  13. Accuracy, reproducibility, and time efficiency of dental measurements using different technologies.

    Science.gov (United States)

    Grünheid, Thorsten; Patel, Nishant; De Felippe, Nanci L; Wey, Andrew; Gaillard, Philippe R; Larson, Brent E

    2014-02-01

    Historically, orthodontists have taken dental measurements on plaster models. Technological advances now allow orthodontists to take these measurements on digital models. In this study, we aimed to assess the accuracy, reproducibility, and time efficiency of dental measurements taken on 3 types of digital models. emodels (GeoDigm, Falcon Heights, Minn), SureSmile models (OraMetrix, Richardson, Tex), and AnatoModels (Anatomage, San Jose, Calif) were made for 30 patients. Mesiodistal tooth-width measurements taken on these digital models were timed and compared with those on the corresponding plaster models, which were used as the gold standard. Accuracy and reproducibility were assessed using the Bland-Altman method. Differences in time efficiency were tested for statistical significance with 1-way analysis of variance. Measurements on SureSmile models were the most accurate, followed by those on emodels and AnatoModels. Measurements taken on SureSmile models were also the most reproducible. Measurements taken on SureSmile models and emodels were significantly faster than those taken on AnatoModels and plaster models. Tooth-width measurements on digital models can be as accurate as, and might be more reproducible and significantly faster than, those taken on plaster models. Of the models studied, the SureSmile models provided the best combination of accuracy, reproducibility, and time efficiency of measurement. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  14. Modeling of Generic Slung Load System

    DEFF Research Database (Denmark)

    Bisgaard, Morten; Bendtsen, Jan Dimon; la Cour-Harbo, Anders

    2009-01-01

    This paper presents the result of the modelling and verification of a generic slung load system using a small-scale helicopter. The model is intended for use in simulation, pilot training, estimation, and control. The model is derived using a redundant coordinate formulation based on Gauss...... slackening and tightening as well as aerodynamic coupling between the helicopter and the load. Furthermore, it is shown how the model can be easily used for multi-lift systems either with multiple helicopters or multiple loads. A numerical stabilisation algorithm is introduced and finally the use...... of the model is illustrated through simulations and flight verifications.  ...

  15. System Models and Aging: A Driving Example.

    Science.gov (United States)

    Melichar, Joseph F.

    Chronological age is a marker in time but it fails to measure accurately the performance or behavioral characteristics of individuals. This paper models the complexity of aging by using a system model and a human function paradigm. These models help facilitate representation of older adults, integrate research agendas, and enhance remediative…

  16. Modelling and control of systems with flow

    NARCIS (Netherlands)

    van Mourik, S.

    2008-01-01

    In practice, feedback control design consists of three steps: modelling, model reduction and controller design for the reduced model. Systems with flow are often complicated, and there is yet no standard algorithm that integrates these steps. In this thesis we make a modest effort by considering two

  17. Model Checking Real-Time Systems

    DEFF Research Database (Denmark)

    Bouyer, Patricia; Fahrenberg, Uli; Larsen, Kim Guldstrand

    2018-01-01

    This chapter surveys timed automata as a formalism for model checking real-time systems. We begin with introducing the model, as an extension of finite-state automata with real-valued variables for measuring time. We then present the main model-checking results in this framework, and give a hint...

  18. Standing Together for Reproducibility in Large-Scale Computing: Report on reproducibility@XSEDE

    OpenAIRE

    James, Doug; Wilkins-Diehr, Nancy; Stodden, Victoria; Colbry, Dirk; Rosales, Carlos; Fahey, Mark; Shi, Justin; Silva, Rafael F.; Lee, Kyo; Roskies, Ralph; Loewe, Laurence; Lindsey, Susan; Kooper, Rob; Barba, Lorena; Bailey, David

    2014-01-01

    This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enab...

  19. Critically Important Object Security System Element Model

    Directory of Open Access Journals (Sweden)

    I. V. Khomyackov

    2012-03-01

    Full Text Available A stochastic model of critically important object security system element has been developed. The model includes mathematical description of the security system element properties and external influences. The state evolution of the security system element is described by the semi-Markov process with finite states number, the semi-Markov matrix and the initial semi-Markov process states probabilities distribution. External influences are set with the intensity of the Poisson thread.

  20. A model for international border management systems.

    Energy Technology Data Exchange (ETDEWEB)

    Duggan, Ruth Ann

    2008-09-01

    To effectively manage the security or control of its borders, a country must understand its border management activities as a system. Using its systems engineering and security foundations as a Department of Energy National Security Laboratory, Sandia National Laboratories has developed such an approach to modeling and analyzing border management systems. This paper describes the basic model and its elements developed under Laboratory Directed Research and Development project 08-684.

  1. A strategic review of electricity systems models

    International Nuclear Information System (INIS)

    Foley, A.M.; O Gallachoir, B.P.; McKeogh, E.J.; Hur, J.; Baldick, R.

    2010-01-01

    Electricity systems models are software tools used to manage electricity demand and the electricity systems, to trade electricity and for generation expansion planning purposes. Various portfolios and scenarios are modelled in order to compare the effects of decision making in policy and on business development plans in electricity systems so as to best advise governments and industry on the least cost economic and environmental approach to electricity supply, while maintaining a secure supply of sufficient quality electricity. The modelling techniques developed to study vertically integrated state monopolies are now applied in liberalised markets where the issues and constraints are more complex. This paper reviews the changing role of electricity systems modelling in a strategic manner, focussing on the modelling response to key developments, the move away from monopoly towards liberalised market regimes and the increasing complexity brought about by policy targets for renewable energy and emissions. The paper provides an overview of electricity systems modelling techniques, discusses a number of key proprietary electricity systems models used in the USA and Europe and provides an information resource to the electricity analyst not currently readily available in the literature on the choice of model to investigate different aspects of the electricity system. (author)

  2. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques.

    Directory of Open Access Journals (Sweden)

    Emi Kamimura

    Full Text Available The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D images of teeth captured by a digital impression technique to a conventional impression technique in vivo.Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE. A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE. Stereolithography (STL data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D laboratory scanner (D810, 3shape. The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test.The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm than when using a conventional impression technique (0.023 ± 0.01 mm.The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator.

  3. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques

    Science.gov (United States)

    Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi

    2017-01-01

    Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642

  4. Ratio-scaling of listener preference of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian

    2005-01-01

    -trivial assumption in the case of complex spatial sounds. In the present study the Bradley-Terry-Luce (BTL) model was employed to investigate the unidimensionality of preference judgments made by 40 listeners on multichannel reproduced sound. Short musical excerpts played back in eight reproduction modes (mono...... music). As a main result, the BTL model was found to predict the choice frequencies well. This implies that listeners were able to integrate the complex nature of the sounds into a unidimensional preference judgment. It further implies the existence of a preference scale on which the reproduction modes...

  5. Development of a system emulating the global carbon cycle in Earth system models

    Science.gov (United States)

    Tachiiri, K.; Hargreaves, J. C.; Annan, J. D.; Oka, A.; Abe-Ouchi, A.; Kawamiya, M.

    2010-08-01

    Recent studies have indicated that the uncertainty in the global carbon cycle may have a significant impact on the climate. Since state of the art models are too computationally expensive for it to be possible to explore their parametric uncertainty in anything approaching a comprehensive fashion, we have developed a simplified system for investigating this problem. By combining the strong points of general circulation models (GCMs), which contain detailed and complex processes, and Earth system models of intermediate complexity (EMICs), which are quick and capable of large ensembles, we have developed a loosely coupled model (LCM) which can represent the outputs of a GCM-based Earth system model, using much smaller computational resources. We address the problem of relatively poor representation of precipitation within our EMIC, which prevents us from directly coupling it to a vegetation model, by coupling it to a precomputed transient simulation using a full GCM. The LCM consists of three components: an EMIC (MIROC-lite) which consists of a 2-D energy balance atmosphere coupled to a low resolution 3-D GCM ocean (COCO) including an ocean carbon cycle (an NPZD-type marine ecosystem model); a state of the art vegetation model (Sim-CYCLE); and a database of daily temperature, precipitation, and other necessary climatic fields to drive Sim-CYCLE from a precomputed transient simulation from a state of the art AOGCM. The transient warming of the climate system is calculated from MIROC-lite, with the global temperature anomaly used to select the most appropriate annual climatic field from the pre-computed AOGCM simulation which, in this case, is a 1% pa increasing CO2 concentration scenario. By adjusting the effective climate sensitivity (equivalent to the equilibrium climate sensitivity for an energy balance model) of MIROC-lite, the transient warming of the LCM could be adjusted to closely follow the low sensitivity (with an equilibrium climate sensitivity of 4.0 K

  6. Development of a system emulating the global carbon cycle in Earth system models

    Directory of Open Access Journals (Sweden)

    K. Tachiiri

    2010-08-01

    Full Text Available Recent studies have indicated that the uncertainty in the global carbon cycle may have a significant impact on the climate. Since state of the art models are too computationally expensive for it to be possible to explore their parametric uncertainty in anything approaching a comprehensive fashion, we have developed a simplified system for investigating this problem. By combining the strong points of general circulation models (GCMs, which contain detailed and complex processes, and Earth system models of intermediate complexity (EMICs, which are quick and capable of large ensembles, we have developed a loosely coupled model (LCM which can represent the outputs of a GCM-based Earth system model, using much smaller computational resources. We address the problem of relatively poor representation of precipitation within our EMIC, which prevents us from directly coupling it to a vegetation model, by coupling it to a precomputed transient simulation using a full GCM. The LCM consists of three components: an EMIC (MIROC-lite which consists of a 2-D energy balance atmosphere coupled to a low resolution 3-D GCM ocean (COCO including an ocean carbon cycle (an NPZD-type marine ecosystem model; a state of the art vegetation model (Sim-CYCLE; and a database of daily temperature, precipitation, and other necessary climatic fields to drive Sim-CYCLE from a precomputed transient simulation from a state of the art AOGCM. The transient warming of the climate system is calculated from MIROC-lite, with the global temperature anomaly used to select the most appropriate annual climatic field from the pre-computed AOGCM simulation which, in this case, is a 1% pa increasing CO2 concentration scenario.

    By adjusting the effective climate sensitivity (equivalent to the equilibrium climate sensitivity for an energy balance model of MIROC-lite, the transient warming of the LCM could be adjusted to closely follow the low sensitivity (with an equilibrium

  7. Formal Modeling and Analysis of Timed Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Niebert, Peter

    This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts of ...... systems, discrete time systems, timed languages, and real-time operating systems....... of two invited talks were carefully selected from 36 submissions during two rounds of reviewing and improvement. All current aspects of formal method for modeling and analyzing timed systems are addressed; among the timed systems dealt with are timed automata, timed Petri nets, max-plus algebras, real-time......This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts...

  8. Spatial Models and Networks of Living Systems

    DEFF Research Database (Denmark)

    Juul, Jeppe Søgaard

    When studying the dynamics of living systems, insight can often be gained by developing a mathematical model that can predict future behaviour of the system or help classify system characteristics. However, in living cells, organisms, and especially groups of interacting individuals, a large number...... variables of the system. However, this approach disregards any spatial structure of the system, which may potentially change the behaviour drastically. An alternative approach is to construct a cellular automaton with nearest neighbour interactions, or even to model the system as a complex network...... with interactions defined by network topology. In this thesis I first describe three different biological models of ageing and cancer, in which spatial structure is important for the system dynamics. I then turn to describe characteristics of ecosystems consisting of three cyclically interacting species...

  9. Reliability models for Space Station power system

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kim, Y.; Wagner, H.

    1987-01-01

    This paper presents a methodology for the reliability evaluation of Space Station power system. The two options considered are the photovoltaic system and the solar dynamic system. Reliability models for both of these options are described along with the methodology for calculating the reliability indices.

  10. Human performance modeling for system of systems analytics.

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, Kevin R.; Lawton, Craig R.; Basilico, Justin Derrick; Longsine, Dennis E. (INTERA, Inc., Austin, TX); Forsythe, James Chris; Gauthier, John Henry; Le, Hai D.

    2008-10-01

    A Laboratory-Directed Research and Development project was initiated in 2005 to investigate Human Performance Modeling in a System of Systems analytic environment. SAND2006-6569 and SAND2006-7911 document interim results from this effort; this report documents the final results. The problem is difficult because of the number of humans involved in a System of Systems environment and the generally poorly defined nature of the tasks that each human must perform. A two-pronged strategy was followed: one prong was to develop human models using a probability-based method similar to that first developed for relatively well-understood probability based performance modeling; another prong was to investigate more state-of-art human cognition models. The probability-based modeling resulted in a comprehensive addition of human-modeling capability to the existing SoSAT computer program. The cognitive modeling resulted in an increased understanding of what is necessary to incorporate cognition-based models to a System of Systems analytic environment.

  11. System Dynamics Modelling for a Balanced Scorecard

    DEFF Research Database (Denmark)

    Nielsen, Steen; Nielsen, Erland Hejn

    2008-01-01

    /methodology/approach - We use a case study model to develop time or dynamic dimensions by using a System Dynamics modelling (SDM) approach. The model includes five perspectives and a number of financial and non-financial measures. All indicators are defined and related to a coherent number of different cause...... have a major influence on other indicators and profit and may be impossible to predict without using a dynamic model. Practical implications - The model may be used as the first step in quantifying the cause-and-effect relationships of an integrated BSC model. Using the System Dynamics model provides......Purpose - To construct a dynamic model/framework inspired by a case study based on an international company. As described by the theory, one of the main difficulties of BSC is to foresee the time lag dimension of different types of indicators and their combined dynamic effects. Design...

  12. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models.

    Science.gov (United States)

    Allen, R J; Rieger, T R; Musante, C J

    2016-03-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed "virtual patients." In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations.

  13. The UK Earth System Model project

    Science.gov (United States)

    Tang, Yongming

    2016-04-01

    In this talk we will describe the development and current status of the UK Earth System Model (UKESM). This project is a NERC/Met Office collaboration and has two objectives; to develop and apply a world-leading Earth System Model, and to grow a community of UK Earth System Model scientists. We are building numerical models that include all the key components of the global climate system, and contain the important process interactions between global biogeochemistry, atmospheric chemistry and the physical climate system. UKESM will be used to make key CMIP6 simulations as well as long-time (e.g. millennium) simulations, large ensemble experiments and investigating a range of future carbon emission scenarios.

  14. Mechatronic Systems Design Methods, Models, Concepts

    CERN Document Server

    Janschek, Klaus

    2012-01-01

    In this textbook, fundamental methods for model-based design of mechatronic systems are presented in a systematic, comprehensive form. The method framework presented here comprises domain-neutral methods for modeling and performance analysis: multi-domain modeling (energy/port/signal-based), simulation (ODE/DAE/hybrid systems), robust control methods, stochastic dynamic analysis, and quantitative evaluation of designs using system budgets. The model framework is composed of analytical dynamic models for important physical and technical domains of realization of mechatronic functions, such as multibody dynamics, digital information processing and electromechanical transducers. Building on the modeling concept of a technology-independent generic mechatronic transducer, concrete formulations for electrostatic, piezoelectric, electromagnetic, and electrodynamic transducers are presented. More than 50 fully worked out design examples clearly illustrate these methods and concepts and enable independent study of th...

  15. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  16. HVDC System Characteristics and Simulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.I.; Han, B.M.; Jang, G.S. [Electric Enginnering and Science Research Institute, Seoul (Korea)

    2001-07-01

    This report deals with the AC-DC power system simulation method by PSS/E and EUROSTAG for the development of a strategy for the reliable operation of the Cheju-Haenam interconnected system. The simulation using both programs is performed to analyze HVDC simulation models. In addition, the control characteristics of the Cheju-Haenam HVDC system as well as Cheju AC system characteristics are described in this work. (author). 104 figs., 8 tabs.

  17. Use of an operational model evaluation system for model intercomparison

    Energy Technology Data Exchange (ETDEWEB)

    Foster, K. T., LLNL

    1998-03-01

    The Atmospheric Release Advisory Capability (ARAC) is a centralized emergency response system used to assess the impact from atmospheric releases of hazardous materials. As part of an on- going development program, new three-dimensional diagnostic windfield and Lagrangian particle dispersion models will soon replace ARAC`s current operational windfield and dispersion codes. A prototype model performance evaluation system has been implemented to facilitate the study of the capabilities and performance of early development versions of these new models relative to ARAC`s current operational codes. This system provides tools for both objective statistical analysis using common performance measures and for more subjective visualization of the temporal and spatial relationships of model results relative to field measurements. Supporting this system is a database of processed field experiment data (source terms and meteorological and tracer measurements) from over 100 individual tracer releases.

  18. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    Science.gov (United States)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  19. Modeling Adaptive Behavior for Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1994-01-01

    Field studies in modern work systems and analysis of recent major accidents have pointed to a need for better models of the adaptive behavior of individuals and organizations operating in a dynamic and highly competitive environment. The paper presents a discussion of some key characteristics.......) The basic difference between the models of system functions used in engineering and design and those evolving from basic research within the various academic disciplines and finally 3.) The models and methods required for closed-loop, feedback system design....

  20. MODEL DRIVEN DEVELOPMENT OF ONLINE BANKING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Bresfelean Vasile Paul

    2011-07-01

    Full Text Available In case of online applications the cycle of software development varies from the routine. The online environment, the variety of users, the treatability of the mass of information created by them, the reusability and the accessibility from different devices are all factors of these systems complexity. The use of model drive approach brings several advantages that ease up the development process. Working prototypes that simplify client relationship and serve as the base of model tests can be easily made from models describing the system. These systems make possible for the banks clients to make their desired actions from anywhere. The user has the possibility of accessing information or making transactions.

  1. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  2. The reproducibility of single photon absorptiometry in a clinical setting

    International Nuclear Information System (INIS)

    Valkema, R.; Blokland, J.A.K.; Pauwels, E.K.J.; Papapoulos, S.E.; Bijvoet, O.L.M.

    1989-01-01

    The reproducibility of single photon absorptiometry (SPA) results for detection of changes in bone mineral content (BMC) was evaluated in a clinical setting. During a period of 18 months with 4 different sources, the calibration scans of an aluminium standard had a variation of less than 1% unless the activity of the 125 I source was low. The calibration procedure was performed weekly and this was sufficient to correct for drift of the system. The short term reproducibility in patients was assessed with 119 duplicate measurements made in direct succession. The best reproducibility (CV=1.35%) was found for fat corrected BMC results expressed in g/cm, obtained at the site proximal to the 8 mm space between the radius and ulna. Analysis of all SPA scans made during 1 year (487 scans) showed a failure of the automatic procedure to detect the space of 8 mm between the forearm bones in 19 scans (3.9%). A space adjacent to the ulnar styloid was taken as the site for the first scan in these examinations. This problem may be recognized and corrected relatively easy. A significant correlation was found between BMC at the lower arm and BMC of the lumbar spine assessed with dual photon absorptiometry. However, the error of estimation of proximal BMC (SEE=20%) and distal BMC (SEE=19.4%) made these measurements of little value to predict BMC at the lumbar spine in individuals. The short term reproducibility in patients combined with long term stability of the equipment in our clinical setting showed that SPA is a reliable technique to assess changes in bone mass at the lower arm of 4% between 2 measurements with a confidence level of 95%. (orig.)

  3. Ontological Model of Business Process Management Systems

    Science.gov (United States)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  4. Personalized, Shareable Geoscience Dataspaces For Simplifying Data Management and Improving Reproducibility

    Science.gov (United States)

    Malik, T.; Foster, I.; Goodall, J. L.; Peckham, S. D.; Baker, J. B. H.; Gurnis, M.

    2015-12-01

    Research activities are iterative, collaborative, and now data- and compute-intensive. Such research activities mean that even the many researchers who work in small laboratories must often create, acquire, manage, and manipulate much diverse data and keep track of complex software. They face difficult data and software management challenges, and data sharing and reproducibility are neglected. There is signficant federal investment in powerful cyberinfrastructure, in part to lesson the burden associated with modern data- and compute-intensive research. Similarly, geoscience communities are establishing research repositories to facilitate data preservation. Yet we observe a large fraction of the geoscience community continues to struggle with data and software management. The reason, studies suggest, is not lack of awareness but rather that tools do not adequately support time-consuming data life cycle activities. Through NSF/EarthCube-funded GeoDataspace project, we are building personalized, shareable dataspaces that help scientists connect their individual or research group efforts with the community at large. The dataspaces provide a light-weight multiplatform research data management system with tools for recording research activities in what we call geounits, so that a geoscientist can at any time snapshot and preserve, both for their own use and to share with the community, all data and code required to understand and reproduce a study. A software-as-a-service (SaaS) deployment model enhances usability of core components, and integration with widely used software systems. In this talk we will present the open-source GeoDataspace project and demonstrate how it is enabling reproducibility across geoscience domains of hydrology, space science, and modeling toolkits.

  5. MDOT Pavement Management System : Prediction Models and Feedback System

    Science.gov (United States)

    2000-10-01

    As a primary component of a Pavement Management System (PMS), prediction models are crucial for one or more of the following analyses: : maintenance planning, budgeting, life-cycle analysis, multi-year optimization of maintenance works program, and a...

  6. Hypersonic Vehicle Propulsion System Simplified Model Development

    Science.gov (United States)

    Stueber, Thomas J.; Raitano, Paul; Le, Dzu K.; Ouzts, Peter

    2007-01-01

    This document addresses the modeling task plan for the hypersonic GN&C GRC team members. The overall propulsion system modeling task plan is a multi-step process and the task plan identified in this document addresses the first steps (short term modeling goals). The procedures and tools produced from this effort will be useful for creating simplified dynamic models applicable to a hypersonic vehicle propulsion system. The document continues with the GRC short term modeling goal. Next, a general description of the desired simplified model is presented along with simulations that are available to varying degrees. The simulations may be available in electronic form (FORTRAN, CFD, MatLab,...) or in paper form in published documents. Finally, roadmaps outlining possible avenues towards realizing simplified model are presented.

  7. Mathematical models of information and stochastic systems

    CERN Document Server

    Kornreich, Philipp

    2008-01-01

    From ancient soothsayers and astrologists to today's pollsters and economists, probability theory has long been used to predict the future on the basis of past and present knowledge. Mathematical Models of Information and Stochastic Systems shows that the amount of knowledge about a system plays an important role in the mathematical models used to foretell the future of the system. It explains how this known quantity of information is used to derive a system's probabilistic properties. After an introduction, the book presents several basic principles that are employed in the remainder of the t

  8. Analytical performance modeling for computer systems

    CERN Document Server

    Tay, Y C

    2013-01-01

    This book is an introduction to analytical performance modeling for computer systems, i.e., writing equations to describe their performance behavior. It is accessible to readers who have taken college-level courses in calculus and probability, networking and operating systems. This is not a training manual for becoming an expert performance analyst. Rather, the objective is to help the reader construct simple models for analyzing and understanding the systems that they are interested in.Describing a complicated system abstractly with mathematical equations requires a careful choice of assumpti

  9. Description, Modelling and Design of Production Systems

    DEFF Research Database (Denmark)

    Jacobsen, Peter; Rudolph, Carsten

    1997-01-01

    Design of production systems are rarely an activity in which decision makers in most production companies have much experience. In future, this activity is to be more recurrent due to more and more frequent changes in the production task. Consequently, the decision makers are in need of better...... management tools and methods for description and modelling of production systems supporting the decisions. In this article a structural framework to describe and model production systems will be introduced, and it is shown how the production system of a minor Danish manufacturer of electromechanical...

  10. Modelling energy systems for developing countries

    International Nuclear Information System (INIS)

    Urban, F.; Benders, R.M.J.; Moll, H.C.

    2007-01-01

    Developing countries' energy use is rapidly increasing, which affects global climate change and global and regional energy settings. Energy models are helpful for exploring the future of developing and industrialised countries. However, energy systems of developing countries differ from those of industrialised countries, which has consequences for energy modelling. New requirements need to be met by present-day energy models to adequately explore the future of developing countries' energy systems. This paper aims to assess if the main characteristics of developing countries are adequately incorporated in present-day energy models. We first discuss these main characteristics, focusing particularly on developing Asia, and then present a model comparison of 12 selected energy models to test their suitability for developing countries. We conclude that many models are biased towards industrialised countries, neglecting main characteristics of developing countries, e.g. the informal economy, supply shortages, poor performance of the power sector, structural economic change, electrification, traditional bio-fuels, urban-rural divide. To more adequately address the energy systems of developing countries, energy models have to be adjusted and new models have to be built. We therefore indicate how to improve energy models for increasing their suitability for developing countries and give advice on modelling techniques and data requirements

  11. Rapid Discrimination Among Putative Mechanistic Models of Biochemical Systems.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2016-08-31

    An overarching goal in molecular biology is to gain an understanding of the mechanistic basis underlying biochemical systems. Success is critical if we are to predict effectively the outcome of drug treatments and the development of abnormal phenotypes. However, data from most experimental studies is typically noisy and sparse. This allows multiple potential mechanisms to account for experimental observations, and often devising experiments to test each is not feasible. Here, we introduce a novel strategy that discriminates among putative models based on their repertoire of qualitatively distinct phenotypes, without relying on knowledge of specific values for rate constants and binding constants. As an illustration, we apply this strategy to two synthetic gene circuits exhibiting anomalous behaviors. Our results show that the conventional models, based on their well-characterized components, cannot account for the experimental observations. We examine a total of 40 alternative hypotheses and show that only 5 have the potential to reproduce the experimental data, and one can do so with biologically relevant parameter values.

  12. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  13. A distributed snow-evolution modeling system (SnowModel)

    Science.gov (United States)

    Glen E. Liston; Kelly. Elder

    2006-01-01

    SnowModel is a spatially distributed snow-evolution modeling system designed for application in landscapes, climates, and conditions where snow occurs. It is an aggregation of four submodels: MicroMet defines meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowPack simulates snow depth and water-equivalent evolution, and SnowTran-3D...

  14. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  15. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  16. Learning Markov models for stationary system behaviors

    DEFF Research Database (Denmark)

    Chen, Yingke; Mao, Hua; Jaeger, Manfred

    2012-01-01

    to a single long observation sequence, and in these situations existing automatic learning methods cannot be applied. In this paper, we adapt algorithms for learning variable order Markov chains from a single observation sequence of a target system, so that stationary system properties can be verified using......Establishing an accurate model for formal verification of an existing hardware or software system is often a manual process that is both time consuming and resource demanding. In order to ease the model construction phase, methods have recently been proposed for automatically learning accurate...... the learned model. Experiments demonstrate that system properties (formulated as stationary probabilities of LTL formulas) can be reliably identified using the learned model....

  17. Power system coherency and model reduction

    CERN Document Server

    Chow, Joe H

    2014-01-01

    This book provides a comprehensive treatment for understanding interarea modes in large power systems and obtaining reduced-order models using the coherency concept and selective modal analysis method.

  18. Regional Ocean Modeling System (ROMS): Samoa

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 7-day, 3-hourly forecast for the region surrounding the islands of Samoa at approximately 3-km resolution. While considerable...

  19. REVIEW OF AQUACULTURAL PRODUCTION SYSTEM MODELS

    African Journals Online (AJOL)

    user

    models of aquacultural production systems with the aim of adopting a suitable one for ... of predicting the environmental condition, so as to determine point of diminishing returns and optimize yield in an ..... sale of fish are also tracked.

  20. Spinal Cord Injury Model System Information Network

    Science.gov (United States)

    ... the UAB-SCIMS More The UAB-SCIMS Information Network The University of Alabama at Birmingham Spinal Cord Injury Model System (UAB-SCIMS) maintains this Information Network as a resource to promote knowledge in the ...

  1. Regional Ocean Modeling System (ROMS): Guam

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 6-day, 3-hourly forecast for the region surrounding Guam at approximately 2-km resolution. While considerable effort has been...

  2. Regional Ocean Modeling System (ROMS): Oahu

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 7-day, 3-hourly forecast for the region surrounding the island of Oahu at approximately 1-km resolution. While considerable...

  3. Regional Ocean Modeling System (ROMS): CNMI

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 7-day, 3-hourly forecast for the region surrounding the Commonwealth of the Northern Mariana Islands (CNMI) at approximately...

  4. A Telecommunications Industry Primer: A Systems Model.

    Science.gov (United States)

    Obermier, Timothy R.; Tuttle, Ronald H.

    2003-01-01

    Describes the Telecommunications Systems Model to help technical educators and students understand the increasingly complex telecommunications infrastructure. Specifically looks at ownership and regulatory status, service providers, transport medium, network protocols, and end-user services. (JOW)

  5. Model reduction of port-Hamiltonian systems as structured systems

    NARCIS (Netherlands)

    Polyuga, R.V.; Schaft, van der A.J.

    2010-01-01

    The goal of this work is to demonstrate that a specific projection-based model reduction method, which provides an H2 error bound, turns out to be applicable to port-Hamiltonian systems, preserving the port-Hamiltonian structure for the reduced order model, and, as a consequence, passivity.

  6. Balmorel open source energy system model

    DEFF Research Database (Denmark)

    Wiese, Frauke; Bramstoft, Rasmus; Koduvere, Hardi

    2018-01-01

    As the world progresses towards a cleaner energy future with more variable renewable energy sources, energy system models are required to deal with new challenges. This article describes design, development and applications of the open source energy system model Balmorel, which is a result...... of a long and fruitful cooperation between public and private institutions within energy system research and analysis. The purpose of the article is to explain the modelling approach, to highlight strengths and challenges of the chosen approach, to create awareness about the possible applications...... of Balmorel as well as to inspire to new model developments and encourage new users to join the community. Some of the key strengths of the model are the flexible handling of the time and space dimensions and the combination of operation and investment optimisation. Its open source character enables diverse...

  7. A stream-based mathematical model for distributed information processing systems - SysLab system model

    OpenAIRE

    Klein, Cornel; Rumpe, Bernhard; Broy, Manfred

    2014-01-01

    In the SysLab project we develop a software engineering method based on a mathematical foundation. The SysLab system model serves as an abstract mathematical model for information systems and their components. It is used to formalize the semantics of all used description techniques such as object diagrams state automata sequence charts or data-flow diagrams. Based on the requirements for such a reference model, we define the system model including its different views and their relationships.

  8. Programming model for distributed intelligent systems

    Science.gov (United States)

    Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.

    1988-01-01

    A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.

  9. New Directions in Modeling the Lighting Systems

    Directory of Open Access Journals (Sweden)

    P. Fiala

    2004-12-01

    Full Text Available This paper presents information about new directions in the modelingof lighting systems, and an overview of methods for the modeling oflighting systems. The new R-FEM method is described, which is acombination of the Radiosity method and the Finite Elements Method. Thepaper contains modeling results and their verification by experimentalmeasurements and by the Matlab simulation for this R-FEM method.

  10. Modelling of Signal - Level Crossing System

    Directory of Open Access Journals (Sweden)

    Daniel Novak

    2006-01-01

    Full Text Available The author presents an object-oriented model of a railway level-crossing system created for the purpose of functional requirements specification. Unified Modelling Language (UML, version 1.4, which enables specification, visualisation, construction and documentation of software system artefacts, was used. The main attention was paid to analysis and design phases. The former phase resulted in creation of use case diagrams and sequential diagrams, the latter in creation of class/object diagrams and statechart diagrams.

  11. Ellipsoidal bag model for heavy quark system

    International Nuclear Information System (INIS)

    Bi Pinzhen; Fudan Univ., Shanghai

    1991-01-01

    The ellipsoidal bag model is used to describe heavy quark systems such as Qanti Q, Qanti Qg and Q 2 anti Q 2 . Instead of two step model, these states are described by an uniform picture. The potential derived from the ellipsoidal bag for Qanti Q is almost equivalent to the Cornell potential. For a Q 2 anti Q 2 system with large quark pair separation, an improvement of 70 MeV is obtained comparing with the spherical bag. (orig.)

  12. Model Reduction of Fuzzy Logic Systems

    Directory of Open Access Journals (Sweden)

    Zhandong Yu

    2014-01-01

    Full Text Available This paper deals with the problem of ℒ2-ℒ∞ model reduction for continuous-time nonlinear uncertain systems. The approach of the construction of a reduced-order model is presented for high-order nonlinear uncertain systems described by the T-S fuzzy systems, which not only approximates the original high-order system well with an ℒ2-ℒ∞ error performance level γ but also translates it into a linear lower-dimensional system. Then, the model approximation is converted into a convex optimization problem by using a linearization procedure. Finally, a numerical example is presented to show the effectiveness of the proposed method.

  13. An ecological process model of systems change.

    Science.gov (United States)

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model.

  14. Hybrid Energy System Modeling in Modelica

    Energy Technology Data Exchange (ETDEWEB)

    William R. Binder; Christiaan J. J. Paredis; Humberto E. Garcia

    2014-03-01

    In this paper, a Hybrid Energy System (HES) configuration is modeled in Modelica. Hybrid Energy Systems (HES) have as their defining characteristic the use of one or more energy inputs, combined with the potential for multiple energy outputs. Compared to traditional energy systems, HES provide additional operational flexibility so that high variability in both energy production and consumption levels can be absorbed more effectively. This is particularly important when including renewable energy sources, whose output levels are inherently variable, determined by nature. The specific HES configuration modeled in this paper include two energy inputs: a nuclear plant, and a series of wind turbines. In addition, the system produces two energy outputs: electricity and synthetic fuel. The models are verified through simulations of the individual components, and the system as a whole. The simulations are performed for a range of component sizes, operating conditions, and control schemes.

  15. System model development for nuclear thermal propulsion

    International Nuclear Information System (INIS)

    Walton, J.T.; Perkins, K.R.; Buksa, J.J.; Worley, B.A.; Dobranich, D.

    1992-01-01

    A critical enabling technology in the evolutionary development of nuclear thermal propulsion (NTP) is the ability to predict the system performance under a variety of operating conditions. Since October 1991, US (DOE), (DOD) and NASA have initiated critical technology development efforts for NTP systems to be used on Space Exploration Initiative (SEI) missions to the Moon and Mars. This paper presents the strategy and progress of an interagency NASA/DOE/DOD team for NTP system modeling. It is the intent of the interagency team to develop several levels of computer programs to simulate various NTP systems. An interagency team was formed for this task to use the best capabilities available and to assure appropriate peer review. The vision and strategy of the interagency team for developing NTP system models will be discussed in this paper. A review of the progress on the Level 1 interagency model is also presented

  16. Economic model of pipeline transportation systems

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.

    1977-07-29

    The objective of the work reported here was to develop a model which could be used to assess the economic effects of energy-conservative technological innovations upon the pipeline industry. The model is a dynamic simulator which accepts inputs of two classes: the physical description (design parameters, fluid properties, and financial structures) of the system to be studied, and the postulated market (throughput and price) projection. The model consists of time-independent submodels: the fluidics model which simulates the physical behavior of the system, and the financial model which operates upon the output of the fluidics model to calculate the economics outputs. Any of a number of existing fluidics models can be used in addition to that developed as a part of this study. The financial model, known as the Systems, Science and Software (S/sup 3/) Financial Projection Model, contains user options whereby pipeline-peculiar characteristics can be removed and/or modified, so that the model can be applied to virtually any kind of business enterprise. The several dozen outputs are of two classes: the energetics and the economics. The energetics outputs of primary interest are the energy intensity, also called unit energy consumption, and the total energy consumed. The primary economics outputs are the long-run average cost, profit, cash flow, and return on investment.

  17. System Identification, Environmental Modelling, and Control System Design

    CERN Document Server

    Garnier, Hugues

    2012-01-01

    System Identification, Environmetric Modelling, and Control Systems Design is dedicated to Professor Peter Young on the occasion of his seventieth birthday. Professor Young has been a pioneer in systems and control, and over the past 45 years he has influenced many developments in this field. This volume is comprised of a collection of contributions by leading experts in system identification, time-series analysis, environmetric modelling and control system design – modern research in topics that reflect important areas of interest in Professor Young’s research career. Recent theoretical developments in and relevant applications of these areas are explored treating the various subjects broadly and in depth. The authoritative and up-to-date research presented here will be of interest to academic researcher in control and disciplines related to environmental research, particularly those to with water systems. The tutorial style in which many of the contributions are composed also makes the book suitable as ...

  18. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  19. Modelling the Replication Management in Information Systems

    Directory of Open Access Journals (Sweden)

    Cezar TOADER

    2017-01-01

    Full Text Available In the modern economy, the benefits of Web services are significant because they facilitates the activities automation in the framework of Internet distributed businesses as well as the cooperation between organizations through interconnection process running in the computer systems. This paper presents the development stages of a model for a reliable information system. This paper describes the communication between the processes within the distributed system, based on the message exchange, and also presents the problem of distributed agreement among processes. A list of objectives for the fault-tolerant systems is defined and a framework model for distributed systems is proposed. This framework makes distinction between management operations and execution operations. The proposed model promotes the use of a central process especially designed for the coordination and control of other application processes. The execution phases and the protocols for the management and the execution components are presented. This model of a reliable system could be a foundation for an entire class of distributed systems models based on the management of replication process.

  20. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside...... attackers. Therefore, many attacks are considerably easier to be performed for insiders than for outsiders. However, current models do not support explicit specification of different behaviours. Instead, behaviour is deeply embedded in the analyses supported by the models, meaning that it is a complex......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...