WorldWideScience

Sample records for modeling system reproduced

  1. How Modeling Standards, Software, and Initiatives Support Reproducibility in Systems Biology and Systems Medicine.

    Science.gov (United States)

    Waltemath, Dagmar; Wolkenhauer, Olaf

    2016-10-01

    Only reproducible results are of significance to science. The lack of suitable standards and appropriate support of standards in software tools has led to numerous publications with irreproducible results. Our objectives are to identify the key challenges of reproducible research and to highlight existing solutions. In this paper, we summarize problems concerning reproducibility in systems biology and systems medicine. We focus on initiatives, standards, and software tools that aim to improve the reproducibility of simulation studies. The long-term success of systems biology and systems medicine depends on trustworthy models and simulations. This requires openness to ensure reusability and transparency to enable reproducibility of results in these fields.

  2. Circuit modeling of the electrical impedance: II. Normal subjects and system reproducibility

    International Nuclear Information System (INIS)

    Shiffman, C A; Rutkove, S B

    2013-01-01

    Part I of this series showed that the five-element circuit model accurately mimics impedances measured using multi-frequency electrical impedance myography (MFEIM), focusing on changes brought on by disease. This paper addresses two requirements which must be met if the method is to qualify for clinical use. First, the extracted parameters must be reproducible over long time periods such as those involved in the treatment of muscular disease, and second, differences amongst normal subjects should be attributable to known differences in the properties of healthy muscle. It applies the method to five muscle groups in 62 healthy subjects, closely following the procedure used earlier for the diseased subjects. Test–retest comparisons show that parameters are reproducible at levels from 6 to 16% (depending on the parameter) over time spans of up to 267 days, levels far below the changes occurring in serious disease. Also, variations with age, gender and muscle location are found to be consistent with established expectations for healthy muscle tissue. We conclude that the combination of MFEIM measurements and five-element circuit analysis genuinely reflects properties of muscle and is reliable enough to recommend its use in following neuromuscular disease. (paper)

  3. Coupled RipCAS-DFLOW (CoRD) Software and Data Management System for Reproducible Floodplain Vegetation Succession Modeling

    Science.gov (United States)

    Turner, M. A.; Miller, S.; Gregory, A.; Cadol, D. D.; Stone, M. C.; Sheneman, L.

    2016-12-01

    We present the Coupled RipCAS-DFLOW (CoRD) modeling system created to encapsulate the workflow to analyze the effects of stream flooding on vegetation succession. CoRD provides an intuitive command-line and web interface to run DFLOW and RipCAS in succession over many years automatically, which is a challenge because, for our application, DFLOW must be run on a supercomputing cluster via the PBS job scheduler. RipCAS is a vegetation succession model, and DFLOW is a 2D open channel flow model. Data adaptors have been developed to seamlessly connect DFLOW output data to be RipCAS inputs, and vice-versa. CoRD provides automated statistical analysis and visualization, plus automatic syncing of input and output files and model run metadata to the hydrological data management system HydroShare using its excellent Python REST client. This combination of technologies and data management techniques allows the results to be shared with collaborators and eventually published. Perhaps most importantly, it allows results to be easily reproduced via either the command-line or web user interface. This system is a result of collaboration between software developers and hydrologists participating in the Western Consortium for Watershed Analysis, Visualization, and Exploration (WC-WAVE). Because of the computing-intensive nature of this particular workflow, including automating job submission/monitoring and data adaptors, software engineering expertise is required. However, the hydrologists provide the software developers with a purpose and ensure a useful, intuitive tool is developed. Our hydrologists contribute software, too: RipCAS was developed from scratch by hydrologists on the team as a specialized, open-source version of the Computer Aided Simulation Model for Instream Flow and Riparia (CASiMiR) vegetation model; our hydrologists running DFLOW provided numerous examples and help with the supercomputing system. This project is written in Python, a popular language in the

  4. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  5. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  6. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  7. A reproducible canine model of esophageal varices.

    Science.gov (United States)

    Jensen, D M; Machicado, G A; Tapia, J I; Kauffman, G; Franco, P; Beilin, D

    1983-03-01

    One of the most promising nonoperative techniques for control of variceal hemorrhage is sclerosis via the fiberoptic endoscope. Many questions remain, however, about sclerosing agents, guidelines for effective use, and limitations of endoscopic techniques. A reproducible large animal model of esophageal varices would facilitate the critical evaluation of techniques for variceal hemostasis or sclerosis. Our purpose was to develop a large animal model of esophageal varices. Studies in pigs and dogs are described which led to the development of a reproducible canine model of esophageal varices. For the final model, mongrel dogs had laparotomy, side-to-side portacaval shunt, inferior vena cava ligation, placement of an ameroid constrictor around the portal vein, and liver biopsy. The mean (+/- SE) pre- and postshunt portal pressure increased significantly from 12 +/- 0.4 to 23 +/- 1 cm saline. Weekly endoscopies were performed to grade the varix size. Two-thirds of animals developed medium or large sized esophageal varices after the first operation. Three to six weeks later, a second laparotomy with complete ligation of the portal vein and liver biopsy were performed in animals with varices (one-third of the animals). All dogs developed esophageal varices and abdominal wall collateral veins of variable size 3-6 wk after the first operation. After the second operation, the varices became larger. Shunting of blood through esophageal varices via splenic and gastric veins was demonstrated by angiography. Sequential liver biopsies were normal. There was no morbidity or mortality. Ascites, encephalopathy, or spontaneous variceal bleeding did not occur. We have documented the lack of size change and the persistence of medium to large esophageal varices and abdominal collateral veins in all animals followed for more than 6 mo. Variceal bleeding could be induced by venipuncture for testing endoscopic hemostatic and sclerosis methods. We suggest other potential uses of this

  8. Skills of General Circulation and Earth System Models in reproducing streamflow to the ocean: the case of Congo river

    Science.gov (United States)

    Santini, M.; Caporaso, L.

    2017-12-01

    Although the importance of water resources in the context of climate change, it is still difficult to correctly simulate the freshwater cycle over the land via General Circulation and Earth System Models (GCMs and ESMs). Existing efforts from the Climate Model Intercomparison Project 5 (CMIP5) were mainly devoted to the validation of atmospheric variables like temperature and precipitation, with low attention to discharge.Here we investigate the present-day performances of GCMs and ESMs participating to CMIP5 in simulating the discharge of the river Congo to the sea thanks to: i) the long-term availability of discharge data for the Kinshasa hydrological station representative of more than 95% of the water flowing in the whole catchment; and ii) the River's still low influence by human intervention, which enables comparison with the (mostly) natural streamflow simulated within CMIP5.Our findings suggest how most of models appear overestimating the streamflow in terms of seasonal cycle, especially in the late winter and spring, while overestimation and variability across models are lower in late summer. Weighted ensemble means are also calculated, based on simulations' performances given by several metrics, showing some improvements of results.Although simulated inter-monthly and inter-annual percent anomalies do not appear significantly different from those in observed data, when translated into well consolidated indicators of drought attributes (frequency, magnitude, timing, duration), usually adopted for more immediate communication to stakeholders and decision makers, such anomalies can be misleading.These inconsistencies produce incorrect assessments towards water management planning and infrastructures (e.g. dams or irrigated areas), especially if models are used instead of measurements, as in case of ungauged basins or for basins with insufficient data, as well as when relying on models for future estimates without a preliminary quantification of model biases.

  9. 3D-modeling of the spine using EOS imaging system: Inter-reader reproducibility and reliability.

    Science.gov (United States)

    Rehm, Johannes; Germann, Thomas; Akbar, Michael; Pepke, Wojciech; Kauczor, Hans-Ulrich; Weber, Marc-André; Spira, Daniel

    2017-01-01

    To retrospectively assess the interreader reproducibility and reliability of EOS 3D full spine reconstructions in patients with adolescent idiopathic scoliosis (AIS). 73 patients with mean age of 17 years and a moderate AIS (median Cobb Angle 18.2°) obtained low-dose standing biplanar radiographs with EOS. Two independent readers performed "full spine" 3D reconstructions of the spine with the "full-spine" method adjusting the bone contour of every thoracic and lumbar vertebra (Th1-L5). Interreader reproducibility was assessed regarding rotation of every single vertebra in the coronal (i.e. frontal), sagittal (i.e. lateral), and axial plane, T1/T12 kyphosis, T4/T12 kyphosis, L1/L5 lordosis, L1/S1 lordosis and pelvic parameters. Radiation exposure, scan-time and 3D reconstruction time were recorded. Interclass correlation (ICC) ranged between 0.83 and 0.98 for frontal vertebral rotation, between 0.94 and 0.99 for lateral vertebral rotation and between 0.51 and 0.88 for axial vertebral rotation. ICC was 0.92 for T1/T12 kyphosis, 0.95 for T4/T12 kyphosis, 0.90 for L1/L5 lordosis, 0.85 for L1/S1 lordosis, 0.97 for pelvic incidence, 0.96 for sacral slope, 0.98 for sagittal pelvic tilt and 0.94 for lateral pelvic tilt. The mean time for reconstruction was 14.9 minutes (reader 1: 14.6 minutes, reader 2: 15.2 minutes, p3D angle measurement of vertebral rotation proved to be reliable and was performed in an acceptable reconstruction time. Interreader reproducibility of axial rotation was limited to some degree in the upper and middle thoracic spine due the obtuse angulation of the pedicles and the processi spinosi in the frontal view somewhat complicating their delineation.

  10. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  11. Towards reproducible descriptions of neuronal network models.

    Directory of Open Access Journals (Sweden)

    Eilen Nordlie

    2009-08-01

    Full Text Available Progress in science depends on the effective exchange of ideas among scientists. New ideas can be assessed and criticized in a meaningful manner only if they are formulated precisely. This applies to simulation studies as well as to experiments and theories. But after more than 50 years of neuronal network simulations, we still lack a clear and common understanding of the role of computational models in neuroscience as well as established practices for describing network models in publications. This hinders the critical evaluation of network models as well as their re-use. We analyze here 14 research papers proposing neuronal network models of different complexity and find widely varying approaches to model descriptions, with regard to both the means of description and the ordering and placement of material. We further observe great variation in the graphical representation of networks and the notation used in equations. Based on our observations, we propose a good model description practice, composed of guidelines for the organization of publications, a checklist for model descriptions, templates for tables presenting model structure, and guidelines for diagrams of networks. The main purpose of this good practice is to trigger a debate about the communication of neuronal network models in a manner comprehensible to humans, as opposed to machine-readable model description languages. We believe that the good model description practice proposed here, together with a number of other recent initiatives on data-, model-, and software-sharing, may lead to a deeper and more fruitful exchange of ideas among computational neuroscientists in years to come. We further hope that work on standardized ways of describing--and thinking about--complex neuronal networks will lead the scientific community to a clearer understanding of high-level concepts in network dynamics, and will thus lead to deeper insights into the function of the brain.

  12. Reproducibility and Transparency in Ocean-Climate Modeling

    Science.gov (United States)

    Hannah, N.; Adcroft, A.; Hallberg, R.; Griffies, S. M.

    2015-12-01

    Reproducibility is a cornerstone of the scientific method. Within geophysical modeling and simulation achieving reproducibility can be difficult, especially given the complexity of numerical codes, enormous and disparate data sets, and variety of supercomputing technology. We have made progress on this problem in the context of a large project - the development of new ocean and sea ice models, MOM6 and SIS2. Here we present useful techniques and experience.We use version control not only for code but the entire experiment working directory, including configuration (run-time parameters, component versions), input data and checksums on experiment output. This allows us to document when the solutions to experiments change, whether due to code updates or changes in input data. To avoid distributing large input datasets we provide the tools for generating these from the sources, rather than provide raw input data.Bugs can be a source of non-determinism and hence irreproducibility, e.g. reading from or branching on uninitialized memory. To expose these we routinely run system tests, using a memory debugger, multiple compilers and different machines. Additional confidence in the code comes from specialised tests, for example automated dimensional analysis and domain transformations. This has entailed adopting a code style where we deliberately restrict what a compiler can do when re-arranging mathematical expressions.In the spirit of open science, all development is in the public domain. This leads to a positive feedback, where increased transparency and reproducibility makes using the model easier for external collaborators, who in turn provide valuable contributions. To facilitate users installing and running the model we provide (version controlled) digital notebooks that illustrate and record analysis of output. This has the dual role of providing a gross, platform-independent, testing capability and a means to documents model output and analysis.

  13. Measurement of cerebral blood flow by intravenous xenon-133 technique and a mobile system. Reproducibility using the Obrist model compared to total curve analysis

    DEFF Research Database (Denmark)

    Schroeder, T; Holstein, P; Lassen, N A

    1986-01-01

    was considerably more reproducible than CBF level. Using a single detector instead of five regional values averaged as the hemispheric flow increased standard deviation of CBF level by 10-20%, while the variation in asymmetry was doubled. In optimal measuring conditions the two models revealed no significant...... differences, but in low flow situations the artifact model yielded significantly more stable results. The present apparatus, equipped with 3-5 detectors covering each hemisphere, offers the opportunity of performing serial CBF measurements in situations not otherwise feasible.......The recent development of a mobile 10 detector unit, using i.v. Xenon-133 technique, has made it possible to perform repeated bedside measurements of cerebral blood flow (CBF). Test-retest studies were carried out in 38 atherosclerotic subjects, in order to evaluate the reproducibility of CBF level...

  14. Theoretical Modeling and Computer Simulations for the Origins and Evolution of Reproducing Molecular Systems and Complex Systems with Many Interactive Parts

    Science.gov (United States)

    Liang, Shoudan

    2000-01-01

    Our research effort has produced nine publications in peer-reviewed journals listed at the end of this report. The work reported here are in the following areas: (1) genetic network modeling; (2) autocatalytic model of pre-biotic evolution; (3) theoretical and computational studies of strongly correlated electron systems; (4) reducing thermal oscillations in atomic force microscope; (5) transcription termination mechanism in prokaryotic cells; and (6) the low glutamine usage in thennophiles obtained by studying completely sequenced genomes. We discuss the main accomplishments of these publications.

  15. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  16. Reproducible analyses of microbial food for advanced life support systems

    Science.gov (United States)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  17. Paleomagnetic analysis of curved thrust belts reproduced by physical models

    Science.gov (United States)

    Costa, Elisabetta; Speranza, Fabio

    2003-12-01

    This paper presents a new methodology for studying the evolution of curved mountain belts by means of paleomagnetic analyses performed on analogue models. Eleven models were designed aimed at reproducing various tectonic settings in thin-skinned tectonics. Our models analyze in particular those features reported in the literature as possible causes for peculiar rotational patterns in the outermost as well as in the more internal fronts. In all the models the sedimentary cover was reproduced by frictional low-cohesion materials (sand and glass micro-beads), which detached either on frictional or on viscous layers. These latter were reproduced in the models by silicone. The sand forming the models has been previously mixed with magnetite-dominated powder. Before deformation, the models were magnetized by means of two permanent magnets generating within each model a quasi-linear magnetic field of intensity variable between 20 and 100 mT. After deformation, the models were cut into closely spaced vertical sections and sampled by means of 1×1-cm Plexiglas cylinders at several locations along curved fronts. Care was taken to collect paleomagnetic samples only within virtually undeformed thrust sheets, avoiding zones affected by pervasive shear. Afterwards, the natural remanent magnetization of these samples was measured, and alternating field demagnetization was used to isolate the principal components. The characteristic components of magnetization isolated were used to estimate the vertical-axis rotations occurring during model deformation. We find that indenters pushing into deforming belts from behind form non-rotational curved outer fronts. The more internal fronts show oroclinal-type rotations of a smaller magnitude than that expected for a perfect orocline. Lateral symmetrical obstacles in the foreland colliding with forward propagating belts produce non-rotational outer curved fronts as well, whereas in between and inside the obstacles a perfect orocline forms

  18. Hydrological Modeling Reproducibility Through Data Management and Adaptors for Model Interoperability

    Science.gov (United States)

    Turner, M. A.

    2015-12-01

    Because of a lack of centralized planning and no widely-adopted standards among hydrological modeling research groups, research communities, and the data management teams meant to support research, there is chaos when it comes to data formats, spatio-temporal resolutions, ontologies, and data availability. All this makes true scientific reproducibility and collaborative integrated modeling impossible without some glue to piece it all together. Our Virtual Watershed Integrated Modeling System provides the tools and modeling framework hydrologists need to accelerate and fortify new scientific investigations by tracking provenance and providing adaptors for integrated, collaborative hydrologic modeling and data management. Under global warming trends where water resources are under increasing stress, reproducible hydrological modeling will be increasingly important to improve transparency and understanding of the scientific facts revealed through modeling. The Virtual Watershed Data Engine is capable of ingesting a wide variety of heterogeneous model inputs, outputs, model configurations, and metadata. We will demonstrate one example, starting from real-time raw weather station data packaged with station metadata. Our integrated modeling system will then create gridded input data via geostatistical methods along with error and uncertainty estimates. These gridded data are then used as input to hydrological models, all of which are available as web services wherever feasible. Models may be integrated in a data-centric way where the outputs too are tracked and used as inputs to "downstream" models. This work is part of an ongoing collaborative Tri-state (New Mexico, Nevada, Idaho) NSF EPSCoR Project, WC-WAVE, comprised of researchers from multiple universities in each of the three states. The tools produced and presented here have been developed collaboratively alongside watershed scientists to address specific modeling problems with an eye on the bigger picture of

  19. Can global chemistry-climate models reproduce air quality extremes?

    Science.gov (United States)

    Schnell, J.; Prather, M. J.; Holmes, C. D.

    2013-12-01

    We identify and characterize extreme ozone pollution episodes over the USA and EU through a novel analysis of ten years (2000-2010) of surface ozone measurements. An optimal interpolation scheme is developed to create grid-cell averaged values of surface ozone that can be compared with gridded model simulations. In addition, it also allows a comparison of two non-coincident observational networks in the EU. The scheme incorporates techniques borrowed from inverse distance weighting and Kriging. It uses all representative observational site data while still recognizing the heterogeneity of surface ozone. Individual, grid-cell level events are identified as an exceedance of historical percentile (10 worst days in a year, 97.3 percentile). A clustering algorithm is then used to construct the ozone episodes from the individual events. We then test the skill of the high-resolution (100 km) two-year (2005-2006) hindcast from the UCI global chemistry transport model in reproducing the events/episodes identified in the observations using the same identification criteria. Although the UCI CTM has substantial biases in surface ozone, we find that it has considerable skill in reproducing both individual grid-cell level extreme events and their connectedness in space and time with an overall skill of 24% (32%) for the US (EU). The grid-cell level extreme ozone events in both the observations and UCI CTM are found to occur mostly (~75%) in coherent, multi-day, connected episodes covering areas greater than 1000 x 1000 square km. In addition the UCI CTM has greater skill in reproducing these larger episodes. We conclude that even at relatively coarse resolution, global chemistry-climate models can be used to project major synoptic pollution episodes driven by large-scale climate and chemistry changes even with their known biases.

  20. From alginate impressions to digital virtual models: accuracy and reproducibility.

    Science.gov (United States)

    Dalstra, Michel; Melsen, Birte

    2009-03-01

    To compare the accuracy and reproducibility of measurements performed on digital virtual models with those taken on plaster casts from models poured immediately after the impression was taken, the 'gold standard', and from plaster models poured following a 3-5 day shipping procedure of the alginate impression. Direct comparison of two measuring techniques. The study was conducted at the Department of Orthodontics, School of Dentistry, University of Aarhus, Denmark in 2006/2007. Twelve randomly selected orthodontic graduate students with informed consent. Three sets of alginate impressions were taken from the participants within 1 hour. Plaster models were poured immediately from two of the sets, while the third set was kept in transit in the mail for 3-5 days. Upon return a plaster model was poured as well. Finally digital models were made from the plaster models. A number of measurements were performed on the plaster casts with a digital calliper and on the corresponding digital models using the virtual measuring tool of the accompanying software. Afterwards these measurements were compared statistically. No statistical differences were found between the three sets of plaster models. The intra- and inter-observer variability are smaller for the measurements performed on the digital models. Sending alginate impressions by mail does not affect the quality and accuracy of plaster casts poured from them afterwards. Virtual measurements performed on digital models display less variability than the corresponding measurements performed with a calliper on the actual models.

  1. A reproducible oral microcosm biofilm model for testing dental materials.

    Science.gov (United States)

    Rudney, J D; Chen, R; Lenton, P; Li, J; Li, Y; Jones, R S; Reilly, C; Fok, A S; Aparicio, C

    2012-12-01

    Most studies of biofilm effects on dental materials use single-species biofilms, or consortia. Microcosm biofilms grown directly from saliva or plaque are much more diverse, but difficult to characterize. We used the Human Oral Microbial Identification Microarray (HOMIM) to validate a reproducible oral microcosm model. Saliva and dental plaque were collected from adults and children. Hydroxyapatite and dental composite discs were inoculated with either saliva or plaque, and microcosm biofilms were grown in a CDC biofilm reactor. In later experiments, the reactor was pulsed with sucrose. DNA from inoculums and microcosms was analysed by HOMIM for 272 species. Microcosms included about 60% of species from the original inoculum. Biofilms grown on hydroxyapatite and composites were extremely similar. Sucrose pulsing decreased diversity and pH, but increased the abundance of Streptococcus and Veillonella. Biofilms from the same donor, grown at different times, clustered together. This model produced reproducible microcosm biofilms that were representative of the oral microbiota. Sucrose induced changes associated with dental caries. This is the first use of HOMIM to validate an oral microcosm model that can be used to study the effects of complex biofilms on dental materials. © 2012 The Society for Applied Microbiology.

  2. The Web system of visualization and analysis equipped with reproducibility

    International Nuclear Information System (INIS)

    Ueshima, Yutaka; Saito, Kanji; Takeda, Yasuhiro; Nakai, Youichi; Hayashi, Sachiko

    2005-01-01

    In the advanced photon experimental research, real-time visualization and steering system is thought as desirable method of data analysis. This approach is valid only in the fixed analysis at one time or in the easily reproducible experiment. But, in the research for an unknown problem like the advanced photon experimental research, it is necessary that the observation data can be analyzed many times because profitable analysis is difficult at the first time. Consequently, output data should be filed to refer and analyze at any time. To support the research, we need the followed automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be integrated database system with several functional servers distributed on the network. (author)

  3. Measurement System Analyses - Gauge Repeatability and Reproducibility Methods

    Science.gov (United States)

    Cepova, Lenka; Kovacikova, Andrea; Cep, Robert; Klaput, Pavel; Mizera, Ondrej

    2018-02-01

    The submitted article focuses on a detailed explanation of the average and range method (Automotive Industry Action Group, Measurement System Analysis approach) and of the honest Gauge Repeatability and Reproducibility method (Evaluating the Measurement Process approach). The measured data (thickness of plastic parts) were evaluated by both methods and their results were compared on the basis of numerical evaluation. Both methods were additionally compared and their advantages and disadvantages were discussed. One difference between both methods is the calculation of variation components. The AIAG method calculates the variation components based on standard deviation (then a sum of variation components does not give 100 %) and the honest GRR study calculates the variation components based on variance, where the sum of all variation components (part to part variation, EV & AV) gives the total variation of 100 %. Acceptance of both methods among the professional society, future use, and acceptance by manufacturing industry were also discussed. Nowadays, the AIAG is the leading method in the industry.

  4. Can a coupled meteorology–chemistry model reproduce the ...

    Science.gov (United States)

    The ability of a coupled meteorology–chemistry model, i.e., Weather Research and Forecast and Community Multiscale Air Quality (WRF-CMAQ), to reproduce the historical trend in aerosol optical depth (AOD) and clear-sky shortwave radiation (SWR) over the Northern Hemisphere has been evaluated through a comparison of 21-year simulated results with observation-derived records from 1990 to 2010. Six satellite-retrieved AOD products including AVHRR, TOMS, SeaWiFS, MISR, MODIS-Terra and MODIS-Aqua as well as long-term historical records from 11 AERONET sites were used for the comparison of AOD trends. Clear-sky SWR products derived by CERES at both the top of atmosphere (TOA) and surface as well as surface SWR data derived from seven SURFRAD sites were used for the comparison of trends in SWR. The model successfully captured increasing AOD trends along with the corresponding increased TOA SWR (upwelling) and decreased surface SWR (downwelling) in both eastern China and the northern Pacific. The model also captured declining AOD trends along with the corresponding decreased TOA SWR (upwelling) and increased surface SWR (downwelling) in the eastern US, Europe and the northern Atlantic for the period of 2000–2010. However, the model underestimated the AOD over regions with substantial natural dust aerosol contributions, such as the Sahara Desert, Arabian Desert, central Atlantic and northern Indian Ocean. Estimates of the aerosol direct radiative effect (DRE) at TOA a

  5. The substorm cycle as reproduced by global MHD models

    Science.gov (United States)

    Gordeev, E.; Sergeev, V.; Tsyganenko, N.; Kuznetsova, M.; Rastäetter, L.; Raeder, J.; Tóth, G.; Lyon, J.; Merkin, V.; Wiltberger, M.

    2017-01-01

    Recently, Gordeev et al. (2015) suggested a method to test global MHD models against statistical empirical data. They showed that four community-available global MHD models supported by the Community Coordinated Modeling Center (CCMC) produce a reasonable agreement with reality for those key parameters (the magnetospheric size, magnetic field, and pressure) that are directly related to the large-scale equilibria in the outer magnetosphere. Based on the same set of simulation runs, here we investigate how the models reproduce the global loading-unloading cycle. We found that in terms of global magnetic flux transport, three examined CCMC models display systematically different response to idealized 2 h north then 2 h south interplanetary magnetic field (IMF) Bz variation. The LFM model shows a depressed return convection and high loading rate during the growth phase as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded/unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. Two other models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. We also demonstrate potential technical problem in the publicly available simulations which is related to postprocessing interpolation and could affect the accuracy of magnetic field tracing and of other related procedures.

  6. Voxel-level reproducibility assessment of modality independent elastography in a pre-clinical murine model

    Science.gov (United States)

    Flint, Katelyn M.; Weis, Jared A.; Yankeelov, Thomas E.; Miga, Michael I.

    2015-03-01

    Changes in tissue mechanical properties, measured non-invasively by elastography methods, have been shown to be an important diagnostic tool, particularly for cancer. Tissue elasticity information, tracked over the course of therapy, may be an important prognostic indicator of tumor response to treatment. While many elastography techniques exist, this work reports on the use of a novel form of elastography that uses image texture to reconstruct elastic property distributions in tissue (i.e., a modality independent elastography (MIE) method) within the context of a pre-clinical breast cancer system.1,2 The elasticity results have previously shown good correlation with independent mechanical testing.1 Furthermore, MIE has been successfully utilized to localize and characterize lesions in both phantom experiments and simulation experiments with clinical data.2,3 However, the reproducibility of this method has not been characterized in previous work. The goal of this study is to evaluate voxel-level reproducibility of MIE in a pre-clinical model of breast cancer. Bland-Altman analysis of co-registered repeat MIE scans in this preliminary study showed a reproducibility index of 24.7% (scaled to a percent of maximum stiffness) at the voxel level. As opposed to many reports in the magnetic resonance elastography (MRE) literature that speak to reproducibility measures of the bulk organ, these results establish MIE reproducibility at the voxel level; i.e., the reproducibility of locally-defined mechanical property measurements throughout the tumor volume.

  7. A reproducible brain tumour model established from human glioblastoma biopsies

    Directory of Open Access Journals (Sweden)

    Li Xingang

    2009-12-01

    Full Text Available Abstract Background Establishing clinically relevant animal models of glioblastoma multiforme (GBM remains a challenge, and many commonly used cell line-based models do not recapitulate the invasive growth patterns of patient GBMs. Previously, we have reported the formation of highly invasive tumour xenografts in nude rats from human GBMs. However, implementing tumour models based on primary tissue requires that these models can be sufficiently standardised with consistently high take rates. Methods In this work, we collected data on growth kinetics from a material of 29 biopsies xenografted in nude rats, and characterised this model with an emphasis on neuropathological and radiological features. Results The tumour take rate for xenografted GBM biopsies were 96% and remained close to 100% at subsequent passages in vivo, whereas only one of four lower grade tumours engrafted. Average time from transplantation to the onset of symptoms was 125 days ± 11.5 SEM. Histologically, the primary xenografts recapitulated the invasive features of the parent tumours while endothelial cell proliferations and necrosis were mostly absent. After 4-5 in vivo passages, the tumours became more vascular with necrotic areas, but also appeared more circumscribed. MRI typically revealed changes related to tumour growth, several months prior to the onset of symptoms. Conclusions In vivo passaging of patient GBM biopsies produced tumours representative of the patient tumours, with high take rates and a reproducible disease course. The model provides combinations of angiogenic and invasive phenotypes and represents a good alternative to in vitro propagated cell lines for dissecting mechanisms of brain tumour progression.

  8. Development of a Consistent and Reproducible Porcine Scald Burn Model

    Science.gov (United States)

    Kempf, Margit; Kimble, Roy; Cuttle, Leila

    2016-01-01

    There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153

  9. Establishment of reproducible osteosarcoma rat model using orthotopic implantation technique.

    Science.gov (United States)

    Yu, Zhe; Sun, Honghui; Fan, Qingyu; Long, Hua; Yang, Tongtao; Ma, Bao'an

    2009-05-01

    osteosarcoma model was shown to be feasible: the take rate was high, surgical mortality was negligible and the procedure was simple to perform and easily reproduced. It may be a useful tool in the investigation of antiangiogenic and anticancer therapeutics. Ultrasound was found to be a highly accurate tool for tumor diagnosis, localization and measurement and may be recommended for monitoring tumor growth in this model.

  10. Reproducing Phenomenology of Peroxidation Kinetics via Model Optimization

    Science.gov (United States)

    Ruslanov, Anatole D.; Bashylau, Anton V.

    2010-06-01

    We studied mathematical modeling of lipid peroxidation using a biochemical model system of iron (II)-ascorbate-dependent lipid peroxidation of rat hepatocyte mitochondrial fractions. We found that antioxidants extracted from plants demonstrate a high intensity of peroxidation inhibition. We simplified the system of differential equations that describes the kinetics of the mathematical model to a first order equation, which can be solved analytically. Moreover, we endeavor to algorithmically and heuristically recreate the processes and construct an environment that closely resembles the corresponding natural system. Our results demonstrate that it is possible to theoretically predict both the kinetics of oxidation and the intensity of inhibition without resorting to analytical and biochemical research, which is important for cost-effective discovery and development of medical agents with antioxidant action from the medicinal plants.

  11. Reproducible and expedient rice regeneration system using in vitro ...

    African Journals Online (AJOL)

    Inevitable prerequisite for expedient regeneration in rice is the selection of totipotent explant and developing an apposite combination of growth hormones. Here, we reported a reproducible regeneration protocol in which basal segments of the stem of the in vitro grown rice plants were used as ex-plant. Using the protocol ...

  12. Reproducing Sea-Ice Deformation Distributions With Viscous-Plastic Sea-Ice Models

    Science.gov (United States)

    Bouchat, A.; Tremblay, B.

    2016-02-01

    High resolution sea-ice dynamic models offer the potential to discriminate between sea-ice rheologies based on their ability to reproduce the satellite-derived deformation fields. Recent studies have shown that sea-ice viscous-plastic (VP) models do not reproduce the observed statistical properties of the strain rate distributions of the RADARSAT Geophysical Processor System (RGPS) deformation fields [1][2]. We use the elliptical VP rheology and we compute the probability density functions (PDFs) for simulated strain rate invariants (divergence and maximum shear stress) and compare against the deformations obtained with the 3-day gridded products from RGPS. We find that the large shear deformations are well reproduced by the elliptical VP model and the deformations do not follow a Gaussian distribution as reported in Girard et al. [1][2]. On the other hand, we do find an overestimation of the shear in the range of mid-magnitude deformations in all of our VP simulations tested with different spatial resolutions and numerical parameters. Runs with no internal stress (free-drift) or with constant viscosity coefficients (Newtonian fluid) also show this overestimation. We trace back this discrepancy to the elliptical yield curve aspect ratio (e = 2) having too little shear strength, hence not resisting enough the inherent shear in the wind forcing associated with synoptic weather systems. Experiments where we simply increase the shear resistance of the ice by modifying the ellipse ratio confirm the need for a rheology with an increased shear strength. [1] Girard et al. (2009), Evaluation of high-resolution sea ice models [...], Journal of Geophysical Research, 114[2] Girard et al. (2011), A new modeling framework for sea-ice mechanics [...], Annals of Glaciology, 57, 123-132

  13. PCM magnetic tape system efficiently records and reproduces data

    Science.gov (United States)

    Cole, P. T.

    1965-01-01

    Split-phase PCM technique consists of data and clock signal recording and reproduction systems. This PCM magnetic tape system achieves a high packing density on the tape and provides a symmetrical reproduction of the recorded signal.

  14. Using a 1-D model to reproduce diurnal SST signals

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob L.

    2014-01-01

    of measurement. A generally preferred approach to bridge the gap between in situ and remotely obtained measurements is through modelling of the upper ocean temperature. This ESA supported study focuses on the implementation of the 1 dimensional General Ocean Turbulence Model (GOTM), in order to resolve...... an additional parametrisation for the total outgoing long-wave radiation and a 9-band parametrisation for the light extinction. New parametrisations for the stability functions, associated with vertical mixing, have been included. GOTM is tested using experimental data from the Woods Hole Oceanographic...

  15. Reproducible Infection Model for Clostridium perfringens in Broiler Chickens

    DEFF Research Database (Denmark)

    Pedersen, Karl; Friis-Holm, Lotte Bjerrum; Heuer, Ole Eske

    2008-01-01

    Experiments were carried out to establish an infection and disease model for Clostridium perfringens in broiler chickens. Previous experiments had failed to induce disease and only a transient colonization with challenge strains had been obtained. In the present study, two series of experiments...

  16. How well can DFT reproduce key interactions in Ziegler-Natta systems?

    KAUST Repository

    Correa, Andrea

    2013-08-08

    The performance of density functional theory in reproducing some of the main interactions occurring in MgCl2-supported Ziegler-Natta catalytic systems is assessed. Eight model systems, representatives of key interactions occurring in Ziegler-Natta catalysts, are selected. Fifteen density functionals are tested in combination with two different basis sets, namely, TZVP and cc-pVTZ. As a general result, we found that the best performances are achieved by the PBEh1PBE hybrid generalized gradient approximation (GGA) functional, but also the cheaper PBEh GGA functional gives rather good results. The failure of the popular B3LYP and BP86 functionals is noticeable. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. COMBINE archive and OMEX format : One file to share all information to reproduce a modeling project

    NARCIS (Netherlands)

    Bergmann, Frank T.; Olivier, Brett G.; Soiland-Reyes, Stian

    2014-01-01

    Background: With the ever increasing use of computational models in the biosciences, the need to share models and reproduce the results of published studies efficiently and easily is becoming more important. To this end, various standards have been proposed that can be used to describe models,

  18. Reproducibility analysis of measurements with a mechanical semiautomatic eye model for evaluation of intraocular lenses

    Science.gov (United States)

    Rank, Elisabet; Traxler, Lukas; Bayer, Natascha; Reutterer, Bernd; Lux, Kirsten; Drauschke, Andreas

    2014-03-01

    Mechanical eye models are used to validate ex vivo the optical quality of intraocular lenses (IOLs). The quality measurement and test instructions for IOLs are defined in the ISO 11979-2. However, it was mentioned in literature that these test instructions could lead to inaccurate measurements in case of some modern IOL designs. Reproducibility of alignment and measurement processes are presented, performed with a semiautomatic mechanical ex vivo eye model based on optical properties published by Liou and Brennan in the scale 1:1. The cornea, the iris aperture and the IOL itself are separately changeable within the eye model. The adjustment of the IOL can be manipulated by automatic decentration and tilt of the IOL in reference to the optical axis of the whole system, which is defined by the connection line of the central point of the artificial cornea and the iris aperture. With the presented measurement setup two quality criteria are measurable: the modulation transfer function (MTF) and the Strehl ratio. First the reproducibility of the alignment process for definition of initial conditions of the lateral position and tilt in reference to the optical axis of the system is investigated. Furthermore, different IOL holders are tested related to the stable holding of the IOL. The measurement is performed by a before-after comparison of the lens position using a typical decentration and tilt tolerance analysis path. Modulation transfer function MTF and Strehl ratio S before and after this tolerance analysis are compared and requirements for lens holder construction are deduced from the presented results.

  19. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research.

    Science.gov (United States)

    Harris, Jenine K; B Wondmeneh, Sarah; Zhao, Yiqiang; Leider, Jonathon P

    2018-02-23

    Research replication, or repeating a study de novo, is the scientific standard for building evidence and identifying spurious results. While replication is ideal, it is often expensive and time consuming. Reproducibility, or reanalysis of data to verify published findings, is one proposed minimum alternative standard. While a lack of research reproducibility has been identified as a serious and prevalent problem in biomedical research and a few other fields, little work has been done to examine the reproducibility of public health research. We examined reproducibility in 6 studies from the public health services and systems research subfield of public health research. Following the methods described in each of the 6 papers, we computed the descriptive and inferential statistics for each study. We compared our results with the original study results and examined the percentage differences in descriptive statistics and differences in effect size, significance, and precision of inferential statistics. All project work was completed in 2017. We found consistency between original and reproduced results for each paper in at least 1 of the 4 areas examined. However, we also found some inconsistency. We identified incorrect transcription of results and omitting detail about data management and analyses as the primary contributors to the inconsistencies. Increasing reproducibility, or reanalysis of data to verify published results, can improve the quality of science. Researchers, journals, employers, and funders can all play a role in improving the reproducibility of science through several strategies including publishing data and statistical code, using guidelines to write clear and complete methods sections, conducting reproducibility reviews, and incentivizing reproducible science.

  20. Reproducibility assessment of commercial optically stimulated luminescence system in diagnostic X-ray beams

    International Nuclear Information System (INIS)

    Yahaya Musa; Bradley, D.A.; Sunway University, Selangor; Karim, M.K.A.; Asmaliza Hashim

    2017-01-01

    This study investigates the reproducibility of commercial optically stimulated luminescence (OSL) system at different X-ray beams in general radiography. The reader stability was evaluated at first and found to be within the accepted tolerance level of ± 10%. Henceforth, reproducibility of OSL dosimeters due to repeated readouts after single exposure were found to be between 1.0% (80 kV/20 mGy) and 5.9% (80 kV/10 mAs), while reproducibility in repeated irradiation of optically stimulated luminescence dosimeters (OSLDs) from the same batch (80 kV/8 mGy) was between 1.8 and 4.3%. After multiple readouts of OSLD, the OSL signal decreased by approximately - 0.4% per readout and - 5.4% per 10 sequential readouts. The reproducibility response demonstrates the suitability of the nanoDots OSLD for use in radiography. (author)

  1. Stochastic model of financial markets reproducing scaling and memory in volatility return intervals

    Science.gov (United States)

    Gontis, V.; Havlin, S.; Kononovicius, A.; Podobnik, B.; Stanley, H. E.

    2016-11-01

    We investigate the volatility return intervals in the NYSE and FOREX markets. We explain previous empirical findings using a model based on the interacting agent hypothesis instead of the widely-used efficient market hypothesis. We derive macroscopic equations based on the microscopic herding interactions of agents and find that they are able to reproduce various stylized facts of different markets and different assets with the same set of model parameters. We show that the power-law properties and the scaling of return intervals and other financial variables have a similar origin and could be a result of a general class of non-linear stochastic differential equations derived from a master equation of an agent system that is coupled by herding interactions. Specifically, we find that this approach enables us to recover the volatility return interval statistics as well as volatility probability and spectral densities for the NYSE and FOREX markets, for different assets, and for different time-scales. We find also that the historical S&P500 monthly series exhibits the same volatility return interval properties recovered by our proposed model. Our statistical results suggest that human herding is so strong that it persists even when other evolving fluctuations perturbate the financial system.

  2. Assessment of a climate model to reproduce rainfall variability and extremes over Southern Africa

    Science.gov (United States)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2010-01-01

    It is increasingly accepted that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The sub-continent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite-derived rainfall data from the Microwave Infrared Rainfall Algorithm (MIRA). This dataset covers the period from 1993 to 2002 and the whole of southern Africa at a spatial resolution of 0.1° longitude/latitude. This paper concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of present-day rainfall variability over southern Africa and is not intended to discuss possible future changes in climate as these have been documented elsewhere. Simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. Secondly, the ability of the model to reproduce daily rainfall extremes is assessed, again by a comparison with

  3. Reliability and reproducibility analysis of the AOSpine thoracolumbar spine injury classification system by Chinese spinal surgeons.

    Science.gov (United States)

    Cheng, Jie; Liu, Peng; Sun, Dong; Qin, Tingzheng; Ma, Zikun; Liu, Jingpei

    2017-05-01

    The objective of this study was to analyze the interobserver reliability and intraobserver reproducibility of the new AOSpine thoracolumbar spine injury classification system in young Chinese orthopedic surgeons with different levels of experience in spinal trauma. Previous reports suggest that the new AOSpine thoracolumbar spine injury classification system demonstrates acceptable interobserver reliability and intraobserver reproducibility. However, there are few studies in Asia, especially in China. The AOSpine thoracolumbar spine injury classification system was applied to 109 patients with acute, traumatic thoracolumbar spinal injuries by two groups of spinal surgeons with different levels of clinical experience. The Kappa coefficient was used to determine interobserver reliability and intraobserver reproducibility. The overall Kappa coefficient for all cases was 0.362, which represents fair reliability. The Kappa statistic was 0.385 for A-type injuries and 0.292 for B-type injuries, which represents fair reliability, and 0.552 for C-type injuries, which represents moderate reliability. The Kappa coefficient for intraobserver reproducibility was 0.442 for A-type injuries, 0.485 for B-type injuries, and 0.412 for C-type injuries. These values represent moderate reproducibility for all injury types. The raters in Group A provided significantly better interobserver reliability than Group B (P < 0.05). There were no between-group differences in intraobserver reproducibility. This study suggests that the new AO spine injury classification system may be applied in day-to-day clinical practice in China following extensive training of healthcare providers. Further prospective studies in different healthcare providers and clinical settings are essential for validation of this classification system and to assess its utility.

  4. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  5. Reproducibility - an important factor determining the quality of computer aided detection (CAD) systems

    Energy Technology Data Exchange (ETDEWEB)

    Malich, Ansgar E-mail: ansgarmalich@gmx.de; Azhari, Tarek; Boehm, Thomas; Fleck, Marlies; Kaiser, Werner A

    2000-12-01

    Purpose: To test the reproducibility of markings on mammography films set by a commercial computer aided detection (CAD) system. Patients and methods: One hundred unilateral mammography examinations (each in CC and MLO) of 100 patients with mammographically detected suspicious foci, which were histopathologically proven to be malignant, were scanned three times with the CAD system, retrospectively. Every fifth patient of the institutional tumor case sampler was enrolled in the study. Only cases with one visible lesion were included in the study. Reproducibility and sensitivity (in both the strict and the broader sense) were determined. Strict sensitivity means the correct set of markers in both images, whereas broader sensitivity means the correct set in at least one of the images. Sixteen of 100 malignancies were indicated by focal suspicious microcalcification clusters, 53 tumors by masses and 31 cases by both signs of breast cancer. The CAD evaluation was divided into only two different markers: one for microcalcifications and one for masses. Thus, 47 (16+31) tumor-induced microcalcifications and 84 (53+31) malignancy-related masses were checked using the CAD system. Results: Eighteen of 100 unilateral mammography examinations revealed identical patterns in all three scans (18% reproducibility). Eleven of 47 suspicious focal microcalcification clusters and 43/84 masses were correctly marked on both mammographic views in all three CAD scans (strict and broader sensitivity, 23.4 and 51.1%, respectively). Six of 47 microcalcification clusters and 8/84 masses were totally missed in all images by the system (false negative rate, 12.8 and 9.6%, respectively). Conclusion: Reproducibility is essential for CAD systems. Currently, reproducibility of the used CAD system appears to be insufficient for clinical routine. Improvement of the system characteristics would make such systems valuable as a 'second reader' in clinical examination.

  6. A stable and reproducible human blood-brain barrier model derived from hematopoietic stem cells.

    Directory of Open Access Journals (Sweden)

    Romeo Cecchelli

    Full Text Available The human blood brain barrier (BBB is a selective barrier formed by human brain endothelial cells (hBECs, which is important to ensure adequate neuronal function and protect the central nervous system (CNS from disease. The development of human in vitro BBB models is thus of utmost importance for drug discovery programs related to CNS diseases. Here, we describe a method to generate a human BBB model using cord blood-derived hematopoietic stem cells. The cells were initially differentiated into ECs followed by the induction of BBB properties by co-culture with pericytes. The brain-like endothelial cells (BLECs express tight junctions and transporters typically observed in brain endothelium and maintain expression of most in vivo BBB properties for at least 20 days. The model is very reproducible since it can be generated from stem cells isolated from different donors and in different laboratories, and could be used to predict CNS distribution of compounds in human. Finally, we provide evidence that Wnt/β-catenin signaling pathway mediates in part the BBB inductive properties of pericytes.

  7. Investigation of dimensional variation in parts manufactured by fused deposition modeling using Gauge Repeatability and Reproducibility

    Science.gov (United States)

    Mohamed, Omar Ahmed; Hasan Masood, Syed; Lal Bhowmik, Jahar

    2018-02-01

    In the additive manufacturing (AM) market, the question is raised by industry and AM users on how reproducible and repeatable the fused deposition modeling (FDM) process is in providing good dimensional accuracy. This paper aims to investigate and evaluate the repeatability and reproducibility of the FDM process through a systematic approach to answer this frequently asked question. A case study based on the statistical gage repeatability and reproducibility (gage R&R) technique is proposed to investigate the dimensional variations in the printed parts of the FDM process. After running the simulation and analysis of the data, the FDM process capability is evaluated, which would help the industry for better understanding the performance of FDM technology.

  8. Using the mouse to model human disease: increasing validity and reproducibility

    Directory of Open Access Journals (Sweden)

    Monica J. Justice

    2016-02-01

    Full Text Available Experiments that use the mouse as a model for disease have recently come under scrutiny because of the repeated failure of data, particularly derived from preclinical studies, to be replicated or translated to humans. The usefulness of mouse models has been questioned because of irreproducibility and poor recapitulation of human conditions. Newer studies, however, point to bias in reporting results and improper data analysis as key factors that limit reproducibility and validity of preclinical mouse research. Inaccurate and incomplete descriptions of experimental conditions also contribute. Here, we provide guidance on best practice in mouse experimentation, focusing on appropriate selection and validation of the model, sources of variation and their influence on phenotypic outcomes, minimum requirements for control sets, and the importance of rigorous statistics. Our goal is to raise the standards in mouse disease modeling to enhance reproducibility, reliability and clinical translation of findings.

  9. Rainfall variability and extremes over southern Africa: Assessment of a climate model to reproduce daily extremes

    Science.gov (United States)

    Williams, C. J. R.; Kniveton, D. R.; Layberry, R.

    2009-04-01

    It is increasingly accepted that that any possible climate change will not only have an influence on mean climate but may also significantly alter climatic variability. A change in the distribution and magnitude of extreme rainfall events (associated with changing variability), such as droughts or flooding, may have a far greater impact on human and natural systems than a changing mean. This issue is of particular importance for environmentally vulnerable regions such as southern Africa. The subcontinent is considered especially vulnerable to and ill-equipped (in terms of adaptation) for extreme events, due to a number of factors including extensive poverty, famine, disease and political instability. Rainfall variability and the identification of rainfall extremes is a function of scale, so high spatial and temporal resolution data are preferred to identify extreme events and accurately predict future variability. The majority of previous climate model verification studies have compared model output with observational data at monthly timescales. In this research, the assessment of ability of a state of the art climate model to simulate climate at daily timescales is carried out using satellite derived rainfall data from the Microwave Infra-Red Algorithm (MIRA). This dataset covers the period from 1993-2002 and the whole of southern Africa at a spatial resolution of 0.1 degree longitude/latitude. The ability of a climate model to simulate current climate provides some indication of how much confidence can be applied to its future predictions. In this paper, simulations of current climate from the UK Meteorological Office Hadley Centre's climate model, in both regional and global mode, are firstly compared to the MIRA dataset at daily timescales. This concentrates primarily on the ability of the model to simulate the spatial and temporal patterns of rainfall variability over southern Africa. Secondly, the ability of the model to reproduce daily rainfall extremes will

  10. Reproducibility of the coil positioning in Nb$_3$Sn magnet models through magnetic measurements

    CERN Document Server

    Borgnolutti, F; Ferracin, P; Kashikhin, V V; Sabbi, G; Velev, G; Todesco, E; Zlobin, A V

    2009-01-01

    The random part of the integral field harmonics in a series of superconducting magnets has been used in the past to identify the reproducibility of the coil positioning. Using a magnetic model and a MonteCarlo approach, coil blocks are randomly moved and the amplitude that best fits the magnetic measurements is interpreted as the reproducibility of the coil position. Previous values for r.m.s. coil displacements for Nb-Ti magnets range from 0.05 to 0.01 mm. In this paper, we use this approach to estimate the reproducibility in the coil position for Nb3Sn short models that have been built in the framework of the FNAL core program (HFDA dipoles) and of the LARP program (TQ quadrupoles). Our analysis shows that the Nb3Sn models manufactured in the past years correspond to r.m.s. coil displacements of at least 5 times what is found for the series production of a mature Nb-Ti technology. On the other hand, the variability of the field harmonics along the magnet axis shows that Nb3Sn magnets have already reached va...

  11. Reproducibility of the sella turcica landmark in three dimensions using a sella turcica-specific reference system

    Energy Technology Data Exchange (ETDEWEB)

    Pittayapat, Pisha; Jacobs, Reinhilde [University Hospitals Leuven, University of Leuven, Leuven (Belgium); Odri, Guillaume A. [Service de Chirurgie Orthopedique et Traumatologique, Centre Hospitalier Regional d' Orleans, Orleans Cedex2 (France); De Faria Vasconcelos, Karla [Dept. of Oral Diagnosis, Division of Oral Radiology, Piracicaba Dental School, University of Campinas, Sao Paulo (Brazil); Willems, Guy [Dept. of Oral Health Sciences, Orthodontics, KU Leuven and Dentistry, University Hospitals Leuven, University of Leuven, Leuven (Belgium); Olszewski, Raphael [Dept. of Oral and Maxillofacial Surgery, Cliniques Universitaires Saint Luc, Universite Catholique de Louvain, Brussels (Belgium)

    2015-03-15

    This study was performed to assess the reproducibility of identifying the sella turcica landmark in a three-dimensional (3D) model by using a new sella-specific landmark reference system. Thirty-two cone-beam computed tomographic scans (3D Accuitomo 170, J. Morita, Kyoto, Japan) were retrospectively collected. The 3D data were exported into the Digital Imaging and Communications in Medicine standard and then imported into the Maxilim software (Medicim NV, Sint-Niklaas, Belgium) to create 3D surface models. Five observers identified four osseous landmarks in order to create the reference frame and then identified two sella landmarks. The x, y, and z coordinates of each landmark were exported. The observations were repeated after four weeks. Statistical analysis was performed using the multiple paired t-test with Bonferroni correction (intraobserver precision: p<0.005, interobserver precision: p<0.0011). The intraobserver mean precision of all landmarks was <1 mm. Significant differences were found when comparing the intraobserver precision of each observer (p<0.005). For the sella landmarks, the intraobserver mean precision ranged from 0.43±0.34 mm to 0.51±0.46 mm. The intraobserver reproducibility was generally good. The overall interobserver mean precision was <1 mm. Significant differences between each pair of observers for all anatomical landmarks were found (p<0.0011). The interobserver reproducibility of sella landmarks was good, with >50% precision in locating the landmark within 1 mm. A newly developed reference system offers high precision and reproducibility for sella turcica identification in a 3D model without being based on two-dimensional images derived from 3D data.

  12. Cellular automaton model in the fundamental diagram approach reproducing the synchronized outflow of wide moving jams

    International Nuclear Information System (INIS)

    Tian, Jun-fang; Yuan, Zhen-zhou; Jia, Bin; Fan, Hong-qiang; Wang, Tao

    2012-01-01

    Velocity effect and critical velocity are incorporated into the average space gap cellular automaton model [J.F. Tian, et al., Phys. A 391 (2012) 3129], which was able to reproduce many spatiotemporal dynamics reported by the three-phase theory except the synchronized outflow of wide moving jams. The physics of traffic breakdown has been explained. Various congested patterns induced by the on-ramp are reproduced. It is shown that the occurrence of synchronized outflow, free outflow of wide moving jams is closely related with drivers time delay in acceleration at the downstream jam front and the critical velocity, respectively. -- Highlights: ► Velocity effect is added into average space gap cellular automaton model. ► The physics of traffic breakdown has been explained. ► The probabilistic nature of traffic breakdown is simulated. ► Various congested patterns induced by the on-ramp are reproduced. ► The occurrence of synchronized outflow of jams depends on drivers time delay.

  13. NRFixer: Sentiment Based Model for Predicting the Fixability of Non-Reproducible Bugs

    Directory of Open Access Journals (Sweden)

    Anjali Goyal

    2017-08-01

    Full Text Available Software maintenance is an essential step in software development life cycle. Nowadays, software companies spend approximately 45\\% of total cost in maintenance activities. Large software projects maintain bug repositories to collect, organize and resolve bug reports. Sometimes it is difficult to reproduce the reported bug with the information present in a bug report and thus this bug is marked with resolution non-reproducible (NR. When NR bugs are reconsidered, a few of them might get fixed (NR-to-fix leaving the others with the same resolution (NR. To analyse the behaviour of developers towards NR-to-fix and NR bugs, the sentiment analysis of NR bug report textual contents has been conducted. The sentiment analysis of bug reports shows that NR bugs' sentiments incline towards more negativity than reproducible bugs. Also, there is a noticeable opinion drift found in the sentiments of NR-to-fix bug reports. Observations driven from this analysis were an inspiration to develop a model that can judge the fixability of NR bugs. Thus a framework, {NRFixer,} which predicts the probability of NR bug fixation, is proposed. {NRFixer} was evaluated with two dimensions. The first dimension considers meta-fields of bug reports (model-1 and the other dimension additionally incorporates the sentiments (model-2 of developers for prediction. Both models were compared using various machine learning classifiers (Zero-R, naive Bayes, J48, random tree and random forest. The bug reports of Firefox and Eclipse projects were used to test {NRFixer}. In Firefox and Eclipse projects, J48 and Naive Bayes classifiers achieve the best prediction accuracy, respectively. It was observed that the inclusion of sentiments in the prediction model shows a rise in the prediction accuracy ranging from 2 to 5\\% for various classifiers.

  14. Reproducibility of a Noninvasive System for Eye Positioning and Monitoring in Stereotactic Radiotherapy of Ocular Melanoma.

    Science.gov (United States)

    Iskanderani, Omar; Béliveau-Nadeau, Dominique; Doucet, Robert; Coulombe, Geneviève; Pascale, Deborah; Roberge, David

    2017-06-01

    Our preferred treatment for juxtapapillary choroidal melanoma is stereotactic radiotherapy. We aim to describe our immobilization system and quantify its reproducibility. Patients were identified in our radiosurgery database. Patients were imaged at computed tomography simulator with an in-house system which allows visual monitoring of the eye as the patient fixates a small target. All patients were reimaged at least once prior to and/or during radiotherapy. The patients were treated on the CyberKnife system, 60 Gy in 10 daily fractions, using skull tracking in conjunction with our visual monitoring system. In order to quantify the reproducibility of the eye immobilization system, computed tomography scans were coregistered using rigid 6-dimensional skull registration. Using the coregistered scans, x, y, and z displacements of the lens/optic nerve insertion were measured. From these displacements, 3-dimensional vectors were calculated. Thirty-four patients were treated from October 2010 to September 2015. Thirty-nine coregistrations were performed using 73 scans (2-3 scans per patient). The mean displacements of lens and optic nerve insertion were 0.1 and 0.0 mm. The median 3-dimensional displacements (absolute value) of lens and nerve insertion were 0.8 and 0.7 mm (standard deviation: 0.5 and 0.6 mm). Ninety-eight percent of 3-dimensional displacements were below 2 mm (maximum 2.4 mm). The calculated planning target volume (PTV) margins were 0.8, 1.4, and 1.5 mm in the anterior-posterior, craniocaudal, and right-left axes, respectively. Following this analysis, no further changes have been applied to our planning margin of 2 to 2.5 mm as it is also meant to account for uncertainties in magnetic resonance imaging to computed tomography registration, skull tracking, and also contouring variability. We have found our stereotactic eye immobilization system to be highly reproducible (<1 mm) and free of systematic error.

  15. Evaluation of fecal mRNA reproducibility via a marginal transformed mixture modeling approach

    Directory of Open Access Journals (Sweden)

    Davidson Laurie A

    2010-01-01

    Full Text Available Abstract Background Developing and evaluating new technology that enables researchers to recover gene-expression levels of colonic cells from fecal samples could be key to a non-invasive screening tool for early detection of colon cancer. The current study, to the best of our knowledge, is the first to investigate and report the reproducibility of fecal microarray data. Using the intraclass correlation coefficient (ICC as a measure of reproducibility and the preliminary analysis of fecal and mucosal data, we assessed the reliability of mixture density estimation and the reproducibility of fecal microarray data. Using Monte Carlo-based methods, we explored whether ICC values should be modeled as a beta-mixture or transformed first and fitted with a normal-mixture. We used outcomes from bootstrapped goodness-of-fit tests to determine which approach is less sensitive toward potential violation of distributional assumptions. Results The graphical examination of both the distributions of ICC and probit-transformed ICC (PT-ICC clearly shows that there are two components in the distributions. For ICC measurements, which are between 0 and 1, the practice in literature has been to assume that the data points are from a beta-mixture distribution. Nevertheless, in our study we show that the use of a normal-mixture modeling approach on PT-ICC could provide superior performance. Conclusions When modeling ICC values of gene expression levels, using mixture of normals in the probit-transformed (PT scale is less sensitive toward model mis-specification than using mixture of betas. We show that a biased conclusion could be made if we follow the traditional approach and model the two sets of ICC values using the mixture of betas directly. The problematic estimation arises from the sensitivity of beta-mixtures toward model mis-specification, particularly when there are observations in the neighborhood of the the boundary points, 0 or 1. Since beta-mixture modeling

  16. Reproducing the optical properties of fine desert dust aerosols using ensembles of simple model particles

    International Nuclear Information System (INIS)

    Kahnert, Michael

    2004-01-01

    Single scattering optical properties are calculated for a proxy of fine dust aerosols at a wavelength of 0.55 μm. Spherical and spheroidal model particles are employed to fit the aerosol optical properties and to retrieve information about the physical parameters characterising the aerosols. It is found that spherical particles are capable of reproducing the scalar optical properties and the forward peak of the phase function of the dust aerosols. The effective size parameter of the aerosol ensemble is retrieved with high accuracy by using spherical model particles. Significant improvements are achieved by using spheroidal model particles. The aerosol phase function and the other diagonal elements of the Stokes scattering matrix can be fitted with high accuracy, whereas the off-diagonal elements are poorly reproduced. More elongated prolate and more flattened oblate spheroids contribute disproportionately strongly to the optimised shape distribution of the model particles and appear to be particularly useful for achieving a good fit of the scattering matrix. However, the clear discrepancies between the shape distribution of the aerosols and the shape distribution of the spheroidal model particles suggest that the possibilities of extracting shape information from optical observations are rather limited

  17. Reproducibility of "The Bethesda System for Reporting Thyroid Cytopathology:" A Retrospective Analysis of 107 Patients.

    Science.gov (United States)

    Awasthi, Pragati; Goel, Garima; Khurana, Ujjawal; Joshi, Deepti; Majumdar, Kaushik; Kapoor, Neelkamal

    2018-01-01

    Fine-needle aspiration cytology (FNAC) has emerged as an indispensable tool to discriminate thyroid lesions into benign or malignant for appropriate management. The need for simplicity of communication and standardization of terminology for thyroid FNAC reporting led to introduction of "The Bethesda system for reporting Thyroid Cytopathology" (TBSRTC) in a conference held at the National Cancer Institute in 2007. This study aims at establishing the reproducibility of TBSRTC for diagnosing thyroid lesions. The present study comprised thyroid FNAC from 107 patients retrospectively over a period of 1.5 year (June 2013 to December 2014), which were reviewed by two trained cytopathologists and re-categorized according to TBSRTC. The interobserver variation and reproducibility of the reporting system was statistically assessed using Cohen's kappa. The cytopathologists were in agreement in 98 out of 107 cases (91.5%). Maximum concordance was noted in benign category (91 of 96 cases; 92.85%), followed by 2 cases each in nondiagnostic/unsatisfactory (ND/US) and follicular neoplasm/suspicious for follicular neoplasm (FN/SFN) category (2.04% each) and 1 case each in atypia of undetermined significance/follicular lesion of undetermined significance (AUS/FLUS), suspicious for malignancy (SUS), and malignant category (1.02% each). The highest diagnostic disagreement was noted among ND/US and benign and benign and FN/SFN categories. The utilization of TBSRTC for reporting thyroid cytology should be promoted in our country because it provides a homogeneous, standardized, and unanimous terminology for cytological diagnosis of thyroid lesions. The present study could substantiate the diagnostic reproducibility of this system.

  18. A novel highly reproducible and lethal nonhuman primate model for orthopox virus infection.

    Directory of Open Access Journals (Sweden)

    Marit Kramski

    Full Text Available The intentional re-introduction of Variola virus (VARV, the agent of smallpox, into the human population is of great concern due its bio-terroristic potential. Moreover, zoonotic infections with Cowpox (CPXV and Monkeypox virus (MPXV cause severe diseases in humans. Smallpox vaccines presently available can have severe adverse effects that are no longer acceptable. The efficacy and safety of new vaccines and antiviral drugs for use in humans can only be demonstrated in animal models. The existing nonhuman primate models, using VARV and MPXV, need very high viral doses that have to be applied intravenously or intratracheally to induce a lethal infection in macaques. To overcome these drawbacks, the infectivity and pathogenicity of a particular CPXV was evaluated in the common marmoset (Callithrix jacchus.A CPXV named calpox virus was isolated from a lethal orthopox virus (OPV outbreak in New World monkeys. We demonstrated that marmosets infected with calpox virus, not only via the intravenous but also the intranasal route, reproducibly develop symptoms resembling smallpox in humans. Infected animals died within 1-3 days after onset of symptoms, even when very low infectious viral doses of 5x10(2 pfu were applied intranasally. Infectious virus was demonstrated in blood, saliva and all organs analyzed.We present the first characterization of a new OPV infection model inducing a disease in common marmosets comparable to smallpox in humans. Intranasal virus inoculation mimicking the natural route of smallpox infection led to reproducible infection. In vivo titration resulted in an MID(50 (minimal monkey infectious dose 50% of 8.3x10(2 pfu of calpox virus which is approximately 10,000-fold lower than MPXV and VARV doses applied in the macaque models. Therefore, the calpox virus/marmoset model is a suitable nonhuman primate model for the validation of vaccines and antiviral drugs. Furthermore, this model can help study mechanisms of OPV pathogenesis.

  19. Tackling the Reproducibility Problem in Systems Research with Declarative Experiment Specifications

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Ivo [Univ. of California, Santa Cruz, CA (United States); Maltzahn, Carlos [Univ. of California, Santa Cruz, CA (United States); Lofstead, Jay [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Moody, Adam [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Arpaci-Dusseau, Remzi [Univ. of Wisconsin, Madison, WI (United States); Arpaci-Dusseau, Andrea [Univ. of Wisconsin, Madison, WI (United States)

    2015-05-04

    Validating experimental results in the field of computer systems is a challenging task, mainly due to the many changes in software and hardware that computational environments go through. Determining if an experiment is reproducible entails two separate tasks: re-executing the experiment and validating the results. Existing reproducibility efforts have focused on the former, envisioning techniques and infrastructures that make it easier to re-execute an experiment. In this work we focus on the latter by analyzing the validation workflow that an experiment re-executioner goes through. We notice that validating results is done on the basis of experiment design and high-level goals, rather than exact quantitative metrics. Based on this insight, we introduce a declarative format for specifying the high-level components of an experiment as well as describing generic, testable conditions that serve as the basis for validation. We present a use case in the area of storage systems to illustrate the usefulness of this approach. We also discuss limitations and potential benefits of using this approach in other areas of experimental systems research.

  20. Reproducing tailing in breakthrough curves: Are statistical models equally representative and predictive?

    Science.gov (United States)

    Pedretti, Daniele; Bianchi, Marco

    2018-03-01

    Breakthrough curves (BTCs) observed during tracer tests in highly heterogeneous aquifers display strong tailing. Power laws are popular models for both the empirical fitting of these curves, and the prediction of transport using upscaling models based on best-fitted estimated parameters (e.g. the power law slope or exponent). The predictive capacity of power law based upscaling models can be however questioned due to the difficulties to link model parameters with the aquifers' physical properties. This work analyzes two aspects that can limit the use of power laws as effective predictive tools: (a) the implication of statistical subsampling, which often renders power laws undistinguishable from other heavily tailed distributions, such as the logarithmic (LOG); (b) the difficulties to reconcile fitting parameters obtained from models with different formulations, such as the presence of a late-time cutoff in the power law model. Two rigorous and systematic stochastic analyses, one based on benchmark distributions and the other on BTCs obtained from transport simulations, are considered. It is found that a power law model without cutoff (PL) results in best-fitted exponents (αPL) falling in the range of typical experimental values reported in the literature (1.5 constant αCO ≈ 1. In the PLCO model, the cutoff rate (λ) is the parameter that fully reproduces the persistence of the tailing and is shown to be inversely correlated to the LOG scale parameter (i.e. with the skewness of the distribution). The theoretical results are consistent with the fitting analysis of a tracer test performed during the MADE-5 experiment. It is shown that a simple mechanistic upscaling model based on the PLCO formulation is able to predict the ensemble of BTCs from the stochastic transport simulations without the need of any fitted parameters. The model embeds the constant αCO = 1 and relies on a stratified description of the transport mechanisms to estimate λ. The PL fails to

  1. [Reproducibility and repeatability of the determination of occlusal plane on digital dental models].

    Science.gov (United States)

    Qin, Yi-fei; Xu, Tian-min

    2015-06-18

    To assess the repeatability(intraobserver comparison)and reproducibility(interobserver comparison)of two different methods for establishing the occlusal plane on digital dental models. With Angle's classification as a stratification factor,48 cases were randomly extracted from 806 ones which had integrated clinical data and had their orthodontic treatment from July 2004 to August 2008 in Department of Orthodontics ,Peking University School and Hospital of Stomatology.Post-treatment plaster casts of 48 cases were scanned by Roland LPX-1200 3D laser scanner to generate geometry data as research subjects.In a locally developed software package,one observer repeated 5 times at intervals of at least one week to localize prescriptive landmarks on each digital model to establish a group of functional occlusal planes and a group of anatomic occlusal planes, while 6 observers established two other groups of functional and anatomic occlusal planes independently.Standard deviations of dihedral angles of each group on each model were calculated and compared between the related groups.The models with the five largest standard deviations of each group were studied to explore possible factors that might influence the identification of the landmarks on the digital models. Significant difference of intraobserver variability was not detected between the functional occlusal plane and the anatomic occlusal plane (P>0.1), while that of interobserver variability was detected (Pocclusal plane was 0.2° smaller than that of the anatomic occlusal plane.The functional occlusal plane's variability of intraobserver and interobsever did not differ significantly (P>0.1), while the anatomic occlusal plane's variability of the intraobserver was significantly smaller than that of the interobserver (Pocclusal planes are suitable as a conference plane with equal repeatability. When several observers measure a large number of digital models,the functional occlusal plane is more reproducible than the

  2. Reproducibility, reliability and validity of measurements obtained from Cecile3 digital models

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Watanabe-Kanno

    2009-09-01

    Full Text Available The aim of this study was to determine the reproducibility, reliability and validity of measurements in digital models compared to plaster models. Fifteen pairs of plaster models were obtained from orthodontic patients with permanent dentition before treatment. These were digitized to be evaluated with the program Cécile3 v2.554.2 beta. Two examiners measured three times the mesiodistal width of all the teeth present, intercanine, interpremolar and intermolar distances, overjet and overbite. The plaster models were measured using a digital vernier. The t-Student test for paired samples and interclass correlation coefficient (ICC were used for statistical analysis. The ICC of the digital models were 0.84 ± 0.15 (intra-examiner and 0.80 ± 0.19 (inter-examiner. The average mean difference of the digital models was 0.23 ± 0.14 and 0.24 ± 0.11 for each examiner, respectively. When the two types of measurements were compared, the values obtained from the digital models were lower than those obtained from the plaster models (p < 0.05, although the differences were considered clinically insignificant (differences < 0.1 mm. The Cécile digital models are a clinically acceptable alternative for use in Orthodontics.

  3. Reproducing the nonlinear dynamic behavior of a structured beam with a generalized continuum model

    Science.gov (United States)

    Vila, J.; Fernández-Sáez, J.; Zaera, R.

    2018-04-01

    In this paper we study the coupled axial-transverse nonlinear vibrations of a kind of one dimensional structured solids by application of the so called Inertia Gradient Nonlinear continuum model. To show the accuracy of this axiomatic model, previously proposed by the authors, its predictions are compared with numeric results from a previously defined finite discrete chain of lumped masses and springs, for several number of particles. A continualization of the discrete model equations based on Taylor series allowed us to set equivalent values of the mechanical properties in both discrete and axiomatic continuum models. Contrary to the classical continuum model, the inertia gradient nonlinear continuum model used herein is able to capture scale effects, which arise for modes in which the wavelength is comparable to the characteristic distance of the structured solid. The main conclusion of the work is that the proposed generalized continuum model captures the scale effects in both linear and nonlinear regimes, reproducing the behavior of the 1D nonlinear discrete model adequately.

  4. Assessment of the reliability of reproducing two-dimensional resistivity models using an image processing technique.

    Science.gov (United States)

    Ishola, Kehinde S; Nawawi, Mohd Nm; Abdullah, Khiruddin; Sabri, Ali Idriss Aboubakar; Adiat, Kola Abdulnafiu

    2014-01-01

    This study attempts to combine the results of geophysical images obtained from three commonly used electrode configurations using an image processing technique in order to assess their capabilities to reproduce two-dimensional (2-D) resistivity models. All the inverse resistivity models were processed using the PCI Geomatica software package commonly used for remote sensing data sets. Preprocessing of the 2-D inverse models was carried out to facilitate further processing and statistical analyses. Four Raster layers were created, three of these layers were used for the input images and the fourth layer was used as the output of the combined images. The data sets were merged using basic statistical approach. Interpreted results show that all images resolved and reconstructed the essential features of the models. An assessment of the accuracy of the images for the four geologic models was performed using four criteria: the mean absolute error and mean percentage absolute error, resistivity values of the reconstructed blocks and their displacements from the true models. Generally, the blocks of the images of maximum approach give the least estimated errors. Also, the displacement of the reconstructed blocks from the true blocks is the least and the reconstructed resistivities of the blocks are closer to the true blocks than any other combined used. Thus, it is corroborated that when inverse resistivity models are combined, most reliable and detailed information about the geologic models is obtained than using individual data sets.

  5. The substorm loading-unloading cycle as reproduced by community-available global MHD magnetospheric models

    Science.gov (United States)

    Gordeev, Evgeny; Sergeev, Victor; Tsyganenko, Nikolay; Kuznetsova, Maria; Rastaetter, Lutz; Raeder, Joachim; Toth, Gabor; Lyon, John; Merkin, Vyacheslav; Wiltberger, Michael

    2017-04-01

    In this study we investigate how well the three community-available global MHD models, supported by the Community Coordinated Modeling Center (CCMC NASA), reproduce the global magnetospheric dynamics, including the loading-unloading substorm cycle. We found that in terms of global magnetic flux transport CCMC models display systematically different response to idealized 2-hour north then 2-hour south IMF Bz variation. The LFM model shows a depressed return convection in the tail plasma sheet and high rate of magnetic flux loading into the lobes during the growth phase, as well as enhanced return convection and high unloading rate during the expansion phase, with the amount of loaded/unloaded magnetotail flux and the growth phase duration being the closest to their observed empirical values during isolated substorms. BATSRUS and Open GGCM models exhibit drastically different behavior. In the BATS-R-US model the plasma sheet convection shows a smooth transition to the steady convection regime after the IMF southward turning. In the Open GGCM a weak plasma sheet convection has comparable intensities during both the growth phase and the following slow unloading phase. Our study shows that different CCMC models under the same solar wind conditions (north to south IMF variation) produce essentially different solutions in terms of global magnetospheric convection.

  6. Mouse Models of Diet-Induced Nonalcoholic Steatohepatitis Reproduce the Heterogeneity of the Human Disease

    Science.gov (United States)

    Machado, Mariana Verdelho; Michelotti, Gregory Alexander; Xie, Guanhua; de Almeida, Thiago Pereira; Boursier, Jerome; Bohnic, Brittany; Guy, Cynthia D.; Diehl, Anna Mae

    2015-01-01

    Background and aims Non-alcoholic steatohepatitis (NASH), the potentially progressive form of nonalcoholic fatty liver disease (NAFLD), is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD) diet and Western diet. Methods Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose) for 16 weeks. Liver pathology and metabolic profile were compared. Results The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation) was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation. Conclusion Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH. PMID:26017539

  7. Mouse models of diet-induced nonalcoholic steatohepatitis reproduce the heterogeneity of the human disease.

    Directory of Open Access Journals (Sweden)

    Mariana Verdelho Machado

    Full Text Available Non-alcoholic steatohepatitis (NASH, the potentially progressive form of nonalcoholic fatty liver disease (NAFLD, is the pandemic liver disease of our time. Although there are several animal models of NASH, consensus regarding the optimal model is lacking. We aimed to compare features of NASH in the two most widely-used mouse models: methionine-choline deficient (MCD diet and Western diet.Mice were fed standard chow, MCD diet for 8 weeks, or Western diet (45% energy from fat, predominantly saturated fat, with 0.2% cholesterol, plus drinking water supplemented with fructose and glucose for 16 weeks. Liver pathology and metabolic profile were compared.The metabolic profile associated with human NASH was better mimicked by Western diet. Although hepatic steatosis (i.e., triglyceride accumulation was also more severe, liver non-esterified fatty acid content was lower than in the MCD diet group. NASH was also less severe and less reproducible in the Western diet model, as evidenced by less liver cell death/apoptosis, inflammation, ductular reaction, and fibrosis. Various mechanisms implicated in human NASH pathogenesis/progression were also less robust in the Western diet model, including oxidative stress, ER stress, autophagy deregulation, and hedgehog pathway activation.Feeding mice a Western diet models metabolic perturbations that are common in humans with mild NASH, whereas administration of a MCD diet better models the pathobiological mechanisms that cause human NAFLD to progress to advanced NASH.

  8. Spatial aspects of sound quality - and by multichannel systems subjective assessment of sound reproduced by stereo

    DEFF Research Database (Denmark)

    Choisel, Sylvain

    The reproduction of sound by stereo and by multichannel systems is affected by many factors which will give rise to various complex sensations in listeners, and thereby influence the perceived overall quality. In this Ph.D. thesis, the perceptual changes associated with different reproduction...... on the perceived direction of panned sources. The second part of the thesis addresses the identification of auditory attributes which play a role in the perception of sound reproduced by multichannel systems. Short musical excerpts were presented in mono, stereo and several multichannel formats to evoke various...... spatial sensations. Eight of these attributes (width, brightness, spaciousness, elevation, distance, envelopment, naturalness and clarity) were identified and quantified in a series of experiments. Finally, the relation between these specific attributes and overall preference was formulated...

  9. [Reproducibility of Fuhrman nuclear grade: advantages of a two-grade system].

    Science.gov (United States)

    Letourneux, Hervé; Lindner, Véronique; Lang, Hervé; Massfelder, Thierry; Meyer, Nicolas; Saussine, Christian; Jacqmin, Didier

    2006-06-01

    The Fuhrman nuclear grade is the reference histoprognostic grading system routinely used all over the world for renal cell carcinoma. Studies measuring the inter-observer and intra-observer concordance of Fuhrman grade show poor results in terms of reproducibility and repeatability. These variations are due to a certain degree of subjectivity of the pathologist in application of the definition of tumour grade, particularly nuclear grade. Elements able to account for this subjectivity in renal cell carcinoma are identified from a review of the literature. To improve the reliability of nuclear grade, the territory occupied by the highest grade must be specified and the grades should probably be combined. At the present time, regrouping of grade 1 and 2 tumours as low grade and grade 3 and 4 tumours as high grade would achieve better reproducibility, while preserving the prognostic: value for overall survival. The development of new treatment modalities and their use in adjuvant situations will imply the use of reliable histoprognostic factors to specify, indications.

  10. Why are models unable to reproduce multi-decadal trends in lower tropospheric baseline ozone levels?

    Science.gov (United States)

    Hu, L.; Liu, J.; Mickley, L. J.; Strahan, S. E.; Steenrod, S.

    2017-12-01

    Assessments of tropospheric ozone radiative forcing rely on accurate model simulations. Parrish et al (2014) found that three chemistry-climate models (CCMs) overestimate present-day O3 mixing ratios and capture only 50% of the observed O3 increase over the last five decades at 12 baseline sites in the northern mid-latitudes, indicating large uncertainties in our understanding of the ozone trends and their implications for radiative forcing. Here we present comparisons of outputs from two chemical transport models (CTMs) - GEOS-Chem and the Global Modeling Initiative model - with O3 observations from the same sites and from the global ozonesonde network. Both CTMs are driven by reanalysis meteorological data (MERRA or MERRA2) and thus are expected to be different in atmospheric transport processes relative to those freely running CCMs. We test whether recent model developments leading to more active ozone chemistry affect the computed ozone sensitivity to perturbations in emissions. Preliminary results suggest these CTMs can reproduce present-day ozone levels but fail to capture the multi-decadal trend since 1980. Both models yield widespread overpredictions of free tropospheric ozone in the 1980s. Sensitivity studies in GEOS-Chem suggest that the model estimate of natural background ozone is too high. We discuss factors that contribute to the variability and trends of tropospheric ozone over the last 30 years, with a focus on intermodel differences in spatial resolution and in the representation of stratospheric chemistry, stratosphere-troposphere exchange, halogen chemistry, and biogenic VOC emissions and chemistry. We also discuss uncertainty in the historical emission inventories used in models, and how these affect the simulated ozone trends.

  11. Reproducibility and reliability of hypoglycaemic episodes recorded with Continuous Glucose Monitoring System (CGMS) in daily life

    DEFF Research Database (Denmark)

    Høi-Hansen, T; Pedersen-Bjergaard, U; Thorsteinsson, B

    2005-01-01

    data sets. CGMS readings were also compared with independent self-monitored blood glucose (SMBG) values. RESULTS: With hypoglycaemia (CGMS readings = 2.2 mmol/l) in calibration set left-A, values below 3.5 mmol/l were present in 99% (95% CI: 95-100%) of samples in left-B, 91% (95% CI: 84......AIM: Continuous glucose monitoring may reveal episodes of unrecognized hypoglycaemia. We evaluated reproducibility and reliability of hypoglycaemic episodes recorded in daily life by the Medtronic MiniMed Continuous Glucose Monitoring System (CGMS). METHODS: Twenty-nine adult patients with Type 1...... diabetes underwent 6 days of continuous subcutaneous glucose monitoring, applying one CGMS on each side of the abdomen. Blood glucose was measured by HemoCue B-Glucose Analyzers six times daily and two different 4-point calibration sets were generated (set A and B). Using these calibration sets, CGMS raw...

  12. How well do CMIP5 Climate Models Reproduce the Hydrologic Cycle of the Colorado River Basin?

    Science.gov (United States)

    Gautam, J.; Mascaro, G.

    2017-12-01

    The Colorado River, which is the primary source of water for nearly 40 million people in the arid Southwestern states of the United States, has been experiencing an extended drought since 2000, which has led to a significant reduction in water supply. As the water demands increase, one of the major challenges for water management in the region has been the quantification of uncertainties associated with streamflow predictions in the Colorado River Basin (CRB) under potential changes of future climate. Hence, testing the reliability of model predictions in the CRB is critical in addressing this challenge. In this study, we evaluated the performances of 17 General Circulation Models (GCMs) from the Coupled Model Intercomparison Project Phase Five (CMIP5) and 4 Regional Climate Models (RCMs) in reproducing the statistical properties of the hydrologic cycle in the CRB. We evaluated the water balance components at four nested sub-basins along with the inter-annual and intra-annual changes of precipitation (P), evaporation (E), runoff (R) and temperature (T) from 1979 to 2005. Most of the models captured the net water balance fairly well in the most-upstream basin but simulated a weak hydrological cycle in the evaporation channel at the downstream locations. The simulated monthly variability of P had different patterns, with correlation coefficients ranging from -0.6 to 0.8 depending on the sub-basin and the models from same parent institution clustering together. Apart from the most-upstream sub-basin where the models were mainly characterized by a negative seasonal bias in SON (of up to -50%), most of them had a positive bias in all seasons (of up to +260%) in the other three sub-basins. The models, however, captured the monthly variability of T well at all sites with small inter-model variabilities and a relatively similar range of bias (-7 °C to +5 °C) across all seasons. Mann-Kendall test was applied to the annual P and T time-series where majority of the models

  13. Analysis of measuring system parameters that influence reproducibility of morphometric assessments with a graphic tablet.

    Science.gov (United States)

    Fleege, J C; Baak, J P; Smeulders, A W

    1988-05-01

    The morphometric analysis of nuclear characteristics by means of a graphic tablet is, in principle, objective and highly reproducible. However, a recent study found considerable variation in the morphometric assessments, which was in contrast to the findings of others. The way in which measurements were performed differed in these studies. Therefore, measuring system factors that can potentially influence the quantitative results were analyzed systematically. One observer, experienced in microscopic analysis and working with a commercially available graphic tablet, conducted all the measurements, thus excluding interobserver variation. The tracing speed, localization (on the graphic tablet), magnification, pen and cursor usage, shape, and orientation on the graphic tablet were analyzed. A nomogram was developed for cursor application that indicates the relation between "projected" particle size, tracing speed, and required coefficient of variation (CV). When the influence of these factors is taken into account, a measuring system can be tuned optimally. With such a regimen, the CV can be kept below 1.5%. Our results show that in the assessment of morphometric features with the use of a graphic tablet, errors due to the measuring system can be virtually eliminated.

  14. Synaptic augmentation in a cortical circuit model reproduces serial dependence in visual working memory.

    Directory of Open Access Journals (Sweden)

    Daniel P Bliss

    Full Text Available Recent work has established that visual working memory is subject to serial dependence: current information in memory blends with that from the recent past as a function of their similarity. This tuned temporal smoothing likely promotes the stability of memory in the face of noise and occlusion. Serial dependence accumulates over several seconds in memory and deteriorates with increased separation between trials. While this phenomenon has been extensively characterized in behavior, its neural mechanism is unknown. In the present study, we investigate the circuit-level origins of serial dependence in a biophysical model of cortex. We explore two distinct kinds of mechanisms: stable persistent activity during the memory delay period and dynamic "activity-silent" synaptic plasticity. We find that networks endowed with both strong reverberation to support persistent activity and dynamic synapses can closely reproduce behavioral serial dependence. Specifically, elevated activity drives synaptic augmentation, which biases activity on the subsequent trial, giving rise to a spatiotemporally tuned shift in the population response. Our hybrid neural model is a theoretical advance beyond abstract mathematical characterizations, offers testable hypotheses for physiological research, and demonstrates the power of biological insights to provide a quantitative explanation of human behavior.

  15. Fast bootstrapping and permutation testing for assessing reproducibility and interpretability of multivariate fMRI decoding models.

    Directory of Open Access Journals (Sweden)

    Bryan R Conroy

    Full Text Available Multivariate decoding models are increasingly being applied to functional magnetic imaging (fMRI data to interpret the distributed neural activity in the human brain. These models are typically formulated to optimize an objective function that maximizes decoding accuracy. For decoding models trained on full-brain data, this can result in multiple models that yield the same classification accuracy, though some may be more reproducible than others--i.e. small changes to the training set may result in very different voxels being selected. This issue of reproducibility can be partially controlled by regularizing the decoding model. Regularization, along with the cross-validation used to estimate decoding accuracy, typically requires retraining many (often on the order of thousands of related decoding models. In this paper we describe an approach that uses a combination of bootstrapping and permutation testing to construct both a measure of cross-validated prediction accuracy and model reproducibility of the learned brain maps. This requires re-training our classification method on many re-sampled versions of the fMRI data. Given the size of fMRI datasets, this is normally a time-consuming process. Our approach leverages an algorithm called fast simultaneous training of generalized linear models (FaSTGLZ to create a family of classifiers in the space of accuracy vs. reproducibility. The convex hull of this family of classifiers can be used to identify a subset of Pareto optimal classifiers, with a single-optimal classifier selectable based on the relative cost of accuracy vs. reproducibility. We demonstrate our approach using full-brain analysis of elastic-net classifiers trained to discriminate stimulus type in an auditory and visual oddball event-related fMRI design. Our approach and results argue for a computational approach to fMRI decoding models in which the value of the interpretation of the decoding model ultimately depends upon optimizing a

  16. Quality assurance: Fundamental reproducibility tests for 3D treatment‐planning systems

    Science.gov (United States)

    Able, Charles M.; Thomas, Michael D.

    2005-01-01

    The use of image‐based 3D treatment planning has significantly increased the complexity of commercially available treatment‐planning systems (TPSs). Medical physicists have traditionally focused their efforts on understanding the calculation algorithm; this is no longer possible. A quality assurance (QA) program for our 3D treatment‐planning system (ADAC Pinnacle3) is presented. The program is consistent with the American Association of Physicists in Medicine Task Group 53 guidelines and balances the cost‐versus‐benefit equation confronted by the clinical physicist in a community cancer center environment. Fundamental reproducibility tests are presented as required for a community cancer center environment using conventional and 3D treatment planning. A series of nondosimetric tests, including digitizer accuracy, image acquisition and display, and hardcopy output, is presented. Dosimetric tests include verification of monitor units (MUs), standard isodoses, and clinical cases. The tests are outlined for the Pinnacle3 TPS but can be generalized to any TPS currently in use. The program tested accuracy and constancy through several hardware and software upgrades to our TPS. This paper gives valuable guidance and insight to other physicists attempting to approach TPS QA at fundamental and practical levels. PACS numbers: 87.53.Tf, 87.53.Xd PMID:16143788

  17. Measurement of volatilized mercury by a mini-system: a simple, reliable and reproducible technique

    Directory of Open Access Journals (Sweden)

    Luciana Cursino

    2003-12-01

    Full Text Available A simple and easy new technique for volatilized mercury determination in biological systems was developed. This technique is fast and sensitive and can overcome the problems that arise due to the extremely low readings during the measurements and reproducibility in biological material (bacteria. It measures directly the volatilized metallic mercury of bacteria by means of a chemical adsorbent in a coupled mini-system, as a modified technique for mercury in air analysis. It is potentially of interest to the bioremediation and bacterial mercury resistance communitiesUma metodologia simples foi desenvolvida para medir o mercúrio volatilizado em um sistema biológico. Esta técnica é rápida, sensível e pode superar as dificuldades freqüentemente observadas em material biológico tais como, leituras extremamente baixas e a sua reprodutibilidade. Este sistema mede diretamente por meio de um adsorvente químico acoplado a um mini-sistema o mercúrio metálico volatilizado pela bactéria. Para isto, a metodologia para a análise de mercúrio em amostras de ar foi modificada. Esta técnica é de interesse para a biorremediação e para o estudo de comunidades bacterianas resistentes ao mercúrio.

  18. Cross-species analysis of gene expression in non-model mammals: reproducibility of hybridization on high density oligonucleotide microarrays

    Directory of Open Access Journals (Sweden)

    Pita-Thomas Wolfgang

    2007-04-01

    Full Text Available Abstract Background Gene expression profiles of non-model mammals may provide valuable data for biomedical and evolutionary studies. However, due to lack of sequence information of other species, DNA microarrays are currently restricted to humans and a few model species. This limitation may be overcome by using arrays developed for a given species to analyse gene expression in a related one, an approach known as "cross-species analysis". In spite of its potential usefulness, the accuracy and reproducibility of the gene expression measures obtained in this way are still open to doubt. The present study examines whether or not hybridization values from cross-species analyses are as reproducible as those from same-species analyses when using Affymetrix oligonucleotide microarrays. Results The reproducibility of the probe data obtained hybridizing deer, Old-World primates, and human RNA samples to Affymetrix human GeneChip® U133 Plus 2.0 was compared. The results show that cross-species hybridization affected neither the distribution of the hybridization reproducibility among different categories, nor the reproducibility values of the individual probes. Our analyses also show that a 0.5% of the probes analysed in the U133 plus 2.0 GeneChip are significantly associated to un-reproducible hybridizations. Such probes-called in the text un-reproducible probe sequences- do not increase in number in cross-species analyses. Conclusion Our study demonstrates that cross-species analyses do not significantly affect hybridization reproducibility of GeneChips, at least within the range of the mammal species analysed here. The differences in reproducibility between same-species and cross-species analyses observed in previous studies were probably caused by the analytical methods used to calculate the gene expression measures. Together with previous observations on the accuracy of GeneChips for cross-species analysis, our analyses demonstrate that cross

  19. Model for a reproducible curriculum infrastructure to provide international nurse anesthesia continuing education.

    Science.gov (United States)

    Collins, Shawn Bryant

    2011-12-01

    There are no set standards for nurse anesthesia education in developing countries, yet one of the keys to the standards in global professional practice is competency assurance for individuals. Nurse anesthetists in developing countries have difficulty obtaining educational materials. These difficulties include, but are not limited to, financial constraints, lack of anesthesia textbooks, and distance from educational sites. There is increasing evidence that the application of knowledge in developing countries is failing. One reason is that many anesthetists in developing countries are trained for considerably less than acceptable time periods and are often supervised by poorly trained practitioners, who then pass on less-than-desirable practice skills, thus exacerbating difficulties. Sustainability of development can come only through anesthetists who are both well trained and able to pass on their training to others. The international nurse anesthesia continuing education project was developed in response to the difficulty that nurse anesthetists in developing countries face in accessing continuing education. The purpose of this project was to develop a nonprofit, volunteer-based model for providing nurse anesthesia continuing education that can be reproduced and used in any developing country.

  20. The interobserver reproducibility of thyroid cytopathology using Bethesda Reporting System: Analysis of 200 cases

    International Nuclear Information System (INIS)

    Ahmed, S.; Khan, M.A.; Kazi, F.

    2013-01-01

    Objective: To determine interobserver reproducibility of thyroid cytopathology in cases of thyroid fine needle aspirates. Methods: The retrospective, descriptive study, was conducted at the Foundation University Medical College, Islamabad, using cases related to period between 2009 and 2011. A total of 200 cases of fine-needle aspirations were retrieved from the archives. Three histopathologists independently categorised them into 6 groups according to Bethesda reporting system guidelines without looking at previous reports. Kappa statistics were used for analysis of the results on SPSS 17. Results: Of the 200 patients, 194 (97%) were females and 6 (3%) were males. The overall mean age of patients was 46+-20 years. Kappa value calculated for observer-1 and observer-2 was 0.735; for observer-1 and observer-3, 0.841; and for observer-2 and observer-3, 0.838, showing substantial interobserver agreement. Histopathological correlation was available, for 39(19.5%). Of these cases, 5(13%) were non-diagnostic, 20(51%) benign, 2(5%) atypia of undetermined significance/follicular lesion of undetermined significance, 6(15%) follicular neoplasm, 1(3%) suspicious for malignancy, and 5(13%) malignant. Conclusions: Good overall interobserver agreement was found, but discordance was seen when certain categories were analysed separately. (author)

  1. Composite model to reproduce the mechanical behaviour of methane hydrate bearing soils

    Science.gov (United States)

    De la Fuente, Maria

    2016-04-01

    Methane hydrate bearing sediments (MHBS) are naturally-occurring materials containing different components in the pores that may suffer phase changes under relative small temperature and pressure variations for conditions typically prevailing a few hundreds of meters below sea level. Their modelling needs to account for heat and mass balance equations of the different components, and several strategies already exist to combine them (e.g., Rutqvist & Moridis, 2009; Sánchez et al. 2014). These equations have to be completed by restrictions and constitutive laws reproducing the phenomenology of heat and fluid flows, phase change conditions and mechanical response. While the formulation of the non-mechanical laws generally includes explicitly the mass fraction of methane in each phase, which allows for a natural update of parameters during phase changes, mechanical laws are, in most cases, stated for the whole solid skeleton (Uchida et al., 2012; Soga et al. 2006). In this paper, a mechanical model is proposed to cope with the response of MHBS. It is based on a composite approach that allows defining the thermo-hydro-mechanical response of mineral skeleton and solid hydrates independently. The global stress-strain-temperature response of the solid phase (grains + hydrate) is then obtained by combining both responses according to energy principle following the work by Pinyol et al. (2007). In this way, dissociation of MH can be assessed on the basis of the stress state and temperature prevailing locally within the hydrate component. Besides, its structuring effect is naturally accounted for by the model according to patterns of MH inclusions within soil pores. This paper describes the fundamental hypothesis behind the model and its formulation. Its performance is assessed by comparison with laboratory data presented in the literature. An analysis of MHBS response to several stress-temperature paths representing potential field cases is finally presented. References

  2. Test Methods for Telemetry Systems and Subsystems. Volume 5: Test Methods for Digital Recorder/Reproducer Systems and Recorder Memory Modules

    Science.gov (United States)

    2016-09-26

    REPRODUCER SYSTEMS AND RECORDER MEMORY MODULES ABERDEEN TEST CENTER DUGWAY PROVING GROUND REAGAN TEST SITE WHITE SANDS MISSILE RANGE YUMA...ARNOLD ENGINEERING DEVELOPMENT COMPLEX NATIONAL AERONAUTICS AND SPACE ADMINISTRATION DISTRIBUTION A: APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...AND SUBSYSTEMS VOLUME V TEST METHODS FOR DIGITAL RECORDER/REPRODUCER SYSTEMS AND RECORDER MEMORY MODULES September 2016

  3. Can Computational Sediment Transport Models Reproduce the Observed Variability of Channel Networks in Modern Deltas?

    Science.gov (United States)

    Nesvold, E.; Mukerji, T.

    2017-12-01

    River deltas display complex channel networks that can be characterized through the framework of graph theory, as shown by Tejedor et al. (2015). Deltaic patterns may also be useful in a Bayesian approach to uncertainty quantification of the subsurface, but this requires a prior distribution of the networks of ancient deltas. By considering subaerial deltas, one can at least obtain a snapshot in time of the channel network spectrum across deltas. In this study, the directed graph structure is semi-automatically extracted from satellite imagery using techniques from statistical processing and machine learning. Once the network is labeled with vertices and edges, spatial trends and width and sinuosity distributions can also be found easily. Since imagery is inherently 2D, computational sediment transport models can serve as a link between 2D network structure and 3D depositional elements; the numerous empirical rules and parameters built into such models makes it necessary to validate the output with field data. For this purpose we have used a set of 110 modern deltas, with average water discharge ranging from 10 - 200,000 m3/s, as a benchmark for natural variability. Both graph theoretic and more general distributions are established. A key question is whether it is possible to reproduce this deltaic network spectrum with computational models. Delft3D was used to solve the shallow water equations coupled with sediment transport. The experimental setup was relatively simple; incoming channelized flow onto a tilted plane, with varying wave and tidal energy, sediment types and grain size distributions, river discharge and a few other input parameters. Each realization was run until a delta had fully developed: between 50 and 500 years (with a morphology acceleration factor). It is shown that input parameters should not be sampled independently from the natural ranges, since this may result in deltaic output that falls well outside the natural spectrum. Since we are

  4. Conceptual model suitability for reproducing preferential flow paths in waste rock piles

    Science.gov (United States)

    Broda, S.; Blessent, D.; Aubertin, M.

    2012-12-01

    Waste rocks are typically deposited on mining sites forming waste rock piles (WRP). Acid mine drainage (AMD) or contaminated neutral drainage (CND) with metal leaching from the sulphidic minerals adversely impact soil and water composition on and beyond the mining sites. The deposition method and the highly heterogeneous hydrogeological and geochemical properties of waste rock have a major impact on water and oxygen movement and pore water pressure distribution in the WRP, controlling AMD/CND production. However, the prediction and interpretation of water distribution in WRP is a challenging problem and many attempted numerical investigations of short and long term forecasts were found unreliable. Various forms of unsaturated localized preferential flow processes have been identified, for instance flow in macropores and fractures, heterogeneity-driven and gravity-driven unstable flow, with local hydraulic conductivities reaching several dozen meters per day. Such phenomena have been entirely neglected in numerical WRP modelling and are unattainable with the classical equivalent porous media conceptual approach typically used in this field. An additional complicating circumstance is the unknown location of macropores and fractures a priori. In this study, modeling techniques originally designed for massive fractured rock aquifers are applied. The properties of the waste rock material, found at the Tio mine at Havre Saint-Pierre, Québec (Canada), used in this modelling study were retrieved from laboratory permeability and water retention tests. These column tests were reproduced with the numerical 3D fully-integrated surface/subsurface flow model HydroGeoSphere, where material heterogeneity is represented by means of i) the dual continuum approach, ii) discrete fractures, and iii) a stochastic facies distribution framework using TPROGS. Comparisons with measured pore water pressures, tracer concentrations and exiting water volumes allowed defining limits and

  5. A novel, comprehensive, and reproducible porcine model for determining the timing of bruises in forensic pathology

    DEFF Research Database (Denmark)

    Barington, Kristiane; Jensen, Henrik Elvang

    2016-01-01

    in order to identify gross and histological parameters that may be useful in determining the age of a bruise. Methods The mechanical device was able to apply a single reproducible stroke with a plastic tube that was equivalent to being struck by a man. In each of 10 anesthetized pigs, four strokes...

  6. Reproducibility of twenty-four-hour finger arterial blood pressure, variability and systemic hemodynamics

    NARCIS (Netherlands)

    Voogel, A. J.; van Montfrans, G. A.

    1997-01-01

    At present, non-invasive continuous monitoring of finger arterial blood pressure by the volume-clamp technique is considered the best approach to obtain reliable assessments of beat-to-beat blood pressure. However, data on the reproducibility (accuracy and precision) of prolonged recordings and of

  7. Can the CMIP5 models reproduce interannual to interdecadal southern African summer rainfall variability and their teleconnections?

    Science.gov (United States)

    Dieppois, Bastien; Pohl, Benjamin; Crétat, Julien; Keenlyside, Noel; New, Mark

    2017-04-01

    This study examines for the first time the ability of 28 global climate models from the Coupled Model Intercomparison Project 5 (CMIP5) to reproduce southern African summer rainfall variability and their teleconnections with large-scale modes of climate variability across the dominant timescales. In observations, summer southern African rainfall exhibits three significant timescales of variability over the twentieth century: interdecadal (15-28 years), quasi-decadal (8-13 years), and interannual (2-8 years). Most of CMIP5 simulations underestimate southern African summer rainfall variability at these three timescales, and this bias is proportionally stronger from high- to low-frequency. The inter-model spread is as important as the spread between the ensemble members of a given model, which suggests a strong influence of internal climate variability, and/or large model uncertainties. The underestimated amplitude of rainfall variability for each timescale are linked to unrealistic spatial distributions of these fluctuations over the subcontinent in most CMIP5 models. This is, at least partially, due to a poor representation of the tropical/subtropical teleconnections, which are known to favour wet conditions over southern African rainfall in the observations. Most CMIP5 realisations (85%) fail at simulating sea-surface temperature (SST) anomalies related to a negative Pacific Decadal Oscillation during wetter conditions at the interdecadal timescale. At the quasi-decadal timescale, only one-third of simulations display a negative Interdecadal Pacific Oscillation during wetter conditions, but these SST anomalies are anomalously shifted westward and poleward when compared to observed anomalies. Similar biases in simulating La Niña SST anomalies are identified in more than 50% of CMIP5 simulations at the interannual timescale. These biases in Pacific SST anomalies result in important shifts in the Walker circulation. This impacts southern Africa rainfall variability

  8. A novel porcine model of ataxia telangiectasia reproduces neurological features and motor deficits of human disease.

    Science.gov (United States)

    Beraldi, Rosanna; Chan, Chun-Hung; Rogers, Christopher S; Kovács, Attila D; Meyerholz, David K; Trantzas, Constantin; Lambertz, Allyn M; Darbro, Benjamin W; Weber, Krystal L; White, Katherine A M; Rheeden, Richard V; Kruer, Michael C; Dacken, Brian A; Wang, Xiao-Jun; Davis, Bryan T; Rohret, Judy A; Struzynski, Jason T; Rohret, Frank A; Weimer, Jill M; Pearce, David A

    2015-11-15

    Ataxia telangiectasia (AT) is a progressive multisystem disorder caused by mutations in the AT-mutated (ATM) gene. AT is a neurodegenerative disease primarily characterized by cerebellar degeneration in children leading to motor impairment. The disease progresses with other clinical manifestations including oculocutaneous telangiectasia, immune disorders, increased susceptibly to cancer and respiratory infections. Although genetic investigations and physiological models have established the linkage of ATM with AT onset, the mechanisms linking ATM to neurodegeneration remain undetermined, hindering therapeutic development. Several murine models of AT have been successfully generated showing some of the clinical manifestations of the disease, however they do not fully recapitulate the hallmark neurological phenotype, thus highlighting the need for a more suitable animal model. We engineered a novel porcine model of AT to better phenocopy the disease and bridge the gap between human and current animal models. The initial characterization of AT pigs revealed early cerebellar lesions including loss of Purkinje cells (PCs) and altered cytoarchitecture suggesting a developmental etiology for AT and could advocate for early therapies for AT patients. In addition, similar to patients, AT pigs show growth retardation and develop motor deficit phenotypes. By using the porcine system to model human AT, we established the first animal model showing PC loss and motor features of the human disease. The novel AT pig provides new opportunities to unmask functions and roles of ATM in AT disease and in physiological conditions. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. A novel porcine model of ataxia telangiectasia reproduces neurological features and motor deficits of human disease

    Science.gov (United States)

    Beraldi, Rosanna; Chan, Chun-Hung; Rogers, Christopher S.; Kovács, Attila D.; Meyerholz, David K.; Trantzas, Constantin; Lambertz, Allyn M.; Darbro, Benjamin W.; Weber, Krystal L.; White, Katherine A.M.; Rheeden, Richard V.; Kruer, Michael C.; Dacken, Brian A.; Wang, Xiao-Jun; Davis, Bryan T.; Rohret, Judy A.; Struzynski, Jason T.; Rohret, Frank A.; Weimer, Jill M.; Pearce, David A.

    2015-01-01

    Ataxia telangiectasia (AT) is a progressive multisystem disorder caused by mutations in the AT-mutated (ATM) gene. AT is a neurodegenerative disease primarily characterized by cerebellar degeneration in children leading to motor impairment. The disease progresses with other clinical manifestations including oculocutaneous telangiectasia, immune disorders, increased susceptibly to cancer and respiratory infections. Although genetic investigations and physiological models have established the linkage of ATM with AT onset, the mechanisms linking ATM to neurodegeneration remain undetermined, hindering therapeutic development. Several murine models of AT have been successfully generated showing some of the clinical manifestations of the disease, however they do not fully recapitulate the hallmark neurological phenotype, thus highlighting the need for a more suitable animal model. We engineered a novel porcine model of AT to better phenocopy the disease and bridge the gap between human and current animal models. The initial characterization of AT pigs revealed early cerebellar lesions including loss of Purkinje cells (PCs) and altered cytoarchitecture suggesting a developmental etiology for AT and could advocate for early therapies for AT patients. In addition, similar to patients, AT pigs show growth retardation and develop motor deficit phenotypes. By using the porcine system to model human AT, we established the first animal model showing PC loss and motor features of the human disease. The novel AT pig provides new opportunities to unmask functions and roles of ATM in AT disease and in physiological conditions. PMID:26374845

  10. Reproducing Electric Field Observations during Magnetic Storms by means of Rigorous 3-D Modelling and Distortion Matrix Co-estimation

    Science.gov (United States)

    Püthe, Christoph; Manoj, Chandrasekharan; Kuvshinov, Alexey

    2015-04-01

    Electric fields induced in the conducting Earth during magnetic storms drive currents in power transmission grids, telecommunication lines or buried pipelines. These geomagnetically induced currents (GIC) can cause severe service disruptions. The prediction of GIC is thus of great importance for public and industry. A key step in the prediction of the hazard to technological systems during magnetic storms is the calculation of the geoelectric field. To address this issue for mid-latitude regions, we developed a method that involves 3-D modelling of induction processes in a heterogeneous Earth and the construction of a model of the magnetospheric source. The latter is described by low-degree spherical harmonics; its temporal evolution is derived from observatory magnetic data. Time series of the electric field can be computed for every location on Earth's surface. The actual electric field however is known to be perturbed by galvanic effects, arising from very local near-surface heterogeneities or topography, which cannot be included in the conductivity model. Galvanic effects are commonly accounted for with a real-valued time-independent distortion matrix, which linearly relates measured and computed electric fields. Using data of various magnetic storms that occurred between 2000 and 2003, we estimated distortion matrices for observatory sites onshore and on the ocean bottom. Strong correlations between modellings and measurements validate our method. The distortion matrix estimates prove to be reliable, as they are accurately reproduced for different magnetic storms. We further show that 3-D modelling is crucial for a correct separation of galvanic and inductive effects and a precise prediction of electric field time series during magnetic storms. Since the required computational resources are negligible, our approach is suitable for a real-time prediction of GIC. For this purpose, a reliable forecast of the source field, e.g. based on data from satellites

  11. From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics.

    Directory of Open Access Journals (Sweden)

    Alejandra González-Beltrán

    Full Text Available Reproducing the results from a scientific paper can be challenging due to the absence of data and the computational tools required for their analysis. In addition, details relating to the procedures used to obtain the published results can be difficult to discern due to the use of natural language when reporting how experiments have been performed. The Investigation/Study/Assay (ISA, Nanopublications (NP, and Research Objects (RO models are conceptual data modelling frameworks that can structure such information from scientific papers. Computational workflow platforms can also be used to reproduce analyses of data in a principled manner. We assessed the extent by which ISA, NP, and RO models, together with the Galaxy workflow system, can capture the experimental processes and reproduce the findings of a previously published paper reporting on the development of SOAPdenovo2, a de novo genome assembler.Executable workflows were developed using Galaxy, which reproduced results that were consistent with the published findings. A structured representation of the information in the SOAPdenovo2 paper was produced by combining the use of ISA, NP, and RO models. By structuring the information in the published paper using these data and scientific workflow modelling frameworks, it was possible to explicitly declare elements of experimental design, variables, and findings. The models served as guides in the curation of scientific information and this led to the identification of inconsistencies in the original published paper, thereby allowing its authors to publish corrections in the form of an errata.SOAPdenovo2 scripts, data, and results are available through the GigaScience Database: http://dx.doi.org/10.5524/100044; the workflows are available from GigaGalaxy: http://galaxy.cbiit.cuhk.edu.hk; and the representations using the ISA, NP, and RO models are available through the SOAPdenovo2 case study website http://isa-tools.github.io/soapdenovo2

  12. [The Autocad system for planimetric study of the optic disc in glaucoma: technique and reproducibility study].

    Science.gov (United States)

    Sánchez Pérez, A; Honrubia López, F M; Larrosa Poves, J M; Polo Llorens, V; Melcon Sánchez-Frieras, B

    2001-09-01

    To develop a lens planimetry technique for the optic disc using AutoCAD. To determine variability magnitude of the optic disc morphological measurements. We employed AutoCAD R.14.0 Autodesk: image acquisition, contour delimitation by multiple lines fitting or ellipse adjustment, image sectorialization and measurements quantification (optic disc and excavation, vertical diameters, optic disc area, excavation area, neuroretinal sector area and Beta atrophy area). Intraimage or operator and interimage o total reproducibility was studied by coefficient of variability (CV) (n=10) in normal and myopic optic discs. This technique allows to obtain optic disc measurement in 5 to 10 minutes time. Total or interimage variability of measurements introduced by one observer presents CV range from 1.18-4.42. Operator or intraimage measurement presents CV range from 0.30-4.21. Optic disc contour delimitation by ellipse adjustment achieved better reproducibility results than multiple lines adjustment in all measurements. Computer assisted AutoCAD planimetry is an interactive method to analyse the optic disc, feasible to incorporate to clinical practice. Reproducibility results are comparable to other analyzers in quantification optic disc morphology. Ellipse adjustment improves results in optic disc contours delimitation.

  13. Pharmacokinetic Modelling to Predict FVIII:C Response to Desmopressin and Its Reproducibility in Nonsevere Haemophilia A Patients.

    Science.gov (United States)

    Schütte, Lisette M; van Hest, Reinier M; Stoof, Sara C M; Leebeek, Frank W G; Cnossen, Marjon H; Kruip, Marieke J H A; Mathôt, Ron A A

    2018-04-01

     Nonsevere haemophilia A (HA) patients can be treated with desmopressin. Response of factor VIII activity (FVIII:C) differs between patients and is difficult to predict.  Our aims were to describe FVIII:C response after desmopressin and its reproducibility by population pharmacokinetic (PK) modelling.  Retrospective data of 128 nonsevere HA patients (age 7-75 years) receiving an intravenous or intranasal dose of desmopressin were used. PK modelling of FVIII:C was performed by nonlinear mixed effect modelling. Reproducibility of FVIII:C response was defined as less than 25% difference in peak FVIII:C between administrations.  A total of 623 FVIII:C measurements from 142 desmopressin administrations were available; 14 patients had received two administrations at different occasions. The FVIII:C time profile was best described by a two-compartment model with first-order absorption and elimination. Interindividual variability of the estimated baseline FVIII:C, central volume of distribution and clearance were 37, 43 and 50%, respectively. The most recently measured FVIII:C (FVIII-recent) was significantly associated with FVIII:C response to desmopressin ( p  C increase of 0.47 IU/mL (median, interquartile range: 0.32-0.65 IU/mL, n  = 142). C response was reproducible in 6 out of 14 patients receiving two desmopressin administrations.  FVIII:C response to desmopressin in nonsevere HA patients was adequately described by a population PK model. Large variability in FVIII:C response was observed, which could only partially be explained by FVIII-recent. C response was not reproducible in a small subset of patients. Therefore, monitoring FVIII:C around surgeries or bleeding might be considered. Research is needed to study this further. Schattauer Stuttgart.

  14. Reproducibility of a novel model of murine asthma-like pulmonary inflammation.

    Science.gov (United States)

    McKinley, L; Kim, J; Bolgos, G L; Siddiqui, J; Remick, D G

    2004-05-01

    Sensitization to cockroach allergens (CRA) has been implicated as a major cause of asthma, especially among inner-city populations. Endotoxin from Gram-negative bacteria has also been investigated for its role in attenuating or exacerbating the asthmatic response. We have created a novel model utilizing house dust extract (HDE) containing high levels of both CRA and endotoxin to induce pulmonary inflammation (PI) and airway hyperresponsiveness (AHR). A potential drawback of this model is that the HDE is in limited supply and preparation of new HDE will not contain the exact components of the HDE used to define our model system. The present study involved testing HDEs collected from various homes for their ability to cause PI and AHR. Dust collected from five homes was extracted in phosphate buffered saline overnight. The levels of CRA and endotoxin in the supernatants varied from 7.1 to 49.5 mg/ml of CRA and 1.7-6 micro g/ml of endotoxin in the HDEs. Following immunization and two pulmonary exposures to HDE all five HDEs induced AHR, PI and plasma IgE levels substantially higher than normal mice. This study shows that HDE containing high levels of cockroach allergens and endotoxin collected from different sources can induce an asthma-like response in our murine model.

  15. Evaluating the Reliability and Reproducibility of the AO and Lauge-Hansen Classification Systems for Ankle Injuries.

    Science.gov (United States)

    Yin, Meng-Chen; Yuan, Xue-Fei; Ma, Jun-Ming; Xia, Ye; Wang, Tao; Xu, Xiao-Li; Yan, Yin-Jie; Xu, Jin-Hai; Ye, Jie; Tong, Zheng-Yi; Feng, Yan-Qi; Wang, Hong-Bo; Wu, Xue-Qun; Mo, Wen

    2015-07-01

    Ankle injuries are responsible for more than 5 million emergency department visits each year. The AO and Lauge-Hansen classification systems are widely used in the clinical diagnosis of ankle injuries. This study aimed to analyze the intraobserver reliability and interobserver reproducibility of the AO and Lauge-Hansen classification systems. In addition, the authors explored the differences among physicians' classification responses and evaluated the clinical value for diagnosis. Fifty-six patients with an ankle injury with complete clinical and radiologic data were enrolled. The definition of injury type, the index score typing methods, and the specific study criteria were explained in detail. Five observers, who were orthopedic surgeons, determined the classifications according to both the AO and Lauge-Hansen systems. The classification was repeated 1 month later. Cronbach's alpha and Cohen's kappa test were used to determine interobserver reliability and intraobserver reproducibility. The physicians conducted 560 classifications (56 cases × 5 physicians × 2 times per patient). Average inter- and intraobserver kappa values for the AO system were 0.708 and 0.608, respectively. Average inter- and intraobserver kappa values for the Lauge-Hansen system were 0.402 and 0.398, respectively. Cronbach's alpha coefficient was 96.7% for the AO system and 76.0% for the Lauge-Hansen system. The Lauge-Hansen classification system is a comprehensive yet cumbersome system. Comparatively, the AO classification system is easier to understand. This study shows that the AO classification system has more reliability and reproducibility, and thus has more value in clinical practice, than the Lauge-Hansen classification system. Copyright 2015, SLACK Incorporated.

  16. The perceptual influence of the cabin acoustics on the reproduced sound of a car audio system

    DEFF Research Database (Denmark)

    Kaplanis, Neofytos; Bech, Søren; Sakari, Tervo

    2015-01-01

    . In this study, a sensory evaluation methodology [Lokki et al., J. Acoust. Soc. Am. 132, 3148–2161 (2012)] was employed to identify the most relevant attributes that characterize the influence of the physical properties of a car cabin on the reproduced sound field. A series of in-situ measurements of a high...... a previous review [Kaplanis et al., in 55th Int. Conf. Aud. Eng. Soc. (2014)] and possible links to the acoustical properties of the car cabin are discussed. [This study is a part of Marie Curie Network on Dereverberation and Reverberation of Audio, Music, and Speech. EU-FP7 under agreement ITN-GA-2012-316969.]...

  17. Development and reproducibility evaluation of a Monte Carlo-based standard LINAC model for quality assurance of multi-institutional clinical trials.

    Science.gov (United States)

    Usmani, Muhammad Nauman; Takegawa, Hideki; Takashina, Masaaki; Numasaki, Hodaka; Suga, Masaki; Anetai, Yusuke; Kurosu, Keita; Koizumi, Masahiko; Teshima, Teruki

    2014-11-01

    Technical developments in radiotherapy (RT) have created a need for systematic quality assurance (QA) to ensure that clinical institutions deliver prescribed radiation doses consistent with the requirements of clinical protocols. For QA, an ideal dose verification system should be independent of the treatment-planning system (TPS). This paper describes the development and reproducibility evaluation of a Monte Carlo (MC)-based standard LINAC model as a preliminary requirement for independent verification of dose distributions. The BEAMnrc MC code is used for characterization of the 6-, 10- and 15-MV photon beams for a wide range of field sizes. The modeling of the LINAC head components is based on the specifications provided by the manufacturer. MC dose distributions are tuned to match Varian Golden Beam Data (GBD). For reproducibility evaluation, calculated beam data is compared with beam data measured at individual institutions. For all energies and field sizes, the MC and GBD agreed to within 1.0% for percentage depth doses (PDDs), 1.5% for beam profiles and 1.2% for total scatter factors (Scps.). Reproducibility evaluation showed that the maximum average local differences were 1.3% and 2.5% for PDDs and beam profiles, respectively. MC and institutions' mean Scps agreed to within 2.0%. An MC-based standard LINAC model developed to independently verify dose distributions for QA of multi-institutional clinical trials and routine clinical practice has proven to be highly accurate and reproducible and can thus help ensure that prescribed doses delivered are consistent with the requirements of clinical protocols. © The Author 2014. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  18. In vivo reproducibility of robotic probe placement for an integrated US-CT image-guided radiation therapy system

    Science.gov (United States)

    Lediju Bell, Muyinatu A.; Sen, H. Tutkun; Iordachita, Iulian; Kazanzides, Peter; Wong, John

    2014-03-01

    Radiation therapy is used to treat cancer by delivering high-dose radiation to a pre-defined target volume. Ultrasound (US) has the potential to provide real-time, image-guidance of radiation therapy to identify when a target moves outside of the treatment volume (e.g. due to breathing), but the associated probe-induced tissue deformation causes local anatomical deviations from the treatment plan. If the US probe is placed to achieve similar tissue deformations in the CT images required for treatment planning, its presence causes streak artifacts that will interfere with treatment planning calculations. To overcome these challenges, we propose robot-assisted placement of a real ultrasound probe, followed by probe removal and replacement with a geometrically-identical, CT-compatible model probe. This work is the first to investigate in vivo deformation reproducibility with the proposed approach. A dog's prostate, liver, and pancreas were each implanted with three 2.38-mm spherical metallic markers, and the US probe was placed to visualize the implanted markers in each organ. The real and model probes were automatically removed and returned to the same position (i.e. position control), and CT images were acquired with each probe placement. The model probe was also removed and returned with the same normal force measured with the real US probe (i.e. force control). Marker positions in CT images were analyzed to determine reproducibility, and a corollary reproducibility study was performed on ex vivo tissue. In vivo results indicate that tissue deformations with the real probe were repeatable under position control for the prostate, liver, and pancreas, with median 3D reproducibility of 0.3 mm, 0.3 mm, and 1.6 mm, respectively, compared to 0.6 mm for the ex vivo tissue. For the prostate, the mean 3D tissue displacement errors between the real and model probes were 0.2 mm under position control and 0.6 mm under force control, which are both within acceptable

  19. Ensemble experiments using a nested LETKF system to reproduce intense vortices associated with tornadoes of 6 May 2012 in Japan

    Science.gov (United States)

    Seko, Hiromu; Kunii, Masaru; Yokota, Sho; Tsuyuki, Tadashi; Miyoshi, Takemasa

    2015-12-01

    Experiments simulating intense vortices associated with tornadoes that occurred on 6 May 2012 on the Kanto Plain, Japan, were performed with a nested local ensemble transform Kalman filter (LETKF) system. Intense vortices were reproduced by downscale experiments with a 12-member ensemble in which the initial conditions were obtained from the nested LETKF system analyses. The downscale experiments successfully generated intense vortices in three regions similar to the observed vortices, whereas only one tornado was reproduced by a deterministic forecast. The intense vorticity of the strongest tornado, which was observed in the southernmost region, was successfully reproduced by 10 of the 12 ensemble members. An examination of the results of the ensemble downscale experiments showed that the duration of intense vorticities tended to be longer when the vertical shear of the horizontal wind was larger and the lower airflow was more humid. Overall, the study results show that ensemble forecasts have the following merits: (1) probabilistic forecasts of the outbreak of intense vortices associated with tornadoes are possible; (2) the miss rate of outbreaks should decrease; and (3) environmental factors favoring outbreaks can be obtained by comparing the multiple possible scenarios of the ensemble forecasts.

  20. Monitoring microbiological changes in drinking water systems using a fast and reproducible flow cytometric method

    KAUST Repository

    Prest, Emmanuelle I E C

    2013-12-01

    Flow cytometry (FCM) is a rapid, cultivation-independent tool to assess and evaluate bacteriological quality and biological stability of water. Here we demonstrate that a stringent, reproducible staining protocol combined with fixed FCM operational and gating settings is essential for reliable quantification of bacteria and detection of changes in aquatic bacterial communities. Triplicate measurements of diverse water samples with this protocol typically showed relative standard deviation values and 95% confidence interval values below 2.5% on all the main FCM parameters. We propose a straightforward and instrument-independent method for the characterization of water samples based on the combination of bacterial cell concentration and fluorescence distribution. Analysis of the fluorescence distribution (or so-called fluorescence fingerprint) was accomplished firstly through a direct comparison of the raw FCM data and subsequently simplified by quantifying the percentage of large and brightly fluorescent high nucleic acid (HNA) content bacteria in each sample. Our approach enables fast differentiation of dissimilar bacterial communities (less than 15min from sampling to final result), and allows accurate detection of even small changes in aquatic environments (detection above 3% change). Demonstrative studies on (a) indigenous bacterial growth in water, (b) contamination of drinking water with wastewater, (c) household drinking water stagnation and (d) mixing of two drinking water types, univocally showed that this FCM approach enables detection and quantification of relevant bacterial water quality changes with high sensitivity. This approach has the potential to be used as a new tool for application in the drinking water field, e.g. for rapid screening of the microbial water quality and stability during water treatment and distribution in networks and premise plumbing. © 2013 Elsevier Ltd.

  1. Reproducibility of the acute rejection diagnosis in human cardiac allografts. The Stanford Classification and the International Grading System

    DEFF Research Database (Denmark)

    Nielsen, H; Sørensen, Flemming Brandt; Nielsen, B

    1993-01-01

    necrosis are used. These terms create some difficulties in understanding or interpreting the various grades. The main problem is to distinguish between grade 1A and grade 3A. Despite the difficulties, the grading system is easy to use, but a revision is needed.......Transplantation has become an accepted treatment of many cardiac end-stage diseases. Acute cellular rejection accounts for 15% to 20% of all graft failures. The first grading system of acute cellular rejection, the Stanford Classification, was introduced in 1979, and since then many other grading...... systems have evolved. Most recently, the International Grading System was introduced in The Journal of Heart and Lung Transplantation. In this study the interobserver reproducibility of both the Stanford Classification and the International Grading System is evaluated using Kappa statistics. Three...

  2. Validation of EURO-CORDEX regional climate models in reproducing the variability of precipitation extremes in Romania

    Science.gov (United States)

    Dumitrescu, Alexandru; Busuioc, Aristita

    2016-04-01

    EURO-CORDEX is the European branch of the international CORDEX initiative that aims to provide improved regional climate change projections for Europe. The main objective of this paper is to document the performance of the individual models in reproducing the variability of precipitation extremes in Romania. Here three EURO-CORDEX regional climate models (RCMs) ensemble (scenario RCP4.5) are analysed and inter-compared: DMI-HIRHAM5, KNMI-RACMO2.2 and MPI-REMO. Compared to previous studies, when the RCM validation regarding the Romanian climate has mainly been made on mean state and at station scale, a more quantitative approach of precipitation extremes is proposed. In this respect, to have a more reliable comparison with observation, a high resolution daily precipitation gridded data set was used as observational reference (CLIMHYDEX project). The comparison between the RCM outputs and observed grid point values has been made by calculating three extremes precipitation indices, recommended by the Expert Team on Climate Change Detection Indices (ETCCDI), for the 1976-2005 period: R10MM, annual count of days when precipitation ≥10mm; RX5DAY, annual maximum 5-day precipitation and R95P%, precipitation fraction of annual total precipitation due to daily precipitation > 95th percentile. The RCMs capability to reproduce the mean state for these variables, as well as the main modes of their spatial variability (given by the first three EOF patterns), are analysed. The investigation confirms the ability of RCMs to simulate the main features of the precipitation extreme variability over Romania, but some deficiencies in reproducing of their regional characteristics were found (for example, overestimation of the mea state, especially over the extra Carpathian regions). This work has been realised within the research project "Changes in climate extremes and associated impact in hydrological events in Romania" (CLIMHYDEX), code PN II-ID-2011-2-0073, financed by the Romanian

  3. Using a 1-D model to reproduce the diurnal variability of SST

    DEFF Research Database (Denmark)

    Karagali, Ioanna; Høyer, Jacob L.; Donlon, Craig J.

    2017-01-01

    preferred approach to bridge the gap between in situ and remotely sensed measurements and obtain diurnal warming estimates at large spatial scales is modeling of the upper ocean temperature. This study uses the one-dimensional General Ocean Turbulence Model (GOTM) to resolve diurnal signals identified from...... forcing fields and is able to resolve daily SST variability seen both from satellite and in situ measurements. As such, and due to its low computational cost, it is proposed as a candidate model for diurnal variability estimates....

  4. Energy and nutrient deposition and excretion in the reproducing sow: model development and evaluation

    DEFF Research Database (Denmark)

    Hansen, A V; Strathe, A B; Theil, Peter Kappel

    2014-01-01

    was related to predictions of body fat and protein loss from the lactation model. Nitrogen intake, urine N, fecal N, and milk N were predicted with RMSPE as percentage of observed mean of 9.7, 17.9, 10.0, and 7.7%, respectively. The model provided a framework, but more refinements and improvements in accuracy......Air and nutrient emissions from swine operations raise environmental concerns. During the reproduction phase, sows consume and excrete large quantities of nutrients. The objective of this study was to develop a mathematical model to describe energy and nutrient partitioning and predict manure...... excretion and composition and methane emissions on a daily basis. The model was structured to contain gestation and lactation modules, which can be run separately or sequentially, with outputs from the gestation module used as inputs to the lactation module. In the gestating module, energy and protein...

  5. Do on/off time series models reproduce emerging stock market comovements?

    OpenAIRE

    Mohamed el hédi Arouri; Fredj Jawadi

    2011-01-01

    Using nonlinear modeling tools, this study investigates the comovements between the Mexican and the world stock markets over the last three decades. While the previous works only highlight some evidence of comovements, our paper aims to specify the different time-varying links and mechanisms characterizing the Mexican stock market through the comparison of two nonlinear error correction models (NECMs). Our findings point out strong evidence of time-varying and nonlinear mean-reversion and lin...

  6. The Computable Catchment: An executable document for model-data software sharing, reproducibility and interactive visualization

    Science.gov (United States)

    Gil, Y.; Duffy, C.

    2015-12-01

    This paper proposes the concept of a "Computable Catchment" which is used to develop a collaborative platform for watershed modeling and data analysis. The object of the research is a sharable, executable document similar to a pdf, but one that includes documentation of the underlying theoretical concepts, interactive computational/numerical resources, linkage to essential data repositories and the ability for interactive model-data visualization and analysis. The executable document for each catchment is stored in the cloud with automatic provisioning and a unique identifier allowing collaborative model and data enhancements for historical hydroclimatic reconstruction and/or future landuse or climate change scenarios to be easily reconstructed or extended. The Computable Catchment adopts metadata standards for naming all variables in the model and the data. The a-priori or initial data is derived from national data sources for soils, hydrogeology, climate, and land cover available from the www.hydroterre.psu.edu data service (Leonard and Duffy, 2015). The executable document is based on Wolfram CDF or Computable Document Format with an interactive open-source reader accessible by any modern computing platform. The CDF file and contents can be uploaded to a website or simply shared as a normal document maintaining all interactive features of the model and data. The Computable Catchment concept represents one application for Geoscience Papers of the Future representing an extensible document that combines theory, models, data and analysis that are digitally shared, documented and reused among research collaborators, students, educators and decision makers.

  7. Evaluating the systemic right ventricle by CMR: the importance of consistent and reproducible delineation of the cavity

    Directory of Open Access Journals (Sweden)

    van Dijk Arie PJ

    2008-08-01

    Full Text Available Abstract Background The method used to delineate the boundary of the right ventricle (RV, relative to the trabeculations and papillary muscles in cardiovascular magnetic resonance (CMR ventricular volume analysis, may matter more when these structures are hypertrophied than in individuals with normal cardiovascular anatomy. This study aimed to compare two methods of cavity delineation in patients with systemic RV. Methods Twenty-nine patients (mean age 34.7 ± 12.4 years with a systemic RV (12 with congenitally corrected transposition of the great arteries (ccTGA and 17 with atrially switched (TGA underwent CMR. We compared measurements of systemic RV volumes and function using two analysis protocols. The RV trabeculations and papillary muscles were either included in the calculated blood volume, the boundary drawn immediately within the apparently compacted myocardial layer, or they were manually outlined and excluded. RV stroke volume (SV calculated using each method was compared with corresponding left ventricular (LV SV. Additionally, we compared the differences in analysis time, and in intra- and inter-observer variability between the two methods. Paired samples t-test was used to test for differences in volumes, function and analysis time between the two methods. Differences in intra- and inter-observer reproducibility were tested using an extension of the Bland-Altman method. Results The inclusion of trabeculations and papillary muscles in the ventricular volume resulted in higher values for systemic RV end diastolic volume (mean difference 28.7 ± 10.6 ml, p Conclusion The choice of method for systemic RV cavity delineation significantly affected volume measurements, given the CMR acquisition and analysis systems used. We recommend delineation outside the trabeculations for routine clinical measurements of systemic RV volumes as this approach took less time and gave more reproducible measurements.

  8. SpectraCam®: A new polarized hyperspectral imaging system for repeatable and reproducible in vivo skin quantification of melanin, total hemoglobin, and oxygen saturation.

    Science.gov (United States)

    Nkengne, A; Robic, J; Seroul, P; Gueheunneux, S; Jomier, M; Vie, K

    2018-02-01

    An accurate way to determine skin pigmentation is to acquire the spectral reflectance of a skin sample and to quantify chromophores by reverse calculation from physical models of light propagation. Therefore, we tested a new hyperspectral imaging device and software suite, the SpectraCam ® system, and evaluated its accuracy to quantify skin chromophores. Validation of the SpectraCam ® system was performed by, firstly, comparing the known and the acquired reflectance spectra of color phantoms. Repeatability and reproducibility were then evaluated by two operators who performed acquisitions at different time points and compared the acquired reflectance spectra. The specificity of the system was tested by quantitative analysis of single chromophore variation models: lentigo and pressure relief. Finally, we tested the ability of the SpectraCam ® system to detect variations in chromophore in the eye region due to the daily application of a new anti-dark circle cosmetic product. The SpectraCam ® system faithfully acquires the reflectance spectra of color phantoms (r 2 >0.90). The skin reflectance spectra acquired by different operators at different times are highly repeatable (r 2 >0.94) and reproducible (r 2 >0.99). The SpectraCam ® system can also produce qualitative maps that reveal local variations in skin chromophore or underlying structures such as blood vessels. The system is precise enough to detect melanin variation in lentigo or total hemoglobin and oxygen saturation variations upon pressure relief. It is also sensitive enough to detect a decrease in melanin in the eye region due to the application of an anti-dark circle cosmetic product. The SpectraCam ® system proves to be rapid and produces high-resolution data encompassing a large field of view. It is a robust hyperspectral imaging system that quantifies melanin, total hemoglobin, and oxygen saturation and is well adapted to cosmetic research. © 2017 John Wiley & Sons A/S. Published by John Wiley

  9. "High-precision, reconstructed 3D model" of skull scanned by conebeam CT: Reproducibility verified using CAD/CAM data.

    Science.gov (United States)

    Katsumura, Seiko; Sato, Keita; Ikawa, Tomoko; Yamamura, Keiko; Ando, Eriko; Shigeta, Yuko; Ogawa, Takumi

    2016-01-01

    Computed tomography (CT) scanning has recently been introduced into forensic medicine and dentistry. However, the presence of metal restorations in the dentition can adversely affect the quality of three-dimensional reconstruction from CT scans. In this study, we aimed to evaluate the reproducibility of a "high-precision, reconstructed 3D model" obtained from a conebeam CT scan of dentition, a method that might be particularly helpful in forensic medicine. We took conebeam CT and helical CT images of three dry skulls marked with 47 measuring points; reconstructed three-dimensional images; and measured the distances between the points in the 3D images with a computer-aided design/computer-aided manufacturing (CAD/CAM) marker. We found that in comparison with the helical CT, conebeam CT is capable of reproducing measurements closer to those obtained from the actual samples. In conclusion, our study indicated that the image-reproduction from a conebeam CT scan was more accurate than that from a helical CT scan. Furthermore, the "high-precision reconstructed 3D model" facilitates reliable visualization of full-sized oral and maxillofacial regions in both helical and conebeam CT scans. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. A novel porcine model of ataxia telangiectasia reproduces neurological features and motor deficits of human disease

    OpenAIRE

    Beraldi, Rosanna; Chan, Chun-Hung; Rogers, Christopher S.; Kovács, Attila D.; Meyerholz, David K.; Trantzas, Constantin; Lambertz, Allyn M.; Darbro, Benjamin W.; Weber, Krystal L.; White, Katherine A.M.; Rheeden, Richard V.; Kruer, Michael C.; Dacken, Brian A.; Wang, Xiao-Jun; Davis, Bryan T.

    2015-01-01

    Ataxia telangiectasia (AT) is a progressive multisystem disorder caused by mutations in the AT-mutated (ATM) gene. AT is a neurodegenerative disease primarily characterized by cerebellar degeneration in children leading to motor impairment. The disease progresses with other clinical manifestations including oculocutaneous telangiectasia, immune disorders, increased susceptibly to cancer and respiratory infections. Although genetic investigations and physiological models have established the l...

  11. Establishing a Reproducible Hypertrophic Scar following Thermal Injury: A Porcine Model

    Directory of Open Access Journals (Sweden)

    Scott J. Rapp, MD

    2015-02-01

    Conclusions: Deep partial-thickness thermal injury to the back of domestic swine produces an immature hypertrophic scar by 10 weeks following burn with thickness appearing to coincide with the location along the dorsal axis. With minimal pig to pig variation, we describe our technique to provide a testable immature scar model.

  12. A Versatile and Reproducible Multi-Frequency Electrical Impedance Tomography System

    Directory of Open Access Journals (Sweden)

    James Avery

    2017-01-01

    Full Text Available A highly versatile Electrical Impedance Tomography (EIT system, nicknamed the ScouseTom, has been developed. The system allows control over current amplitude, frequency, number of electrodes, injection protocol and data processing. Current is injected using a Keithley 6221 current source, and voltages are recorded with a 24-bit EEG system with minimum bandwidth of 3.2 kHz. Custom PCBs interface with a PC to control the measurement process, electrode addressing and triggering of external stimuli. The performance of the system was characterised using resistor phantoms to represent human scalp recordings, with an SNR of 77.5 dB, stable across a four hour recording and 20 Hz to 20 kHz. In studies of both haeomorrhage using scalp electrodes, and evoked activity using epicortical electrode mats in rats, it was possible to reconstruct images matching established literature at known areas of onset. Data collected using scalp electrode in humans matched known tissue impedance spectra and was stable over frequency. The experimental procedure is software controlled and is readily adaptable to new paradigms. Where possible, commercial or open-source components were used, to minimise the complexity in reproduction. The hardware designs and software for the system have been released under an open source licence, encouraging contributions and allowing for rapid replication.

  13. Reproducibility of optical quality parameters measured at objective and subjective best focuses in a double-pass system

    Directory of Open Access Journals (Sweden)

    Ai-Lian Hu

    2015-10-01

    Full Text Available AIM:To evaluate intra-session repeatability and reproducibility of optical quality parameters measured at objective and subjective best focuses in a double-pass system.METHODS: Thirty Chinese healthy adults (19 to 40 years old meeting our inclusion criterion were enrolled in the study. After a basic eye examination, two methods of optical quality measurement, based on subjective and objective best focuses were performed using the Optical Quality Analysis System (OQAS with an artificial pupil diameter of 4.0 mm.RESULTS: With each method, three consecutive measurements of the following parameters:the modulation transfer function cutoff frequency (MTFcutoff, the Strehl2D ratio, the OQAS values (OVs at contrasts of 100%, 20%, 9% and the objective scatter index (OSI were performed by an experienced examiner. The repeatability of each method was evaluated by the repeatability limit (RL and the coefficient of repeatability (COR. Reproducibility of the two methods was evaluated by intra-class correlation coefficient (ICC and the 95% limits of agreement (Bland and Altman analysis. Thirty subjects, seven females and twenty three males, of whom 15 right eyes and 15 left eyes were selected randomly for recruitment in the study. The RLs (percentage for the six parameters measured at objective focus and subjective focus ranged from 8.44% to 15.13% and 10.85% to 16.26%, respectively. The CORs for the two measurement methods ranged from 8.27% to 14.83% and 10.63% to 15.93%, respectively. With regard to reproducibility, the ICCs for the six parameters of OQAS ranged from 0.024 to 0.276. The 95% limits of agreement obtained for the six parameters (in comparison of the two methods ranged from -0.57 to 42.18 (MTFcutoff, -0.01 to 0.23 (Strehl2D ratio, -0.02 to 1.40 (OV100%, -0.10 to 1.75 (OV20%, -0.14 to 1.80 (OV9% and -1.46 to 0.18 (OSI.CONCLUSION:Measurements provided by OQAS with either method showed a good repeatability. However, the results obtained from the

  14. A computational model incorporating neural stem cell dynamics reproduces glioma incidence across the lifespan in the human population.

    Directory of Open Access Journals (Sweden)

    Roman Bauer

    Full Text Available Glioma is the most common form of primary brain tumor. Demographically, the risk of occurrence increases until old age. Here we present a novel computational model to reproduce the probability of glioma incidence across the lifespan. Previous mathematical models explaining glioma incidence are framed in a rather abstract way, and do not directly relate to empirical findings. To decrease this gap between theory and experimental observations, we incorporate recent data on cellular and molecular factors underlying gliomagenesis. Since evidence implicates the adult neural stem cell as the likely cell-of-origin of glioma, we have incorporated empirically-determined estimates of neural stem cell number, cell division rate, mutation rate and oncogenic potential into our model. We demonstrate that our model yields results which match actual demographic data in the human population. In particular, this model accounts for the observed peak incidence of glioma at approximately 80 years of age, without the need to assert differential susceptibility throughout the population. Overall, our model supports the hypothesis that glioma is caused by randomly-occurring oncogenic mutations within the neural stem cell population. Based on this model, we assess the influence of the (experimentally indicated decrease in the number of neural stem cells and increase of cell division rate during aging. Our model provides multiple testable predictions, and suggests that different temporal sequences of oncogenic mutations can lead to tumorigenesis. Finally, we conclude that four or five oncogenic mutations are sufficient for the formation of glioma.

  15. [Renaissance of training in general surgery in Cambodia: a unique experience or reproducible model].

    Science.gov (United States)

    Dumurgier, C; Baulieux, J

    2005-01-01

    Is the new surgical training program at the University of Phom-Penh, Cambodia a unique experience or can it serve as a model for developing countries? This report describes the encouraging first results of this didactic and hands-on surgical program. Based on their findings the authors recommend not only continuing the program in Phom-Penh but also proposing slightly modified versions to new medical universities not currently offering specialization in surgery.

  16. Evaluation of Nitinol staples for the Lapidus arthrodesis in a reproducible biomechanical model

    Directory of Open Access Journals (Sweden)

    Nicholas Alexander Russell

    2015-12-01

    Full Text Available While the Lapidus procedure is a widely accepted technique for treatment of hallux valgus, the optimal fixation method to maintain joint stability remains controversial. The purpose of this study was to evaluate the biomechanical properties of new Shape Memory Alloy staples arranged in different configurations in a repeatable 1st Tarsometatarsal arthrodesis model. Ten sawbones models of the whole foot (n=5 per group were reconstructed using a single dorsal staple or two staples in a delta configuration. Each construct was mechanically tested in dorsal four-point bending, medial four-point bending, dorsal three-point bending and plantar cantilever bending with the staples activated at 37°C. The peak load, stiffness and plantar gapping were determined for each test. Pressure sensors were used to measure the contact force and area of the joint footprint in each group. There was a significant (p < 0.05 increase in peak load in the two staple constructs compared to the single staple constructs for all testing modalities. Stiffness also increased significantly in all tests except dorsal four-point bending. Pressure sensor readings showed a significantly higher contact force at time zero and contact area following loading in the two staple constructs (p < 0.05. Both groups completely recovered any plantar gapping following unloading and restored their initial contact footprint. The biomechanical integrity and repeatability of the models was demonstrated with no construct failures due to hardware or model breakdown. Shape memory alloy staples provide fixation with the ability to dynamically apply and maintain compression across a simulated arthrodesis following a range of loading conditions.

  17. Can lagrangian models reproduce the migration time of European eel obtained from otolith analysis?

    Science.gov (United States)

    Rodríguez-Díaz, L.; Gómez-Gesteira, M.

    2017-12-01

    European eel can be found at the Bay of Biscay after a long migration across the Atlantic. The duration of migration, which takes place at larval stage, is of primary importance to understand eel ecology and, hence, its survival. This duration is still a controversial matter since it can range from 7 months to > 4 years depending on the method to estimate duration. The minimum migration duration estimated from our lagrangian model is similar to the duration obtained from the microstructure of eel otoliths, which is typically on the order of 7-9 months. The lagrangian model showed to be sensitive to different conditions like spatial and time resolution, release depth, release area and initial distribution. In general, migration showed to be faster when decreasing the depth and increasing the resolution of the model. In average, the fastest migration was obtained when only advective horizontal movement was considered. However, faster migration was even obtained in some cases when locally oriented random migration was taken into account.

  18. Acute multi-sgRNA knockdown of KEOPS complex genes reproduces the microcephaly phenotype of the stable knockout zebrafish model.

    Directory of Open Access Journals (Sweden)

    Tilman Jobst-Schwan

    Full Text Available Until recently, morpholino oligonucleotides have been widely employed in zebrafish as an acute and efficient loss-of-function assay. However, off-target effects and reproducibility issues when compared to stable knockout lines have compromised their further use. Here we employed an acute CRISPR/Cas approach using multiple single guide RNAs targeting simultaneously different positions in two exemplar genes (osgep or tprkb to increase the likelihood of generating mutations on both alleles in the injected F0 generation and to achieve a similar effect as morpholinos but with the reproducibility of stable lines. This multi single guide RNA approach resulted in median likelihoods for at least one mutation on each allele of >99% and sgRNA specific insertion/deletion profiles as revealed by deep-sequencing. Immunoblot showed a significant reduction for Osgep and Tprkb proteins. For both genes, the acute multi-sgRNA knockout recapitulated the microcephaly phenotype and reduction in survival that we observed previously in stable knockout lines, though milder in the acute multi-sgRNA knockout. Finally, we quantify the degree of mutagenesis by deep sequencing, and provide a mathematical model to quantitate the chance for a biallelic loss-of-function mutation. Our findings can be generalized to acute and stable CRISPR/Cas targeting for any zebrafish gene of interest.

  19. Realizing the Living Paper using the ProvONE Model for Reproducible Research

    Science.gov (United States)

    Jones, M. B.; Jones, C. S.; Ludäscher, B.; Missier, P.; Walker, L.; Slaughter, P.; Schildhauer, M.; Cuevas-Vicenttín, V.

    2015-12-01

    Science has advanced through traditional publications that codify research results as a permenant part of the scientific record. But because publications are static and atomic, researchers can only cite and reference a whole work when building on prior work of colleagues. The open source software model has demonstrated a new approach in which strong version control in an open environment can nurture an open ecosystem of software. Developers now commonly fork and extend software giving proper credit, with less repetition, and with confidence in the relationship to original software. Through initiatives like 'Beyond the PDF', an analogous model has been imagined for open science, in which software, data, analyses, and derived products become first class objects within a publishing ecosystem that has evolved to be finer-grained and is realized through a web of linked open data. We have prototyped a Living Paper concept by developing the ProvONE provenance model for scientific workflows, with prototype deployments in DataONE. ProvONE promotes transparency and openness by describing the authenticity, origin, structure, and processing history of research artifacts and by detailing the steps in computational workflows that produce derived products. To realize the Living Paper, we decompose scientific papers into their constituent products and publish these as compound objects in the DataONE federation of archival repositories. Each individual finding and sub-product of a reseach project (such as a derived data table, a workflow or script, a figure, an image, or a finding) can be independently stored, versioned, and cited. ProvONE provenance traces link these fine-grained products within and across versions of a paper, and across related papers that extend an original analysis. This allows for open scientific publishing in which researchers extend and modify findings, creating a dynamic, evolving web of results that collectively represent the scientific enterprise. The

  20. A discrete particle model reproducing collective dynamics of a bee swarm.

    Science.gov (United States)

    Bernardi, Sara; Colombi, Annachiara; Scianna, Marco

    2018-02-01

    In this article, we present a microscopic discrete mathematical model describing collective dynamics of a bee swarm. More specifically, each bee is set to move according to individual strategies and social interactions, the former involving the desire to reach a target destination, the latter accounting for repulsive/attractive stimuli and for alignment processes. The insects tend in fact to remain sufficiently close to the rest of the population, while avoiding collisions, and they are able to track and synchronize their movement to the flight of a given set of neighbors within their visual field. The resulting collective behavior of the bee cloud therefore emerges from non-local short/long-range interactions. Differently from similar approaches present in the literature, we here test different alignment mechanisms (i.e., based either on an Euclidean or on a topological neighborhood metric), which have an impact also on the other social components characterizing insect behavior. A series of numerical realizations then shows the phenomenology of the swarm (in terms of pattern configuration, collective productive movement, and flight synchronization) in different regions of the space of free model parameters (i.e., strength of attractive/repulsive forces, extension of the interaction regions). In this respect, constraints in the possible variations of such coefficients are here given both by reasonable empirical observations and by analytical results on some stability characteristics of the defined pairwise interaction kernels, which have to assure a realistic crystalline configuration of the swarm. An analysis of the effect of unconscious random fluctuations of bee dynamics is also provided. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Sprague-Dawley rats are a sustainable and reproducible animal model for induction and study of oral submucous fibrosis

    Directory of Open Access Journals (Sweden)

    Shilpa Maria

    2015-01-01

    Full Text Available Background: Oral submucous fibrosis (OSF is a chronic debilitating disease predominantly affecting the oral cavity and oropharynx. Characteristic histological traits of OSF include epithelial atrophy, inflammation, and a generalized submucosal fibrosis. Several studies and epidemiological surveys provide substantial evidence that areca nut is the main etiological factor for OSF. Hesitance of patients to undergo biopsy procedure together with clinicians becoming increasingly reluctant to take biopsies in cases of OSF has prompted researchers to develop animal models to study the disease process. Materials and Methods: The present study evaluates the efficacy, sustainability, and reproducibility of using Sprague-Dawley (SD rats as a possible model in the induction and progression of OSF. Buccal mucosa of SD rats was injected with areca nut and pan masala solutions on alternate days over a period of 48 weeks. The control group was treated with saline. The influence of areca nut and pan masala on the oral epithelium and connective tissue was evaluated by light microscopy. Results: Oral submucous fibrosis-like lesions were seen in both the areca nut and pan masala treated groups. The histological changes observed included: Atrophic epithelium, partial or complete loss of rete ridges, juxta-epithelial hyalinization, inflammation and accumulation of dense bundles of collagen fibers subepithelially. Conclusions: Histopathological changes in SD rats following treatment with areca nut and pan masala solutions bears a close semblance to that seen in humans with OSF. The SD rats seem to be a cheap and efficient, sustainable and reproducible model for the induction and development of OSF.

  2. The diverse broad-band light-curves of Swift GRBs reproduced with the cannonball model

    CERN Document Server

    Dado, Shlomo; De Rújula, A

    2009-01-01

    Two radiation mechanisms, inverse Compton scattering (ICS) and synchrotron radiation (SR), suffice within the cannonball (CB) model of long gamma ray bursts (LGRBs) and X-ray flashes (XRFs) to provide a very simple and accurate description of their observed prompt emission and afterglows. Simple as they are, the two mechanisms and the burst environment generate the rich structure of the light curves at all frequencies and times. This is demonstrated for 33 selected Swift LGRBs and XRFs, which are well sampled from early time until late time and well represent the entire diversity of the broad band light curves of Swift LGRBs and XRFs. Their prompt gamma-ray and X-ray emission is dominated by ICS of glory light. During their fast decline phase, ICS is taken over by SR which dominates their broad band afterglow. The pulse shape and spectral evolution of the gamma-ray peaks and the early-time X-ray flares, and even the delayed optical `humps' in XRFs, are correctly predicted. The canonical and non-canonical X-ra...

  3. Comparative analysis of 5 lung cancer natural history and screening models that reproduce outcomes of the NLST and PLCO trials.

    Science.gov (United States)

    Meza, Rafael; ten Haaf, Kevin; Kong, Chung Yin; Erdogan, Ayca; Black, William C; Tammemagi, Martin C; Choi, Sung Eun; Jeon, Jihyoun; Han, Summer S; Munshi, Vidit; van Rosmalen, Joost; Pinsky, Paul; McMahon, Pamela M; de Koning, Harry J; Feuer, Eric J; Hazelton, William D; Plevritis, Sylvia K

    2014-06-01

    The National Lung Screening Trial (NLST) demonstrated that low-dose computed tomography screening is an effective way of reducing lung cancer (LC) mortality. However, optimal screening strategies have not been determined to date and it is uncertain whether lighter smokers than those examined in the NLST may also benefit from screening. To address these questions, it is necessary to first develop LC natural history models that can reproduce NLST outcomes and simulate screening programs at the population level. Five independent LC screening models were developed using common inputs and calibration targets derived from the NLST and the Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO). Imputation of missing information regarding smoking, histology, and stage of disease for a small percentage of individuals and diagnosed LCs in both trials was performed. Models were calibrated to LC incidence, mortality, or both outcomes simultaneously. Initially, all models were calibrated to the NLST and validated against PLCO. Models were found to validate well against individuals in PLCO who would have been eligible for the NLST. However, all models required further calibration to PLCO to adequately capture LC outcomes in PLCO never-smokers and light smokers. Final versions of all models produced incidence and mortality outcomes in the presence and absence of screening that were consistent with both trials. The authors developed 5 distinct LC screening simulation models based on the evidence in the NLST and PLCO. The results of their analyses demonstrated that the NLST and PLCO have produced consistent results. The resulting models can be important tools to generate additional evidence to determine the effectiveness of lung cancer screening strategies using low-dose computed tomography. © 2014 American Cancer Society.

  4. A Reliable and Reproducible Model for Assessing the Effect of Different Concentrations of α-Solanine on Rat Bone Marrow Mesenchymal Stem Cells

    Directory of Open Access Journals (Sweden)

    Adriana Ordóñez-Vásquez

    2017-01-01

    Full Text Available Αlpha-solanine (α-solanine is a glycoalkaloid present in potato (Solanum tuberosum. It has been of particular interest because of its toxicity and potential teratogenic effects that include abnormalities of the central nervous system, such as exencephaly, encephalocele, and anophthalmia. Various types of cell culture have been used as experimental models to determine the effect of α-solanine on cell physiology. The morphological changes in the mesenchymal stem cell upon exposure to α-solanine have not been established. This study aimed to describe a reliable and reproducible model for assessing the structural changes induced by exposure of mouse bone marrow mesenchymal stem cells (MSCs to different concentrations of α-solanine for 24 h. The results demonstrate that nonlethal concentrations of α-solanine (2–6 μM changed the morphology of the cells, including an increase in the number of nucleoli, suggesting elevated protein synthesis, and the formation of spicules. In addition, treatment with α-solanine reduced the number of adherent cells and the formation of colonies in culture. Immunophenotypic characterization and staining of MSCs are proposed as a reproducible method that allows description of cells exposed to the glycoalkaloid, α-solanine.

  5. Implementation of an experimental pilot reproducing the fouling of the exhaust gas recirculation system in diesel engines

    Directory of Open Access Journals (Sweden)

    Crepeau Gérald

    2012-04-01

    Full Text Available The European emission standards EURO 5 and EURO 6 define more stringent acceptable limits for exhaust emissions of new vehicles. The Exhaust Gas Recirculation (EGR system is a partial but essential solution for lowering the emission of nitrogen oxides and soot particulates. Yet, due to a more intensive use than in the past, the fouling of the EGR system is increased. Ensuring the reliability of the EGR system becomes a main challenge. In partnership with PSA Peugeot Citroën, we designed an experimental setup that mimics an operating EGR system. Its distinctive features are (1 its ability to reproduce precisely the operating conditions and (2 its ability to measure the temperature field on the heat exchanger surface with an Infra Red camera for detecting in real time the evolution of the fooling deposit based on its thermal resistance. Numerical codes are used in conjunction with this experimental setup to determine the evolution of the fouling thickness from its thermal resistance.

  6. AO group, AO subgroup, Garden and Pauwels classification systems of femoral neck fractures: are they reliable and reproducible?

    Science.gov (United States)

    Gašpar, Drago; Crnković, Tomislav; Durović, Dražen; Podsednik, Dinko; Slišurić, Ferdinand

    2012-08-01

    To determine which of the classification systems for the femoral neck fracture between AO group, AO subgroup, Garden and Pauwels is much more reliable and reproducible to predict a method of treatment, radiological prediction of nonunion and prediction of outcomes. Five observers classified 77 randomly selected anterior- posterior (AP) and lateral view preoperative radiographs of the femoral neck fractures according to AO group, AO subgroup, Garden and Pauwels classification systems. The procedure was repeated on the same radiographs after three months. First classification is used to calculate interobserver agreement by kappa value between observers, while the first and second classification has served to calculate intraobserver kappa value for each examiner. Overall mean for classification system for interobserver agreement is: AO 0.44, AO subgroup 0.17, Garden 0.41 and Pauwels 0.19. Mean intraobserver agreement for AO group was 0.56, AO subgroup 0.38, Garden 0.49 and Pauwels 0.38 coefficient kappa value. Garden and AO group are useful for the division of femoral neck fractures without displacement and with displacement, but they are not for clinical use. AO subgroup and Pauwels classification are not recommended for further use.

  7. Can CFMIP2 models reproduce the leading modes of cloud vertical structure in the CALIPSO-GOCCP observations?

    Science.gov (United States)

    Wang, Fang; Yang, Song

    2018-02-01

    Using principal component (PC) analysis, three leading modes of cloud vertical structure (CVS) are revealed by the GCM-Oriented CALIPSO Cloud Product (GOCCP), i.e. tropical high, subtropical anticyclonic and extratropical cyclonic cloud modes (THCM, SACM and ECCM, respectively). THCM mainly reflect the contrast between tropical high clouds and clouds in middle/high latitudes. SACM is closely associated with middle-high clouds in tropical convective cores, few-cloud regimes in subtropical anticyclonic clouds and stratocumulus over subtropical eastern oceans. ECCM mainly corresponds to clouds along extratropical cyclonic regions. Models of phase 2 of Cloud Feedback Model Intercomparison Project (CFMIP2) well reproduce the THCM, but SACM and ECCM are generally poorly simulated compared to GOCCP. Standardized PCs corresponding to CVS modes are generally captured, whereas original PCs (OPCs) are consistently underestimated (overestimated) for THCM (SACM and ECCM) by CFMIP2 models. The effects of CVS modes on relative cloud radiative forcing (RSCRF/RLCRF) (RSCRF being calculated at the surface while RLCRF at the top of atmosphere) are studied in terms of principal component regression method. Results show that CFMIP2 models tend to overestimate (underestimated or simulate the opposite sign) RSCRF/RLCRF radiative effects (REs) of ECCM (THCM and SACM) in unit global mean OPC compared to observations. These RE biases may be attributed to two factors, one of which is underestimation (overestimation) of low/middle clouds (high clouds) (also known as stronger (weaker) REs in unit low/middle (high) clouds) in simulated global mean cloud profiles, the other is eigenvector biases in CVS modes (especially for SACM and ECCM). It is suggested that much more attention should be paid on improvement of CVS, especially cloud parameterization associated with particular physical processes (e.g. downwelling regimes with the Hadley circulation, extratropical storm tracks and others), which

  8. QSAR model reproducibility and applicability: a case study of rate constants of hydroxyl radical reaction models applied to polybrominated diphenyl ethers and (benzo-)triazoles.

    Science.gov (United States)

    Roy, Partha Pratim; Kovarich, Simona; Gramatica, Paola

    2011-08-01

    The crucial importance of the three central OECD principles for quantitative structure-activity relationship (QSAR) model validation is highlighted in a case study of tropospheric degradation of volatile organic compounds (VOCs) by OH, applied to two CADASTER chemical classes (PBDEs and (benzo-)triazoles). The application of any QSAR model to chemicals without experimental data largely depends on model reproducibility by the user. The reproducibility of an unambiguous algorithm (OECD Principle 2) is guaranteed by redeveloping MLR models based on both updated version of DRAGON software for molecular descriptors calculation and some freely available online descriptors. The Genetic Algorithm has confirmed its ability to always select the most informative descriptors independently on the input pool of variables. The ability of the GA-selected descriptors to model chemicals not used in model development is verified by three different splittings (random by response, K-ANN and K-means clustering), thus ensuring the external predictivity of the new models, independently of the training/prediction set composition (OECD Principle 5). The relevance of checking the structural applicability domain becomes very evident on comparing the predictions for CADASTER chemicals, using the new models proposed herein, with those obtained by EPI Suite. Copyright © 2011 Wiley Periodicals, Inc.

  9. Eccentric Contraction-Induced Muscle Injury: Reproducible, Quantitative, Physiological Models to Impair Skeletal Muscle’s Capacity to Generate Force

    Science.gov (United States)

    Call, Jarrod A.; Lowe, Dawn A.

    2018-01-01

    In order to investigate the molecular and cellular mechanisms of muscle regeneration an experimental injury model is required. Advantages of eccentric contraction-induced injury are that it is a controllable, reproducible, and physiologically relevant model to cause muscle injury, with injury being defined as a loss of force generating capacity. While eccentric contractions can be incorporated into conscious animal study designs such as downhill treadmill running, electrophysiological approaches to elicit eccentric contractions and examine muscle contractility, for example before and after the injurious eccentric contractions, allows researchers to circumvent common issues in determining muscle function in a conscious animal (e.g., unwillingness to participate). Herein, we describe in vitro and in vivo methods that are reliable, repeatable, and truly maximal because the muscle contractions are evoked in a controlled, quantifiable manner independent of subject motivation. Both methods can be used to initiate eccentric contraction-induced injury and are suitable for monitoring functional muscle regeneration hours to days to weeks post-injury. PMID:27492161

  10. Eccentric Contraction-Induced Muscle Injury: Reproducible, Quantitative, Physiological Models to Impair Skeletal Muscle's Capacity to Generate Force.

    Science.gov (United States)

    Call, Jarrod A; Lowe, Dawn A

    2016-01-01

    In order to investigate the molecular and cellular mechanisms of muscle regeneration an experimental injury model is required. Advantages of eccentric contraction-induced injury are that it is a controllable, reproducible, and physiologically relevant model to cause muscle injury, with injury being defined as a loss of force generating capacity. While eccentric contractions can be incorporated into conscious animal study designs such as downhill treadmill running, electrophysiological approaches to elicit eccentric contractions and examine muscle contractility, for example before and after the injurious eccentric contractions, allows researchers to circumvent common issues in determining muscle function in a conscious animal (e.g., unwillingness to participate). Herein, we describe in vitro and in vivo methods that are reliable, repeatable, and truly maximal because the muscle contractions are evoked in a controlled, quantifiable manner independent of subject motivation. Both methods can be used to initiate eccentric contraction-induced injury and are suitable for monitoring functional muscle regeneration hours to days to weeks post-injury.

  11. A rat model of post-traumatic stress disorder reproduces the hippocampal deficits seen in the human syndrome

    Directory of Open Access Journals (Sweden)

    Sonal eGoswami

    2012-06-01

    Full Text Available Despite recent progress, the causes and pathophysiology of post-traumatic stress disorder (PTSD remain poorly understood, partly because of ethical limitations inherent to human studies. One approach to circumvent this obstacle is to study PTSD in a valid animal model of the human syndrome. In one such model, extreme and long-lasting behavioral manifestations of anxiety develop in a subset of Lewis rats after exposure to an intense predatory threat that mimics the type of life-and-death situation known to precipitate PTSD in humans. This study aimed to assess whether the hippocampus-associated deficits observed in the human syndrome are reproduced in this rodent model. Prior to predatory threat, different groups of rats were each tested on one of three object recognition memory tasks that varied in the types of contextual clues (i.e. that require the hippocampus or not the rats could use to identify novel items. After task completion, the rats were subjected to predatory threat and, one week later, tested on the elevated plus maze. Based on their exploratory behavior in the plus maze, rats were then classified as resilient or PTSD-like and their performance on the pre-threat object recognition tasks compared. The performance of PTSD-like rats was inferior to that of resilient rats but only when subjects relied on an allocentric frame of reference to identify novel items, a process thought to be critically dependent on the hippocampus. Therefore, these results suggest that even prior to trauma, PTSD-like rats show a deficit in hippocampal-dependent functions, as reported in twin studies of human PTSD.

  12. Reproducing the organic matter model of anthropogenic dark earth of Amazonia and testing the ecotoxicity of functionalized charcoal compounds

    Directory of Open Access Journals (Sweden)

    Carolina Rodrigues Linhares

    2012-05-01

    Full Text Available The objective of this work was to obtain organic compounds similar to the ones found in the organic matter of anthropogenic dark earth of Amazonia (ADE using a chemical functionalization procedure on activated charcoal, as well as to determine their ecotoxicity. Based on the study of the organic matter from ADE, an organic model was proposed and an attempt to reproduce it was described. Activated charcoal was oxidized with the use of sodium hypochlorite at different concentrations. Nuclear magnetic resonance was performed to verify if the spectra of the obtained products were similar to the ones of humic acids from ADE. The similarity between spectra indicated that the obtained products were polycondensed aromatic structures with carboxyl groups: a soil amendment that can contribute to soil fertility and to its sustainable use. An ecotoxicological test with Daphnia similis was performed on the more soluble fraction (fulvic acids of the produced soil amendment. Aryl chloride was formed during the synthesis of the organic compounds from activated charcoal functionalization and partially removed through a purification process. However, it is probable that some aryl chloride remained in the final product, since the ecotoxicological test indicated that the chemical functionalized soil amendment is moderately toxic.

  13. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  14. Reproducibility and accuracy of linear measurements on dental models derived from cone-beam computed tomography compared with digital dental casts

    NARCIS (Netherlands)

    Waard, O. de; Rangel, F.A.; Fudalej, P.S.; Bronkhorst, E.M.; Kuijpers-Jagtman, A.M.; Breuning, K.H.

    2014-01-01

    INTRODUCTION: The aim of this study was to determine the reproducibility and accuracy of linear measurements on 2 types of dental models derived from cone-beam computed tomography (CBCT) scans: CBCT images, and Anatomodels (InVivoDental, San Jose, Calif); these were compared with digital models

  15. [Reproducing and evaluating a rabbit model of multiple organ dysfunction syndrome after cardiopulmonary resuscitation resulted from asphyxia].

    Science.gov (United States)

    Zhang, Dong; Li, Nan; Chen, Ying; Wang, Yu-shan

    2013-02-01

    To evaluate the reproduction of a model of post resuscitation multiple organ dysfunction syndrome (PR-MODS) after cardiac arrest (CA) in rabbit, in order to provide new methods for post-CA treatment. Thirty-five rabbits were randomly divided into three groups, the sham group (n=5), the 7-minute asphyxia group (n=15), and the 8-minute asphyxia group (n=15). The asphyxia CA model was reproduced with tracheal occlusion. After cardiopulmonary resuscitation (CPR), the ratio of recovery of spontaneous circulation (ROSC), the mortality at different time points and the incidence of systemic inflammatory response syndrome (SIRS) were observed in two asphyxia groups. Creatine kinase isoenzyme (CK-MB), alanine aminotransferase (ALT), creatinine (Cr), glucose (Glu) and arterial partial pressure of oxygen (PaO2) levels in blood were measured in the two asphyxia groups before CPR and 12, 24 and 48 hours after ROSC. The survived rabbits were euthanized at 48 hours after ROSC, and heart, brain, lung, kidney, liver, and intestine were harvested for pathological examination using light microscope. PR-MODS after CA was defined based on the function of main organs and their pathological changes. (1) The incidence of ROSC was 100.0% in 7-minute asphyxia group and 86.7% in 8-minute asphyxia group respectively (P>0.05). The 6-hour mortality in 8-minute asphyxia group was significantly higher than that in 7-minute asphyxia group (46.7% vs. 6.7%, P0.05). (2) There was a variety of organ dysfunctions in survived rabbits after ROSC, including chemosis, respiratory distress, hypotension, abdominal distension, weakened or disappearance of bowel peristalsis and oliguria. (3) There was no SIRS or associated changes in major organ function in the sham group. SIRS was observed at 12 - 24 hours after ROSC in the two asphyxia groups. CK-MB was increased significantly at 12 hours after ROSC compared with that before asphyxia (7-minute asphyxia group: 786.88±211.84 U/L vs. 468.20±149.45 U/L, 8

  16. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  17. The use of a robotic tibial rotation device and an electromagnetic tracking system to accurately reproduce the clinical dial test.

    Science.gov (United States)

    Stinton, S K; Siebold, R; Freedberg, H; Jacobs, C; Branch, T P

    2016-03-01

    The purpose of this study was to: (1) determine whether a robotic tibial rotation device and an electromagnetic tracking system could accurately reproduce the clinical dial test at 30° of knee flexion; (2) compare rotation data captured at the footplates of the robotic device to tibial rotation data measured using an electromagnetic sensor on the proximal tibia. Thirty-two unilateral ACL-reconstructed patients were examined using a robotic tibial rotation device that mimicked the dial test. The data reported in this study is only from the healthy legs of these patients. Torque was applied through footplates and was measured using servomotors. Lower leg motion was measured at the foot using the motors. Tibial motion was also measured through an electromagnetic tracking system and a sensor on the proximal tibia. Load-deformation curves representing rotational motion of the foot and tibia were compared using Pearson's correlation coefficients. Off-axis motions including medial-lateral translation and anterior-posterior translation were also measured using the electromagnetic system. The robotic device and electromagnetic system were able to provide axial rotation data and translational data for the tibia during the dial test. Motion measured at the foot was not correlated to motion of the tibial tubercle in internal rotation or in external rotation. The position of the tibial tubercle was 26.9° ± 11.6° more internally rotated than the foot at torque 0 Nm. Medial-lateral translation and anterior-posterior translation were combined to show the path of the tubercle in the coronal plane during tibial rotation. The information captured during a manual dial test includes both rotation of the tibia and proximal tibia translation. All of this information can be captured using a robotic tibial axial rotation device with an electromagnetic tracking system. The pathway of the tibial tubercle during tibial axial rotation can provide additional information about knee

  18. Attempting to train a digital human model to reproduce human subject reach capabilities in an ejection seat aircraft

    NARCIS (Netherlands)

    Zehner, G.F.; Hudson, J.A.; Oudenhuijzen, A.

    2006-01-01

    From 1997 through 2002, the Air Force Research Lab and TNO Defence, Security and Safety (Business Unit Human Factors) were involved in a series of tests to quantify the accuracy of five Human Modeling Systems (HMSs) in determining accommodation limits of ejection seat aircraft. The results of these

  19. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  20. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  1. Opening Reproducible Research

    Science.gov (United States)

    Nüst, Daniel; Konkol, Markus; Pebesma, Edzer; Kray, Christian; Klötgen, Stephanie; Schutzeichel, Marc; Lorenz, Jörg; Przibytzin, Holger; Kussmann, Dirk

    2016-04-01

    Open access is not only a form of publishing such that research papers become available to the large public free of charge, it also refers to a trend in science that the act of doing research becomes more open and transparent. When science transforms to open access we not only mean access to papers, research data being collected, or data being generated, but also access to the data used and the procedures carried out in the research paper. Increasingly, scientific results are generated by numerical manipulation of data that were already collected, and may involve simulation experiments that are completely carried out computationally. Reproducibility of research findings, the ability to repeat experimental procedures and confirm previously found results, is at the heart of the scientific method (Pebesma, Nüst and Bivand, 2012). As opposed to the collection of experimental data in labs or nature, computational experiments lend themselves very well for reproduction. Some of the reasons why scientists do not publish data and computational procedures that allow reproduction will be hard to change, e.g. privacy concerns in the data, fear for embarrassment or of losing a competitive advantage. Others reasons however involve technical aspects, and include the lack of standard procedures to publish such information and the lack of benefits after publishing them. We aim to resolve these two technical aspects. We propose a system that supports the evolution of scientific publications from static papers into dynamic, executable research documents. The DFG-funded experimental project Opening Reproducible Research (ORR) aims for the main aspects of open access, by improving the exchange of, by facilitating productive access to, and by simplifying reuse of research results that are published over the Internet. Central to the project is a new form for creating and providing research results, the executable research compendium (ERC), which not only enables third parties to

  2. Reproducibility of Quantitative Brain Imaging Using a PET-Only and a Combined PET/MR System

    Directory of Open Access Journals (Sweden)

    Martin L. Lassen

    2017-07-01

    Full Text Available The purpose of this study was to test the feasibility of migrating a quantitative brain imaging protocol from a positron emission tomography (PET-only system to an integrated PET/MR system. Potential differences in both absolute radiotracer concentration as well as in the derived kinetic parameters as a function of PET system choice have been investigated. Five healthy volunteers underwent dynamic (R-[11C]verapamil imaging on the same day using a GE-Advance (PET-only and a Siemens Biograph mMR system (PET/MR. PET-emission data were reconstructed using a transmission-based attenuation correction (AC map (PET-only, whereas a standard MR-DIXON as well as a low-dose CT AC map was applied to PET/MR emission data. Kinetic modeling based on arterial blood sampling was performed using a 1-tissue-2-rate constant compartment model, yielding kinetic parameters (K1 and k2 and distribution volume (VT. Differences for parametric values obtained in the PET-only and the PET/MR systems were analyzed using a 2-way Analysis of Variance (ANOVA. Comparison of DIXON-based AC (PET/MR with emission data derived from the PET-only system revealed average inter-system differences of −33 ± 14% (p < 0.05 for the K1 parameter and −19 ± 9% (p < 0.05 for k2. Using a CT-based AC for PET/MR resulted in slightly lower systematic differences of −16 ± 18% for K1 and −9 ± 10% for k2. The average differences in VT were −18 ± 10% (p < 0.05 for DIXON- and −8 ± 13% for CT-based AC. Significant systematic differences were observed for kinetic parameters derived from emission data obtained from PET/MR and PET-only imaging due to different standard AC methods employed. Therefore, a transfer of imaging protocols from PET-only to PET/MR systems is not straightforward without application of proper correction methods.Clinical Trial Registration:www.clinicaltrialsregister.eu, identifier 2013-001724-19

  3. A sensitive and reproducible in vivo imaging mouse model for evaluation of drugs against late-stage human African trypanosomiasis.

    Science.gov (United States)

    Burrell-Saward, Hollie; Rodgers, Jean; Bradley, Barbara; Croft, Simon L; Ward, Theresa H

    2015-02-01

    To optimize the Trypanosoma brucei brucei GVR35 VSL-2 bioluminescent strain as an innovative drug evaluation model for late-stage human African trypanosomiasis. An IVIS® Lumina II imaging system was used to detect bioluminescent T. b. brucei GVR35 parasites in mice to evaluate parasite localization and disease progression. Drug treatment was assessed using qualitative bioluminescence imaging and real-time quantitative PCR (qPCR). We have shown that drug dose-response can be evaluated using bioluminescence imaging and confirmed quantification of tissue parasite load using qPCR. The model was also able to detect drug relapse earlier than the traditional blood film detection and even in the absence of any detectable peripheral parasites. We have developed and optimized a new, efficient method to evaluate novel anti-trypanosomal drugs in vivo and reduce the current 180 day drug relapse experiment to a 90 day model. The non-invasive in vivo imaging model reduces the time required to assess preclinical efficacy of new anti-trypanosomal drugs. © The Author 2014. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Dynamic contrast-enhanced computed tomography in metastatic nasopharyngeal carcinoma: reproducibility analysis and observer variability of the distributed parameter model.

    Science.gov (United States)

    Ng, Quan-Sing; Thng, Choon Hua; Lim, Wan Teck; Hartono, Septian; Thian, Yee Liang; Lee, Puor Sherng; Tan, Daniel Shao-Weng; Tan, Eng Huat; Koh, Tong San

    2012-01-01

    To determine the reproducibility and observer variability of distributed parameter analysis of dynamic contrast-enhanced computed tomography (DCE-CT) data in metastatic nasopharyngeal carcinoma, and to compare 2 approaches of region-of-interest (ROI) analyses. Following ethical approval and informed consent, 17 patients with nasopharyngeal carcinoma underwent paired DCE-CT examinations on a 64-detector scanner, measuring tumor blood flow (F, mL/100 mL/min), permeability surface area product (PS, mL/100 mL/min), fractional intravascular blood volume (v1, mL/100 mL), and fractional extracellular-extravascular volume (v2, mL/100 mL). Tumor parameters were derived by fitting (i) the ROI-averaged concentration-time curve, and (ii) the median value of parameters from voxel-level concentration-time curves. Measurement reproducibility and inter- and intraobserver variability were estimated using Bland-Altman statistics. Mean F, PS, v1, and v2 are 44.9, 20.4, 7.1, and 34.1 for ROI analysis, and 49.0, 18.7, 6.7, and 34.0 for voxel analysis, respectively. Within-subject coefficients of variation are 38.8%, 49.5%, 54.2%, and 35.9% for ROI analysis, and 15.0%, 35.1%, 33.0%, and 21.0% for voxel analysis, respectively. Repeatability coefficients are 48.2, 28.0, 10.7, and 33.9 for ROI analysis, and 20.3, 18.2, 6.1 and 19.8 for voxel analysis, respectively. Intra- and interobserver correlation coefficient ranged from 0.94 to 0.97 and 0.90 to 0.95 for voxel analysis, and 0.73 to 0.87 and 0.72 to 0.94 for ROI analysis, respectively. Measurements of F and v2 appear more reproducible than PS and v1. Voxel-level analysis improves both reproducibility and observer variability compared with ROI-averaged analysis and may retain information about tumor spatial heterogeneity.

  5. Developing a Collection of Composable Data Translation Software Units to Improve Efficiency and Reproducibility in Ecohydrologic Modeling Workflows

    Science.gov (United States)

    Olschanowsky, C.; Flores, A. N.; FitzGerald, K.; Masarik, M. T.; Rudisill, W. J.; Aguayo, M.

    2017-12-01

    Dynamic models of the spatiotemporal evolution of water, energy, and nutrient cycling are important tools to assess impacts of climate and other environmental changes on ecohydrologic systems. These models require spatiotemporally varying environmental forcings like precipitation, temperature, humidity, windspeed, and solar radiation. These input data originate from a variety of sources, including global and regional weather and climate models, global and regional reanalysis products, and geostatistically interpolated surface observations. Data translation measures, often subsetting in space and/or time and transforming and converting variable units, represent a seemingly mundane, but critical step in the application workflows. Translation steps can introduce errors, misrepresentations of data, slow execution time, and interrupt data provenance. We leverage a workflow that subsets a large regional dataset derived from the Weather Research and Forecasting (WRF) model and prepares inputs to the Parflow integrated hydrologic model to demonstrate the impact translation tool software quality on scientific workflow results and performance. We propose that such workflows will benefit from a community approved collection of data transformation components. The components should be self-contained composable units of code. This design pattern enables automated parallelization and software verification, improving performance and reliability. Ensuring that individual translation components are self-contained and target minute tasks increases reliability. The small code size of each component enables effective unit and regression testing. The components can be automatically composed for efficient execution. An efficient data translation framework should be written to minimize data movement. Composing components within a single streaming process reduces data movement. Each component will typically have a low arithmetic intensity, meaning that it requires about the same number of

  6. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  7. Reproducibility study of [{sup 18}F]FPP(RGD){sub 2} uptake in murine models of human tumor xenografts

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Edwin; Liu, Shuangdong; Chin, Frederick; Cheng, Zhen [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Gowrishankar, Gayatri; Yaghoubi, Shahriar [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Wedgeworth, James Patrick [Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Berndorff, Dietmar; Gekeler, Volker [Bayer Schering Pharma AG, Global Drug Discovery, Berlin (Germany); Gambhir, Sanjiv S. [Stanford University, Molecular Imaging Program at Stanford, Department of Radiology, School of Medicine, Stanford, CA (United States); Stanford University, Molecular Imaging Program at Stanford, Department of Bioengineering, School of Medicine, Stanford, CA (United States); Canary Center at Stanford for Cancer Early Detection, Nuclear Medicine, Departments of Radiology and Bioengineering, Molecular Imaging Program at Stanford, Stanford, CA (United States)

    2011-04-15

    An {sup 18}F-labeled PEGylated arginine-glycine-aspartic acid (RGD) dimer [{sup 18}F]FPP(RGD){sub 2} has been used to image tumor {alpha}{sub v}{beta}{sub 3} integrin levels in preclinical and clinical studies. Serial positron emission tomography (PET) studies may be useful for monitoring antiangiogenic therapy response or for drug screening; however, the reproducibility of serial scans has not been determined for this PET probe. The purpose of this study was to determine the reproducibility of the integrin {alpha}{sub v}{beta}{sub 3}-targeted PET probe, [{sup 18}F ]FPP(RGD){sub 2} using small animal PET. Human HCT116 colon cancer xenografts were implanted into nude mice (n = 12) in the breast and scapular region and grown to mean diameters of 5-15 mm for approximately 2.5 weeks. A 3-min acquisition was performed on a small animal PET scanner approximately 1 h after administration of [{sup 18}F]FPP(RGD){sub 2} (1.9-3.8 MBq, 50-100 {mu}Ci) via the tail vein. A second small animal PET scan was performed approximately 6 h later after reinjection of the probe to assess for reproducibility. Images were analyzed by drawing an ellipsoidal region of interest (ROI) around the tumor xenograft activity. Percentage injected dose per gram (%ID/g) values were calculated from the mean or maximum activity in the ROIs. Coefficients of variation and differences in %ID/g values between studies from the same day were calculated to determine the reproducibility. The coefficient of variation (mean {+-}SD) for %ID{sub mean}/g and %ID{sub max}/g values between [{sup 18}F]FPP(RGD){sub 2} small animal PET scans performed 6 h apart on the same day were 11.1 {+-} 7.6% and 10.4 {+-} 9.3%, respectively. The corresponding differences in %ID{sub mean}/g and %ID{sub max}/g values between scans were -0.025 {+-} 0.067 and -0.039 {+-} 0.426. Immunofluorescence studies revealed a direct relationship between extent of {alpha}{sub {nu}}{beta}{sub 3} integrin expression in tumors and tumor vasculature

  8. Enzymatic profiles of Enterobacter sakazakii and related species with special reference to the alpha-glucosidase reaction and reproducibility of the test system.

    Science.gov (United States)

    Muytjens, H L; van der Ros-van de Repe, J; van Druten, H A

    1984-01-01

    The enzymatic profiles of Enterobacter sakazakii, Enterobacter cloacae, Enterobacter aerogenes, and Enterobacter agglomerans were determined with the API ZYM system (API System S.A., La Balme Les Grottes, France). Each assay was performed three times. A simple formula was derived and applied to assess the reproducibility of the API ZYM tests. In addition, a separate alpha-glucosidase test was performed. All E. sakazakii isolates produced alpha-glucosidase, in contrast to the other Enterobacter isolates. No phosphoamidase activity was detected in any of the E. sakazakii isolates, whereas it was present in 72% of E. cloacae, 89% of E. agglomerans, and 100% of E. aerogenes isolates. It was concluded that detection of alpha-glucosidase permits rapid and reliable differentiation between E. sakazakii and other Enterobacter species. The reproducibilities of alpha-glucosidase and phosphoamidase reactions were estimated to be 89 and 81%, respectively. PMID:6386874

  9. A reliable and reproducible method for the lipase assay in an AOT/isooctane reversed micellar system: modification of the copper-soap colorimetric method.

    Science.gov (United States)

    Kwon, Chang Woo; Park, Kyung-Min; Choi, Seung Jun; Chang, Pahn-Shick

    2015-09-01

    The copper-soap method, which is based on the absorbance of a fatty acid-copper complex at 715 nm, is a widely used colorimetric assay to determine the lipase activity in reversed micellar system. However, the absorbance of the bis(2-ethylhexyl) sodium sulfosuccinate (AOT)-copper complex prevents the use of an AOT/isooctane reversed micellar system. An extraction step was added to the original procedure to remove AOT and eliminate interference from the AOT-copper complex. Among the solvents tested, acetonitrile was determined to be the most suitable because it allows for the generation of a reproducible calibration curve with oleic acid that is independent of the AOT concentrations. Based on the validation data, the modified method, which does not experience interference from the AOT-copper complex, could be a useful method with enhanced accuracy and reproducibility for the lipase assay. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Synchronized mammalian cell culture: part II--population ensemble modeling and analysis for development of reproducible processes.

    Science.gov (United States)

    Jandt, Uwe; Barradas, Oscar Platas; Pörtner, Ralf; Zeng, An-Ping

    2015-01-01

    The consideration of inherent population inhomogeneities of mammalian cell cultures becomes increasingly important for systems biology study and for developing more stable and efficient processes. However, variations of cellular properties belonging to different sub-populations and their potential effects on cellular physiology and kinetics of culture productivity under bioproduction conditions have not yet been much in the focus of research. Culture heterogeneity is strongly determined by the advance of the cell cycle. The assignment of cell-cycle specific cellular variations to large-scale process conditions can be optimally determined based on the combination of (partially) synchronized cultivation under otherwise physiological conditions and subsequent population-resolved model adaptation. The first step has been achieved using the physical selection method of countercurrent flow centrifugal elutriation, recently established in our group for different mammalian cell lines which is presented in Part I of this paper series. In this second part, we demonstrate the successful adaptation and application of a cell-cycle dependent population balance ensemble model to describe and understand synchronized bioreactor cultivations performed with two model mammalian cell lines, AGE1.HNAAT and CHO-K1. Numerical adaptation of the model to experimental data allows for detection of phase-specific parameters and for determination of significant variations between different phases and different cell lines. It shows that special care must be taken with regard to the sampling frequency in such oscillation cultures to minimize phase shift (jitter) artifacts. Based on predictions of long-term oscillation behavior of a culture depending on its start conditions, optimal elutriation setup trade-offs between high cell yields and high synchronization efficiency are proposed. © 2014 American Institute of Chemical Engineers.

  11. Pangea breakup and northward drift of the Indian subcontinent reproduced by a numerical model of mantle convection.

    Science.gov (United States)

    Yoshida, Masaki; Hamano, Yozo

    2015-02-12

    Since around 200 Ma, the most notable event in the process of the breakup of Pangea has been the high speed (up to 20 cm yr(-1)) of the northward drift of the Indian subcontinent. Our numerical simulations of 3-D spherical mantle convection approximately reproduced the process of continental drift from the breakup of Pangea at 200 Ma to the present-day continental distribution. These simulations revealed that a major factor in the northward drift of the Indian subcontinent was the large-scale cold mantle downwelling that developed spontaneously in the North Tethys Ocean, attributed to the overall shape of Pangea. The strong lateral mantle flow caused by the high-temperature anomaly beneath Pangea, due to the thermal insulation effect, enhanced the acceleration of the Indian subcontinent during the early stage of the Pangea breakup. The large-scale hot upwelling plumes from the lower mantle, initially located under Africa, might have contributed to the formation of the large-scale cold mantle downwelling in the North Tethys Ocean.

  12. Three-dimensional surgical modelling with an open-source software protocol: study of precision and reproducibility in mandibular reconstruction with the fibula free flap.

    Science.gov (United States)

    Ganry, L; Quilichini, J; Bandini, C M; Leyder, P; Hersant, B; Meningaud, J P

    2017-08-01

    Very few surgical teams currently use totally independent and free solutions to perform three-dimensional (3D) surgical modelling for osseous free flaps in reconstructive surgery. This study assessed the precision and technical reproducibility of a 3D surgical modelling protocol using free open-source software in mandibular reconstruction with fibula free flaps and surgical guides. Precision was assessed through comparisons of the 3D surgical guide to the sterilized 3D-printed guide, determining accuracy to the millimetre level. Reproducibility was assessed in three surgical cases by volumetric comparison to the millimetre level. For the 3D surgical modelling, a difference of less than 0.1mm was observed. Almost no deformations (free flap modelling was between 0.1mm and 0.4mm, and the average precision of the complete reconstructed mandible was less than 1mm. The open-source software protocol demonstrated high accuracy without complications. However, the precision of the surgical case depends on the surgeon's 3D surgical modelling. Therefore, surgeons need training on the use of this protocol before applying it to surgical cases; this constitutes a limitation. Further studies should address the transfer of expertise. Copyright © 2017 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. The Proximal Medial Sural Nerve Biopsy Model: A Standardised and Reproducible Baseline Clinical Model for the Translational Evaluation of Bioengineered Nerve Guides

    Directory of Open Access Journals (Sweden)

    Ahmet Bozkurt

    2014-01-01

    Full Text Available Autologous nerve transplantation (ANT is the clinical gold standard for the reconstruction of peripheral nerve defects. A large number of bioengineered nerve guides have been tested under laboratory conditions as an alternative to the ANT. The step from experimental studies to the implementation of the device in the clinical setting is often substantial and the outcome is unpredictable. This is mainly linked to the heterogeneity of clinical peripheral nerve injuries, which is very different from standardized animal studies. In search of a reproducible human model for the implantation of bioengineered nerve guides, we propose the reconstruction of sural nerve defects after routine nerve biopsy as a first or baseline study. Our concept uses the medial sural nerve of patients undergoing diagnostic nerve biopsy (≥2 cm. The biopsy-induced nerve gap was immediately reconstructed by implantation of the novel microstructured nerve guide, Neuromaix, as part of an ongoing first-in-human study. Here we present (i a detailed list of inclusion and exclusion criteria, (ii a detailed description of the surgical procedure, and (iii a follow-up concept with multimodal sensory evaluation techniques. The proximal medial sural nerve biopsy model can serve as a preliminarynature of the injuries or baseline nerve lesion model. In a subsequent step, newly developed nerve guides could be tested in more unpredictable and challenging clinical peripheral nerve lesions (e.g., following trauma which have reduced comparability due to the different nature of the injuries (e.g., site of injury and length of nerve gap.

  14. Reproducibility in Seismic Imaging

    Directory of Open Access Journals (Sweden)

    González-Verdejo O.

    2012-04-01

    Full Text Available Within the field of exploration seismology, there is interest at national level of integrating reproducibility in applied, educational and research activities related to seismic processing and imaging. This reproducibility implies the description and organization of the elements involved in numerical experiments. Thus, a researcher, teacher or student can study, verify, repeat, and modify them independently. In this work, we document and adapt reproducibility in seismic processing and imaging to spread this concept and its benefits, and to encourage the use of open source software in this area within our academic and professional environment. We present an enhanced seismic imaging example, of interest in both academic and professional environments, using Mexican seismic data. As a result of this research, we prove that it is possible to assimilate, adapt and transfer technology at low cost, using open source software and following a reproducible research scheme.

  15. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Science.gov (United States)

    Nada, Rania M; Maal, Thomas J J; Breuning, K Hero; Bergé, Stefaan J; Mostafa, Yehya A; Kuijpers-Jagtman, Anne Marie

    2011-02-09

    Superimposition of serial Cone Beam Computed Tomography (CBCT) scans has become a valuable tool for three dimensional (3D) assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16) for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27) for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  16. Reproducibility of Quantitative Brain Imaging Using a PET-Only and a Combined PET/MR System

    DEFF Research Database (Denmark)

    Lassen, Martin L; Muzik, Otto; Beyer, Thomas

    2017-01-01

    The purpose of this study was to test the feasibility of migrating a quantitative brain imaging protocol from a positron emission tomography (PET)-only system to an integrated PET/MR system. Potential differences in both absolute radiotracer concentration as well as in the derived kinetic...

  17. Assessment of a numerical model to reproduce event‐scale erosion and deposition distributions in a braided river

    Science.gov (United States)

    Measures, R.; Hicks, D. M.; Brasington, J.

    2016-01-01

    Abstract Numerical morphological modeling of braided rivers, using a physics‐based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth‐averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high‐flow event. Evaluation of model performance primarily focused upon using high‐resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach‐scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers. PMID:27708477

  18. Assessment of a numerical model to reproduce event-scale erosion and deposition distributions in a braided river

    Science.gov (United States)

    Williams, R. D.; Measures, R.; Hicks, D. M.; Brasington, J.

    2016-08-01

    Numerical morphological modeling of braided rivers, using a physics-based approach, is increasingly used as a technique to explore controls on river pattern and, from an applied perspective, to simulate the impact of channel modifications. This paper assesses a depth-averaged nonuniform sediment model (Delft3D) to predict the morphodynamics of a 2.5 km long reach of the braided Rees River, New Zealand, during a single high-flow event. Evaluation of model performance primarily focused upon using high-resolution Digital Elevation Models (DEMs) of Difference, derived from a fusion of terrestrial laser scanning and optical empirical bathymetric mapping, to compare observed and predicted patterns of erosion and deposition and reach-scale sediment budgets. For the calibrated model, this was supplemented with planform metrics (e.g., braiding intensity). Extensive sensitivity analysis of model functions and parameters was executed, including consideration of numerical scheme for bed load component calculations, hydraulics, bed composition, bed load transport and bed slope effects, bank erosion, and frequency of calculations. Total predicted volumes of erosion and deposition corresponded well to those observed. The difference between predicted and observed volumes of erosion was less than the factor of two that characterizes the accuracy of the Gaeuman et al. bed load transport formula. Grain size distributions were best represented using two φ intervals. For unsteady flows, results were sensitive to the morphological time scale factor. The approach of comparing observed and predicted morphological sediment budgets shows the value of using natural experiment data sets for model testing. Sensitivity results are transferable to guide Delft3D applications to other rivers.

  19. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ...

  20. Repeatability and Reproducibility of Corneal Biometric Measurements Using the Visante Omni and a Rabbit Experimental Model of Post-Surgical Corneal Ectasia

    Science.gov (United States)

    Liu, Yu-Chi; Konstantopoulos, Aris; Riau, Andri K.; Bhayani, Raj; Lwin, Nyein C.; Teo, Ericia Pei Wen; Yam, Gary Hin Fai; Mehta, Jodhbir S.

    2015-01-01

    Purpose: To investigate the repeatability and reproducibility of the Visante Omni topography in obtaining topography measurements of rabbit corneas and to develop a post-surgical model of corneal ectasia. Methods: Eight rabbits were used to study the repeatability and reproducibility by assessing the intra- and interobserver bias and limits of agreement. Another nine rabbits underwent different diopters (D) of laser in situ keratosmileusis (LASIK) were used for the development of ectasia model. All eyes were examined with the Visante Omni, and corneal ultrastructure were evaluated with transmission electron microscopy (TEM). Results: There was no significant intra- or interobserver difference for mean steep and flat keratometry (K) values of simulated K, anterior, and posterior elevation measurements. Eyes underwent −5 D LASIK had a significant increase in mean amplitude of astigmatism and posterior surface elevation with time (P for trend corneal ectasia that was gradual in development and simulated the human condition. Translational Relevance: The results provide the foundations for the future evaluation of novel treatment modalities for post-surgical ectasia and keratoconus. PMID:25938004

  1. A right to reproduce?

    Science.gov (United States)

    Emson, H E

    1992-10-31

    Conscious control of the environment by homo sapiens has brought almost total release from the controls of ecology that limit the population of all other species. After a mere 10,000 years, humans have brought the planet close to collapse, and all the debate in the world seems unlikely to save it. A combination of uncontrolled breeding and rapacity is propelling us down the slippery slope 1st envisioned by Malthus, dragging the rest of the planet along. Only the conscious control, and most likely voluntary, reimposition of controls on breeding will reduce the overgrowth of humans, and we have far to go in that direction. "According to the United Nations Universal Declaration of Human Rights (1948, articles 16[I] and 16 [III]), Men and women of full age without any limitation due to race, nationality or religion have the right to marry and to found a family ... the family is the natural and fundamental group unit of society." The rhetoric of rights without the balancing of responsibilities is wrong in health care, and even more wrong in the context of world population. The mind-set of dominance and exploitation over the rest of creation has meant human reluctance to admit participation in a system where every part is interdependent. We must balance the right to reproduce with it responsible use, valuing interdependence, understanding, and respect with a duty not to unbalance, damage, or destroy. It is long overdue that we discard every statement of right that is unmatched by the equivalent duty and responsibility.

  2. Isokinetic eccentric exercise as a model to induce and reproduce pathophysiological alterations related to delayed onset muscle soreness

    DEFF Research Database (Denmark)

    Lund, Henrik; Vestergaard-Poulsen, P; Kanstrup, I.L.

    1998-01-01

    Physiological alterations following unaccustomed eccentric exercise in an isokinetic dynamometer of the right m. quadriceps until exhaustion were studied, in order to create a model in which the physiological responses to physiotherapy could be measured. In experiment I (exp. I), seven selected p...

  3. Modeling of System Families

    National Research Council Canada - National Science Library

    Feiler, Peter

    2007-01-01

    .... The Society of Automotive Engineers (SAE) Architecture Analysis & Design Language (AADL) is an industry-standard, architecture-modeling notation specifically designed to support a component- based approach to modeling embedded systems...

  4. Modelling Railway Interlocking Systems

    DEFF Research Database (Denmark)

    Lindegaard, Morten Peter; Viuf, P.; Haxthausen, Anne Elisabeth

    2000-01-01

    In this report we present a model of interlocking systems, and describe how the model may be validated by simulation. Station topologies are modelled by graphs in which the nodes denote track segments, and the edges denote connectivity for train traÆc. Points and signals are modelled by annotatio...

  5. Reproducibility of the acute rejection diagnosis in human cardiac allografts. The Stanford Classification and the International Grading System

    DEFF Research Database (Denmark)

    Nielsen, H; Sørensen, Flemming Brandt; Nielsen, B

    1993-01-01

    Transplantation has become an accepted treatment of many cardiac end-stage diseases. Acute cellular rejection accounts for 15% to 20% of all graft failures. The first grading system of acute cellular rejection, the Stanford Classification, was introduced in 1979, and since then many other grading...

  6. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  7. Minimum Information about a Cardiac Electrophysiology Experiment (MICEE): Standardised Reporting for Model Reproducibility, Interoperability, and Data Sharing

    Science.gov (United States)

    Quinn, TA; Granite, S; Allessie, MA; Antzelevitch, C; Bollensdorff, C; Bub, G; Burton, RAB; Cerbai, E; Chen, PS; Delmar, M; DiFrancesco, D; Earm, YE; Efimov, IR; Egger, M; Entcheva, E; Fink, M; Fischmeister, R; Franz, MR; Garny, A; Giles, WR; Hannes, T; Harding, SE; Hunter, PJ; Iribe, G; Jalife, J; Johnson, CR; Kass, RS; Kodama, I; Koren, G; Lord, P; Markhasin, VS; Matsuoka, S; McCulloch, AD; Mirams, GR; Morley, GE; Nattel, S; Noble, D; Olesen, SP; Panfilov, AV; Trayanova, NA; Ravens, U; Richard, S; Rosenbaum, DS; Rudy, Y; Sachs, F; Sachse, FB; Saint, DA; Schotten, U; Solovyova, O; Taggart, P; Tung, L; Varró, A; Volders, PG; Wang, K; Weiss, JN; Wettwer, E; White, E; Wilders, R; Winslow, RL; Kohl, P

    2011-01-01

    Cardiac experimental electrophysiology is in need of a well-defined Minimum Information Standard for recording, annotating, and reporting experimental data. As a step toward establishing this, we present a draft standard, called Minimum Information about a Cardiac Electrophysiology Experiment (MICEE). The ultimate goal is to develop a useful tool for cardiac electrophysiologists which facilitates and improves dissemination of the minimum information necessary for reproduction of cardiac electrophysiology research, allowing for easier comparison and utilisation of findings by others. It is hoped that this will enhance the integration of individual results into experimental, computational, and conceptual models. In its present form, this draft is intended for assessment and development by the research community. We invite the reader to join this effort, and, if deemed productive, implement the Minimum Information about a Cardiac Electrophysiology Experiment standard in their own work. PMID:21745496

  8. Reproducibility of haemodynamical simulations in a subject-specific stented aneurysm model--a report on the Virtual Intracranial Stenting Challenge 2007.

    Science.gov (United States)

    Radaelli, A G; Augsburger, L; Cebral, J R; Ohta, M; Rüfenacht, D A; Balossino, R; Benndorf, G; Hose, D R; Marzo, A; Metcalfe, R; Mortier, P; Mut, F; Reymond, P; Socci, L; Verhegghe, B; Frangi, A F

    2008-07-19

    This paper presents the results of the Virtual Intracranial Stenting Challenge (VISC) 2007, an international initiative whose aim was to establish the reproducibility of state-of-the-art haemodynamical simulation techniques in subject-specific stented models of intracranial aneurysms (IAs). IAs are pathological dilatations of the cerebral artery walls, which are associated with high mortality and morbidity rates due to subarachnoid haemorrhage following rupture. The deployment of a stent as flow diverter has recently been indicated as a promising treatment option, which has the potential to protect the aneurysm by reducing the action of haemodynamical forces and facilitating aneurysm thrombosis. The direct assessment of changes in aneurysm haemodynamics after stent deployment is hampered by limitations in existing imaging techniques and currently requires resorting to numerical simulations. Numerical simulations also have the potential to assist in the personalized selection of an optimal stent design prior to intervention. However, from the current literature it is difficult to assess the level of technological advancement and the reproducibility of haemodynamical predictions in stented patient-specific models. The VISC 2007 initiative engaged in the development of a multicentre-controlled benchmark to analyse differences induced by diverse grid generation and computational fluid dynamics (CFD) technologies. The challenge also represented an opportunity to provide a survey of available technologies currently adopted by international teams from both academic and industrial institutions for constructing computational models of stented aneurysms. The results demonstrate the ability of current strategies in consistently quantifying the performance of three commercial intracranial stents, and contribute to reinforce the confidence in haemodynamical simulation, thus taking a step forward towards the introduction of simulation tools to support diagnostics and

  9. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Directory of Open Access Journals (Sweden)

    Rania M Nada

    Full Text Available Superimposition of serial Cone Beam Computed Tomography (CBCT scans has become a valuable tool for three dimensional (3D assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16 for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27 for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  10. RSMASS system model development

    International Nuclear Information System (INIS)

    Marshall, A.C.; Gallup, D.R.

    1998-01-01

    RSMASS system mass models have been used for more than a decade to make rapid estimates of space reactor power system masses. This paper reviews the evolution of the RSMASS models and summarizes present capabilities. RSMASS has evolved from a simple model used to make rough estimates of space reactor and shield masses to a versatile space reactor power system model. RSMASS uses unique reactor and shield models that permit rapid mass optimization calculations for a variety of space reactor power and propulsion systems. The RSMASS-D upgrade of the original model includes algorithms for the balance of the power system, a number of reactor and shield modeling improvements, and an automatic mass optimization scheme. The RSMASS-D suite of codes cover a very broad range of reactor and power conversion system options as well as propulsion and bimodal reactor systems. Reactor choices include in-core and ex-core thermionic reactors, liquid metal cooled reactors, particle bed reactors, and prismatic configuration reactors. Power conversion options include thermoelectric, thermionic, Stirling, Brayton, and Rankine approaches. Program output includes all major component masses and dimensions, efficiencies, and a description of the design parameters for a mass optimized system. In the past, RSMASS has been used as an aid to identify and select promising concepts for space power applications. The RSMASS modeling approach has been demonstrated to be a valuable tool for guiding optimization of the power system design; consequently, the model is useful during system design and development as well as during the selection process. An improved in-core thermionic reactor system model RSMASS-T is now under development. The current development of the RSMASS-T code represents the next evolutionary stage of the RSMASS models. RSMASS-T includes many modeling improvements and is planned to be more user-friendly. RSMASS-T will be released as a fully documented, certified code at the end of

  11. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  12. Reprodutibilidade do estadiamento endoscópico tridimensional da polipose nasal Reproducibility of the three-dimensional endoscopic staging system for nasal polyposis

    Directory of Open Access Journals (Sweden)

    Marcelo Castro Alves de Sousa

    2009-12-01

    Full Text Available A polipose nasossinusal é um processo inflamatório crônico da mucosa nasal, caracterizado pela presença de pólipos nasais múltiplos e bilaterais. Vários tipos de medicações têm sido usados no seu tratamento. Para estudar o resultado de diferentes formas de tratamento, é preciso alguma forma de estadiamento. OBJETIVOS: Apresentar um novo método endoscópico de estadiamento, baseado na endoscopia nasal e na avaliação tridimensional dos pólipos, comparar sua reprodutibilidade entre outros dois métodos já difundidos. FORMA DE ESTUDO: Estudo de coorte histórica com corte transversal. MATERIAL E MÉTODO: Três examinadores avaliaram exames de 20 pacientes portadores de polipose nasossinusal em diferentes momentos, antes e 15 e 30 dias após o início de um tratamento com prednisona, na dose de 1 mg/kg/dia por 7 dias. Foi avaliado o grau de concordância entre os examinadores para cada método, utilizando-se o Kappa múltiplo para análise estatística. RESULTADOS: Os três métodos mostraram-se reprodutíveis, sendo que o método proposto apresentou menor concordância entre os examinadores. CONCLUSÃO: O estadiamento proposto mostrou-se reprodutível, apesar de apresentar menor concordância do que os outros dois estadiamentos.Nasal Polyposis is a chronic inflammatory process of the nasal mucosa, characterized by multiple and bilateral nasal polyps. Different drugs have been used in its treatment. In order to study the results of different treatment modalities it is necessary to have some kind of staging. AIM: to present a new endoscopic staging method, based on nasal endoscopy and on the three-dimensional nasal polyp assessment; and compare its reproducibility with that from two other systems already established in the literature. STUDY DESIGN: Cohort study. MATERIAL AND METHODS: Three experts assessed the exams of 20 patients with nasal polyposis at different times, before, at 15 and at 30 days after the start of oral prednisone

  13. Systemic resilience model

    International Nuclear Information System (INIS)

    Lundberg, Jonas; Johansson, Björn JE

    2015-01-01

    It has been realized that resilience as a concept involves several contradictory definitions, both for instance resilience as agile adjustment and as robust resistance to situations. Our analysis of resilience concepts and models suggest that beyond simplistic definitions, it is possible to draw up a systemic resilience model (SyRes) that maintains these opposing characteristics without contradiction. We outline six functions in a systemic model, drawing primarily on resilience engineering, and disaster response: anticipation, monitoring, response, recovery, learning, and self-monitoring. The model consists of four areas: Event-based constraints, Functional Dependencies, Adaptive Capacity and Strategy. The paper describes dependencies between constraints, functions and strategies. We argue that models such as SyRes should be useful both for envisioning new resilience methods and metrics, as well as for engineering and evaluating resilient systems. - Highlights: • The SyRes model resolves contradictions between previous resilience definitions. • SyRes is a core model for envisioning and evaluating resilience metrics and models. • SyRes describes six functions in a systemic model. • They are anticipation, monitoring, response, recovery, learning, self-monitoring. • The model describes dependencies between constraints, functions and strategies

  14. Selected System Models

    Science.gov (United States)

    Schmidt-Eisenlohr, F.; Puñal, O.; Klagges, K.; Kirsche, M.

    Apart from the general issue of modeling the channel, the PHY and the MAC of wireless networks, there are specific modeling assumptions that are considered for different systems. In this chapter we consider three specific wireless standards and highlight modeling options for them. These are IEEE 802.11 (as example for wireless local area networks), IEEE 802.16 (as example for wireless metropolitan networks) and IEEE 802.15 (as example for body area networks). Each section on these three systems discusses also at the end a set of model implementations that are available today.

  15. Modeling cellular systems

    CERN Document Server

    Matthäus, Franziska; Pahle, Jürgen

    2017-01-01

    This contributed volume comprises research articles and reviews on topics connected to the mathematical modeling of cellular systems. These contributions cover signaling pathways, stochastic effects, cell motility and mechanics, pattern formation processes, as well as multi-scale approaches. All authors attended the workshop on "Modeling Cellular Systems" which took place in Heidelberg in October 2014. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  16. Retrospective Correction of Physiological Noise: Impact on Sensitivity, Specificity, and Reproducibility of Resting-State Functional Connectivity in a Reading Network Model.

    Science.gov (United States)

    Krishnamurthy, Venkatagiri; Krishnamurthy, Lisa C; Schwam, Dina M; Ealey, Ashley; Shin, Jaemin; Greenberg, Daphne; Morris, Robin D

    2018-03-01

    It is well accepted that physiological noise (PN) obscures the detection of neural fluctuations in resting-state functional connectivity (rsFC) magnetic resonance imaging. However, a clear consensus for an optimal PN correction (PNC) methodology and how it can impact the rsFC signal characteristics is still lacking. In this study, we probe the impact of three PNC methods: RETROICOR: (Glover et al., 2000 ), ANATICOR: (Jo et al., 2010 ), and RVTMBPM: (Bianciardi et al., 2009 ). Using a reading network model, we systematically explore the effects of PNC optimization on sensitivity, specificity, and reproducibility of rsFC signals. In terms of specificity, ANATICOR was found to be effective in removing local white matter (WM) fluctuations and also resulted in aggressive removal of expected cortical-to-subcortical functional connections. The ability of RETROICOR to remove PN was equivalent to removal of simulated random PN such that it artificially inflated the connection strength, thereby decreasing sensitivity. RVTMBPM maintained specificity and sensitivity by balanced removal of vasodilatory PN and local WM nuisance edges. Another aspect of this work was exploring the effects of PNC on identifying reading group differences. Most PNC methods accounted for between-subject PN variability resulting in reduced intersession reproducibility. This effect facilitated the detection of the most consistent group differences. RVTMBPM was most effective in detecting significant group differences due to its inherent sensitivity to removing spatially structured and temporally repeating PN arising from dense vasculature. Finally, results suggest that combining all three PNC resulted in "overcorrection" by removing signal along with noise.

  17. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  18. Agent-based models of cellular systems.

    Science.gov (United States)

    Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Tesei, Luca

    2013-01-01

    Software agents are particularly suitable for engineering models and simulations of cellular systems. In a very natural and intuitive manner, individual software components are therein delegated to reproduce "in silico" the behavior of individual components of alive systems at a given level of resolution. Individuals' actions and interactions among individuals allow complex collective behavior to emerge. In this chapter we first introduce the readers to software agents and multi-agent systems, reviewing the evolution of agent-based modeling of biomolecular systems in the last decade. We then describe the main tools, platforms, and methodologies available for programming societies of agents, possibly profiting also of toolkits that do not require advanced programming skills.

  19. Modelling of wastewater systems

    DEFF Research Database (Denmark)

    Bechmann, Henrik

    Oxygen Demand) flux and SS flux in the inlet to the WWTP. COD is measured by means of a UV absorption sensor while SS is measured by a turbidity sensor. These models include a description of the deposit of COD and SS amounts, respectively, in the sewer system, and the models can thus be used to quantify......In this thesis, models of pollution fluxes in the inlet to 2 Danish wastewater treatment plants (WWTPs) as well as of suspended solids (SS) concentrations in the aeration tanks of an alternating WWTP and in the effluent from the aeration tanks are developed. The latter model is furthermore used...... to analyze and quantify the effect of the Aeration Tank Settling (ATS) operating mode, which is used during rain events. Furthermore, the model is used to propose a control algorithm for the phase lengths during ATS operation. The models are mainly formulated as state space model in continuous time...

  20. Inhibition of basophil activation by histamine: a sensitive and reproducible model for the study of the biological activity of high dilutions.

    Science.gov (United States)

    Sainte-Laudy, J; Belon, Ph

    2009-10-01

    (another human basophil activation marker). Results were expressed in mean fluorescence intensity of the CD203c positive population (MFI-CD203c) and an activation index calculated by an algorithm. For the mouse basophil model, histamine was measured spectrofluorimetrically. The main results obtained over 28 years of work was the demonstration of a reproducible inhibition of human basophil activation by high dilutions of histamine, the effect peaks in the range of 15-17CH. The effect was not significant when histamine was replaced by histidine (a histamine precursor) or cimetidine (histamine H2 receptor antagonist) was added to the incubation medium. These results were confirmed by flow cytometry. Using the latter technique, we also showed that 4-Methyl histamine (H2 agonist) induced a similar effect, in contrast to 1-Methyl histamine, an inactive histamine metabolite. Using the mouse model, we showed that histamine high dilutions, in the same range of dilutions, inhibited histamine release. Successively, using different models to study of human and murine basophil activation, we demonstrated that high dilutions of histamine, in the range of 15-17CH induce a reproducible biological effect. This phenomenon has been confirmed by a multi-center study using the HBDT model and by at least three independent laboratories by flow cytometry. The specificity of the observed effect was confirmed, versus the water controls at the same dilution level by the absence of biological activity of inactive compounds such as histidine and 1-Methyl histamine and by the reversibility of this effect in the presence of a histamine receptor H2 antagonist.

  1. Reproducibility of isotope ratio measurements

    International Nuclear Information System (INIS)

    Elmore, D.

    1981-01-01

    The use of an accelerator as part of a mass spectrometer has improved the sensitivity for measuring low levels of long-lived radionuclides by several orders of magnitude. However, the complexity of a large tandem accelerator and beam transport system has made it difficult to match the precision of low energy mass spectrometry. Although uncertainties for accelerator measured isotope ratios as low as 1% have been obtained under favorable conditions, most errors quoted in the literature for natural samples are in the 5 to 20% range. These errors are dominated by statistics and generally the reproducibility is unknown since the samples are only measured once

  2. Roles of chemical complexity and evolutionary theory in some hepatic and intestinal enzymatic systems in chemical reproducibility and clinical efficiency of herbal derivatives.

    Science.gov (United States)

    Di Pierro, Francesco

    2014-01-01

    Despite the great marketing success, most physicians attribute poor efficacy to herbals. This perception is due to two situations that are an integral part of the herbal topic. The first is the poor phytochemical reproducibility obtained during the production process of herbal extracts, as herbal extracts are not always standardized in the whole manufacturing process, but only in their titer. The second problem is linked to the evolution of important enzymatic systems: cytochromes and ABC proteins. They are both enzyme classes with detoxifying properties and seem to have evolved from the molecular mould provided by active plant substances. During the evolution, as still happens today, polyphenols, saponins, terpenes, and alkaloids were ingested together with food. They do not possess any nutritional value but seem to be provided with a potential pharmacological activity. Cytochromes and ABC proteins, which evolved over time to detoxify food from vegetable chemical "actives," now seem to limit the action of herbal derivatives. The comprehension of these 2 events may explain the origin of the widespread scepticism of physicians about herbal medicine and suggests that, after correct herbal standardization, use of antagonists of cytochromes and ABC systems will make it possible to recover their pharmacological potential.

  3. Modeling Complex Systems

    CERN Document Server

    Boccara, Nino

    2010-01-01

    Modeling Complex Systems, 2nd Edition, explores the process of modeling complex systems, providing examples from such diverse fields as ecology, epidemiology, sociology, seismology, and economics. It illustrates how models of complex systems are built and provides indispensable mathematical tools for studying their dynamics. This vital introductory text is useful for advanced undergraduate students in various scientific disciplines, and serves as an important reference book for graduate students and young researchers. This enhanced second edition includes: . -recent research results and bibliographic references -extra footnotes which provide biographical information on cited scientists who have made significant contributions to the field -new and improved worked-out examples to aid a student’s comprehension of the content -exercises to challenge the reader and complement the material Nino Boccara is also the author of Essentials of Mathematica: With Applications to Mathematics and Physics (Springer, 2007).

  4. Modeling Complex Systems

    International Nuclear Information System (INIS)

    Schreckenberg, M

    2004-01-01

    This book by Nino Boccara presents a compilation of model systems commonly termed as 'complex'. It starts with a definition of the systems under consideration and how to build up a model to describe the complex dynamics. The subsequent chapters are devoted to various categories of mean-field type models (differential and recurrence equations, chaos) and of agent-based models (cellular automata, networks and power-law distributions). Each chapter is supplemented by a number of exercises and their solutions. The table of contents looks a little arbitrary but the author took the most prominent model systems investigated over the years (and up until now there has been no unified theory covering the various aspects of complex dynamics). The model systems are explained by looking at a number of applications in various fields. The book is written as a textbook for interested students as well as serving as a comprehensive reference for experts. It is an ideal source for topics to be presented in a lecture on dynamics of complex systems. This is the first book on this 'wide' topic and I have long awaited such a book (in fact I planned to write it myself but this is much better than I could ever have written it!). Only section 6 on cellular automata is a little too limited to the author's point of view and one would have expected more about the famous Domany-Kinzel model (and more accurate citation!). In my opinion this is one of the best textbooks published during the last decade and even experts can learn a lot from it. Hopefully there will be an actualization after, say, five years since this field is growing so quickly. The price is too high for students but this, unfortunately, is the normal case today. Nevertheless I think it will be a great success! (book review)

  5. Modeling the earth system

    Energy Technology Data Exchange (ETDEWEB)

    Ojima, D. [ed.

    1992-12-31

    The 1990 Global Change Institute (GCI) on Earth System Modeling is the third of a series organized by the Office for Interdisciplinary Earth Studies to look in depth at particular issues critical to developing a better understanding of the earth system. The 1990 GCI on Earth System Modeling was organized around three themes: defining critical gaps in the knowledge of the earth system, developing simplified working models, and validating comprehensive system models. This book is divided into three sections that reflect these themes. Each section begins with a set of background papers offering a brief tutorial on the subject, followed by working group reports developed during the institute. These reports summarize the joint ideas and recommendations of the participants and bring to bear the interdisciplinary perspective that imbued the institute. Since the conclusion of the 1990 Global Change Institute, research programs, nationally and internationally, have moved forward to implement a number of the recommendations made at the institute, and many of the participants have maintained collegial interactions to develop research projects addressing the needs identified during the two weeks in Snowmass.

  6. Information Systems Efficiency Model

    Directory of Open Access Journals (Sweden)

    Milos Koch

    2017-07-01

    Full Text Available This contribution discusses the basic concept of creating a new model for the efficiency and effectiveness assessment of company information systems. The present trends in this field are taken into account, and the attributes are retained of measuring the optimal solutions for a company’s ICT (the implementation, functionality, service, innovations, safety, relationships, costs, etc.. The proposal of a new model of assessment comes from our experience with formerly implemented and employed methods, methods which we have modified in time and adapted to companies’ needs but also to the necessaries of our research that has been done through the ZEFIS portal. The most noteworthy of them is the HOS method that we have discussed in a number of forums. Its main feature is the fact that it respects the complexity of an information system in correlation with the balanced state of its individual parts.

  7. Osteolytica: An automated image analysis software package that rapidly measures cancer-induced osteolytic lesions in in vivo models with greater reproducibility compared to other commonly used methods.

    Science.gov (United States)

    Evans, H R; Karmakharm, T; Lawson, M A; Walker, R E; Harris, W; Fellows, C; Huggins, I D; Richmond, P; Chantry, A D

    2016-02-01

    Methods currently used to analyse osteolytic lesions caused by malignancies such as multiple myeloma and metastatic breast cancer vary from basic 2-D X-ray analysis to 2-D images of micro-CT datasets analysed with non-specialised image software such as ImageJ. However, these methods have significant limitations. They do not capture 3-D data, they are time-consuming and they often suffer from inter-user variability. We therefore sought to develop a rapid and reproducible method to analyse 3-D osteolytic lesions in mice with cancer-induced bone disease. To this end, we have developed Osteolytica, an image analysis software method featuring an easy to use, step-by-step interface to measure lytic bone lesions. Osteolytica utilises novel graphics card acceleration (parallel computing) and 3-D rendering to provide rapid reconstruction and analysis of osteolytic lesions. To evaluate the use of Osteolytica we analysed tibial micro-CT datasets from murine models of cancer-induced bone disease and compared the results to those obtained using a standard ImageJ analysis method. Firstly, to assess inter-user variability we deployed four independent researchers to analyse tibial datasets from the U266-NSG murine model of myeloma. Using ImageJ, inter-user variability between the bones was substantial (±19.6%), in contrast to using Osteolytica, which demonstrated minimal variability (±0.5%). Secondly, tibial datasets from U266-bearing NSG mice or BALB/c mice injected with the metastatic breast cancer cell line 4T1 were compared to tibial datasets from aged and sex-matched non-tumour control mice. Analyses by both Osteolytica and ImageJ showed significant increases in bone lesion area in tumour-bearing mice compared to control mice. These results confirm that Osteolytica performs as well as the current 2-D ImageJ osteolytic lesion analysis method. However, Osteolytica is advantageous in that it analyses over the entirety of the bone volume (as opposed to selected 2-D images), it

  8. Prognostic Performance and Reproducibility of the 1973 and 2004/2016 World Health Organization Grading Classification Systems in Non-muscle-invasive Bladder Cancer: A European Association of Urology Non-muscle Invasive Bladder Cancer Guidelines Panel Systematic Review.

    Science.gov (United States)

    Soukup, Viktor; Čapoun, Otakar; Cohen, Daniel; Hernández, Virginia; Babjuk, Marek; Burger, Max; Compérat, Eva; Gontero, Paolo; Lam, Thomas; MacLennan, Steven; Mostafid, A Hugh; Palou, Joan; van Rhijn, Bas W G; Rouprêt, Morgan; Shariat, Shahrokh F; Sylvester, Richard; Yuan, Yuhong; Zigeuner, Richard

    2017-11-01

    Tumour grade is an important prognostic indicator in non-muscle-invasive bladder cancer (NMIBC). Histopathological classifications are limited by interobserver variability (reproducibility), which may have prognostic implications. European Association of Urology NMIBC guidelines suggest concurrent use of both 1973 and 2004/2016 World Health Organization (WHO) classifications. To compare the prognostic performance and reproducibility of the 1973 and 2004/2016 WHO grading systems for NMIBC. A systematic literature search was undertaken incorporating Medline, Embase, and the Cochrane Library. Studies were critically appraised for risk of bias (QUIPS). For prognosis, the primary outcome was progression to muscle-invasive or metastatic disease. Secondary outcomes were disease recurrence, and overall and cancer-specific survival. For reproducibility, the primary outcome was interobserver variability between pathologists. Secondary outcome was intraobserver variability (repeatability) by the same pathologist. Of 3593 articles identified, 20 were included in the prognostic review; three were eligible for the reproducibility review. Increasing tumour grade in both classifications was associated with higher disease progression and recurrence rates. Progression rates in grade 1 patients were similar to those in low-grade patients; progression rates in grade 3 patients were higher than those in high-grade patients. Survival data were limited. Reproducibility of the 2004/2016 system was marginally better than that of the 1973 system. Two studies on repeatability showed conflicting results. Most studies had a moderate to high risk of bias. Current grading classifications in NMIBC are suboptimal. The 1973 system identifies more aggressive tumours. Intra- and interobserver variability was slightly less in the 2004/2016 classification. We could not confirm that the 2004/2016 classification outperforms the 1973 classification in prediction of recurrence and progression. This article

  9. The construction of general basis functions in reweighting ensemble dynamics simulations: Reproduce equilibrium distribution in complex systems from multiple short simulation trajectories

    Science.gov (United States)

    Zhang, Chuan-Biao; Ming, Li; Xin, Zhou

    2015-12-01

    Ensemble simulations, which use multiple short independent trajectories from dispersive initial conformations, rather than a single long trajectory as used in traditional simulations, are expected to sample complex systems such as biomolecules much more efficiently. The re-weighted ensemble dynamics (RED) is designed to combine these short trajectories to reconstruct the global equilibrium distribution. In the RED, a number of conformational functions, named as basis functions, are applied to relate these trajectories to each other, then a detailed-balance-based linear equation is built, whose solution provides the weights of these trajectories in equilibrium distribution. Thus, the sufficient and efficient selection of basis functions is critical to the practical application of RED. Here, we review and present a few possible ways to generally construct basis functions for applying the RED in complex molecular systems. Especially, for systems with less priori knowledge, we could generally use the root mean squared deviation (RMSD) among conformations to split the whole conformational space into a set of cells, then use the RMSD-based-cell functions as basis functions. We demonstrate the application of the RED in typical systems, including a two-dimensional toy model, the lattice Potts model, and a short peptide system. The results indicate that the RED with the constructions of basis functions not only more efficiently sample the complex systems, but also provide a general way to understand the metastable structure of conformational space. Project supported by the National Natural Science Foundation of China (Grant No. 11175250).

  10. Production process reproducibility and product quality consistency of transient gene expression in HEK293 cells with anti-PD1 antibody as the model protein.

    Science.gov (United States)

    Ding, Kai; Han, Lei; Zong, Huifang; Chen, Junsheng; Zhang, Baohong; Zhu, Jianwei

    2017-03-01

    Demonstration of reproducibility and consistency of process and product quality is one of the most crucial issues in using transient gene expression (TGE) technology for biopharmaceutical development. In this study, we challenged the production consistency of TGE by expressing nine batches of recombinant IgG antibody in human embryonic kidney 293 cells to evaluate reproducibility including viable cell density, viability, apoptotic status, and antibody yield in cell culture supernatant. Product quality including isoelectric point, binding affinity, secondary structure, and thermal stability was assessed as well. In addition, major glycan forms of antibody from different batches of production were compared to demonstrate glycosylation consistency. Glycan compositions of the antibody harvested at different time periods were also measured to illustrate N-glycan distribution over the culture time. From the results, it has been demonstrated that different TGE batches are reproducible from lot to lot in overall cell growth, product yield, and product qualities including isoelectric point, binding affinity, secondary structure, and thermal stability. Furthermore, major N-glycan compositions are consistent among different TGE batches and conserved during cell culture time.

  11. Energy-dissipation-model for metallurgical multi-phase-systems

    Energy Technology Data Exchange (ETDEWEB)

    Mavrommatis, K.T. [Rheinisch-Westfaelische Technische Hochschule Aachen, Aachen (Germany)

    1996-12-31

    Entropy production in real processes is directly associated with the dissipation of energy. Both are potential measures for the proceed of irreversible processes taking place in metallurgical systems. Many of these processes in multi-phase-systems could then be modelled on the basis of the energy-dissipation associated with. As this entity can often be estimated using very simple assumptions from first principles, the evolution of an overall measure of systems behaviour can be studied constructing an energy-dissipation -based model of the system. In this work a formulation of this concept, the Energy-Dissipation-Model (EDM), for metallurgical multi-phase-systems is given. Special examples are studied to illustrate the concept, and benefits as well as the range of validity are shown. This concept might be understood as complement to usual CFD-modelling of complex systems on a more abstract level but reproducing essential attributes of complex metallurgical systems. (author)

  12. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  13. Simulation of the hydrodynamic conditions of the eye to better reproduce the drug release from hydrogel contact lenses: experiments and modeling.

    Science.gov (United States)

    Pimenta, A F R; Valente, A; Pereira, J M C; Pereira, J C F; Filipe, H P; Mata, J L G; Colaço, R; Saramago, B; Serro, A P

    2016-12-01

    Currently, most in vitro drug release studies for ophthalmic applications are carried out in static sink conditions. Although this procedure is simple and useful to make comparative studies, it does not describe adequately the drug release kinetics in the eye, considering the small tear volume and flow rates found in vivo. In this work, a microfluidic cell was designed and used to mimic the continuous, volumetric flow rate of tear fluid and its low volume. The suitable operation of the cell, in terms of uniformity and symmetry of flux, was proved using a numerical model based in the Navier-Stokes and continuity equations. The release profile of a model system (a hydroxyethyl methacrylate-based hydrogel (HEMA/PVP) for soft contact lenses (SCLs) loaded with diclofenac) obtained with the microfluidic cell was compared with that obtained in static conditions, showing that the kinetics of release in dynamic conditions is slower. The application of the numerical model demonstrated that the designed cell can be used to simulate the drug release in the whole range of the human eye tear film volume and allowed to estimate the drug concentration in the volume of liquid in direct contact with the hydrogel. The knowledge of this concentration, which is significantly different from that measured in the experimental tests during the first hours of release, is critical to predict the toxicity of the drug release system and its in vivo efficacy. In conclusion, the use of the microfluidic cell in conjunction with the numerical model shall be a valuable tool to design and optimize new therapeutic drug-loaded SCLs.

  14. Modeling and estimating system availability

    International Nuclear Information System (INIS)

    Gaver, D.P.; Chu, B.B.

    1976-11-01

    Mathematical models to infer the availability of various types of more or less complicated systems are described. The analyses presented are probabilistic in nature and consist of three parts: a presentation of various analytic models for availability; a means of deriving approximate probability limits on system availability; and a means of statistical inference of system availability from sparse data, using a jackknife procedure. Various low-order redundant systems are used as examples, but extension to more complex systems is not difficult

  15. Enhancing reproducibility: Failures from Reproducibility Initiatives underline core challenges.

    Science.gov (United States)

    Mullane, Kevin; Williams, Michael

    2017-08-15

    Efforts to address reproducibility concerns in biomedical research include: initiatives to improve journal publication standards and peer review; increased attention to publishing methodological details that enable experiments to be reconstructed; guidelines on standards for study design, implementation, analysis and execution; meta-analyses of multiple studies within a field to synthesize a common conclusion and; the formation of consortia to adopt uniform protocols and internally reproduce data. Another approach to addressing reproducibility are Reproducibility Initiatives (RIs), well-intended, high-profile, systematically peer-vetted initiatives that are intended to replace the traditional process of scientific self-correction. Outcomes from the RIs reported to date have questioned the usefulness of this approach, particularly when the RI outcome differs from other independent self-correction studies that have reproduced the original finding. As a failed RI attempt is a single outcome distinct from the original study, it cannot provide any definitive conclusions necessitating additional studies that the RI approach has neither the ability nor intent of conducting making it a questionable replacement for self-correction. A failed RI attempt also has the potential to damage the reputation of the author of the original finding. Reproduction is frequently confused with replication, an issue that is more than semantic with the former denoting "similarity" and the latter an "exact copy" - an impossible outcome in research because of known and unknown technical, environmental and motivational differences between the original and reproduction studies. To date, the RI framework has negatively impacted efforts to improve reproducibility, confounding attempts to determine whether a research finding is real. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. A method to isolate bacterial communities and characterize ecosystems from food products: Validation and utilization in as a reproducible chicken meat model.

    Science.gov (United States)

    Rouger, Amélie; Remenant, Benoit; Prévost, Hervé; Zagorec, Monique

    2017-04-17

    Influenced by production and storage processes and by seasonal changes the diversity of meat products microbiota can be very variable. Because microbiotas influence meat quality and safety, characterizing and understanding their dynamics during processing and storage is important for proposing innovative and efficient storage conditions. Challenge tests are usually performed using meat from the same batch, inoculated at high levels with one or few strains. Such experiments do not reflect the true microbial situation, and the global ecosystem is not taken into account. Our purpose was to constitute live stocks of chicken meat microbiotas to create standard and reproducible ecosystems. We searched for the best method to collect contaminating bacterial communities from chicken cuts to store as frozen aliquots. We tested several methods to extract DNA of these stored communities for subsequent PCR amplification. We determined the best moment to collect bacteria in sufficient amounts during the product shelf life. Results showed that the rinsing method associated to the use of Mobio DNA extraction kit was the most reliable method to collect bacteria and obtain DNA for subsequent PCR amplification. Then, 23 different chicken meat microbiotas were collected using this procedure. Microbiota aliquots were stored at -80°C without important loss of viability. Their characterization by cultural methods confirmed the large variability (richness and abundance) of bacterial communities present on chicken cuts. Four of these bacterial communities were used to estimate their ability to regrow on meat matrices. Challenge tests performed on sterile matrices showed that these microbiotas were successfully inoculated and could overgrow the natural microbiota of chicken meat. They can therefore be used for performing reproducible challenge tests mimicking a true meat ecosystem and enabling the possibility to test the influence of various processing or storage conditions on complex meat

  17. Modelling of reverberation enhancement systems

    OpenAIRE

    ROUCH , Jeremy; Schmich , Isabelle; Galland , Marie-Annick

    2012-01-01

    International audience; Electroacoustic enhancement systems are increasingly specified by acoustic consultants to address the requests for a multi-purpose use of performance halls. However, there is still a lack of simple models to predict the effect induced by these systems on the acoustic field. Two models are introduced to establish the impulse responses of a room equipped with a reverberation enhancement system. These models are based on passive impulse responses according to the modified...

  18. MODELLING OF MATERIAL FLOW SYSTEMS

    OpenAIRE

    PÉTER TELEK

    2012-01-01

    Material flow systems are in generally very complex processes. During design, building and operation of complex systems there are many different problems. If these complex processes can be described in a simple model, the tasks will be clearer, better adaptable and easier solvable. As the material flow systems are very different, so using models is a very important aid to create uniform methods and solutions. This paper shows the details of the application possibilities of modelling in the ma...

  19. Dynamic Modeling of ALS Systems

    Science.gov (United States)

    Jones, Harry

    2002-01-01

    The purpose of dynamic modeling and simulation of Advanced Life Support (ALS) systems is to help design them. Static steady state systems analysis provides basic information and is necessary to guide dynamic modeling, but static analysis is not sufficient to design and compare systems. ALS systems must respond to external input variations and internal off-nominal behavior. Buffer sizing, resupply scheduling, failure response, and control system design are aspects of dynamic system design. We develop two dynamic mass flow models and use them in simulations to evaluate systems issues, optimize designs, and make system design trades. One model is of nitrogen leakage in the space station, the other is of a waste processor failure in a regenerative life support system. Most systems analyses are concerned with optimizing the cost/benefit of a system at its nominal steady-state operating point. ALS analysis must go beyond the static steady state to include dynamic system design. All life support systems exhibit behavior that varies over time. ALS systems must respond to equipment operating cycles, repair schedules, and occasional off-nominal behavior or malfunctions. Biological components, such as bioreactors, composters, and food plant growth chambers, usually have operating cycles or other complex time behavior. Buffer sizes, material stocks, and resupply rates determine dynamic system behavior and directly affect system mass and cost. Dynamic simulation is needed to avoid the extremes of costly over-design of buffers and material reserves or system failure due to insufficient buffers and lack of stored material.

  20. Modeling soft interface dominated systems

    NARCIS (Netherlands)

    Lamorgese, A.; Mauri, R.; Sagis, L.M.C.

    2017-01-01

    The two main continuum frameworks used for modeling the dynamics of soft multiphase systems are the Gibbs dividing surface model, and the diffuse interface model. In the former the interface is modeled as a two dimensional surface, and excess properties such as a surface density, or surface energy

  1. Validation of systems biology models

    NARCIS (Netherlands)

    Hasdemir, D.

    2015-01-01

    The paradigm shift from qualitative to quantitative analysis of biological systems brought a substantial number of modeling approaches to the stage of molecular biology research. These include but certainly are not limited to nonlinear kinetic models, static network models and models obtained by the

  2. From Numeric Models to Granular System Modeling

    Directory of Open Access Journals (Sweden)

    Witold Pedrycz

    2015-03-01

    To make this study self-contained, we briefly recall the key concepts of granular computing and demonstrate how this conceptual framework and its algorithmic fundamentals give rise to granular models. We discuss several representative formal setups used in describing and processing information granules including fuzzy sets, rough sets, and interval calculus. Key architectures of models dwell upon relationships among information granules. We demonstrate how information granularity and its optimization can be regarded as an important design asset to be exploited in system modeling and giving rise to granular models. With this regard, an important category of rule-based models along with their granular enrichments is studied in detail.

  3. Coastal Modeling System Advanced Topics

    Science.gov (United States)

    2012-06-18

    22 June 2012 - Day 5  Debugging and Problem solving  Model Calibration  Post-processing Coastal and Hydraulics Laboratory Focus of...Efficiently: • The setup process is fast and without wasted time or effort 3 Coastal and Hydraulics Laboratory 4 Coastal Modeling System (CMS) What...is the CMS? Integrated wave, current, and morphology change model in the Surface- water Modeling System (SMS). Why CMS? Operational at 10

  4. Safeguards system effectiveness modeling

    International Nuclear Information System (INIS)

    Bennett, H.A.; Boozer, D.D.; Chapman, L.D.; Daniel, S.L.; Engi, D.; Hulme, B.L.; Varnado, G.B.

    1976-01-01

    A general methodology for the comparative evaluation of physical protection system effectiveness at nuclear facilities is presently under development. The approach is applicable to problems of sabotage or theft at fuel cycle facilities. The overall methodology and the primary analytic techniques used to assess system effectiveness are briefly outlined

  5. Safeguards system effectiveness modeling

    International Nuclear Information System (INIS)

    Bennett, H.A.; Boozer, D.D.; Chapman, L.D.; Daniel, S.L.; Engi, D.; Hulme, B.L.; Varnado, G.B.

    1976-01-01

    A general methodology for the comparative evaluation of physical protection system effectiveness at nuclear facilities is presently under development. The approach is applicable to problems of sabotage or theft at fuel cycle facilities. In this paper, the overall methodology and the primary analytic techniques used to assess system effectiveness are briefly outlined

  6. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  7. Immortalized keratinocytes derived from patients with epidermolytic ichthyosis reproduce the disease phenotype: a useful in vitro model for testing new treatments.

    Science.gov (United States)

    Chamcheu, J C; Pihl-Lundin, I; Mouyobo, C E; Gester, T; Virtanen, M; Moustakas, A; Navsaria, H; Vahlquist, A; Törmä, H

    2011-02-01

    Epidermolytic ichthyosis (EI) is a skin fragility disorder caused by mutations in genes encoding suprabasal keratins 1 and 10. While the aetiology of EI is known, model systems are needed for pathophysiological studies and development of novel therapies. To generate immortalized keratinocyte lines from patients with EI for studies of EI cell pathology and the effects of chemical chaperones as putative therapies. We derived keratinocytes from three patients with EI and one healthy control and established immortalized keratinocytes using human papillomavirus 16-E6/E7. Growth and differentiation characteristics, ability to regenerate organotypic epidermis, keratin expression, formation of cytoskeletal aggregates, and responses to heat shock and chemical chaperones were assessed. The cell lines EH11 (K1_p.Val176_Lys197del), EH21 (K10_p.156Arg>Gly), EH31 (K10_p.Leu161_Asp162del) and NKc21 (wild-type) currently exceed 160 population doublings and differentiate when exposed to calcium. At resting state, keratin aggregates were detected in 9% of calcium-differentiated EH31 cells, but not in any other cell line. Heat stress further increased this proportion to 30% and also induced aggregates in 3% of EH11 cultures. Treatment with trimethylamine N-oxide and 4-phenylbutyrate (4-PBA) reduced the fraction of aggregate-containing cells and affected the mRNA expression of keratins 1 and 10 while 4-PBA also modified heat shock protein 70 (HSP70) expression. Furthermore, in situ proximity ligation assay suggested a colocalization between HSP70 and keratins 1 and 10. Reconstituted epidermis from EI cells cornified but EH21 and EH31 cells produced suprabasal cytolysis, closely resembling the in vivo phenotype. These immortalized cell lines represent a useful model for studying EI biology and novel therapies. © 2011 The Authors. BJD © 2011 British Association of Dermatologists.

  8. European cold winter 2009-2010: How unusual in the instrumental record and how reproducible in the ARPEGE-Climat model?

    Science.gov (United States)

    Ouzeau, G.; Cattiaux, J.; Douville, H.; Ribes, A.; Saint-Martin, D.

    2011-06-01

    Boreal winter 2009-2010 made headlines for cold anomalies in many countries of the northern mid-latitudes. Northern Europe was severely hit by this harsh winter in line with a record persistence of the negative phase of the North Atlantic Oscillation (NAO). In the present study, we first provide a wider perspective on how unusual this winter was by using the recent 20th Century Reanalysis. A weather regime analysis shows that the frequency of the negative NAO was unprecedented since winter 1939-1940, which is then used as a dynamical analog of winter 2009-2010 to demonstrate that the latter might have been much colder without the background global warming observed during the twentieth century. We then use an original nudging technique in ensembles of global atmospheric simulations driven by observed sea surface temperature (SST) and radiative forcings to highlight the relevance of the stratosphere for understanding if not predicting such anomalous winter seasons. Our results demonstrate that an improved representation of the lower stratosphere is necessary to reproduce not only the seasonal mean negative NAO signal, but also its intraseasonal distribution and the corresponding increased probability of cold waves over northern Europe.

  9. Fluctuation-Driven Neural Dynamics Reproduce Drosophila Locomotor Patterns.

    Directory of Open Access Journals (Sweden)

    Andrea Maesani

    2015-11-01

    Full Text Available The neural mechanisms determining the timing of even simple actions, such as when to walk or rest, are largely mysterious. One intriguing, but untested, hypothesis posits a role for ongoing activity fluctuations in neurons of central action selection circuits that drive animal behavior from moment to moment. To examine how fluctuating activity can contribute to action timing, we paired high-resolution measurements of freely walking Drosophila melanogaster with data-driven neural network modeling and dynamical systems analysis. We generated fluctuation-driven network models whose outputs-locomotor bouts-matched those measured from sensory-deprived Drosophila. From these models, we identified those that could also reproduce a second, unrelated dataset: the complex time-course of odor-evoked walking for genetically diverse Drosophila strains. Dynamical models that best reproduced both Drosophila basal and odor-evoked locomotor patterns exhibited specific characteristics. First, ongoing fluctuations were required. In a stochastic resonance-like manner, these fluctuations allowed neural activity to escape stable equilibria and to exceed a threshold for locomotion. Second, odor-induced shifts of equilibria in these models caused a depression in locomotor frequency following olfactory stimulation. Our models predict that activity fluctuations in action selection circuits cause behavioral output to more closely match sensory drive and may therefore enhance navigation in complex sensory environments. Together these data reveal how simple neural dynamics, when coupled with activity fluctuations, can give rise to complex patterns of animal behavior.

  10. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0....

  11. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0....

  12. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  13. Modelling of Indoor Environments Using Lindenmayer Systems

    Science.gov (United States)

    Peter, M.

    2017-09-01

    Documentation of the "as-built" state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks) or building elements (e.g. façade layouts), formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  14. MODELLING OF INDOOR ENVIRONMENTS USING LINDENMAYER SYSTEMS

    Directory of Open Access Journals (Sweden)

    M. Peter

    2017-09-01

    Full Text Available Documentation of the “as-built” state of building interiors has gained a lot of interest in the recent years. Various data acquisition methods exist, e.g. the extraction from photographed evacuation plans using image processing or, most prominently, indoor mobile laser scanning. Due to clutter or data gaps as well as errors during data acquisition and processing, automatic reconstruction of CAD/BIM-like models from these data sources is not a trivial task. Thus it is often tried to support reconstruction by general rules for the perpendicularity and parallelism which are predominant in man-made structures. Indoor environments of large, public buildings, however, often also follow higher-level rules like symmetry and repetition of e.g. room sizes and corridor widths. In the context of reconstruction of city city elements (e.g. street networks or building elements (e.g. fac¸ade layouts, formal grammars have been put to use. In this paper, we describe the use of Lindenmayer systems - which originally have been developed for the computer-based modelling of plant growth - to model and reproduce the layout of indoor environments in 2D.

  15. Mechanical Systems, Classical Models

    CERN Document Server

    Teodorescu, Petre P

    2007-01-01

    All phenomena in nature are characterized by motion; this is an essential property of matter, having infinitely many aspects. Motion can be mechanical, physical, chemical or biological, leading to various sciences of nature, mechanics being one of them. Mechanics deals with the objective laws of mechanical motion of bodies, the simplest form of motion. In the study of a science of nature mathematics plays an important role. Mechanics is the first science of nature which was expressed in terms of mathematics by considering various mathematical models, associated to phenomena of the surrounding nature. Thus, its development was influenced by the use of a strong mathematical tool; on the other hand, we must observe that mechanics also influenced the introduction and the development of many mathematical notions. In this respect, the guideline of the present book is precisely the mathematical model of mechanics. A special accent is put on the solving methodology as well as on the mathematical tools used; vectors, ...

  16. Stochastic Modelling of Energy Systems

    DEFF Research Database (Denmark)

    Andersen, Klaus Kaae

    2001-01-01

    In this thesis dynamic models of typical components in Danish heating systems are considered. Emphasis is made on describing and evaluating mathematical methods for identification of such models, and on presentation of component models for practical applications. The thesis consists of seven...... of component models, such as e.g. heat exchanger and valve models, adequate for system simulations. Furthermore, the thesis demonstrates and discusses the advantages and disadvantages of using statistical methods in conjunction with physical knowledge in establishing adequate component models of heating...... research papers (case studies) together with a summary report. Each case study takes it's starting point in typical heating system components and both, the applied mathematical modelling methods and the application aspects, are considered. The summary report gives an introduction to the scope...

  17. ITK: enabling reproducible research and open science

    Science.gov (United States)

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  18. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  19. Model Reduction of Hybrid Systems

    DEFF Research Database (Denmark)

    Shaker, Hamid Reza

    for model reduction of switched systems is based on the switching generalized gramians. The reduced order switched system is guaranteed to be stable for all switching signal in this method. This framework uses stability conditions which are based on switching quadratic Lyapunov functions which are less...... conservative than the stability conditions based on common quadratic Lyapunov functions. The stability conditions which are used for this method are very useful in model reduction and design problems because they have slack variables in the conditions. Similar conditions for a class of switched nonlinear......High-Technological solutions of today are characterized by complex dynamical models. A lot of these models have inherent hybrid/switching structure. Hybrid/switched systems are powerful models for distributed embedded systems design where discrete controls are applied to continuous processes...

  20. Evaluation of Land Surface Models in Reproducing Satellite-Derived LAI over the High-Latitude Northern Hemisphere. Part I: Uncoupled DGVMs

    Directory of Open Access Journals (Sweden)

    Ning Zeng

    2013-10-01

    Full Text Available Leaf Area Index (LAI represents the total surface area of leaves above a unit area of ground and is a key variable in any vegetation model, as well as in climate models. New high resolution LAI satellite data is now available covering a period of several decades. This provides a unique opportunity to validate LAI estimates from multiple vegetation models. The objective of this paper is to compare new, satellite-derived LAI measurements with modeled output for the Northern Hemisphere. We compare monthly LAI output from eight land surface models from the TRENDY compendium with satellite data from an Artificial Neural Network (ANN from the latest version (third generation of GIMMS AVHRR NDVI data over the period 1986–2005. Our results show that all the models overestimate the mean LAI, particularly over the boreal forest. We also find that seven out of the eight models overestimate the length of the active vegetation-growing season, mostly due to a late dormancy as a result of a late summer phenology. Finally, we find that the models report a much larger positive trend in LAI over this period than the satellite observations suggest, which translates into a higher trend in the growing season length. These results highlight the need to incorporate a larger number of more accurate plant functional types in all models and, in particular, to improve the phenology of deciduous trees.

  1. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  2. Hydronic distribution system computer model

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, J.W.; Strasser, J.J.

    1994-10-01

    A computer model of a hot-water boiler and its associated hydronic thermal distribution loop has been developed at Brookhaven National Laboratory (BNL). It is intended to be incorporated as a submodel in a comprehensive model of residential-scale thermal distribution systems developed at Lawrence Berkeley. This will give the combined model the capability of modeling forced-air and hydronic distribution systems in the same house using the same supporting software. This report describes the development of the BNL hydronics model, initial results and internal consistency checks, and its intended relationship to the LBL model. A method of interacting with the LBL model that does not require physical integration of the two codes is described. This will provide capability now, with reduced up-front cost, as long as the number of runs required is not large.

  3. Simulation and modeling of systems of systems

    National Research Council Canada - National Science Library

    Cantot, Pascal; Luzeaux, Dominique

    2011-01-01

    ... reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers, or in the case of reprographic reproduction in accordance with the terms and licenses issued by the CLA. Enquiries concerning reproduction outside these terms should be sent to the publishers at the undermentioned address: ISTE Ltd 27-37 S...

  4. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    from the quantified attributes predict overall preference well. The findings allow for some generalizations within musical program genres regarding the perception of and preference for certain spatial reproduction modes, but for limited generalizations across selections from different musical genres.......A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes...

  5. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  6. Intrusion detection: systems and models

    Science.gov (United States)

    Sherif, J. S.; Dearmond, T. G.

    2002-01-01

    This paper puts forward a review of state of the art and state of the applicability of intrusion detection systems, and models. The paper also presents a classfication of literature pertaining to intrusion detection.

  7. Plant and safety system model

    International Nuclear Information System (INIS)

    Beltracchi, Leo

    1999-01-01

    The design and development of a digital computer-based safety system for a nuclear power plant is a complex process. The process of design and product development must result in a final product free of critical errors; operational safety of nuclear power plants must not be compromised. This paper focuses on the development of a safety system model to assist designers, developers, and regulators in establishing and evaluating requirements for a digital computer-based safety system. The model addresses hardware, software, and human elements for use in the requirements definition process. The purpose of the safety system model is to assist and serve as a guide to humans in the cognitive reasoning process of establishing requirements. The goals in the use of the model are to: (1) enhance the completeness of the requirements and (2) reduce the number of errors associated with the requirements definition phase of a project

  8. Mathematical modeling of aeroelastic systems

    Science.gov (United States)

    Velmisov, Petr A.; Ankilov, Andrey V.; Semenova, Elizaveta P.

    2017-12-01

    In the paper, the stability of elastic elements of a class of designs that are in interaction with a gas or liquid flow is investigated. The definition of the stability of an elastic body corresponds to the concept of stability of dynamical systems by Lyapunov. As examples the mathematical models of flowing channels (models of vibration devices) at a subsonic flow and the mathematical models of protective surface at a supersonic flow are considered. Models are described by the related systems of the partial differential equations. An analytic investigation of stability is carried out on the basis of the construction of Lyapunov-type functionals, a numerical investigation is carried out on the basis of the Galerkin method. The various models of the gas-liquid environment (compressed, incompressible) and the various models of a deformable body (elastic linear and elastic nonlinear) are considered.

  9. How well do environmental archives of atmospheric mercury deposition in the Arctic reproduce rates and trends depicted by atmospheric models and measurements?

    Science.gov (United States)

    Goodsite, M E; Outridge, P M; Christensen, J H; Dastoor, A; Muir, D; Travnikov, O; Wilson, S

    2013-05-01

    This review compares the reconstruction of atmospheric Hg deposition rates and historical trends over recent decades in the Arctic, inferred from Hg profiles in natural archives such as lake and marine sediments, peat bogs and glacial firn (permanent snowpack), against those predicted by three state-of-the-art atmospheric models based on global Hg emission inventories from 1990 onwards. Model veracity was first tested against atmospheric Hg measurements. Most of the natural archive and atmospheric data came from the Canadian-Greenland sectors of the Arctic, whereas spatial coverage was poor in other regions. In general, for the Canadian-Greenland Arctic, models provided good agreement with atmospheric gaseous elemental Hg (GEM) concentrations and trends measured instrumentally. However, there are few instrumented deposition data with which to test the model estimates of Hg deposition, and these data suggest models over-estimated deposition fluxes under Arctic conditions. Reconstructed GEM data from glacial firn on Greenland Summit showed the best agreement with the known decline in global Hg emissions after about 1980, and were corroborated by archived aerosol filter data from Resolute, Nunavut. The relatively stable or slowly declining firn and model GEM trends after 1990 were also corroborated by real-time instrument measurements at Alert, Nunavut, after 1995. However, Hg fluxes and trends in northern Canadian lake sediments and a southern Greenland peat bog did not exhibit good agreement with model predictions of atmospheric deposition since 1990, the Greenland firn GEM record, direct GEM measurements, or trends in global emissions since 1980. Various explanations are proposed to account for these discrepancies between atmosphere and archives, including problems with the accuracy of archive chronologies, climate-driven changes in Hg transfer rates from air to catchments, waters and subsequently into sediments, and post-depositional diagenesis in peat bogs

  10. ABSTRACT MODELS FOR SYSTEM VIRTUALIZATION

    Directory of Open Access Journals (Sweden)

    M. G. Koveshnikov

    2015-05-01

    Full Text Available The paper is dedicated to issues of system objects securing (system files and user system or application configuration files against unauthorized access including denial of service attacks. We have suggested the method and developed abstract system virtualization models, which are used toresearch attack scenarios for different virtualization modes. Estimation for system tools virtualization technology effectiveness is given. Suggested technology is based on redirection of access requests to system objects shared among access subjects. Whole and partial system virtualization modes have been modeled. The difference between them is the following: in the whole virtualization mode all copies of access system objects are created whereon subjects’ requests are redirected including corresponding application objects;in the partial virtualization mode corresponding copies are created only for part of a system, for example, only system objects for applications. Alternative solutions effectiveness is valued relating to different attack scenarios. We consider proprietary and approved technical solution which implements system virtualization method for Microsoft Windows OS family. Administrative simplicity and capabilities of correspondingly designed system objects security tools are illustrated on this example. Practical significance of the suggested security method has been confirmed.

  11. Aerodynamic and Mechanical System Modelling

    DEFF Research Database (Denmark)

    Jørgensen, Martin Felix

    This thesis deals with mechanical multibody-systems applied to the drivetrain of a 500 kW wind turbine. Particular focus has been on gearbox modelling of wind turbines. The main part of the present project involved programming multibody systems to investigate the connection between forces, moments...

  12. Chow-Liu trees are sufficient predictive models for reproducing key features of functional networks of periictal EEG time-series.

    Science.gov (United States)

    Steimer, Andreas; Zubler, Frédéric; Schindler, Kaspar

    2015-09-01

    Seizure freedom in patients suffering from pharmacoresistant epilepsies is still not achieved in 20-30% of all cases. Hence, current therapies need to be improved, based on a more complete understanding of ictogenesis. In this respect, the analysis of functional networks derived from intracranial electroencephalographic (iEEG) data has recently become a standard tool. Functional networks however are purely descriptive models and thus are conceptually unable to predict fundamental features of iEEG time-series, e.g., in the context of therapeutical brain stimulation. In this paper we present some first steps towards overcoming the limitations of functional network analysis, by showing that its results are implied by a simple predictive model of time-sliced iEEG time-series. More specifically, we learn distinct graphical models (so called Chow-Liu (CL) trees) as models for the spatial dependencies between iEEG signals. Bayesian inference is then applied to the CL trees, allowing for an analytic derivation/prediction of functional networks, based on thresholding of the absolute value Pearson correlation coefficient (CC) matrix. Using various measures, the thus obtained networks are then compared to those which were derived in the classical way from the empirical CC-matrix. In the high threshold limit we find (a) an excellent agreement between the two networks and (b) key features of periictal networks as they have previously been reported in the literature. Apart from functional networks, both matrices are also compared element-wise, showing that the CL approach leads to a sparse representation, by setting small correlations to values close to zero while preserving the larger ones. Overall, this paper shows the validity of CL-trees as simple, spatially predictive models for periictal iEEG data. Moreover, we suggest straightforward generalizations of the CL-approach for modeling also the temporal features of iEEG signals. Copyright © 2015 Elsevier Inc. All rights

  13. A Neuron Model Based Ultralow Current Sensor System for Bioapplications

    Directory of Open Access Journals (Sweden)

    A. K. M. Arifuzzman

    2016-01-01

    Full Text Available An ultralow current sensor system based on the Izhikevich neuron model is presented in this paper. The Izhikevich neuron model has been used for its superior computational efficiency and greater biological plausibility over other well-known neuron spiking models. Of the many biological neuron spiking features, regular spiking, chattering, and neostriatal spiny projection spiking have been reproduced by adjusting the parameters associated with the model at hand. This paper also presents a modified interpretation of the regular spiking feature in which the firing pattern is similar to that of the regular spiking but with improved dynamic range offering. The sensor current ranges between 2 pA and 8 nA and exhibits linearity in the range of 0.9665 to 0.9989 for different spiking features. The efficacy of the sensor system in detecting low amount of current along with its high linearity attribute makes it very suitable for biomedical applications.

  14. Adjustments in the Almod 3W2 code models for reproducing the net load trip test in Angra I nuclear power plant

    International Nuclear Information System (INIS)

    Camargo, C.T.M.; Madeira, A.A.; Pontedeiro, A.C.; Dominguez, L.

    1986-09-01

    The recorded traces got from the net load trip test in Angra I NPP yelded the oportunity to make fine adjustments in the ALMOD 3W2 code models. The changes are described and the results are compared against plant real data. (Author) [pt

  15. Preface: the hydra model system.

    Science.gov (United States)

    Galliot, Brigitte

    2012-01-01

    The freshwater Hydra polyp emerged as a model system in 1741 when Abraham Trembley not only discovered its amazing regenerative potential, but also demonstrated that experimental manipulations pave the way to research in biology. Since then, Hydra flourished as a potent and fruitful model system to help answer questions linked to cell and developmental biology, as such as the setting up of an organizer to regenerate a complex missing structure, the establishment and maintainance of polarity in a multicellular organism, the development of mathematical models to explain the robust developmental rules observed in this animal, the maintainance of stemness and multipotency in a highly dynamic environment, the plasticity of differentiated cells, to name but a few. However the Hydra model system is not restricted to cell and developmental biology; during the past 270 years it has also been heavily used to investigate the relationships between Hydra and its environment, opening new horizons concerning neurophysiology, innate immunity, ecosystems, ecotoxicology, symbiosis...

  16. Modeling Multi-Level Systems

    CERN Document Server

    Iordache, Octavian

    2011-01-01

    This book is devoted to modeling of multi-level complex systems, a challenging domain for engineers, researchers and entrepreneurs, confronted with the transition from learning and adaptability to evolvability and autonomy for technologies, devices and problem solving methods. Chapter 1 introduces the multi-scale and multi-level systems and highlights their presence in different domains of science and technology. Methodologies as, random systems, non-Archimedean analysis, category theory and specific techniques as model categorification and integrative closure, are presented in chapter 2. Chapters 3 and 4 describe polystochastic models, PSM, and their developments. Categorical formulation of integrative closure offers the general PSM framework which serves as a flexible guideline for a large variety of multi-level modeling problems. Focusing on chemical engineering, pharmaceutical and environmental case studies, the chapters 5 to 8 analyze mixing, turbulent dispersion and entropy production for multi-scale sy...

  17. Analog VLSI-based modeling of the primate oculomotor system.

    Science.gov (United States)

    Horiuchi, T K; Koch, C

    1999-01-01

    One way to understand a neurobiological system is by building a simulacrum that replicates its behavior in real time using similar constraints. Analog very large-scale integrated (VLSI) electronic circuit technology provides such an enabling technology. We here describe a neuromorphic system that is part of a long-term effort to understand the primate oculomotor system. It requires both fast sensory processing and fast motor control to interact with the world. A one-dimensional hardware model of the primate eye has been built that simulates the physical dynamics of the biological system. It is driven by two different analog VLSI chips, one mimicking cortical visual processing for target selection and tracking and another modeling brain stem circuits that drive the eye muscles. Our oculomotor plant demonstrates both smooth pursuit movements, driven by a retinal velocity error signal, and saccadic eye movements, controlled by retinal position error, and can reproduce several behavioral, stimulation, lesion, and adaptation experiments performed on primates.

  18. An International Ki67 Reproducibility Study

    Science.gov (United States)

    2013-01-01

    Background In breast cancer, immunohistochemical assessment of proliferation using the marker Ki67 has potential use in both research and clinical management. However, lack of consistency across laboratories has limited Ki67’s value. A working group was assembled to devise a strategy to harmonize Ki67 analysis and increase scoring concordance. Toward that goal, we conducted a Ki67 reproducibility study. Methods Eight laboratories received 100 breast cancer cases arranged into 1-mm core tissue microarrays—one set stained by the participating laboratory and one set stained by the central laboratory, both using antibody MIB-1. Each laboratory scored Ki67 as percentage of positively stained invasive tumor cells using its own method. Six laboratories repeated scoring of 50 locally stained cases on 3 different days. Sources of variation were analyzed using random effects models with log2-transformed measurements. Reproducibility was quantified by intraclass correlation coefficient (ICC), and the approximate two-sided 95% confidence intervals (CIs) for the true intraclass correlation coefficients in these experiments were provided. Results Intralaboratory reproducibility was high (ICC = 0.94; 95% CI = 0.93 to 0.97). Interlaboratory reproducibility was only moderate (central staining: ICC = 0.71, 95% CI = 0.47 to 0.78; local staining: ICC = 0.59, 95% CI = 0.37 to 0.68). Geometric mean of Ki67 values for each laboratory across the 100 cases ranged 7.1% to 23.9% with central staining and 6.1% to 30.1% with local staining. Factors contributing to interlaboratory discordance included tumor region selection, counting method, and subjective assessment of staining positivity. Formal counting methods gave more consistent results than visual estimation. Conclusions Substantial variability in Ki67 scoring was observed among some of the world’s most experienced laboratories. Ki67 values and cutoffs for clinical decision-making cannot be transferred between laboratories without

  19. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  20. Systematic Methodology for Reproducible Optimizing Batch Operation

    DEFF Research Database (Denmark)

    Bonné, Dennis; Jørgensen, Sten Bay

    2006-01-01

    contribution furthermore presents how the asymptotic convergence of Iterative Learning Control is combined with the closed-loop performance of Model Predictive Control to form a robust and asymptotically stable optimal controller for ensuring reliable and reproducible operation of batch processes....... This controller may also be used for Optimizing control. The modeling and control performance is demonstrated on a fed-batch protein cultivation example. The presented methodologies lend themselves directly for application as Process Analytical Technologies (PAT).......This contribution presents a systematic methodology for rapid acquirement of discrete-time state space model representations of batch processes based on their historical operation data. These state space models are parsimoniously parameterized as a set of local, interdependent models. The present...

  1. System Convergence in Transport Modelling

    DEFF Research Database (Denmark)

    Rich, Jeppe; Nielsen, Otto Anker; Cantarella, Guilio E.

    2010-01-01

    A fundamental premise of most applied transport models is the existence and uniqueness of an equilibrium solution that balances demand x(t) and supply t(x). The demand consists of the people that travel in the transport system and on the defined network, whereas the supply consists of the resulting...... level-of-service attributes (e.g., travel time and cost) offered to travellers. An important source of complexity is the congestion, which causes increasing demand to affect travel time in a non-linear way. Transport models most often involve separate models for traffic assignment and demand modelling...

  2. Nutrient cycle benchmarks for earth system land model

    Science.gov (United States)

    Zhu, Q.; Riley, W. J.; Tang, J.; Zhao, L.

    2017-12-01

    Projecting future biosphere-climate feedbacks using Earth system models (ESMs) relies heavily on robust modeling of land surface carbon dynamics. More importantly, soil nutrient (particularly, nitrogen (N) and phosphorus (P)) dynamics strongly modulate carbon dynamics, such as plant sequestration of atmospheric CO2. Prevailing ESM land models all consider nitrogen as a potentially limiting nutrient, and several consider phosphorus. However, including nutrient cycle processes in ESM land models potentially introduces large uncertainties that could be identified and addressed by improved observational constraints. We describe the development of two nutrient cycle benchmarks for ESM land models: (1) nutrient partitioning between plants and soil microbes inferred from 15N and 33P tracers studies and (2) nutrient limitation effects on carbon cycle informed by long-term fertilization experiments. We used these benchmarks to evaluate critical hypotheses regarding nutrient cycling and their representation in ESMs. We found that a mechanistic representation of plant-microbe nutrient competition based on relevant functional traits best reproduced observed plant-microbe nutrient partitioning. We also found that for multiple-nutrient models (i.e., N and P), application of Liebig's law of the minimum is often inaccurate. Rather, the Multiple Nutrient Limitation (MNL) concept better reproduces observed carbon-nutrient interactions.

  3. Prediction of lung tumour position based on spirometry and on abdominal displacement: Accuracy and reproducibility

    International Nuclear Information System (INIS)

    Hoisak, Jeremy D.P.; Sixel, Katharina E.; Tirona, Romeo; Cheung, Patrick C.F.; Pignol, Jean-Philippe

    2006-01-01

    Background and purpose: A simulation investigating the accuracy and reproducibility of a tumour motion prediction model over clinical time frames is presented. The model is formed from surrogate and tumour motion measurements, and used to predict the future position of the tumour from surrogate measurements alone. Patients and methods: Data were acquired from five non-small cell lung cancer patients, on 3 days. Measurements of respiratory volume by spirometry and abdominal displacement by a real-time position tracking system were acquired simultaneously with X-ray fluoroscopy measurements of superior-inferior tumour displacement. A model of tumour motion was established and used to predict future tumour position, based on surrogate input data. The calculated position was compared against true tumour motion as seen on fluoroscopy. Three different imaging strategies, pre-treatment, pre-fraction and intrafractional imaging, were employed in establishing the fitting parameters of the prediction model. The impact of each imaging strategy upon accuracy and reproducibility was quantified. Results: When establishing the predictive model using pre-treatment imaging, four of five patients exhibited poor interfractional reproducibility for either surrogate in subsequent sessions. Simulating the formulation of the predictive model prior to each fraction resulted in improved interfractional reproducibility. The accuracy of the prediction model was only improved in one of five patients when intrafractional imaging was used. Conclusions: Employing a prediction model established from measurements acquired at planning resulted in localization errors. Pre-fractional imaging improved the accuracy and reproducibility of the prediction model. Intrafractional imaging was of less value, suggesting that the accuracy limit of a surrogate-based prediction model is reached with once-daily imaging

  4. Repeatability and reproducibility of Population Viability Analysis (PVA and the implications for threatened species management

    Directory of Open Access Journals (Sweden)

    Clare Morrison

    2016-08-01

    Full Text Available Conservation triage focuses on prioritizing species, populations or habitats based on urgency, biodiversity benefits, recovery potential as well as cost. Population Viability Analysis (PVA is frequently used in population focused conservation prioritizations. The critical nature of many of these management decisions requires that PVA models are repeatable and reproducible to reliably rank species and/or populations quantitatively. This paper assessed the repeatability and reproducibility of a subset of previously published PVA models. We attempted to rerun baseline models from 90 publicly available PVA studies published between 2000-2012 using the two most common PVA modelling software programs, VORTEX and RAMAS-GIS. Forty percent (n = 36 failed, 50% (45 were both repeatable and reproducible, and 10% (9 had missing baseline models. Repeatability was not linked to taxa, IUCN category, PVA program version used, year published or the quality of publication outlet, suggesting that the problem is systemic within the discipline. Complete and systematic presentation of PVA parameters and results are needed to ensure that the scientific input into conservation planning is both robust and reliable, thereby increasing the chances of making decisions that are both beneficial and defensible. The implications for conservation triage may be far reaching if population viability models cannot be reproduced with confidence, thus undermining their intended value.

  5. Executive Information Systems' Multidimensional Models

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available Executive Information Systems are design to improve the quality of strategic level of management in organization through a new type of technology and several techniques for extracting, transforming, processing, integrating and presenting data in such a way that the organizational knowledge filters can easily associate with this data and turn it into information for the organization. These technologies are known as Business Intelligence Tools. But in order to build analytic reports for Executive Information Systems (EIS in an organization we need to design a multidimensional model based on the business model from the organization. This paper presents some multidimensional models that can be used in EIS development and propose a new model that is suitable for strategic business requests.

  6. Transposition of a Process-Based Model, Flumy: from Meandering Fluvial Systems to Channelized Turbidite Systems

    Science.gov (United States)

    Lemay, M.; Cojan, I.; Rivoirard, J.; Grimaud, J. L.; Ors, F.

    2017-12-01

    Channelized turbidite systems are among the most important hydrocarbon reservoirs. Yet building realistic turbidite reservoir models is still a challenge. Flumy has been firstly developed to simulate the long-term evolution of aggrading meandering fluvial systems in order to build facies reservoir models. In this study, Flumy has been transposed to channelized turbidite systems. The channel migration linear model of Imran et al. (1999) dedicated to subaqueous flows has been implemented. The whole model has been calibrated taking into account the differences on channel morphology, avulsion frequency, and aggradation and migration rates. This calibration and the comparison of the model to natural systems rely on: i) the channel planform morphology characterized by the meander wavelength, amplitude, and sinuosity; ii) the channel trajectory and the resulting stratigraphic architecture described using Jobe et al. (2016) indexes. Flumy succeeds in reproducing turbidite channel planform morphology as shown by the mean sinuosity of 1.7, the wavelength to width and amplitude to width ratios around 4 and 1 respectively. First-order meander architecture, characterized by the ratios meander belt width versus channel width, meander belt thickness versus channel depth, and the deduced stratigraphic mobility number (the ratio between lateral versus vertical channel displacements), is also well reproduced: 2.5, 3.8, and 0.6 respectively. Both lateral and downstream channel normalized migrations are around 3.5 times lower than in fluvial systems. All these values are absolutely coherent with the observations. In the other hand, the channel trajectory observed on seismic cross sections (hockey stick geometry) is not fully reproduced: the local stratigraphic mobility number is divided upward by 3 whereas up to 10 is expected. This behavior is generally explained in the literature by an increasing aggradation rate through time and/or flow stripping at outer bend that decreases

  7. Numerical Modeling of Microelectrochemical Systems

    DEFF Research Database (Denmark)

    Adesokan, Bolaji James

    for the reactants in the bulk electrolyte that are traveling waves. The first paper presents the mathematical model which describes an electrochemical system and simulates an electroanalytical technique called cyclic voltammetry. The model is governed by a system of advection–diffusion equations with a nonlinear...... reaction term at the boundary. We investigate the effect of flow rates, scan rates, and concentration on the cyclic voltammetry. We establish that high flow rates lead to the reduced hysteresis in the cyclic voltammetry curves and increasing scan rates lead to more pronounced current peaks. The final part...... of the paper shows that the response current in a cyclic voltammetry increases proportionally to the electrolyte concentration. In the second paper we present an experiment of an electrochemical system in a microfluidc system and compare the result to the numerical solutions. We investigate how the position...

  8. System Code Models and Capabilities

    International Nuclear Information System (INIS)

    Bestion, D.

    2008-01-01

    System thermalhydraulic codes such as RELAP, TRACE, CATHARE or ATHLET are now commonly used for reactor transient simulations. The whole methodology of code development is described including the derivation of the system of equations, the analysis of experimental data to obtain closure relation and the validation process. The characteristics of the models are briefly presented starting with the basic assumptions, the system of equations and the derivation of closure relationships. An extensive work was devoted during the last three decades to the improvement and validation of these models, which resulted in some homogenisation of the different codes although separately developed. The so called two-fluid model is the common basis of these codes and it is shown how it can describe both thermal and mechanical nonequilibrium. A review of some important physical models allows to illustrate the main capabilities and limitations of system codes. Attention is drawn on the role of flow regime maps, on the various methods for developing closure laws, on the role of interfacial area and turbulence on interfacial and wall transfers. More details are given for interfacial friction laws and their relation with drift flux models. Prediction of chocked flow and CFFL is also addressed. Based on some limitations of the present generation of codes, perspectives for future are drawn.

  9. Experimental Modeling of Dynamic Systems

    DEFF Research Database (Denmark)

    Knudsen, Morten Haack

    2006-01-01

    An engineering course, Simulation and Experimental Modeling, has been developed that is based on a method for direct estimation of physical parameters in dynamic systems. Compared with classical system identification, the method appears to be easier to understand, apply, and combine with physical...... insight. It is based on a sensitivity approach that is useful for choice of model structure, for experiment design, and for accuracy verification. The method is implemented in the Matlab toolkit Senstools. The method and the presentation have been developed with generally preferred learning styles in mind...

  10. Compositional Modeling of Biological Systems

    OpenAIRE

    Zámborszky, Judit

    2010-01-01

    Molecular interactions are wired in a fascinating way resulting in complex behavior of bio-logical systems. Theoretical modeling provides us a useful framework for understanding the dynamics and the function of such networks. The complexity of the biological systems calls for conceptual tools that manage the combinatorial explosion of the set of possible interac-tions. A suitable conceptual tool to attack complexity is compositionality, already success-fully used in the process algebra field ...

  11. The ternary sorption system U(VI)-phosphate-silica explained by spectroscopy and thermodynamic modelling

    Energy Technology Data Exchange (ETDEWEB)

    Foerstendorf, Harald; Stockmann, Madlen; Heim, Karsten; Mueller, Katharina; Brendler, Vinzenz [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Surface Processes; Comarmond, M.J.; Payne, T.E. [Australian Nuclear Science and Technology Organisation, Lucas Heights (Australia); Steudtner, Robin [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Inst. of Resource Ecology

    2017-06-01

    Spectroscopic data of sorption processes potentially provide direct impact on Surface Complexation Modelling (SCM) approaches. Based on spectroscopic data of the ternary sorption system U(VI)/phosphate/silica strongly suggesting the formation of a precipitate as the predominant surface process, SCM calculations accurately reproduced results from classical batch experiments.

  12. b-c system approach to minimal models. The genus-zero case

    International Nuclear Information System (INIS)

    Bonora, L.; Matone, M.; Toppan, F.; Wu Ke

    1989-01-01

    We present a method based on real weight b-c system to study conformal minimal models. In particular it reproduces the results of the Coulomb gas approach, while shedding light on the topological origin of the charge at infinity. (orig.)

  13. The ternary sorption system U(VI)-phosphate-silica explained by spectroscopy and thermodynamic modelling

    International Nuclear Information System (INIS)

    Foerstendorf, Harald; Stockmann, Madlen; Heim, Karsten; Mueller, Katharina; Brendler, Vinzenz; Steudtner, Robin

    2017-01-01

    Spectroscopic data of sorption processes potentially provide direct impact on Surface Complexation Modelling (SCM) approaches. Based on spectroscopic data of the ternary sorption system U(VI)/phosphate/silica strongly suggesting the formation of a precipitate as the predominant surface process, SCM calculations accurately reproduced results from classical batch experiments.

  14. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  15. A Mouse Model That Reproduces the Developmental Pathways and Site Specificity of the Cancers Associated With the Human BRCA1 Mutation Carrier State

    Directory of Open Access Journals (Sweden)

    Ying Liu

    2015-10-01

    Full Text Available Predisposition to breast and extrauterine Müllerian carcinomas in BRCA1 mutation carriers is due to a combination of cell-autonomous consequences of BRCA1 inactivation on cell cycle homeostasis superimposed on cell-nonautonomous hormonal factors magnified by the effects of BRCA1 mutations on hormonal changes associated with the menstrual cycle. We used the Müllerian inhibiting substance type 2 receptor (Mis2r promoter and a truncated form of the Follicle stimulating hormone receptor (Fshr promoter to introduce conditional knockouts of Brca1 and p53 not only in mouse mammary and Müllerian epithelia, but also in organs that control the estrous cycle. Sixty percent of the double mutant mice developed invasive Müllerian and mammary carcinomas. Mice carrying heterozygous mutations in Brca1 and p53 also developed invasive tumors, albeit at a lesser (30% rate, in which the wild type alleles were no longer present due to loss of heterozygosity. While mice carrying heterozygous mutations in both genes developed mammary tumors, none of the mice carrying only a heterozygous p53 mutation developed such tumors (P < 0.0001, attesting to a role for Brca1 mutations in tumor development. This mouse model is attractive to investigate cell-nonautonomous mechanisms associated with cancer predisposition in BRCA1 mutation carriers and to investigate the merit of chemo-preventive drugs targeting such mechanisms.

  16. Parametric Modeling for Fluid Systems

    Science.gov (United States)

    Pizarro, Yaritzmar Rosario; Martinez, Jonathan

    2013-01-01

    Fluid Systems involves different projects that require parametric modeling, which is a model that maintains consistent relationships between elements as is manipulated. One of these projects is the Neo Liquid Propellant Testbed, which is part of Rocket U. As part of Rocket U (Rocket University), engineers at NASA's Kennedy Space Center in Florida have the opportunity to develop critical flight skills as they design, build and launch high-powered rockets. To build the Neo testbed; hardware from the Space Shuttle Program was repurposed. Modeling for Neo, included: fittings, valves, frames and tubing, between others. These models help in the review process, to make sure regulations are being followed. Another fluid systems project that required modeling is Plant Habitat's TCUI test project. Plant Habitat is a plan to develop a large growth chamber to learn the effects of long-duration microgravity exposure to plants in space. Work for this project included the design and modeling of a duct vent for flow test. Parametric Modeling for these projects was done using Creo Parametric 2.0.

  17. Model checking embedded system designs

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    2002-01-01

    Model checking has established itself as a successful tool supported technique for the verification and debugging of various hardware and software systems [16]. Not only in academia, but also by industry this technique is increasingly being regarded as a promising and practical proposition,

  18. GENERIC model for multiphase systems

    NARCIS (Netherlands)

    Sagis, L.M.C.

    2010-01-01

    GENERIC is a nonequilibrium thermodynamic formalism in which the dynamic behavior of a system is described by a single compact equation involving two types of brackets: a Poisson bracket and a dissipative bracket. This formalism has proved to be a very powerful instrument to model the dynamic

  19. An extensible analysable system model

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, Rene Rydhof

    2008-01-01

    Analysing real-world systems for vulnerabilities with respect to security and safety threats is a difficult undertaking, not least due to a lack of availability of formalisations for those systems. While both formalisations and analyses can be found for artificial systems such as software......, this does not hold for real physical systems. Approaches such as threat modelling try to target the formalisation of the real-world domain, but still are far from the rigid techniques available in security research. Many currently available approaches to assurance of critical infrastructure security...... are based on (quite successful) ad-hoc techniques. We believe they can be significantly improved beyond the state-of-the-art by pairing them with static analyses techniques. In this paper we present an approach to both formalising those real-world systems, as well as providing an underlying semantics, which...

  20. Systems modeling for laser IFE

    Science.gov (United States)

    Meier, W. R.; Raffray, A. R.; Sviatoslavsky, I. N.

    2006-06-01

    A systems model of a laser-driven IFE power plant is being developed to assist in design trade-offs and optimization. The focus to date has been on modeling the fusion chamber, blanket and power conversion system. A self-consistent model has been developed to determine key chamber and thermal cycle parameters (e.g., chamber radius, structure and coolant temperatures, cycle efficiency, etc.) as a function of the target yield and pulse repetition rate. Temperature constraints on the tungsten armor, ferritic steel wall, and structure/coolant interface are included in evaluating the potential design space. Results are presented for a lithium cooled first wall coupled with a Brayton power cycle. LLNL work performed under the auspices of the US Department of Energy by the University of California LLNL under Contract W-7405-Eng-48.

  1. Evaluation of Statistical Downscaling Skill at Reproducing Extreme Events

    Science.gov (United States)

    McGinnis, S. A.; Tye, M. R.; Nychka, D. W.; Mearns, L. O.

    2015-12-01

    Climate model outputs usually have much coarser spatial resolution than is needed by impacts models. Although higher resolution can be achieved using regional climate models for dynamical downscaling, further downscaling is often required. The final resolution gap is often closed with a combination of spatial interpolation and bias correction, which constitutes a form of statistical downscaling. We use this technique to downscale regional climate model data and evaluate its skill in reproducing extreme events. We downscale output from the North American Regional Climate Change Assessment Program (NARCCAP) dataset from its native 50-km spatial resolution to the 4-km resolution of University of Idaho's METDATA gridded surface meterological dataset, which derives from the PRISM and NLDAS-2 observational datasets. We operate on the major variables used in impacts analysis at a daily timescale: daily minimum and maximum temperature, precipitation, humidity, pressure, solar radiation, and winds. To interpolate the data, we use the patch recovery method from the Earth System Modeling Framework (ESMF) regridding package. We then bias correct the data using Kernel Density Distribution Mapping (KDDM), which has been shown to exhibit superior overall performance across multiple metrics. Finally, we evaluate the skill of this technique in reproducing extreme events by comparing raw and downscaled output with meterological station data in different bioclimatic regions according to the the skill scores defined by Perkins et al in 2013 for evaluation of AR4 climate models. We also investigate techniques for improving bias correction of values in the tails of the distributions. These techniques include binned kernel density estimation, logspline kernel density estimation, and transfer functions constructed by fitting the tails with a generalized pareto distribution.

  2. Cotangent Models for Integrable Systems

    Science.gov (United States)

    Kiesenhofer, Anna; Miranda, Eva

    2017-03-01

    We associate cotangent models to a neighbourhood of a Liouville torus in symplectic and Poisson manifolds focusing on b-Poisson/ b-symplectic manifolds. The semilocal equivalence with such models uses the corresponding action-angle theorems in these settings: the theorem of Liouville-Mineur-Arnold for symplectic manifolds and an action-angle theorem for regular Liouville tori in Poisson manifolds (Laurent- Gengoux et al., IntMath Res Notices IMRN 8: 1839-1869, 2011). Our models comprise regular Liouville tori of Poisson manifolds but also consider the Liouville tori on the singular locus of a b-Poisson manifold. For this latter class of Poisson structures we define a twisted cotangent model. The equivalence with this twisted cotangent model is given by an action-angle theorem recently proved by the authors and Scott (Math. Pures Appl. (9) 105(1):66-85, 2016). This viewpoint of cotangent models provides a new machinery to construct examples of integrable systems, which are especially valuable in the b-symplectic case where not many sources of examples are known. At the end of the paper we introduce non-degenerate singularities as lifted cotangent models on b-symplectic manifolds and discuss some generalizations of these models to general Poisson manifolds.

  3. Normative 3D acetabular orientation measurements by the low-dose EOS imaging system in 102 asymptomatic subjects in standing position: Analyses by side, gender, pelvic incidence and reproducibility.

    Science.gov (United States)

    Thelen, T; Thelen, P; Demezon, H; Aunoble, S; Le Huec, J-C

    2017-04-01

    Three-dimensional (3D) acetabular orientation is a fundamental topic in orthopedic surgery. Computed tomography (CT) allows 3D measurement of native acetabular orientation, but with a substantial radiation dose. The EOS imaging system was developed to perform this kind of evaluation, but has not been validated in this indication with specific attention to the acetabulum. We therefore performed a prospective study using EOS to assess: (1) the reproducibility of the 3D acetabulum orientation measures; (2) normative asymptomatic acetabular morphology in standing position, according to side and gender; and (3) the relationship between acetabular anteversion and pelvic incidence. The low-dose EOS imaging system is a reproducible method for measuring 3D acetabular orientation in standing position. In a previous prospective study of spine sagittal balance, 165 asymptomatic volunteers were examined on whole-body EOS biplanar X-ray; 102 had appropriate images for pelvic and acetabular analysis, with an equal sex-ratio (53 female, 49 male). These EOS images were reviewed using sterEOS 3D software, allowing automatic measurement of acetabular parameters (anteversion and inclination) and pelvic parameters (pelvic incidence, pelvic tilt and sacral slope) in an anatomical (anterior pelvic plane: APP) and a functional reference plane (patient vertical plane: PVP). Both intra- and inter-observer analysis showed good agreement (ICC>0.90); Bland-Altman plot distributions were good. Acetabular anatomical anteversion and inclination relative to APP (AAAPP and AIAPP, respectively) were significantly greater in women than in men, with no effect of side (right AAA: women 21.3°±3.4° vs. men 16.1°±3.3° (P62°) correlated significantly with functional acetabular orientation in standing position: PVP functional anteversion decreased by 5° relative to APP anteversion with incidence APP with incidence 44-62°, and or was greater by 4° relative to APP with incidence >62°. The use of a

  4. Surface wind mixing in the Regional Ocean Modeling System (ROMS)

    Science.gov (United States)

    Robertson, Robin; Hartlipp, Paul

    2017-12-01

    Mixing at the ocean surface is key for atmosphere-ocean interactions and the distribution of heat, energy, and gases in the upper ocean. Winds are the primary force for surface mixing. To properly simulate upper ocean dynamics and the flux of these quantities within the upper ocean, models must reproduce mixing in the upper ocean. To evaluate the performance of the Regional Ocean Modeling System (ROMS) in replicating the surface mixing, the results of four different vertical mixing parameterizations were compared against observations, using the surface mixed layer depth, the temperature fields, and observed diffusivities for comparisons. The vertical mixing parameterizations investigated were Mellor- Yamada 2.5 level turbulent closure (MY), Large- McWilliams- Doney Kpp (LMD), Nakanishi- Niino (NN), and the generic length scale (GLS) schemes. This was done for one temperate site in deep water in the Eastern Pacific and three shallow water sites in the Baltic Sea. The model reproduced the surface mixed layer depth reasonably well for all sites; however, the temperature fields were reproduced well for the deep site, but not for the shallow Baltic Sea sites. In the Baltic Sea, the models overmixed the water column after a few days. Vertical temperature diffusivities were higher than those observed and did not show the temporal fluctuations present in the observations. The best performance was by NN and MY; however, MY became unstable in two of the shallow simulations with high winds. The performance of GLS nearly as good as NN and MY. LMD had the poorest performance as it generated temperature diffusivities that were too high and induced too much mixing. Further observational comparisons are needed to evaluate the effects of different stratification and wind conditions and the limitations on the vertical mixing parameterizations.

  5. Letter regarding 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics' by Patrizi et al. and research reproducibility.

    Science.gov (United States)

    2017-04-01

    The reporting of research in a manner that allows reproduction in subsequent investigations is important for scientific progress. Several details of the recent study by Patrizi et al., 'Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics', are absent from the published manuscript and make reproduction of findings impossible. As new and complex technologies with great promise for ergonomics develop, new but surmountable challenges for reporting investigations using these technologies in a reproducible manner arise. Practitioner Summary: As with traditional methods, scientific reporting of new and complex ergonomics technologies should be performed in a manner that allows reproduction in subsequent investigations and supports scientific advancement.

  6. Evaluation of the agonist PET radioligand [¹¹C]GR103545 to image kappa opioid receptor in humans: kinetic model selection, test-retest reproducibility and receptor occupancy by the antagonist PF-04455242.

    Science.gov (United States)

    Naganawa, Mika; Jacobsen, Leslie K; Zheng, Ming-Qiang; Lin, Shu-Fei; Banerjee, Anindita; Byon, Wonkyung; Weinzimmer, David; Tomasi, Giampaolo; Nabulsi, Nabeel; Grimwood, Sarah; Badura, Lori L; Carson, Richard E; McCarthy, Timothy J; Huang, Yiyun

    2014-10-01

    Kappa opioid receptors (KOR) are implicated in several brain disorders. In this report, a first-in-human positron emission tomography (PET) study was conducted with the potent and selective KOR agonist tracer, [(11)C]GR103545, to determine an appropriate kinetic model for analysis of PET imaging data and assess the test-retest reproducibility of model-derived binding parameters. The non-displaceable distribution volume (V(ND)) was estimated from a blocking study with naltrexone. In addition, KOR occupancy of PF-04455242, a selective KOR antagonist that is active in preclinical models of depression, was also investigated. For determination of a kinetic model and evaluation of test-retest reproducibility, 11 subjects were scanned twice with [(11)C]GR103545. Seven subjects were scanned before and 75 min after oral administration of naltrexone (150 mg). For the KOR occupancy study, six subjects were scanned at baseline and 1.5 h and 8 h after an oral dose of PF-04455242 (15 mg, n=1 and 30 mg, n=5). Metabolite-corrected arterial input functions were measured and all scans were 150 min in duration. Regional time-activity curves (TACs) were analyzed with 1- and 2-tissue compartment models (1TC and 2TC) and the multilinear analysis (MA1) method to derive regional volume of distribution (V(T)). Relative test-retest variability (TRV), absolute test-retest variability (aTRV) and intra-class coefficient (ICC) were calculated to assess test-retest reproducibility of regional VT. Occupancy plots were computed for blocking studies to estimate occupancy and V(ND). The half maximal inhibitory concentration (IC50) of PF-04455242 was determined from occupancies and drug concentrations in plasma. [(11)C]GR103545 in vivo K(D) was also estimated. Regional TACs were well described by the 2TC model and MA1. However, 2TC VT was sometimes estimated with high standard error. Thus MA1 was the model of choice. Test-retest variability was ~15%, depending on the outcome measure. The blocking

  7. A Low-Cost Anthropometric Walking Robot for Reproducing Gait Lab Data

    Directory of Open Access Journals (Sweden)

    Rogério Eduardo da Silva Santana

    2008-01-01

    Full Text Available Human gait analysis is one of the resources that may be used in the study and treatment of pathologies of the locomotive system. This paper deals with the modelling and control aspects of the design, construction and testing of a biped walking robot conceived to, in limited extents, reproduce the human gait. Robot dimensions have been chosen in order to guarantee anthropomorphic proportions and then to help health professionals in gait studies. The robot has been assembled with low-cost components and can reproduce, in an assisted way, real-gait patterns generated from data previously acquired in gait laboratories. Part of the simulated and experimental results are addressed to demonstrate the ability of the biped robot in reproducing normal and pathological human gait.

  8. Graph modeling systems and methods

    Science.gov (United States)

    Neergaard, Mike

    2015-10-13

    An apparatus and a method for vulnerability and reliability modeling are provided. The method generally includes constructing a graph model of a physical network using a computer, the graph model including a plurality of terminating vertices to represent nodes in the physical network, a plurality of edges to represent transmission paths in the physical network, and a non-terminating vertex to represent a non-nodal vulnerability along a transmission path in the physical network. The method additionally includes evaluating the vulnerability and reliability of the physical network using the constructed graph model, wherein the vulnerability and reliability evaluation includes a determination of whether each terminating and non-terminating vertex represents a critical point of failure. The method can be utilized to evaluate wide variety of networks, including power grid infrastructures, communication network topologies, and fluid distribution systems.

  9. Probabilistic models for feedback systems.

    Energy Technology Data Exchange (ETDEWEB)

    Grace, Matthew D.; Boggs, Paul T.

    2011-02-01

    In previous work, we developed a Bayesian-based methodology to analyze the reliability of hierarchical systems. The output of the procedure is a statistical distribution of the reliability, thus allowing many questions to be answered. The principal advantage of the approach is that along with an estimate of the reliability, we also can provide statements of confidence in the results. The model is quite general in that it allows general representations of all of the distributions involved, it incorporates prior knowledge into the models, it allows errors in the 'engineered' nodes of a system to be determined by the data, and leads to the ability to determine optimal testing strategies. In this report, we provide the preliminary steps necessary to extend this approach to systems with feedback. Feedback is an essential component of 'complexity' and provides interesting challenges in modeling the time-dependent action of a feedback loop. We provide a mechanism for doing this and analyze a simple case. We then consider some extensions to more interesting examples with local control affecting the entire system. Finally, a discussion of the status of the research is also included.

  10. Stochastic Modelling of Hydrologic Systems

    DEFF Research Database (Denmark)

    Jonsdottir, Harpa

    2007-01-01

    In this PhD project several stochastic modelling methods are studied and applied on various subjects in hydrology. The research was prepared at Informatics and Mathematical Modelling at the Technical University of Denmark. The thesis is divided into two parts. The first part contains an introduct......In this PhD project several stochastic modelling methods are studied and applied on various subjects in hydrology. The research was prepared at Informatics and Mathematical Modelling at the Technical University of Denmark. The thesis is divided into two parts. The first part contains...... an introduction and an overview of the papers published. Then an introduction to basic concepts in hydrology along with a description of hydrological data is given. Finally an introduction to stochastic modelling is given. The second part contains the research papers. In the research papers the stochastic methods...... are described, as at the time of publication these methods represent new contribution to hydrology. The second part also contains additional description of software used and a brief introduction to stiff systems. The system in one of the papers is stiff....

  11. Reproducibility between conventional and digital periapical radiography for bone height measurement

    Directory of Open Access Journals (Sweden)

    Miguel Simancas Pallares

    2015-10-01

    Conclusions. Reproducibility between methods was considered poor, including subgroup analysis, therefore, reproducibility between methods is minimal. Usage of these methods in periodontics should be made implementing the whole knowledge of the technical features and the advantages of these systems.

  12. Discrete modelling of drapery systems

    Science.gov (United States)

    Thoeni, Klaus; Giacomini, Anna

    2016-04-01

    Drapery systems are an efficient and cost-effective measure in preventing and controlling rockfall hazards on rock slopes. The simplest form consists of a row of ground anchors along the top of the slope connected to a horizontal support cable from which a wire mesh is suspended down the face of the slope. Such systems are generally referred to as simple or unsecured draperies (Badger and Duffy 2012). Variations such as secured draperies, where a pattern of ground anchors is incorporated within the field of the mesh, and hybrid systems, where the upper part of an unsecured drapery is elevated to intercept rockfalls originating upslope of the installation, are becoming more and more popular. This work presents a discrete element framework for simulation of unsecured drapery systems and its variations. The numerical model is based on the classical discrete element method (DEM) and implemented into the open-source framework YADE (Šmilauer et al., 2010). The model takes all relevant interactions between block, drapery and slope into account (Thoeni et al., 2014) and was calibrated and validated based on full-scale experiments (Giacomini et al., 2012).The block is modelled as a rigid clump made of spherical particles which allows any shape to be approximated. The drapery is represented by a set of spherical particle with remote interactions. The behaviour of the remote interactions is governed by the constitutive behaviour of the wire and generally corresponds to a piecewise linear stress-strain relation (Thoeni et al., 2013). The same concept is used to model wire ropes. The rock slope is represented by rigid triangular elements where material properties (e.g., normal coefficient of restitution, friction angle) are assigned to each triangle. The capabilities of the developed model to simulate drapery systems and estimate the residual hazard involved with such systems is shown. References Badger, T.C., Duffy, J.D. (2012) Drapery systems. In: Turner, A.K., Schuster R

  13. Validation of a Perceptual Distraction Model in a Complex Personal Sound Zone System

    DEFF Research Database (Denmark)

    Rämö, Jussi; Marsh, Steven; Bech, Søren

    2016-01-01

    This paper evaluates a previously proposed perceptual model predicting user’s perceived distraction caused by interfering audio programmes. The distraction model was originally trained using a simple sound reproduction system for music-on-music interference situations and it has not been formally...... tested using more complex sound systems. A listening experiment was conducted to evaluate the performance of the model, using music target and speech interferer reproduced by a complex personal sound-zone system. The model was found to successfully predict the perceived distraction of a more complex...

  14. Organ-on-a-Chip Technology for Reproducing Multiorgan Physiology.

    Science.gov (United States)

    Lee, Seung Hwan; Sung, Jong Hwan

    2018-01-01

    In the drug development process, the accurate prediction of drug efficacy and toxicity is important in order to reduce the cost, labor, and effort involved. For this purpose, conventional 2D cell culture models are used in the early phase of drug development. However, the differences between the in vitro and the in vivo systems have caused the failure of drugs in the later phase of the drug-development process. Therefore, there is a need for a novel in vitro model system that can provide accurate information for evaluating the drug efficacy and toxicity through a closer recapitulation of the in vivo system. Recently, the idea of using microtechnology for mimicking the microscale tissue environment has become widespread, leading to the development of "organ-on-a-chip." Furthermore, the system is further developed for realizing a multiorgan model for mimicking interactions between multiple organs. These advancements are still ongoing and are aimed at ultimately developing "body-on-a-chip" or "human-on-a-chip" devices for predicting the response of the whole body. This review summarizes recently developed organ-on-a-chip technologies, and their applications for reproducing multiorgan functions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. JSim, an open-source modeling system for data analysis.

    Science.gov (United States)

    Butterworth, Erik; Jardine, Bartholomew E; Raymond, Gary M; Neal, Maxwell L; Bassingthwaighte, James B

    2013-01-01

    JSim is a simulation system for developing models, designing experiments, and evaluating hypotheses on physiological and pharmacological systems through the testing of model solutions against data. It is designed for interactive, iterative manipulation of the model code, handling of multiple data sets and parameter sets, and for making comparisons among different models running simultaneously or separately. Interactive use is supported by a large collection of graphical user interfaces for model writing and compilation diagnostics, defining input functions, model runs, selection of algorithms solving ordinary and partial differential equations, run-time multidimensional graphics, parameter optimization (8 methods), sensitivity analysis, and Monte Carlo simulation for defining confidence ranges. JSim uses Mathematical Modeling Language (MML) a declarative syntax specifying algebraic and differential equations. Imperative constructs written in other languages (MATLAB, FORTRAN, C++, etc.) are accessed through procedure calls. MML syntax is simple, basically defining the parameters and variables, then writing the equations in a straightforward, easily read and understood mathematical form. This makes JSim good for teaching modeling as well as for model analysis for research.   For high throughput applications, JSim can be run as a batch job.  JSim can automatically translate models from the repositories for Systems Biology Markup Language (SBML) and CellML models. Stochastic modeling is supported. MML supports assigning physical units to constants and variables and automates checking dimensional balance as the first step in verification testing. Automatic unit scaling follows, e.g. seconds to minutes, if needed. The JSim Project File sets a standard for reproducible modeling analysis: it includes in one file everything for analyzing a set of experiments: the data, the models, the data fitting, and evaluation of parameter confidence ranges. JSim is open source; it

  16. Computational models of complex systems

    CERN Document Server

    Dabbaghian, Vahid

    2014-01-01

    Computational and mathematical models provide us with the opportunities to investigate the complexities of real world problems. They allow us to apply our best analytical methods to define problems in a clearly mathematical manner and exhaustively test our solutions before committing expensive resources. This is made possible by assuming parameter(s) in a bounded environment, allowing for controllable experimentation, not always possible in live scenarios. For example, simulation of computational models allows the testing of theories in a manner that is both fundamentally deductive and experimental in nature. The main ingredients for such research ideas come from multiple disciplines and the importance of interdisciplinary research is well recognized by the scientific community. This book provides a window to the novel endeavours of the research communities to present their works by highlighting the value of computational modelling as a research tool when investigating complex systems. We hope that the reader...

  17. Model reduction of parametrized systems

    CERN Document Server

    Ohlberger, Mario; Patera, Anthony; Rozza, Gianluigi; Urban, Karsten

    2017-01-01

    The special volume offers a global guide to new concepts and approaches concerning the following topics: reduced basis methods, proper orthogonal decomposition, proper generalized decomposition, approximation theory related to model reduction, learning theory and compressed sensing, stochastic and high-dimensional problems, system-theoretic methods, nonlinear model reduction, reduction of coupled problems/multiphysics, optimization and optimal control, state estimation and control, reduced order models and domain decomposition methods, Krylov-subspace and interpolatory methods, and applications to real industrial and complex problems. The book represents the state of the art in the development of reduced order methods. It contains contributions from internationally respected experts, guaranteeing a wide range of expertise and topics. Further, it reflects an important effor t, carried out over the last 12 years, to build a growing research community in this field. Though not a textbook, some of the chapters ca...

  18. Modeling software systems by domains

    Science.gov (United States)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  19. Model Reduction of Nonlinear Aeroelastic Systems Experiencing Hopf Bifurcation

    KAUST Repository

    Abdelkefi, Abdessattar

    2013-06-18

    In this paper, we employ the normal form to derive a reduced - order model that reproduces nonlinear dynamical behavior of aeroelastic systems that undergo Hopf bifurcation. As an example, we consider a rigid two - dimensional airfoil that is supported by nonlinear springs in the pitch and plunge directions and subjected to nonlinear aerodynamic loads. We apply the center manifold theorem on the governing equations to derive its normal form that constitutes a simplified representation of the aeroelastic sys tem near flutter onset (manifestation of Hopf bifurcation). Then, we use the normal form to identify a self - excited oscillator governed by a time - delay ordinary differential equation that approximates the dynamical behavior while reducing the dimension of the original system. Results obtained from this oscillator show a great capability to predict properly limit cycle oscillations that take place beyond and above flutter as compared with the original aeroelastic system.

  20. Genome Modeling System: A Knowledge Management Platform for Genomics.

    Directory of Open Access Journals (Sweden)

    Malachi Griffith

    2015-07-01

    Full Text Available In this work, we present the Genome Modeling System (GMS, an analysis information management system capable of executing automated genome analysis pipelines at a massive scale. The GMS framework provides detailed tracking of samples and data coupled with reliable and repeatable analysis pipelines. The GMS also serves as a platform for bioinformatics development, allowing a large team to collaborate on data analysis, or an individual researcher to leverage the work of others effectively within its data management system. Rather than separating ad-hoc analysis from rigorous, reproducible pipelines, the GMS promotes systematic integration between the two. As a demonstration of the GMS, we performed an integrated analysis of whole genome, exome and transcriptome sequencing data from a breast cancer cell line (HCC1395 and matched lymphoblastoid line (HCC1395BL. These data are available for users to test the software, complete tutorials and develop novel GMS pipeline configurations. The GMS is available at https://github.com/genome/gms.

  1. Quantitative comparison of Zeiss-Humphrey model 840 and Rion UX-02 systems of ultrasound biomicroscopy.

    Science.gov (United States)

    Kobayashi, H; Kobayashi, K

    1999-05-01

    Our objective was to estimate the agreement between two different ultrasound biomicroscopes (UBMs) and to evaluate the clinical implications of the measurements obtained. We measured the anterior chamber depth, trabecular-iris angle, angle opening distance at 250 and 500 microm from the scleral spur, iris thickness and scleral-iris angle using the Humphrey UBM model 840 and Rion UBM UX-02 in 25 eyes of 25 normal volunteers. No significant difference was found in the mean values of any parameters measured by the Humphrey and Rion systems. Correlation coefficients of greater than 90% were observed for the parameters studied. Each system showed high reproducibility for all measured parameters. There were significant differences between the two systems in coefficients of variation for all parameters measured except the anterior chamber depth. The parameters measured with the Humphrey and Rion systems showed correlation coefficients of greater than 90%. The Humphrey system showed better reproducibility than the Rion system.

  2. Properties of damped Ly α absorption systems and star-forming galaxies in semi-analytic models at z = 2

    NARCIS (Netherlands)

    Berry, Michael; Somerville, Rachel S.; Gawiser, Eric; Maller, Ariyeh H.; Popping, Gergö; Trager, Scott C.

    2016-01-01

    We investigate predictions from semi-analytic cosmological models of galaxy formation for the properties of star-forming galaxies (SFGs) and damped Ly α absorption systems (DLAS), and the relationship between these two populations. Our models reproduce fairly well the observed distributions of

  3. Quark/gluon jet discrimination: a reproducible analysis using R

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The power to discriminate between light-quark jets and gluon jets would have a huge impact on many searches for new physics at CERN and beyond. This talk will present a walk-through of the development of a prototype machine learning classifier for differentiating between quark and gluon jets at experiments like those at the Large Hadron Collider at CERN. A new fast feature selection method that combines information theory and graph analytics will be outlined. This method has found new variables that promise significant improvements in discrimination power. The prototype jet tagger is simple, interpretable, parsimonious, and computationally extremely cheap, and therefore might be suitable for use in trigger systems for real-time data processing. Nested stratified k-fold cross validation was used to generate robust estimates of model performance. The data analysis was performed entirely in the R statistical programming language, and is fully reproducible. The entire analysis workflow is data-driven, automated a...

  4. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    OpenAIRE

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-01-01

    The Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamfl...

  5. Cognitive models embedded in system simulation models

    International Nuclear Information System (INIS)

    Siegel, A.I.; Wolf, J.J.

    1982-01-01

    If we are to discuss and consider cognitive models, we must first come to grips with two questions: (1) What is cognition; (2) What is a model. Presumably, the answers to these questions can provide a basis for defining a cognitive model. Accordingly, this paper first places these two questions into perspective. Then, cognitive models are set within the context of computer simulation models and a number of computer simulations of cognitive processes are described. Finally, pervasive issues are discussed vis-a-vis cognitive modeling in the computer simulation context

  6. Thermodynamic modeling of complex systems

    DEFF Research Database (Denmark)

    Liang, Xiaodong

    . Contrary to earlier theories, the oil is not only present on the surface, but also in great volumes both in the water column and on the seafloor, which indicates that we do not know enough about how oil behaves in water and interacts with it. Sonar detection is one of the most important and necessary...... after an oil spill. Engineering thermodynamics could be applied in the state-of-the-art sonar products through advanced artificial technology, if the speed of sound, solubility and density of oil-seawater systems could be satisfactorily modelled. The addition of methanol or glycols into unprocessed well...... streams during subsea pipelines is necessary to inhibit gas hydrate formation, and the offshore reservoirs often mean complicated temperature and pressure conditions. Accurate description of the phase behavior and thermalphysical properties of complex systems containing petroleum fluids and polar...

  7. Studies of Catalytic Model Systems

    DEFF Research Database (Denmark)

    Holse, Christian

    of the Cu/ZnO nanoparticles is highly relevant to industrial methanol synthesis for which the direct interaction of Cu and ZnO nanocrystals synergistically boost the catalytic activity. The dynamical behavior of the nanoparticles under reducing and oxidizing environments were studied by means of ex situ X...... as the nanoparticles are reduced. The Cu/ZnO nanoparticles are tested on a  µ-reactor platform and prove to be active towards methanol synthesis, making it an excellent model system for further investigations into activity depended morphology changes....

  8. A user-friendly earth system model of low complexity: the ESCIMO system dynamics model of global warming towards 2100

    Science.gov (United States)

    Randers, Jorgen; Golüke, Ulrich; Wenstøp, Fred; Wenstøp, Søren

    2016-11-01

    We have made a simple system dynamics model, ESCIMO (Earth System Climate Interpretable Model), which runs on a desktop computer in seconds and is able to reproduce the main output from more complex climate models. ESCIMO represents the main causal mechanisms at work in the Earth system and is able to reproduce the broad outline of climate history from 1850 to 2015. We have run many simulations with ESCIMO to 2100 and beyond. In this paper we present the effects of introducing in 2015 six possible global policy interventions that cost around USD 1000 billion per year - around 1 % of world GDP. We tentatively conclude (a) that these policy interventions can at most reduce the global mean surface temperature - GMST - by up to 0.5 °C in 2050 and up to 1.0 °C in 2100 relative to no intervention. The exception is injection of aerosols into the stratosphere, which can reduce the GMST by more than 1.0 °C in a decade but creates other serious problems. We also conclude (b) that relatively cheap human intervention can keep global warming in this century below +2 °C relative to preindustrial times. Finally, we conclude (c) that run-away warming is unlikely to occur in this century but is likely to occur in the longer run. The ensuing warming is slow, however. In ESCIMO, it takes several hundred years to lift the GMST to +3 °C above preindustrial times through gradual self-reinforcing melting of the permafrost. We call for research to test whether more complex climate models support our tentative conclusions from ESCIMO.

  9. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  10. Reservoir Model Information System: REMIS

    Science.gov (United States)

    Lee, Sang Yun; Lee, Kwang-Wu; Rhee, Taehyun; Neumann, Ulrich

    2009-01-01

    We describe a novel data visualization framework named Reservoir Model Information System (REMIS) for the display of complex and multi-dimensional data sets in oil reservoirs. It is aimed at facilitating visual exploration and analysis of data sets as well as user collaboration in an easier way. Our framework consists of two main modules: the data access point module and the data visualization module. For the data access point module, the Phrase-Driven Grammar System (PDGS) is adopted for helping users facilitate the visualization of data. It integrates data source applications and external visualization tools and allows users to formulate data query and visualization descriptions by selecting graphical icons in a menu or on a map with step-by-step visual guidance. For the data visualization module, we implemented our first prototype of an interactive volume viewer named REMVR to classify and to visualize geo-spatial specific data sets. By combining PDGS and REMVR, REMIS assists users better in describing visualizations and exploring data so that they can easily find desired data and explore interesting or meaningful relationships including trends and exceptions in oil reservoir model data.

  11. Modeling integrated water user decisions in intermittent supply systems

    Science.gov (United States)

    Rosenberg, David E.; Tarawneh, Tarek; Abdel-Khaleq, Rania; Lund, Jay R.

    2007-07-01

    We apply systems analysis to estimate household water use in an intermittent supply system considering numerous interdependent water user behaviors. Some 39 household actions include conservation; improving local storage or water quality; and accessing sources having variable costs, availabilities, reliabilities, and qualities. A stochastic optimization program with recourse decisions identifies the infrastructure investments and short-term coping actions a customer can adopt to cost-effectively respond to a probability distribution of piped water availability. Monte Carlo simulations show effects for a population of customers. Model calibration reproduces the distribution of billed residential water use in Amman, Jordan. Parametric analyses suggest economic and demand responses to increased availability and alternative pricing. It also suggests potential market penetration for conservation actions, associated water savings, and subsidies to entice further adoption. We discuss new insights to size, target, and finance conservation.

  12. Modeling learning technology systems as business systems

    NARCIS (Netherlands)

    Avgeriou, Paris; Retalis, Symeon; Papaspyrou, Nikolaos

    2003-01-01

    The design of Learning Technology Systems, and the Software Systems that support them, is largely conducted on an intuitive, ad hoc basis, thus resulting in inefficient systems that defectively support the learning process. There is now justifiable, increasing effort in formalizing the engineering

  13. Bond graph modeling of centrifugal compression systems

    OpenAIRE

    Uddin, Nur; Gravdahl, Jan Tommy

    2015-01-01

    A novel approach to model unsteady fluid dynamics in a compressor network by using a bond graph is presented. The model is intended in particular for compressor control system development. First, we develop a bond graph model of a single compression system. Bond graph modeling offers a different perspective to previous work by modeling the compression system based on energy flow instead of fluid dynamics. Analyzing the bond graph model explains the energy flow during compressor surge. Two pri...

  14. Structuring Problem Analysis for Embedded Systems Modelling

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.; Lucas, Yan

    Our interest is embedded systems validation as part of the model-driven approach. To design a model, the modeller needs to obtain knowledge about the system and decide what is relevant to model and how. A part of the modelling activities is inherently informal - it cannot be formalised in such a way

  15. Hyperactivity and hypoactivity in a rat model of Huntington's disease: the systemic 3-nitropropionic acid model.

    Science.gov (United States)

    Borlongan, C V; Koutouzis, T K; Freeman, T B; Hauser, R A; Cahill, D W; Sanberg, P R

    1997-08-01

    The present study proposes the use of systemic 3-nitropropionic acid (3-NP) treatment in rats as a model of Huntington's disease (HD). The systemic 3-NP model involves chronic injection of low dose intraperitoneal (i.p.) injections of 3-NP to rats once every 4 days over a period of time. Evidence from our experimental studies suggests that manipulating the number of injections can result in either increased nocturnal spontaneous locomotor activity (hyperactivity) or nocturnal akinesia (hypoactivity) [1]. For example, two injections of 3-NP (using the treatment of one injection every 4 days) result in hyperactivity, while four injections or more of 3-NP lead to hypoactivity [1]. The locomotor activity is recorded by Digiscan locomotor activity monitors [11]. The observation of these two types of locomotor activity is unique since no excitotoxin model has replicated a two-stage progression of a HD-like behavioral alteration. Most studies using excitotoxins like quinolinic acid (QA) and kainic acid (KA) have only reproduced the hyperactivity stage [4,5,7]. With the systemic 3-NP model, investigations into at least two stages of the disease are made possible. This allows for better assessment of intervention strategies such as neural transplants across different stages of the disease. The systemic 3-NP rat model is believed to be an improved animal model of HD.

  16. Annotating with Propp's Morphology of the Folktale: Reproducibility and Trainability

    NARCIS (Netherlands)

    Fisseni, B.; Kurji, A.; Löwe, B.

    2014-01-01

    We continue the study of the reproducibility of Propp’s annotations from Bod et al. (2012). We present four experiments in which test subjects were taught Propp’s annotation system; we conclude that Propp’s system needs a significant amount of training, but that with sufficient time investment, it

  17. Environment and industrial economy: Challenge of reproducibility

    International Nuclear Information System (INIS)

    Rullani, E.

    1992-01-01

    Historically and methodologically counterposed until now, the environmentalist and the economic approach to environmental problems need to be integrated in a new approach that considers, from one side, the relevance of the ecological equilibria for the economic systems and, from the other side, the economic dimension (in terms of investments and transformations in the production system) of any attempt to achieve a better environment. In order to achieve this integration, both approaches are compelled to give up some cultural habits that have characterized them, and have contributed to over-emphasize the opposition between them. The article shows that both approaches can converge into a new one, in which environment is no longer only an holistic, not bargainable, natural external limit to human activity (as in the environmentalist approach), nor simply a scarce and exhaustible resource (as economics tends to consider it); environment should instead become part of the reproducibility sphere, or, in other words, it must be regarded as part of the output that the economic system provides. This new approach, due to scientific and technological advances, is made possible for an increasing class of environmental problems. In order to do this, an evolution is required, that could be able to convert environmental goals into investment and technological innovation goals, and communicate to the firms the value society assigns to environmental resources. This value, the author suggests, should correspond to the reproduction cost. Various examples of this new approach are analyzed and discussed

  18. Are classifications of proximal radius fractures reproducible?

    Directory of Open Access Journals (Sweden)

    dos Santos João BG

    2009-10-01

    Full Text Available Abstract Background Fractures of the proximal radius need to be classified in an appropriate and reproducible manner. The aim of this study was to assess the reliability of the three most widely used classification systems. Methods Elbow radiographs images of patients with proximal radius fractures were classified according to Mason, Morrey, and Arbeitsgemeinschaft für osteosynthesefragen/Association for the Study of Internal Fixation (AO/ASIF classifications by four observers with different experience with this subject to assess their intra- and inter-observer agreement. Each observer analyzed the images on three different occasions on a computer with numerical sequence randomly altered. Results We found that intra-observer agreement of Mason and Morrey classifications were satisfactory (κ = 0.582 and 0.554, respectively, while the AO/ASIF classification had poor intra-observer agreement (κ = 0.483. Inter-observer agreement was higher in the Mason (κ = 0.429-0.560 and Morrey (κ = 0.319-0.487 classifications than in the AO/ASIF classification (κ = 0.250-0.478, which showed poor reliability. Conclusion Inter- and intra-observer agreement of the Mason and Morey classifications showed overall satisfactory reliability when compared to the AO/ASIF system. The Mason classification is the most reliable system.

  19. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  20. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...

  1. Modeling Power Systems as Complex Adaptive Systems

    Energy Technology Data Exchange (ETDEWEB)

    Chassin, David P.; Malard, Joel M.; Posse, Christian; Gangopadhyaya, Asim; Lu, Ning; Katipamula, Srinivas; Mallow, J V.

    2004-12-30

    Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today's most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This report explores the state-of-the-art physical analogs for understanding the behavior of some econophysical systems and deriving stable and robust control strategies for using them. We review and discuss applications of some analytic methods based on a thermodynamic metaphor, according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood. We apply these methods to the question of how power markets can be expected to behave under a variety of conditions.

  2. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  3. A discrete model to study reaction-diffusion-mechanics systems.

    Science.gov (United States)

    Weise, Louis D; Nash, Martyn P; Panfilov, Alexander V

    2011-01-01

    This article introduces a discrete reaction-diffusion-mechanics (dRDM) model to study the effects of deformation on reaction-diffusion (RD) processes. The dRDM framework employs a FitzHugh-Nagumo type RD model coupled to a mass-lattice model, that undergoes finite deformations. The dRDM model describes a material whose elastic properties are described by a generalized Hooke's law for finite deformations (Seth material). Numerically, the dRDM approach combines a finite difference approach for the RD equations with a Verlet integration scheme for the equations of the mass-lattice system. Using this framework results were reproduced on self-organized pacemaking activity that have been previously found with a continuous RD mechanics model. Mechanisms that determine the period of pacemakers and its dependency on the medium size are identified. Finally it is shown how the drift direction of pacemakers in RDM systems is related to the spatial distribution of deformation and curvature effects.

  4. A discrete model to study reaction-diffusion-mechanics systems.

    Directory of Open Access Journals (Sweden)

    Louis D Weise

    Full Text Available This article introduces a discrete reaction-diffusion-mechanics (dRDM model to study the effects of deformation on reaction-diffusion (RD processes. The dRDM framework employs a FitzHugh-Nagumo type RD model coupled to a mass-lattice model, that undergoes finite deformations. The dRDM model describes a material whose elastic properties are described by a generalized Hooke's law for finite deformations (Seth material. Numerically, the dRDM approach combines a finite difference approach for the RD equations with a Verlet integration scheme for the equations of the mass-lattice system. Using this framework results were reproduced on self-organized pacemaking activity that have been previously found with a continuous RD mechanics model. Mechanisms that determine the period of pacemakers and its dependency on the medium size are identified. Finally it is shown how the drift direction of pacemakers in RDM systems is related to the spatial distribution of deformation and curvature effects.

  5. Visual computing model for immune system and medical system.

    Science.gov (United States)

    Gong, Tao; Cao, Xinxue; Xiong, Qin

    2015-01-01

    Natural immune system is an intelligent self-organizing and adaptive system, which has a variety of immune cells with different types of immune mechanisms. The mutual cooperation between the immune cells shows the intelligence of this immune system, and modeling this immune system has an important significance in medical science and engineering. In order to build a comprehensible model of this immune system for better understanding with the visualization method than the traditional mathematic model, a visual computing model of this immune system was proposed and also used to design a medical system with the immune system, in this paper. Some visual simulations of the immune system were made to test the visual effect. The experimental results of the simulations show that the visual modeling approach can provide a more effective way for analyzing this immune system than only the traditional mathematic equations.

  6. Model Information Exchange System (MIXS).

    Science.gov (United States)

    2013-08-01

    Many travel demand forecast models operate at state, regional, and local levels. While they share the same physical network in overlapping geographic areas, they use different and uncoordinated modeling networks. This creates difficulties for models ...

  7. Two sustainable energy system analysis models

    DEFF Research Database (Denmark)

    Lund, Henrik; Goran Krajacic, Neven Duic; da Graca Carvalho, Maria

    2005-01-01

    This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy.......This paper presents a comparative study of two energy system analysis models both designed with the purpose of analysing electricity systems with a substantial share of fluctuating renewable energy....

  8. Models for multimegawatt space power systems

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.

    1990-06-01

    This report describes models for multimegawatt, space power systems which Sandia's Advanced Power Systems Division has constructed to help evaluate space power systems for SDI's Space Power Office. Five system models and models for associated components are presented for both open (power system waste products are exhausted into space) and closed (no waste products) systems: open, burst mode, hydrogen cooled nuclear reactor -- turboalternator system; open, hydrogen-oxygen combustion turboalternator system; closed, nuclear reactor powered Brayton cycle system; closed, liquid metal Rankine cycle system; and closed, in-core, reactor therminonic system. The models estimate performance and mass for the components in each of these systems. 17 refs., 8 figs., 15 tabs.

  9. Nonlinear modeling of crystal system transition of black phosphorus using continuum-DFT model

    Science.gov (United States)

    Setoodeh, A. R.; Farahmand, H.

    2018-01-01

    In this paper, the nonlinear behavior of black phosphorus crystals is investigated in tandem with dispersion-corrected density functional theory (DFT-D) analysis under uniaxial loadings. From the identified anisotropic behavior of black phosphorus due to its morphological anisotropy, a hyperelastic anisotropic (HA) model named continuum-DFT is established to predict the nonlinear behavior of the material. In this respect, uniaxial Cauchy stresses are employed on both the DFT-D and HA models along the zig-zag and armchair directions. Simultaneously, the transition of the crystal system is recognized at about 4.5 GPa of the applied uniaxial tensile stress along the zig-zag direction on the DFT-D simulation in the nonlinear region. In order to develop the nonlinear continuum model, unknown constants are surveyed with the optimized least square technique. In this regard, the continuum model is obtained to reproduce the Cauchy stress–stretch and density of strain–stretch results of the DFT-D simulation. Consequently, the modified HA model is introduced to characterize the nonlinear behavior of black phosphorus along the zig-zag direction. More importantly, the specific transition of the crystal system is successfully predicted in the new modified continuum-DFT model. The results reveal that the multiscale continuum-DFT model is well defined to replicate the nonlinear behavior of black phosphorus along the zig-zag and armchair directions.

  10. Nonlinear modeling of crystal system transition of black phosphorus using continuum-DFT model.

    Science.gov (United States)

    Setoodeh, A R; Farahmand, H

    2018-01-24

    In this paper, the nonlinear behavior of black phosphorus crystals is investigated in tandem with dispersion-corrected density functional theory (DFT-D) analysis under uniaxial loadings. From the identified anisotropic behavior of black phosphorus due to its morphological anisotropy, a hyperelastic anisotropic (HA) model named continuum-DFT is established to predict the nonlinear behavior of the material. In this respect, uniaxial Cauchy stresses are employed on both the DFT-D and HA models along the zig-zag and armchair directions. Simultaneously, the transition of the crystal system is recognized at about 4.5 GPa of the applied uniaxial tensile stress along the zig-zag direction on the DFT-D simulation in the nonlinear region. In order to develop the nonlinear continuum model, unknown constants are surveyed with the optimized least square technique. In this regard, the continuum model is obtained to reproduce the Cauchy stress-stretch and density of strain-stretch results of the DFT-D simulation. Consequently, the modified HA model is introduced to characterize the nonlinear behavior of black phosphorus along the zig-zag direction. More importantly, the specific transition of the crystal system is successfully predicted in the new modified continuum-DFT model. The results reveal that the multiscale continuum-DFT model is well defined to replicate the nonlinear behavior of black phosphorus along the zig-zag and armchair directions.

  11. Advancement of Global-scale River Hydrodynamics Modelling and Its Potential Applications to Earth System Models

    Science.gov (United States)

    Yamazaki, D.

    2015-12-01

    Global river routine models have been developed for representing freshwater discharge from land to ocean in Earth System Models. At the beginning, global river models had simulated river discharge along a prescribed river network map by using a linear-reservoir assumption. Recently, in parallel with advancement of remote sensing and computational powers, many advanced global river models have started to represent floodplain inundation assuming sub-grid floodplain topography. Some of them further pursue physically-appropriate representation of river and floodplain dynamics, and succeeded to utilize "hydrodynamic flow equations" to realistically simulate channel/floodplain and upstream/downstream interactions. State-of-the-art global river hydrodynamic models can well reproduce flood stage (e.g. inundated areas and water levels) in addition to river discharge. Flood stage simulation by global river models can be potentially coupled with land surface processes in Earth System Models. For example, evaporation from inundated water area is not negligible for land-atmosphere interactions in arid areas (such as the Niger River). Surface water level and ground water level are correlated each other in flat topography, and this interaction could dominate wetting and drying of many small lakes in flatland and could also affect biogeochemical processes in these lakes. These land/surface water interactions had not been implemented in Earth System Models but they have potential impact on the global climate and carbon cycle. In the AGU presentation, recent advancements of global river hydrodynamic modelling, including super-high resolution river topography datasets, will be introduces. The potential applications of river and surface water modules within Earth System Models will be also discussed.

  12. Modeling traceability in system of systems

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Erata, Ferhat

    2017-01-01

    An important aspect in SoS is the realization of the concerns in different systems that work together. Identifying and locating these concerns is important to orchestrate the overall activities and hereby to achieve the overall goal of the SoS. Moreover, concerns in SoS are rarely stable and need

  13. Modeling of Nonlinear Systems using Genetic Algorithm

    Science.gov (United States)

    Hayashi, Kayoko; Yamamoto, Toru; Kawada, Kazuo

    In this paper, a newly modeling system by using Genetic Algorithm (GA) is proposed. The GA is an evolutionary computational method that simulates the mechanisms of heredity or evolution of living things, and it is utilized in optimization and in searching for optimized solutions. Most process systems have nonlinearities, so it is necessary to anticipate exactly such systems. However, it is difficult to make a suitable model for nonlinear systems, because most nonlinear systems have a complex structure. Therefore the newly proposed method of modeling for nonlinear systems uses GA. Then, according to the newly proposed scheme, the optimal structure and parameters of the nonlinear model are automatically generated.

  14. Model Driven Development of Data Sensitive Systems

    DEFF Research Database (Denmark)

    Olsen, Petur

    2014-01-01

    to the values of variables. This theses strives to improve model-driven development of such data-sensitive systems. This is done by addressing three research questions. In the first we combine state-based modeling and abstract interpretation, in order to ease modeling of data-sensitive systems, while allowing...... efficient model-checking and model-based testing. In the second we develop automatic abstraction learning used together with model learning, in order to allow fully automatic learning of data-sensitive systems to allow learning of larger systems. In the third we develop an approach for modeling and model-based...... detection and pushing error detection to earlier stages of development. The complexity of modeling and the size of systems which can be analyzed is severely limited when introducing data variables. The state space grows exponentially in the number of variable and the domain size of the variables...

  15. CFD Modeling of Flow and Ion Exchange Kinetics in a Rotating Bed Reactor System

    DEFF Research Database (Denmark)

    Larsson, Hilde Kristina; Schjøtt Andersen, Patrick Alexander; Byström, Emil

    2017-01-01

    be achieved by making the baffles deeper. Two-phase simulations were performed, which managed to reproduce the deflection of the gas–liquid interface in an unbaffled system. A chemical reaction was implemented in the model, describing the ion-exchange phenomena in the porous material using four different......A rotating bed reactor (RBR) has been modeled using computational fluid dynamics (CFD). The flow pattern in the RBR was investigated and the flow through the porous material in it was quantified. A simplified geometry representing the more complex RBR geometry was introduced and the simplified...... model was able to reproduce the main characteristics of the flow. Alternating reactor shapes were investigated, and it was concluded that the use of baffles has a very large impact on the flows through the porous material. The simulations suggested, therefore, that even faster reaction rates could...

  16. Modeling of a Hydraulic Braking System

    OpenAIRE

    Lundin, Christopher

    2015-01-01

    The objective of this thesis is to derive an analytical model representing a reduced form of a mine hoist hydraulic braking system. Based primarily on fluid mechanical and mechanical physical modeling, along with a number of simplifying assumptions, the analytical model will be derived and expressed in the form of a system of differential equations including a set of static functions. The obtained model will be suitable for basic simulation and analysis of system dynamics, with the aim to cap...

  17. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  18. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  19. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  20. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin

    2010-01-01

    Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model...... and check the running status of the system, which offers a debugging capability on a higher level of abstraction. The framework intends to contribute a tool to the Eclipse society, especially suitable for model-driven development of embedded systems....

  1. Validation of newly designed regional earth system model (RegESM) for Mediterranean Basin

    Science.gov (United States)

    Turuncoglu, Ufuk Utku; Sannino, Gianmaria

    2017-05-01

    We present a validation analysis of a regional earth system model system (RegESM) for the Mediterranean Basin. The used configuration of the modeling system includes two active components: a regional climate model (RegCM4) and an ocean modeling system (ROMS). To assess the performance of the coupled modeling system in representing the climate of the basin, the results of the coupled simulation (C50E) are compared to the results obtained by a standalone atmospheric simulation (R50E) as well as several observation datasets. Although there is persistent cold bias in fall and winter, which is also seen in previous studies, the model reproduces the inter-annual variability and the seasonal cycles of sea surface temperature (SST) in a general good agreement with the available observations. The analysis of the near-surface wind distribution and the main circulation of the sea indicates that the coupled model can reproduce the main characteristics of the Mediterranean Sea surface and intermediate layer circulation as well as the seasonal variability of wind speed and direction when it is compared with the available observational datasets. The results also reveal that the simulated near-surface wind speed and direction have poor performance in the Gulf of Lion and surrounding regions that also affects the large positive SST bias in the region due to the insufficient horizontal resolution of the atmospheric component of the coupled modeling system. The simulated seasonal climatologies of the surface heat flux components are also consistent with the CORE.2 and NOCS datasets along with the overestimation in net long-wave radiation and latent heat flux (or evaporation, E), although a large observational uncertainty is found in these variables. Also, the coupled model tends to improve the latent heat flux by providing a better representation of the air-sea interaction as well as total heat flux budget over the sea. Both models are also able to reproduce the temporal evolution of

  2. Evaluation of AirGIS: a GIS-based air pollution and human exposure modelling system

    DEFF Research Database (Denmark)

    Ketzel, Matthias; Berkowicz, Ruwim; Hvidberg, Martin

    2011-01-01

    shows, in general, a good performance for both long-term averages (annual and monthly averages), short-term averages (hourly and daily) as well as when reproducing spatial variation in air pollution concentrations. Some shortcomings and future perspectives of the system are discussed too.......This study describes in brief the latest extensions of the Danish Geographic Information System (GIS)-based air pollution and human exposure modelling system (AirGIS), which has been developed in Denmark since 2001 and gives results of an evaluation with measured air pollution data. The system...

  3. Hydrological modelling in forested systems

    Science.gov (United States)

    This chapter provides a brief overview of forest hydrology modelling approaches for answering important global research and management questions. Many hundreds of hydrological models have been applied globally across multiple decades to represent and predict forest hydrological p...

  4. Towards Modelling of Hybrid Systems

    DEFF Research Database (Denmark)

    Wisniewski, Rafal

    2006-01-01

    The article is an attempt to use methods of category theory and topology for analysis of hybrid systems. We use the notion of a directed topological space; it is a topological space together with a set of privileged paths. Dynamical systems are examples of directed topological spaces. A hybrid...... system consists of a number of dynamical systems that are glued together according to information encoded in the discrete part of the system. We develop a definition of a hybrid system as a functor from the category generated by a transition system to the category of directed topological spaces. Its...... directed homotopy colimit (geometric realization) is a single directed topological space. The behavior of hybrid systems can be then understood in terms of the behavior of dynamical systems through the directed homotopy colimit....

  5. Service systems concepts, modeling, and programming

    CERN Document Server

    Cardoso, Jorge; Poels, Geert

    2014-01-01

    This SpringerBrief explores the internal workings of service systems. The authors propose a lightweight semantic model for an effective representation to capture the essence of service systems. Key topics include modeling frameworks, service descriptions and linked data, creating service instances, tool support, and applications in enterprises.Previous books on service system modeling and various streams of scientific developments used an external perspective to describe how systems can be integrated. This brief introduces the concept of white-box service system modeling as an approach to mo

  6. Transparent, reproducible and reusable research in pharmacoepidemiology

    NARCIS (Netherlands)

    Gardarsdottir, Helga; Sauer, Brian C.; Liang, Huifang; Ryan, Patrick; Klungel, Olaf; Reynolds, Robert

    2012-01-01

    Background: Epidemiological research has been criticized as being unreliable. Scientific evidence is strengthened when the study procedures of important findings are transparent, open for review, and easily reproduced by different investigators and in various settings. Studies often have common

  7. Thou Shalt Be Reproducible! A Technology Perspective

    Science.gov (United States)

    Mair, Patrick

    2016-01-01

    This article elaborates on reproducibility in psychology from a technological viewpoint. Modern open source computational environments are shown and explained that foster reproducibility throughout the whole research life cycle, and to which emerging psychology researchers should be sensitized, are shown and explained. First, data archiving platforms that make datasets publicly available are presented. Second, R is advocated as the data-analytic lingua franca in psychology for achieving reproducible statistical analysis. Third, dynamic report generation environments for writing reproducible manuscripts that integrate text, data analysis, and statistical outputs such as figures and tables in a single document are described. Supplementary materials are provided in order to get the reader started with these technologies. PMID:27471486

  8. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  9. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  10. Balmorel open source energy system model

    DEFF Research Database (Denmark)

    Wiese, Frauke; Bramstoft, Rasmus; Koduvere, Hardi

    2018-01-01

    As the world progresses towards a cleaner energy future with more variable renewable energy sources, energy system models are required to deal with new challenges. This article describes design, development and applications of the open source energy system model Balmorel, which is a result of a l...... transport of local biomass as part of the optimisation and speeding up the model....

  11. Modeling, Control and Coordination of Helicopter Systems

    CERN Document Server

    Ren, Beibei; Chen, Chang; Fua, Cheng-Heng; Lee, Tong Heng

    2012-01-01

    Modeling, Control and Coordination of Helicopter Systems provides a comprehensive treatment of helicopter systems, ranging from related nonlinear flight dynamic modeling and stability analysis to advanced control design for single helicopter systems, and also covers issues related to the coordination and formation control of multiple helicopter systems to achieve high performance tasks. Ensuring stability in helicopter flight is a challenging problem for nonlinear control design and development. This book is a valuable reference on modeling, control and coordination of helicopter systems,providing readers with practical solutions for the problems that still plague helicopter system design and implementation. Readers will gain a complete picture of helicopters at the systems level, as well as a better understanding of the technical intricacies involved. This book also: Presents a complete picture of modeling, control and coordination for helicopter systems Provides a modeling platform for a general class of ro...

  12. Systems Biology—Biomedical Modeling

    OpenAIRE

    Sobie, Eric A.; Lee, Young-Seon; Jenkins, Sherry L.; Iyengar, Ravi

    2011-01-01

    Because of the complexity inherent in biological systems, many researchers frequently rely on a combination of global analysis and computational approaches to gain insight into both (i) how interacting components can produce complex system behaviors, and (ii) how changes in conditions may alter these behaviors. Because the biological details of a particular system are generally not taught along with the quantitative approaches that enable hypothesis generation and analysis of the system, we d...

  13. Modeling and simulation of systems using Matlab and Simulink

    CERN Document Server

    Chaturvedi, Devendra K

    2009-01-01

    Introduction to SystemsSystemClassification of SystemsLinear SystemsTime-Varying vs. Time-Invariant Systems Lumped vs. Distributed Parameter SystemsContinuous- and Discrete-Time Systems Deterministic vs. Stochastic Systems Hard and Soft Systems Analysis of Systems Synthesis of Systems Introduction to System Philosophy System Thinking Large and Complex Applied System Engineering: A Generic ModelingSystems ModelingIntroduction Need of System Modeling Modeling Methods for Complex Systems Classification of ModelsCharacteristics of Models ModelingMathematical Modeling of Physical SystemsFormulation of State Space Model of SystemsPhysical Systems Theory System Components and Interconnections Computation of Parameters of a Component Single Port and Multiport Systems Techniques of System Analysis Basics of Linear Graph Theoretic ApproachFormulation of System Model for Conceptual SystemFormulation System Model for Physical SystemsTopological RestrictionsDevelopment of State Model of Degenerative SystemSolution of Stat...

  14. Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Bliguet, Marie Le; Kjær, Andreas

    2010-01-01

    This paper describes how relay interlocking systems as used by the Danish railways can be formally modelled and verified. Such systems are documented by circuit diagrams describing their static layout. It is explained how to derive a state transition system model for the dynamic behaviour of a re...

  15. Compositional Modelling of Stochastic Hybrid Systems

    NARCIS (Netherlands)

    Strubbe, S.N.

    2005-01-01

    In this thesis we present a modelling framework for compositional modelling of stochastic hybrid systems. Hybrid systems consist of a combination of continuous and discrete dynamics. The state space of a hybrid system is hybrid in the sense that it consists of a continuous component and a discrete

  16. Models of complex attitude systems

    DEFF Research Database (Denmark)

    Sørensen, Bjarne Taulo

    Existing research on public attitudes towards agricultural production systems is largely descriptive, abstracting from the processes through which members of the general public generate their evaluations of such systems. The present paper adopts a systems perspective on such evaluations, understa......Existing research on public attitudes towards agricultural production systems is largely descriptive, abstracting from the processes through which members of the general public generate their evaluations of such systems. The present paper adopts a systems perspective on such evaluations......, understanding them as embedded into a wider attitude system that consists of attitudes towards objects of different abstraction levels, ranging from personal value orientations over general socio-political attitudes to evaluations of specific characteristics of agricultural production systems. It is assumed...... that evaluative affect propagates through the system in such a way that the system becomes evaluatively consistent and operates as a schema for the generation of evaluative judgments. In the empirical part of the paper, the causal structure of an attitude system from which people derive their evaluations of pork...

  17. Second order kinetic modeling of headspace solid phase microextraction of flavors released from selected food model systems.

    Science.gov (United States)

    Zhang, Jiyuan; Cheong, Mun-Wai; Yu, Bin; Curran, Philip; Zhou, Weibiao

    2014-09-04

    The application of headspace-solid phase microextraction (HS-SPME) has been widely used in various fields as a simple and versatile method, yet challenging in quantification. In order to improve the reproducibility in quantification, a mathematical model with its root in psychological modeling and chemical reactor modeling was developed, describing the kinetic behavior of aroma active compounds extracted by SPME from two different food model systems, i.e., a semi-solid food and a liquid food. The model accounted for both adsorption and release of the analytes from SPME fiber, which occurred simultaneously but were counter-directed. The model had four parameters and their estimated values were found to be more reproducible than the direct measurement of the compounds themselves by instrumental analysis. With the relative standard deviations (RSD) of each parameter less than 5% and root mean square error (RMSE) less than 0.15, the model was proved to be a robust one in estimating the release of a wide range of low molecular weight acetates at three environmental temperatures i.e., 30, 40 and 60 °C. More insights of SPME behavior regarding the small molecule analytes were also obtained through the kinetic parameters and the model itself.

  18. Second Order Kinetic Modeling of Headspace Solid Phase Microextraction of Flavors Released from Selected Food Model Systems

    Directory of Open Access Journals (Sweden)

    Jiyuan Zhang

    2014-09-01

    Full Text Available The application of headspace-solid phase microextraction (HS-SPME has been widely used in various fields as a simple and versatile method, yet challenging in quantification. In order to improve the reproducibility in quantification, a mathematical model with its root in psychological modeling and chemical reactor modeling was developed, describing the kinetic behavior of aroma active compounds extracted by SPME from two different food model systems, i.e., a semi-solid food and a liquid food. The model accounted for both adsorption and release of the analytes from SPME fiber, which occurred simultaneously but were counter-directed. The model had four parameters and their estimated values were found to be more reproducible than the direct measurement of the compounds themselves by instrumental analysis. With the relative standard deviations (RSD of each parameter less than 5% and root mean square error (RMSE less than 0.15, the model was proved to be a robust one in estimating the release of a wide range of low molecular weight acetates at three environmental temperatures i.e., 30, 40 and 60 °C. More insights of SPME behavior regarding the small molecule analytes were also obtained through the kinetic parameters and the model itself.

  19. Modeling Control Situations in Power System Operations

    DEFF Research Database (Denmark)

    Saleem, Arshad; Lind, Morten; Singh, Sri Niwas

    2010-01-01

    Increased interconnection and loading of the power system along with deregulation has brought new challenges for electric power system operation, control and automation. Traditional power system models used in intelligent operation and control are highly dependent on the task purpose. Thus, a model...... of explicit principles for model construction. This paper presents a work on using explicit means-ends model based reasoning about complex control situations which results in maintaining consistent perspectives and selecting appropriate control action for goal driven agents. An example of power system...... for intelligent operation and control must represent system features, so that information from measurements can be related to possible system states and to control actions. These general modeling requirements are well understood, but it is, in general, difficult to translate them into a model because of the lack...

  20. Data Science Innovations That Streamline Development, Documentation, Reproducibility, and Dissemination of Models in Computational Thermodynamics: An Application of Image Processing Techniques for Rapid Computation, Parameterization and Modeling of Phase Diagrams

    Science.gov (United States)

    Ghiorso, M. S.

    2014-12-01

    Computational thermodynamics (CT) represents a collection of numerical techniques that are used to calculate quantitative results from thermodynamic theory. In the Earth sciences, CT is most often applied to estimate the equilibrium properties of solutions, to calculate phase equilibria from models of the thermodynamic properties of materials, and to approximate irreversible reaction pathways by modeling these as a series of local equilibrium steps. The thermodynamic models that underlie CT calculations relate the energy of a phase to temperature, pressure and composition. These relationships are not intuitive and they are seldom well constrained by experimental data; often, intuition must be applied to generate a robust model that satisfies the expectations of use. As a consequence of this situation, the models and databases the support CT applications in geochemistry and petrology are tedious to maintain as new data and observations arise. What is required to make the process more streamlined and responsive is a computational framework that permits the rapid generation of observable outcomes from the underlying data/model collections, and importantly, the ability to update and re-parameterize the constitutive models through direct manipulation of those outcomes. CT procedures that take models/data to the experiential reference frame of phase equilibria involve function minimization, gradient evaluation, the calculation of implicit lines, curves and surfaces, contour extraction, and other related geometrical measures. All these procedures are the mainstay of image processing analysis. Since the commercial escalation of video game technology, open source image processing libraries have emerged (e.g., VTK) that permit real time manipulation and analysis of images. These tools find immediate application to CT calculations of phase equilibria by permitting rapid calculation and real time feedback between model outcome and the underlying model parameters.

  1. Holonic Models for Traffic Control Systems

    Science.gov (United States)

    Ciufudean, Calin; Filote, Constantin

    This paper proposes a new time-placed net model for traffic control systems, respectively railway control traffic systems. This model can be interpreted as a holonic one, and contains three modules: Transport Planning Module, Transport Control Module and Priority Control Module. For railway traffic systems we introduce a strategy in a timed-place Petri net model to solve collision and traffic jam problems.

  2. Hydrological modeling in forested systems

    Science.gov (United States)

    H.E. Golden; G.R. Evenson; S. Tian; Devendra Amatya; Ge Sun

    2015-01-01

    Characterizing and quantifying interactions among components of the forest hydrological cycle is complex and usually requires a combination of field monitoring and modelling approaches (Weiler and McDonnell, 2004; National Research Council, 2008). Models are important tools for testing hypotheses, understanding hydrological processes and synthesizing experimental data...

  3. Comment on "Most computational hydrology is not reproducible, so is it really science?" by Christopher Hutton et al.

    Science.gov (United States)

    Añel, Juan A.

    2017-03-01

    Nowadays, the majority of the scientific community is not aware of the risks and problems associated with an inadequate use of computer systems for research, mostly for reproducibility of scientific results. Such reproducibility can be compromised by the lack of clear standards and insufficient methodological description of the computational details involved in an experiment. In addition, the inappropriate application or ignorance of copyright laws can have undesirable effects on access to aspects of great importance of the design of experiments and therefore to the interpretation of results.Plain Language SummaryThis article highlights several important issues to ensure the scientific reproducibility of results within the current scientific framework, going beyond simple documentation. Several specific examples are discussed in the field of hydrological modeling.

  4. Life-Cycle Models for Survivable Systems

    National Research Council Canada - National Science Library

    Linger, Richard

    2002-01-01

    .... Current software development life-cycle models are not focused on creating survivable systems, and exhibit shortcomings when the goal is to develop systems with a high degree of assurance of survivability...

  5. Model Updating Nonlinear System Identification Toolbox Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology (ZONA) proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology that utilizes flight data with...

  6. Digital clocks: simple Boolean models can quantitatively describe circadian systems.

    Science.gov (United States)

    Akman, Ozgur E; Watterson, Steven; Parton, Andrew; Binns, Nigel; Millar, Andrew J; Ghazal, Peter

    2012-09-07

    The gene networks that comprise the circadian clock modulate biological function across a range of scales, from gene expression to performance and adaptive behaviour. The clock functions by generating endogenous rhythms that can be entrained to the external 24-h day-night cycle, enabling organisms to optimally time biochemical processes relative to dawn and dusk. In recent years, computational models based on differential equations have become useful tools for dissecting and quantifying the complex regulatory relationships underlying the clock's oscillatory dynamics. However, optimizing the large parameter sets characteristic of these models places intense demands on both computational and experimental resources, limiting the scope of in silico studies. Here, we develop an approach based on Boolean logic that dramatically reduces the parametrization, making the state and parameter spaces finite and tractable. We introduce efficient methods for fitting Boolean models to molecular data, successfully demonstrating their application to synthetic time courses generated by a number of established clock models, as well as experimental expression levels measured using luciferase imaging. Our results indicate that despite their relative simplicity, logic models can (i) simulate circadian oscillations with the correct, experimentally observed phase relationships among genes and (ii) flexibly entrain to light stimuli, reproducing the complex responses to variations in daylength generated by more detailed differential equation formulations. Our work also demonstrates that logic models have sufficient predictive power to identify optimal regulatory structures from experimental data. By presenting the first Boolean models of circadian circuits together with general techniques for their optimization, we hope to establish a new framework for the systematic modelling of more complex clocks, as well as other circuits with different qualitative dynamics. In particular, we anticipate

  7. Digital clocks: simple Boolean models can quantitatively describe circadian systems

    Science.gov (United States)

    Akman, Ozgur E.; Watterson, Steven; Parton, Andrew; Binns, Nigel; Millar, Andrew J.; Ghazal, Peter

    2012-01-01

    The gene networks that comprise the circadian clock modulate biological function across a range of scales, from gene expression to performance and adaptive behaviour. The clock functions by generating endogenous rhythms that can be entrained to the external 24-h day–night cycle, enabling organisms to optimally time biochemical processes relative to dawn and dusk. In recent years, computational models based on differential equations have become useful tools for dissecting and quantifying the complex regulatory relationships underlying the clock's oscillatory dynamics. However, optimizing the large parameter sets characteristic of these models places intense demands on both computational and experimental resources, limiting the scope of in silico studies. Here, we develop an approach based on Boolean logic that dramatically reduces the parametrization, making the state and parameter spaces finite and tractable. We introduce efficient methods for fitting Boolean models to molecular data, successfully demonstrating their application to synthetic time courses generated by a number of established clock models, as well as experimental expression levels measured using luciferase imaging. Our results indicate that despite their relative simplicity, logic models can (i) simulate circadian oscillations with the correct, experimentally observed phase relationships among genes and (ii) flexibly entrain to light stimuli, reproducing the complex responses to variations in daylength generated by more detailed differential equation formulations. Our work also demonstrates that logic models have sufficient predictive power to identify optimal regulatory structures from experimental data. By presenting the first Boolean models of circadian circuits together with general techniques for their optimization, we hope to establish a new framework for the systematic modelling of more complex clocks, as well as other circuits with different qualitative dynamics. In particular, we

  8. An expert system for dispersion model interpretation

    International Nuclear Information System (INIS)

    Skyllingstad, E.D.; Ramsdell, J.V.

    1988-10-01

    A prototype expert system designed to diagnose dispersion model uncertainty is described in this paper with application to a puff transport model. The system obtains qualitative information from the model user and through an expert-derived knowledge base, performs a rating of the current simulation. These results can then be used in combination with dispersion model output for deciding appropriate evacuation measures. Ultimately, the goal of this work is to develop an expert system that may be operated accurately by an individual uneducated in meteorology or dispersion modeling. 5 refs., 3 figs

  9. Semantic models for adaptive interactive systems

    CERN Document Server

    Hussein, Tim; Lukosch, Stephan; Ziegler, Jürgen; Calvary, Gaëlle

    2013-01-01

    Providing insights into methodologies for designing adaptive systems based on semantic data, and introducing semantic models that can be used for building interactive systems, this book showcases many of the applications made possible by the use of semantic models.Ontologies may enhance the functional coverage of an interactive system as well as its visualization and interaction capabilities in various ways. Semantic models can also contribute to bridging gaps; for example, between user models, context-aware interfaces, and model-driven UI generation. There is considerable potential for using

  10. Models of the venous system

    DEFF Research Database (Denmark)

    Mehlsen, J

    2000-01-01

    of the venous system require at least three elements: a resistor, a capacitor and an inductor, with the latter being of more importance in the venous than in the arterial system. Non-linearities must be considered in pressure/flow relations in the small venules, during venous collapse, or low flow conditions...

  11. Stochastic Models of Polymer Systems

    Science.gov (United States)

    2016-01-01

    published in non-peer-reviewed journals (N/A for none) The dynamics of stochastic gradient algorithms (submitted); Noisy Hegselmann- Krause Systems...algorithms for big data applications. (2) We studied stochastic dynamics of polymer systems in the mean field limit. (3) We studied noisy Hegselmann- Krause

  12. wrv: An R Package for Groundwater Flow Model Construction, Wood River Valley Aquifer System, Idaho

    Science.gov (United States)

    Fisher, J. C.

    2014-12-01

    Groundwater models are one of the main tools used in the hydrogeological sciences to assess resources and to simulate possible effects from future water demands and changes in climate. The hydrological inputs to groundwater models can be numerous and can vary in both time and space. Difficulties associated with model construction are often related to extensive datasets and cumbersome data processing tasks. To mitigate these difficulties, a graphical user interface (GUI) is often employed to aid the input of data for creating models. Unfortunately, GUI software presents an obstacle to reproducibility, a cornerstone of research. The considerable effort required to document processing steps in a GUI program, and the rapid obsoleteness of these steps with subsequent versions of the software, has prompted modelers to explicitly write down processing steps as source code to make them 'easily' reproducible. This research describes the R package wrv, a collection of datasets and functions for pre- and post-processing the numerical groundwater flow model of the Wood River Valley aquifer system, south-central Idaho. R largely facilitates reproducible modeling with the package vignette; a document that is a combination of content and source code. The code is run when the vignette is built, and all data analysis output (such as figures and tables) is created on the fly and inserted into the final document. The wrv package includes two vignettes that explain and run steps that (1) create package datasets from raw data files located on a publicly accessible repository, and (2) create and run the groundwater flow model. MODFLOW-USG, the numerical groundwater model used in this study, is executed from the vignette, and model output is returned for exploratory analyses. The ability of R to perform all processing steps in a single workflow is attributed to its comprehensive list of features; that include geographic information system and time series functionality.

  13. Switching model photovoltaic pumping system

    Science.gov (United States)

    Anis, Wagdy R.; Abdul-Sadek Nour, M.

    Photovoltaic (PV) pumping systems are widely used due to their simplicity, high reliability and low cost. A directly-coupled PV pumping system is the most reliable and least-cost PV system. The d.c. motor-pump group is not, however, working at its optimum operating point. A battery buffered PV pumping system introduces a battery between the PV array and the d.c. motor-pump group to ensure that the motor-pump group is operating at its optimum point. The size of the battery storage depends on system economics. If the battery is fully charged while solar radiation is available, the battery will discharge through the load while the PV array is disconnected. Hence, a power loss takes place. To overcome the above mentioned difficulty, a switched mode PV pumping is proposed. When solar radiation is available and the battery is fully charged, the battery is disconnected and the d.c. motor-pump group is directly coupled to the PV array. To avoid excessive operating voltage for the motor, a part of the PV array is switched off to reduce the voltage. As a result, the energy loss is significantly eliminated. Detailed analysis of the proposed system shows that the discharged water increases by about 10% when compared with a conventional battery-buffered system. The system transient performance just after the switching moment shows that the system returns to a steady state in short period. The variations in the system parameters lie within 1% of the rated values.

  14. Multiple system modelling of waste management

    International Nuclear Information System (INIS)

    Eriksson, Ola; Bisaillon, Mattias

    2011-01-01

    Highlights: → Linking of models will provide a more complete, correct and credible picture of the systems. → The linking procedure is easy to perform and also leads to activation of project partners. → The simulation procedure is a bit more complicated and calls for the ability to run both models. - Abstract: Due to increased environmental awareness, planning and performance of waste management has become more and more complex. Therefore waste management has early been subject to different types of modelling. Another field with long experience of modelling and systems perspective is energy systems. The two modelling traditions have developed side by side, but so far there are very few attempts to combine them. Waste management systems can be linked together with energy systems through incineration plants. The models for waste management can be modelled on a quite detailed level whereas surrounding systems are modelled in a more simplistic way. This is a problem, as previous studies have shown that assumptions on the surrounding system often tend to be important for the conclusions. In this paper it is shown how two models, one for the district heating system (MARTES) and another one for the waste management system (ORWARE), can be linked together. The strengths and weaknesses with model linking are discussed when compared to simplistic assumptions on effects in the energy and waste management systems. It is concluded that the linking of models will provide a more complete, correct and credible picture of the consequences of different simultaneous changes in the systems. The linking procedure is easy to perform and also leads to activation of project partners. However, the simulation procedure is a bit more complicated and calls for the ability to run both models.

  15. Modeling of power electronic systems with EMTP

    Science.gov (United States)

    Tam, Kwa-Sur; Dravid, Narayan V.

    1989-01-01

    In view of the potential impact of power electronics on power systems, there is need for a computer modeling/analysis tool to perform simulation studies on power systems with power electronic components as well as to educate engineering students about such systems. The modeling of the major power electronic components of the NASA Space Station Freedom Electric Power System is described along with ElectroMagnetic Transients Program (EMTP) and it is demonstrated that EMTP can serve as a very useful tool for teaching, design, analysis, and research in the area of power systems with power electronic components. EMTP modeling of power electronic circuits is described and simulation results are presented.

  16. System Level Modelling and Performance Estimation of Embedded Systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer

    The advances seen in the semiconductor industry within the last decade have brought the possibility of integrating evermore functionality onto a single chip forming functionally highly advanced embedded systems. These integration possibilities also imply that as the design complexity increases, so...... an efficient system level design methodology, a modelling framework for performance estimation and design space exploration at the system level is required. This thesis presents a novel component based modelling framework for system level modelling and performance estimation of embedded systems. The framework...... is performed by having the framework produce detailed quantitative information about the system model under investigation. The project is part of the national Danish research project, Danish Network of Embedded Systems (DaNES), which is funded by the Danish National Advanced Technology Foundation. The project...

  17. ReproPhylo: An Environment for Reproducible Phylogenomics.

    Directory of Open Access Journals (Sweden)

    Amir Szitenberg

    2015-09-01

    Full Text Available The reproducibility of experiments is key to the scientific process, and particularly necessary for accurate reporting of analyses in data-rich fields such as phylogenomics. We present ReproPhylo, a phylogenomic analysis environment developed to ensure experimental reproducibility, to facilitate the handling of large-scale data, and to assist methodological experimentation. Reproducibility, and instantaneous repeatability, is built in to the ReproPhylo system and does not require user intervention or configuration because it stores the experimental workflow as a single, serialized Python object containing explicit provenance and environment information. This 'single file' approach ensures the persistence of provenance across iterations of the analysis, with changes automatically managed by the version control program Git. This file, along with a Git repository, are the primary reproducibility outputs of the program. In addition, ReproPhylo produces an extensive human-readable report and generates a comprehensive experimental archive file, both of which are suitable for submission with publications. The system facilitates thorough experimental exploration of both parameters and data. ReproPhylo is a platform independent CC0 Python module and is easily installed as a Docker image or a WinPython self-sufficient package, with a Jupyter Notebook GUI, or as a slimmer version in a Galaxy distribution.

  18. Network model of security system

    Directory of Open Access Journals (Sweden)

    Adamczyk Piotr

    2016-01-01

    Full Text Available The article presents the concept of building a network security model and its application in the process of risk analysis. It indicates the possibility of a new definition of the role of the network models in the safety analysis. Special attention was paid to the development of the use of an algorithm describing the process of identifying the assets, vulnerability and threats in a given context. The aim of the article is to present how this algorithm reduced the complexity of the problem by eliminating from the base model these components that have no links with others component and as a result and it was possible to build a real network model corresponding to reality.

  19. Modeling Adaptive Behavior for Systems Design

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1994-01-01

    Field studies in modern work systems and analysis of recent major accidents have pointed to a need for better models of the adaptive behavior of individuals and organizations operating in a dynamic and highly competitive environment. The paper presents a discussion of some key characteristics...... of the predictive models required for the design of work supports systems, that is,information systems serving as the human-work interface. Three basic issues are in focus: 1.) Some fundamental problems in analysis and modeling modern dynamic work systems caused by the adaptive nature of human behavior; 2.......) The basic difference between the models of system functions used in engineering and design and those evolving from basic research within the various academic disciplines and finally 3.) The models and methods required for closed-loop, feedback system design....

  20. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  1. Very Large System Dynamics Models - Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Jacob J. Jacobson; Leonard Malczynski

    2008-10-01

    This paper provides lessons learned from developing several large system dynamics (SD) models. System dynamics modeling practice emphasize the need to keep models small so that they are manageable and understandable. This practice is generally reasonable and prudent; however, there are times that large SD models are necessary. This paper outlines two large SD projects that were done at two Department of Energy National Laboratories, the Idaho National Laboratory and Sandia National Laboratories. This paper summarizes the models and then discusses some of the valuable lessons learned during these two modeling efforts.

  2. Test-driven modeling of embedded systems

    DEFF Research Database (Denmark)

    Munck, Allan; Madsen, Jan

    2015-01-01

    To benefit maximally from model-based systems engineering (MBSE) trustworthy high quality models are required. From the software disciplines it is known that test-driven development (TDD) can significantly increase the quality of the products. Using a test-driven approach with MBSE may have...... a similar positive effect on the quality of the system models and the resulting products and may therefore be desirable. To define a test-driven model-based systems engineering (TD-MBSE) approach, we must define this approach for numerous sub disciplines such as modeling of requirements, use cases......, scenarios, behavior, architecture, etc. In this paper we present a method that utilizes the formalism of timed automatons with formal and statistical model checking techniques to apply TD-MBSE to the modeling of system architecture and behavior. The results obtained from applying it to an industrial case...

  3. Precipitation-runoff modeling system; user's manual

    Science.gov (United States)

    Leavesley, G.H.; Lichty, R.W.; Troutman, B.M.; Saindon, L.G.

    1983-01-01

    The concepts, structure, theoretical development, and data requirements of the precipitation-runoff modeling system (PRMS) are described. The precipitation-runoff modeling system is a modular-design, deterministic, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow, sediment yields, and general basin hydrology. Basin response to normal and extreme rainfall and snowmelt can be simulated to evaluate changes in water balance relationships, flow regimes, flood peaks and volumes, soil-water relationships, sediment yields, and groundwater recharge. Parameter-optimization and sensitivity analysis capabilites are provided to fit selected model parameters and evaluate their individual and joint effects on model output. The modular design provides a flexible framework for continued model system enhancement and hydrologic modeling research and development. (Author 's abstract)

  4. An Empirical Model for Energy Storage Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rosewater, David Martin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Scott, Paul [TransPower, Poway, CA (United States)

    2016-03-17

    Improved models of energy storage systems are needed to enable the electric grid’s adaptation to increasing penetration of renewables. This paper develops a generic empirical model of energy storage system performance agnostic of type, chemistry, design or scale. Parameters for this model are calculated using test procedures adapted from the US DOE Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage. We then assess the accuracy of this model for predicting the performance of the TransPower GridSaver – a 1 MW rated lithium-ion battery system that underwent laboratory experimentation and analysis. The developed model predicts a range of energy storage system performance based on the uncertainty of estimated model parameters. Finally, this model can be used to better understand the integration and coordination of energy storage on the electric grid.

  5. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  6. Construction and Application of an LP Farm Model with an Integrated Life Cycle Assessment for the Determination of Sustainable Milk Production Systems

    OpenAIRE

    Mohring, Anke; Zimmermann, Albert

    2005-01-01

    The increasingly stringent conditions underlying Swiss dairy production demand sustainable milk production systems that are economically optimized but also meet the ecological requirements of society. To determine such systems, a comparative-static LP model was constructed at farm level. Realistic production systems are reproduced in the model by means of binary variables. An Life Cycle Assessment (LCA) was integrated into the model to determine the environmental impacts of the farm. To this ...

  7. Brief History of Agricultural Systems Modeling

    Science.gov (United States)

    Jones, James W.; Antle, John M.; Basso, Bruno O.; Boote, Kenneth J.; Conant, Richard T.; Foster, Ian; Godfray, H. Charles J.; Herrrero, Mario; Howitt, Richard E.; Janssen, Sandor; hide

    2016-01-01

    Agricultural systems science generates knowledge that allows researchers to consider complex problems or take informed agricultural decisions. The rich history of this science exemplifies the diversity of systems and scales over which they operate and have been studied. Modeling, an essential tool in agricultural systems science, has been accomplished by scientists from a wide range of disciplines, who have contributed concepts and tools over more than six decades. As agricultural scientists now consider the next generation models, data, and knowledge products needed to meet the increasingly complex systems problems faced by society, it is important to take stock of this history and its lessons to ensure that we avoid re-invention and strive to consider all dimensions of associated challenges. To this end, we summarize here the history of agricultural systems modeling and identify lessons learned that can help guide the design and development of next generation of agricultural system tools and methods. A number of past events combined with overall technological progress in other fields have strongly contributed to the evolution of agricultural system modeling, including development of process-based bio-physical models of crops and livestock, statistical models based on historical observations, and economic optimization and simulation models at household and regional to global scales. Characteristics of agricultural systems models have varied widely depending on the systems involved, their scales, and the wide range of purposes that motivated their development and use by researchers in different disciplines. Recent trends in broader collaboration across institutions, across disciplines, and between the public and private sectors suggest that the stage is set for the major advances in agricultural systems science that are needed for the next generation of models, databases, knowledge products and decision support systems. The lessons from history should be considered

  8. Formal Modeling and Analysis of Timed Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Niebert, Peter

    This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts...... of two invited talks were carefully selected from 36 submissions during two rounds of reviewing and improvement. All current aspects of formal method for modeling and analyzing timed systems are addressed; among the timed systems dealt with are timed automata, timed Petri nets, max-plus algebras, real......-time systems, discrete time systems, timed languages, and real-time operating systems....

  9. Reproducibility of pacing profiles in elite swimmers.

    Science.gov (United States)

    Skorski, Sabrina; Faude, Oliver; Caviezel, Seraina; Meyer, Tim

    2014-03-01

    To analyze the reproducibility of pacing in elite swimmers during competitions and to compare heats and finals within 1 event. Finals and heats of 158 male swimmers (age 22.8 ± 2.9 y) from 29 nations were analyzed in 2 competitions (downloaded from swimrankings.net). Of these, 134 were listed in the world's top 50 in 2010; the remaining 24 were finalists of the Pan Pacific Games or European Championships. The level of both competitions for the analysis had to be at least national championships (7.7 ± 5.4 wk apart). Standard error of measurement expressed as percentage of the subject's mean score (CV) with 90% confidence limits (CL) for each 50-m split time and for total times were calculated. In addition, mixed general modeling was used to determine standard deviations between and within swimmers. CV for total time in finals ranged between 0.8% and 1.3% (CL 0.6-2.2%). Regarding split times, 200-m freestyle showed a consistent pacing over all split times (CV 0.9-1.6%). During butterfly, backstroke, and 400-m freestyle, CVs were low in the first 3 and 7 sections, respectively (CV 0.9-1.7%), with greater variability in the last section (1.9-2.2%). In breaststroke, values were higher in all sections (CV 1.2-2.3%). Within-subject SDs for changes between laps were between 0.9% and 2.6% in all finals. Split-time variability for finals and heats ranged between 0.9% and 2.5% (CL 0.3-4.9%). Pacing profiles are consistent between different competitions. Variability of pacing seems to be a result of the within-subject variation rather than a result of different competitions.

  10. Grey Box Modelling of Hydrological Systems

    DEFF Research Database (Denmark)

    Thordarson, Fannar Ørn

    The main topic of the thesis is grey box modelling of hydrologic systems, as well as formulation and assessment of their embedded uncertainties. Grey box model is a combination of a white box model, a physically-based model that is traditionally formulated using deterministic ordinary differential...... the lack of fit in state space formulation, and further support decisions for a model expansion. By using stochastic differential equations to formulate the dynamics of the hydrological system, either the complexity of the model can be increased by including the necessary hydrological processes...... in the model, or formulation of process noise can be considered so that it meets the physical limits of the hydrological system and give an adequate description of the embedded uncertainty in model structure. The thesis consists of two parts: a summary report and a part which contains six scientific papers...

  11. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    . The Guided System Development framework contributes to more secure communication systems by aiding the development of such systems. The framework features a simple modelling language, step-wise refinement from models to implementation, interfaces to security verification tools, and code generation from...... the verified specification. The refinement process carries thus security properties from the model to the implementation. Our approach also supports verification of systems previously developed and deployed. Internally, the reasoning in our framework is based on the Beliefs and Knowledge tool, a verification...

  12. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  13. Coupling population dynamics with earth system models: the POPEM model.

    Science.gov (United States)

    Navarro, Andrés; Moreno, Raúl; Jiménez-Alcázar, Alfonso; Tapiador, Francisco J

    2017-09-16

    Precise modeling of CO 2 emissions is important for environmental research. This paper presents a new model of human population dynamics that can be embedded into ESMs (Earth System Models) to improve climate modeling. Through a system dynamics approach, we develop a cohort-component model that successfully simulates historical population dynamics with fine spatial resolution (about 1°×1°). The population projections are used to improve the estimates of CO 2 emissions, thus transcending the bulk approach of existing models and allowing more realistic non-linear effects to feature in the simulations. The module, dubbed POPEM (from Population Parameterization for Earth Models), is compared with current emission inventories and validated against UN aggregated data. Finally, it is shown that the module can be used to advance toward fully coupling the social and natural components of the Earth system, an emerging research path for environmental science and pollution research.

  14. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  15. Spinal Cord Injury Model System Information Network

    Science.gov (United States)

    ... the UAB-SCIMS Contact the UAB-SCIMS UAB Spinal Cord Injury Model System Newly Injured Health Daily Living Consumer ... Information Network The University of Alabama at Birmingham Spinal Cord Injury Model System (UAB-SCIMS) maintains this Information Network ...

  16. Statistical Model Checking for Stochastic Hybrid Systems

    DEFF Research Database (Denmark)

    David, Alexandre; Du, Dehui; Larsen, Kim Guldstrand

    2012-01-01

    This paper presents novel extensions and applications of the UPPAAL-SMC model checker. The extensions allow for statistical model checking of stochastic hybrid systems. We show how our race-based stochastic semantics extends to networks of hybrid systems, and indicate the integration technique ap...

  17. Systemic, Ecological Model for Rehabilitation Counseling.

    Science.gov (United States)

    Hershenson, David B.

    1998-01-01

    Presents a reformulation of Hershenson's theoretical model for rehabilitation counseling in systemic and ecological terms. This macrosystem is organized by precipitating event of disability and is composed of four systems: consumer, functional, provider, and contextual. Discusses the use of the model for selecting rehabilitation interventions…

  18. Modeling complex work systems - method meets reality

    NARCIS (Netherlands)

    van der Veer, Gerrit C.; Hoeve, Machteld; Lenting, Bert

    1996-01-01

    Modeling an existing task situation is often a first phase in the (re)design of information systems. For complex systems design, this model should consider both the people and the organization involved, the work, and situational aspects. Groupware Task Analysis (GTA) as part of a method for the

  19. System dynamics modelling of situation awareness

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2015-11-01

    Full Text Available . The feedback loops and delays in the Command and Control system also contribute to the complex dynamic behavior. This paper will build on existing situation awareness models to develop a System Dynamics model to support a qualitative investigation through...

  20. Dynamic modeling of the INAPRO aquaponic system

    NARCIS (Netherlands)

    Karimanzira, Divas; Keesman, Karel J.; Kloas, Werner; Baganz, Daniela; Rauschenbach, Thomas

    2016-01-01

    The use of modeling techniques to analyze aquaponics systems is demonstrated with an example of dynamic modeling for the production of Nile tilapia (Oreochromis niloticus) and tomatoes (Solanum lycopersicon) using the innovative double recirculating aquaponic system ASTAF-PRO. For the management

  1. Formal heterogeneous system modeling with SystemC

    DEFF Research Database (Denmark)

    Niaki, Seyed Hosein Attarzadeh; Jakobsen, Mikkel Koefoed; Sulonen, Tero

    2012-01-01

    Electronic System Level (ESL) design of embedded systems proposes raising the abstraction level of the design entry to cope with the increasing complexity of such systems. To exploit the benefits of ESL, design languages should allow specification of models which are a) heterogeneous, to describe...... to focus on specifying the pure functional aspects. A key advantage is that the formalism is used to export the structure and behavior of the models via introspection as an abstract representation for further analysis and synthesis....

  2. Regression Models for Repairable Systems

    Czech Academy of Sciences Publication Activity Database

    Novák, Petr

    2015-01-01

    Roč. 17, č. 4 (2015), s. 963-972 ISSN 1387-5841 Institutional support: RVO:67985556 Keywords : Reliability analysis * Repair models * Regression Subject RIV: BB - Applied Statistics , Operational Research Impact factor: 0.782, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/novak-0450902.pdf

  3. Mathematical Modeling Of Life-Support Systems

    Science.gov (United States)

    Seshan, Panchalam K.; Ganapathi, Balasubramanian; Jan, Darrell L.; Ferrall, Joseph F.; Rohatgi, Naresh K.

    1994-01-01

    Generic hierarchical model of life-support system developed to facilitate comparisons of options in design of system. Model represents combinations of interdependent subsystems supporting microbes, plants, fish, and land animals (including humans). Generic model enables rapid configuration of variety of specific life support component models for tradeoff studies culminating in single system design. Enables rapid evaluation of effects of substituting alternate technologies and even entire groups of technologies and subsystems. Used to synthesize and analyze life-support systems ranging from relatively simple, nonregenerative units like aquariums to complex closed-loop systems aboard submarines or spacecraft. Model, called Generic Modular Flow Schematic (GMFS), coded in such chemical-process-simulation languages as Aspen Plus and expressed as three-dimensional spreadsheet.

  4. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  5. Modeling and analysis of stochastic systems

    CERN Document Server

    Kulkarni, Vidyadhar G

    2011-01-01

    Based on the author's more than 25 years of teaching experience, Modeling and Analysis of Stochastic Systems, Second Edition covers the most important classes of stochastic processes used in the modeling of diverse systems, from supply chains and inventory systems to genetics and biological systems. For each class of stochastic process, the text includes its definition, characterization, applications, transient and limiting behavior, first passage times, and cost/reward models. Along with reorganizing the material, this edition revises and adds new exercises and examples. New to the second edi

  6. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  7. Formal heterogeneous system modeling with SystemC

    DEFF Research Database (Denmark)

    Niaki, Seyed Hosein Attarzadeh; Jakobsen, Mikkel Koefoed; Sulonen, Tero

    2012-01-01

    Electronic System Level (ESL) design of embedded systems proposes raising the abstraction level of the design entry to cope with the increasing complexity of such systems. To exploit the benefits of ESL, design languages should allow specification of models which are a) heterogeneous, to describe...

  8. A hybrid conceptual-fuzzy inference streamflow modelling for the Letaba River system in South Africa

    Science.gov (United States)

    Katambara, Zacharia; Ndiritu, John G.

    There has been considerable water resources developments in South Africa and other regions in the world in order to meet the ever-increasing water demands. These developments have not been matched with a similar development of hydrological monitoring systems and hence there is inadequate data for managing the developed water resources systems. The Letaba River system ( Fig. 1) is a typical case of such a system in South Africa. The available water on this river is over-allocated and reliable daily streamflow modelling of the Letaba River that adequately incorporates the main components and processes would be an invaluable aid to optimal operation of the system. This study describes the development of a calibrated hybrid conceptual-fuzzy-logic model and explores its capability in reproducing the natural processes and human effects on the daily stream flow in the Letaba River. The model performance is considered satisfactory in view of the complexity of the system and inadequacy of relevant data. Performance in modelling streamflow improves towards the downstream and matches that of a stand-alone fuzzy-logic model. The hybrid model obtains realistic estimates of the major system components and processes including the capacities of the farm dams and storage weirs and their trajectories. This suggests that for complex data-scarce River systems, hybrid conceptual-fuzzy-logic modelling may be used for more detailed and dependable operational and planning analysis than stand-alone fuzzy modelling. Further work will include developing and testing other hybrid model configurations.

  9. Systems Engineering Model for ART Energy Conversion

    Energy Technology Data Exchange (ETDEWEB)

    Mendez Cruz, Carmen Margarita [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rochau, Gary E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wilson, Mollye C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    The near-term objective of the EC team is to establish an operating, commercially scalable Recompression Closed Brayton Cycle (RCBC) to be constructed for the NE - STEP demonstration system (demo) with the lowest risk possible. A systems engineering approach is recommended to ensure adequate requirements gathering, documentation, and mode ling that supports technology development relevant to advanced reactors while supporting crosscut interests in potential applications. A holistic systems engineering model was designed for the ART Energy Conversion program by leveraging Concurrent Engineering, Balance Model, Simplified V Model, and Project Management principles. The resulting model supports the identification and validation of lifecycle Brayton systems requirements, and allows designers to detail system-specific components relevant to the current stage in the lifecycle, while maintaining a holistic view of all system elements.

  10. Modeling aluminum-air battery systems

    Science.gov (United States)

    Savinell, R. F.; Willis, M. S.

    The performance of a complete aluminum-air battery system was studied with a flowsheet model built from unit models of each battery system component. A plug flow model for heat transfer was used to estimate the amount of heat transferred from the electrolyte to the air stream. The effect of shunt currents on battery performance was found to be insignificant. Using the flowsheet simulator to analyze a 100 cell battery system now under development demonstrated that load current, aluminate concentration, and electrolyte temperature are dominant variables controlling system performance. System efficiency was found to decrease as both load current and aluminate concentration increases. The flowsheet model illustrates the interdependence of separate units on overall system performance.

  11. Experimental challenges to reproduce seismic fault motion

    Science.gov (United States)

    Shimamoto, T.

    2011-12-01

    This presentation briefly reviews scientific and technical development in the studies of intermediate to high-velocity frictional properties of faults and summarizes remaining technical challenges to reproduce nucleation to growth processes of large earthquakes in laboratory. Nearly 10 high-velocity or low to high-velocity friction apparatuses have been built in the last several years in the world and it has become possible now to produce sub-plate velocity to seismic slip rate in a single machine. Despite spreading of high-velocity friction studies, reproducing seismic fault motion at high P and T conditions to cover the entire seismogenic zone is still a big challenge. Previous studies focused on (1) frictional melting, (2) thermal pressurization, and (3) high-velocity gouge behavior without frictional melting. Frictional melting process was solved as a Stefan problem with very good agreement with experimental results. Thermal pressurization has been solved theoretically based on measured transport properties and has been included successfully in the modeling of earthquake generation. High-velocity gouge experiments in the last several years have revealed that a wide variety of gouges exhibit dramatic weakening at high velocities (e.g., Di Toro et al., 2011, Nature). Most gouge experiments were done under dry conditions partly to separate gouge friction from the involvement of thermal pressurization. However, recent studies demonstrated that dehydration or degassing due to mineral decomposition can occur during seismic fault motion. Those results not only provided a new view of looking at natural fault zones in search of geological evidence of seismic fault motion, but also indicated that thermal pressurization and gouge weakening can occur simultaneously even in initially dry gouge. Thus experiments with controlled pore pressure are needed. I have struggled to make a pressure vessel for wet high-velocity experiments in the last several years. A technical

  12. Model-Based Design for Embedded Systems

    CERN Document Server

    Nicolescu, Gabriela

    2009-01-01

    Model-based design allows teams to start the design process from a high-level model that is gradually refined through abstraction levels to ultimately yield a prototype. This book describes the main facets of heterogeneous system design. It focuses on multi-core methodological issues, real-time analysis, and modeling and validation

  13. Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis

    Science.gov (United States)

    Xiao, Di; Wang, Jun

    2012-10-01

    The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.

  14. Critically Important Object Security System Element Model

    Directory of Open Access Journals (Sweden)

    I. V. Khomyackov

    2012-03-01

    Full Text Available A stochastic model of critically important object security system element has been developed. The model includes mathematical description of the security system element properties and external influences. The state evolution of the security system element is described by the semi-Markov process with finite states number, the semi-Markov matrix and the initial semi-Markov process states probabilities distribution. External influences are set with the intensity of the Poisson thread.

  15. A model for international border management systems.

    Energy Technology Data Exchange (ETDEWEB)

    Duggan, Ruth Ann

    2008-09-01

    To effectively manage the security or control of its borders, a country must understand its border management activities as a system. Using its systems engineering and security foundations as a Department of Energy National Security Laboratory, Sandia National Laboratories has developed such an approach to modeling and analyzing border management systems. This paper describes the basic model and its elements developed under Laboratory Directed Research and Development project 08-684.

  16. Composting in small laboratory pilots: Performance and reproducibility

    International Nuclear Information System (INIS)

    Lashermes, G.; Barriuso, E.; Le Villio-Poitrenaud, M.; Houot, S.

    2012-01-01

    Highlights: ► We design an innovative small-scale composting device including six 4-l reactors. ► We investigate the performance and reproducibility of composting on a small scale. ► Thermophilic conditions are established by self-heating in all replicates. ► Biochemical transformations, organic matter losses and stabilisation are realistic. ► The organic matter evolution exhibits good reproducibility for all six replicates. - Abstract: Small-scale reactors ( 2 consumption and CO 2 emissions, and characterising the biochemical evolution of organic matter. A good reproducibility was found for the six replicates with coefficients of variation for all parameters generally lower than 19%. An intense self-heating ensured the existence of a spontaneous thermophilic phase in all reactors. The average loss of total organic matter (TOM) was 46% of the initial content. Compared to the initial mixture, the hot water soluble fraction decreased by 62%, the hemicellulose-like fraction by 68%, the cellulose-like fraction by 50% and the lignin-like fractions by 12% in the final compost. The TOM losses, compost stabilisation and evolution of the biochemical fractions were similar to observed in large reactors or on-site experiments, excluding the lignin degradation, which was less important than in full-scale systems. The reproducibility of the process and the quality of the final compost make it possible to propose the use of this experimental device for research requiring a mass reduction of the initial composted waste mixtures.

  17. A strategic review of electricity systems models

    International Nuclear Information System (INIS)

    Foley, A.M.; O Gallachoir, B.P.; McKeogh, E.J.; Hur, J.; Baldick, R.

    2010-01-01

    Electricity systems models are software tools used to manage electricity demand and the electricity systems, to trade electricity and for generation expansion planning purposes. Various portfolios and scenarios are modelled in order to compare the effects of decision making in policy and on business development plans in electricity systems so as to best advise governments and industry on the least cost economic and environmental approach to electricity supply, while maintaining a secure supply of sufficient quality electricity. The modelling techniques developed to study vertically integrated state monopolies are now applied in liberalised markets where the issues and constraints are more complex. This paper reviews the changing role of electricity systems modelling in a strategic manner, focussing on the modelling response to key developments, the move away from monopoly towards liberalised market regimes and the increasing complexity brought about by policy targets for renewable energy and emissions. The paper provides an overview of electricity systems modelling techniques, discusses a number of key proprietary electricity systems models used in the USA and Europe and provides an information resource to the electricity analyst not currently readily available in the literature on the choice of model to investigate different aspects of the electricity system. (author)

  18. Development of a system emulating the global carbon cycle in Earth system models

    Directory of Open Access Journals (Sweden)

    K. Tachiiri

    2010-08-01

    Full Text Available Recent studies have indicated that the uncertainty in the global carbon cycle may have a significant impact on the climate. Since state of the art models are too computationally expensive for it to be possible to explore their parametric uncertainty in anything approaching a comprehensive fashion, we have developed a simplified system for investigating this problem. By combining the strong points of general circulation models (GCMs, which contain detailed and complex processes, and Earth system models of intermediate complexity (EMICs, which are quick and capable of large ensembles, we have developed a loosely coupled model (LCM which can represent the outputs of a GCM-based Earth system model, using much smaller computational resources. We address the problem of relatively poor representation of precipitation within our EMIC, which prevents us from directly coupling it to a vegetation model, by coupling it to a precomputed transient simulation using a full GCM. The LCM consists of three components: an EMIC (MIROC-lite which consists of a 2-D energy balance atmosphere coupled to a low resolution 3-D GCM ocean (COCO including an ocean carbon cycle (an NPZD-type marine ecosystem model; a state of the art vegetation model (Sim-CYCLE; and a database of daily temperature, precipitation, and other necessary climatic fields to drive Sim-CYCLE from a precomputed transient simulation from a state of the art AOGCM. The transient warming of the climate system is calculated from MIROC-lite, with the global temperature anomaly used to select the most appropriate annual climatic field from the pre-computed AOGCM simulation which, in this case, is a 1% pa increasing CO2 concentration scenario.

    By adjusting the effective climate sensitivity (equivalent to the equilibrium climate sensitivity for an energy balance model of MIROC-lite, the transient warming of the LCM could be adjusted to closely follow the low sensitivity (with an equilibrium

  19. Human performance modeling for system of systems analytics.

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, Kevin R.; Lawton, Craig R.; Basilico, Justin Derrick; Longsine, Dennis E. (INTERA, Inc., Austin, TX); Forsythe, James Chris; Gauthier, John Henry; Le, Hai D.

    2008-10-01

    A Laboratory-Directed Research and Development project was initiated in 2005 to investigate Human Performance Modeling in a System of Systems analytic environment. SAND2006-6569 and SAND2006-7911 document interim results from this effort; this report documents the final results. The problem is difficult because of the number of humans involved in a System of Systems environment and the generally poorly defined nature of the tasks that each human must perform. A two-pronged strategy was followed: one prong was to develop human models using a probability-based method similar to that first developed for relatively well-understood probability based performance modeling; another prong was to investigate more state-of-art human cognition models. The probability-based modeling resulted in a comprehensive addition of human-modeling capability to the existing SoSAT computer program. The cognitive modeling resulted in an increased understanding of what is necessary to incorporate cognition-based models to a System of Systems analytic environment.

  20. Reliability models for Space Station power system

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kim, Y.; Wagner, H.

    1987-01-01

    This paper presents a methodology for the reliability evaluation of Space Station power system. The two options considered are the photovoltaic system and the solar dynamic system. Reliability models for both of these options are described along with the methodology for calculating the reliability indices.

  1. System identification application using Hammerstein model

    Indian Academy of Sciences (India)

    Saban Ozer

    because of its advanced theoretical background [3–5, 10]. However, many systems in real life have nonlinear beha- ... To describe a polynomial non-linear system with memory, the Volterra series expansion has been the ... suppression and adaptive noise suppression [19]. 2.3 Hammerstein model. Many systems can be ...

  2. Using Interaction Scenarios to Model Information Systems

    DEFF Research Database (Denmark)

    Bækgaard, Lars; Bøgh Andersen, Peter

    The purpose of this paper is to define and discuss a set of interaction primitives that can be used to model the dynamics of socio-technical activity systems, including information systems, in a way that emphasizes structural aspects of the interaction that occurs in such systems. The primitives...

  3. Reproducibility of Computer-Aided Detection Marks in Digital Mammography

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Kim, Sun Mi; Im, Jung Gi; Cha, Joo Hee

    2007-01-01

    To evaluate the performance and reproducibility of a computeraided detection (CAD) system in mediolateral oblique (MLO) digital mammograms taken serially, without release of breast compression. A CAD system was applied preoperatively to the fulfilled digital mammograms of two MLO views taken without release of breast compression in 82 patients (age range: 33 83 years; mean age: 49 years) with previously diagnosed breast cancers. The total number of visible lesion components in 82 patients was 101: 66 masses and 35 microcalcifications. We analyzed the sensitivity and reproducibility of the CAD marks. The sensitivity of the CAD system for first MLO views was 71% (47/66) for masses and 80% (28/35) for microcalcifications. The sensitivity of the CAD system for second MLO views was 68% (45/66) for masses and 17% (6/35) for microcalcifications. In 84 ipsilateral serial MLO image sets (two patients had bilateral cancers), identical images, regardless of the existence of CAD marks, were obtained for 35% (29/84) and identical images with CAD marks were obtained for 29% (23/78). Identical images, regardless of the existence of CAD marks, for contralateral MLO images were 65% (52/80) and identical images with CAD marks were obtained for 28% (11/39). The reproducibility of CAD marks for the true positive masses in serial MLO views was 84% (42/50) and that for the true positive microcalcifications was 0% (0/34). The CAD system in digital mammograms showed a high sensitivity for detecting masses and microcalcifications. However, reproducibility of microcalcification marks was very low in MLO views taken serially without release of breast compression. Minute positional change and patient movement can alter the images and result in a significant effect on the algorithm utilized by the CAD for detecting microcalcifications

  4. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael C.; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric-Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  5. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  6. REPRODUCIBILITY OF CHILDHOOD RESPIRATORY SYMPTOM QUESTIONS

    NARCIS (Netherlands)

    BRUNEKREEF, B; GROOT, B; RIJCKEN, B; HOEK, G; STEENBEKKERS, A; DEBOER, A

    The reproducibility of answers to childhood respiratory symptom questions was investigated by administering two childhood respiratory symptom questionnaires twice, with a one month interval, to the same population of Dutch school children. The questionnaires were completed by the parents of 410

  7. Reply to the comment of S. Rayne on "QSAR model reproducibility and applicability: A case study of rate constants of hydroxyl radical reaction models applied to polybrominated diphenyl ethers and (benzo-)triazoles".

    Science.gov (United States)

    Gramatica, Paola; Kovarich, Simona; Roy, Partha Pratim

    2013-07-30

    We appreciate the interest of Dr. Rayne on our article and we completely agree that the dataset of (benzo-)triazoles, which were screened by the hydroxyl radical reaction quantitative structure-activity relationship (QSAR) model, was not only composed of benzo-triazoles but also included some simpler triazoles (without the condensed benzene ring), such as the chemicals listed by Dr. Rayne, as well as some related heterocycles (also few not aromatic). We want to clarify that in this article (as well as in other articles in which the same dataset was screened), for conciseness, the abbreviations (B)TAZs and BTAZs were used as general (and certainly too simplified) notations meaning an extended dataset of benzo-triazoles, triazoles, and related compounds. Copyright © 2013 Wiley Periodicals, Inc.

  8. Data Processing Workflows to Support Reproducible Data-driven Research in Hydrology

    Science.gov (United States)

    Goodall, J. L.; Essawy, B.; Xu, H.; Rajasekar, A.; Moore, R. W.

    2015-12-01

    Geoscience analyses often require the use of existing data sets that are large, heterogeneous, and maintained by different organizations. A particular challenge in creating reproducible analyses using these data sets is automating the workflows required to transform raw datasets into model specific input files and finally into publication ready visualizations. Data grids, such as the Integrated Rule-Oriented Data System (iRODS), are architectures that allow scientists to access and share large data sets that are geographically distributed on the Internet, but appear to the scientist as a single file management system. The DataNet Federation Consortium (DFC) project is built on iRODS and aims to demonstrate data and computational interoperability across scientific communities. This paper leverages iRODS and the DFC to demonstrate how hydrological modeling workflows can be encapsulated as workflows using the iRODS concept of Workflow Structured Objects (WSO). An example use case is presented for automating hydrologic model post-processing routines that demonstrates how WSOs can be created and used within the DFC to automate the creation of data visualizations from large model output collections. By co-locating the workflow used to create the visualization with the data collection, the use case demonstrates how data grid technology aids in reuse, reproducibility, and sharing of workflows within scientific communities.

  9. Reproducibility of quantitative high-throughput BI-RADS features extracted from ultrasound images of breast cancer.

    Science.gov (United States)

    Hu, Yuzhou; Qiao, Mengyun; Guo, Yi; Wang, Yuanyuan; Yu, Jinhua; Li, Jiawei; Chang, Cai

    2017-07-01

    Digital Breast Imaging Reporting and Data System (BI-RADS) features extracted from ultrasound images are essential in computer-aided diagnosis, prediction, and prognosis of breast cancer. This study focuses on the reproducibility of quantitative high-throughput BI-RADS features in the presence of variations due to different segmentation results, various ultrasound machine models, and multiple ultrasound machine settings. Dataset 1 consists of 399 patients with invasive breast cancer and is used as the training set to measure the reproducibility of features, while dataset 2 consists of 138 other patients and is a validation set used to evaluate the diagnosis performances of the final reproducible features. Four hundred and sixty high-throughput BI-RADS features are designed and quantized according to BI-RADS lexicon. Concordance Correlation Coefficient (CCC) and Deviation (Dev) are used to assess the effect of the segmentation methods and Between-class Distance (BD) is used to study the influences of the machine models. In addition, the features jointly shared by two methodologies are further investigated on their effects with multiple machine settings. Subsequently, the absolute value of Pearson Correlation Coefficient (R abs ) is applied for redundancy elimination. Finally, the features that are reproducible and not redundant are preserved as the stable feature set. A 10-fold Support Vector Machine (SVM) classifier is employed to verify the diagnostic ability. One hundred and fifty-three features were found to have high reproducibility (CCC > 0.9 & Dev BI-RADS features to various degrees. Our 46 reproducible features were robust to these factors and were capable of distinguishing benign and malignant breast tumors. © 2017 American Association of Physicists in Medicine.

  10. Model Validation for Simulations of Vehicle Systems

    Science.gov (United States)

    2012-08-01

    jackknife”, Annals of Statistics, 7:1-26, 1979. [45] B. Efron and G. Gong, “A leisurely look at the bootstrap, the jackknife, and cross-validation”, The...battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation of ground vehicle systems...Sandia National Laboratories and a battery model developed in the Automotive Research Center, a US Army Center of Excellence for modeling and simulation

  11. Models of cuspy triaxial stellar systems. IV: Rotating systems

    OpenAIRE

    Carpintero, D. D.; Muzzio, J. C.

    2016-01-01

    We built two self-consistent models of triaxial, cuspy, rotating stellar systems adding rotation to non-rotating models presented in previous papers of this series. The final angular velocity of the material is not constant and varies with the distance to the center and with the height over the equator of the systems, but the figure rotation is very uniform in both cases. Even though the addition of rotation to the models modifies their original semiaxes ratios, the final rotating models are ...

  12. Spatial Models and Networks of Living Systems

    DEFF Research Database (Denmark)

    Juul, Jeppe Søgaard

    with interactions defined by network topology. In this thesis I first describe three different biological models of ageing and cancer, in which spatial structure is important for the system dynamics. I then turn to describe characteristics of ecosystems consisting of three cyclically interacting species......When studying the dynamics of living systems, insight can often be gained by developing a mathematical model that can predict future behaviour of the system or help classify system characteristics. However, in living cells, organisms, and especially groups of interacting individuals, a large number...... of different factors influence the time development of the system. This often makes it challenging to construct a mathematical model from which to draw conclusions. One traditional way of capturing the dynamics in a mathematical model is to formulate a set of coupled differential equations for the essential...

  13. Atomic Weights Confirm Bipolar Model of Oscillations in a Chain System

    Directory of Open Access Journals (Sweden)

    Ries A.

    2013-10-01

    Full Text Available We apply the bipolar model of oscillations in a chain system to the data set of standard atomic weights. 90% of these masses could be reproduced by this model and were expressed in continued fraction form, where all numerators are Euler’s number and the sum of the free link and all partial denominators yields zero. All outliers were either radioactive or polynuclidic elements whose isotopic compositions as found in samples on Earth might not be fully representative for the mean values when considering samples from all parts of the universe.

  14. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  15. Mechatronic Systems Design Methods, Models, Concepts

    CERN Document Server

    Janschek, Klaus

    2012-01-01

    In this textbook, fundamental methods for model-based design of mechatronic systems are presented in a systematic, comprehensive form. The method framework presented here comprises domain-neutral methods for modeling and performance analysis: multi-domain modeling (energy/port/signal-based), simulation (ODE/DAE/hybrid systems), robust control methods, stochastic dynamic analysis, and quantitative evaluation of designs using system budgets. The model framework is composed of analytical dynamic models for important physical and technical domains of realization of mechatronic functions, such as multibody dynamics, digital information processing and electromechanical transducers. Building on the modeling concept of a technology-independent generic mechatronic transducer, concrete formulations for electrostatic, piezoelectric, electromagnetic, and electrodynamic transducers are presented. More than 50 fully worked out design examples clearly illustrate these methods and concepts and enable independent study of th...

  16. FSM Model of a Simple Photovoltaic System

    Directory of Open Access Journals (Sweden)

    Martina Latkova

    2015-01-01

    Full Text Available The paper describes a simulation model of a simple photovoltaic system intended as a tool for testing the use of finite state machines for simulations representing a long-term operation of renewable energy sources. The mathematical model of the photovoltaic system is described first. Then it is used to build a finite state machine model that calculates a power output of the photovoltaic system for changing values of a solar irradiance and a temperature. Data measured on a real photovoltaic installation are used to verify model’s accuracy through its comparison with a previously created and verified Matlab model. The finite state machine model presented in this paper was created using Ptolemy II software.

  17. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  18. LHC Orbit Correction Reproducibility and Related Machine Protection

    OpenAIRE

    Baer, T; Fuchsberger, K; Schmidt, R; Wenninger, J

    2012-01-01

    The Large Hadron Collider (LHC) has an unprecedented nominal stored beam energy of up to 362 MJ per beam. In order to ensure an adequate machine protection by the collimation system, a high reproducibility of the beam position at collimators and special elements like the final focus quadrupoles is essential. This is realized by a combination of manual orbit corrections, feed forward and real time feedback. In order to protect the LHC against inconsistent orbit corrections, which could put the...

  19. A MICROCOMPUTER MODEL FOR IRRIGATION SYSTEM EVALUATION

    OpenAIRE

    Williams, Jeffery R.; Buller, Orlan H.; Dvorak, Gary J.; Manges, Harry L.

    1988-01-01

    ICEASE (Irrigation Cost Estimator and System Evaluator) is a microcomputer model designed and developed to meet the need for conducting economic evaluation of adjustments to irrigation systems and management techniques to improve the use of irrigated water. ICEASE can calculate the annual operating costs for irrigation systems and has five options that can be used to economically evaluate improvements in the pumping plant or the way the irrigation system is used for crop production.

  20. MODEL DRIVEN DEVELOPMENT OF ONLINE BANKING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Bresfelean Vasile Paul

    2011-07-01

    Full Text Available In case of online applications the cycle of software development varies from the routine. The online environment, the variety of users, the treatability of the mass of information created by them, the reusability and the accessibility from different devices are all factors of these systems complexity. The use of model drive approach brings several advantages that ease up the development process. Working prototypes that simplify client relationship and serve as the base of model tests can be easily made from models describing the system. These systems make possible for the banks clients to make their desired actions from anywhere. The user has the possibility of accessing information or making transactions.

  1. Modelling and parameter estimation of dynamic systems

    CERN Document Server

    Raol, JR; Singh, J

    2004-01-01

    Parameter estimation is the process of using observations from a system to develop mathematical models that adequately represent the system dynamics. The assumed model consists of a finite set of parameters, the values of which are calculated using estimation techniques. Most of the techniques that exist are based on least-square minimization of error between the model response and actual system response. However, with the proliferation of high speed digital computers, elegant and innovative techniques like filter error method, H-infinity and Artificial Neural Networks are finding more and mor

  2. System Model of Daily Sediment Yield

    Science.gov (United States)

    Sharma, T. C.; Dickinson, W. T.

    1980-06-01

    Input-output systems concepts have been applied to the modeling of daily runoff-sediment yield of the Thames River in southern Ontario, Canada. Spectral and correlation techniques have been used to construct a parsimonious model of daily sediment yields. It is shown that a linear discrete dynamic model is possible in terms of the log-transformed daily runoff and sediment yield sequences. The fluvial system of the Thames River watershed exhibits a weak memory on a daily basis, and the noise component corrupting the watershed fluvial system resembles a white noise process.

  3. Externalizing Behaviour for Analysing System Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof

    2013-01-01

    System models have recently been introduced to model organisations and evaluate their vulnerability to threats and especially insider threats. Especially for the latter these models are very suitable, since insiders can be assumed to have more knowledge about the attacked organisation than outside......, if not impossible task to change behaviours. Especially when considering social engineering or the human factor in general, the ability to use different kinds of behaviours is essential. In this work we present an approach to make the behaviour a separate component in system models, and explore how to integrate...

  4. A pebbles accretion model with chemistry and implications for the Solar system

    Science.gov (United States)

    Ali-Dib, Mohamad

    2017-02-01

    We investigate the chemical composition of the Solar system's giant planets atmospheres using a physical formation model with chemistry. The model incorporate disc evolution, pebbles and gas accretion, type I and II migration, simplified disc photoevaporation and Solar system chemical measurements. We track the chemical compositions of the formed giant planets and compare them to the observed values. Two categories of models are studied: with and without disc chemical enrichment via photoevaporation (PE). Predictions for the oxygen and nitrogen abundances, core masses and total amount of heavy elements for the planets are made for each case. We find that in the case without disc PE, both Jupiter and Saturn will have a small residual core and comparable total amounts of heavy elements in the envelopes. We predict oxygen abundances enrichments in the same order as carbon, phosphorus and sulfur for both planets. Cometary nitrogen abundances does not allow us to easily reproduce Jupiter's nitrogen observations. In the case with disc PE, less core erosion is needed to reproduce the chemical composition of the atmospheres, so both planets will end up with possibly more massive residual cores and higher total mass of heavy elements. It is also significantly easier to reproduce Jupiter's nitrogen abundance. No single disc was found to form both Jupiter and Saturn with all their constraints in the case without photoevaporation. No model was able to fit the constraints on Uranus and Neptune, hinting towards a more complicated formation mechanism for these planets. The predictions of these models should be compared to the upcoming Juno measurements to better understand the origins of the Solar system giant planets.

  5. Model transformation based information system modernization

    Directory of Open Access Journals (Sweden)

    Olegas Vasilecas

    2013-03-01

    Full Text Available Information systems begin to date increasingly faster because of rapidly changing business environment. Usually, small changes are not sufficient to adapt complex legacy information systems to changing business needs. A new functionality should be installed with the requirement of putting business data in the smallest possible risk. Information systems modernization problems are beeing analyzed in this paper and a method for information system modernization is proposed. It involves programming code transformation into abstract syntax tree metamodel (ASTM and model based transformation from ASTM into knowledge discovery model (KDM. The method is validated on example for SQL language.

  6. An automatic system using mobile-agent software to model the calculation process of a chemical vapor deposition film deposition simulator.

    Science.gov (United States)

    Takahashi, Takahiro; Fukui, Noriyuki; Arakawa, Masamoto; Funatsu, Kimito; Ema, Yoshinori

    2011-09-01

    We have developed an automatic modeling system for calculation processes of the simulator to reproduce experimental results of chemical vapor deposition (CVD), in order to decrease the calculation cost of the simulator. Replacing the simulator by the mathematical models proposed by the system will contribute towards decreasing the calculation costs for predicting the experimental results. The system consists of a mobile agent and two software resources in computer networks, that is, generalized modeling software and a simulator reproducing cross-sections of the deposited films on the substrates with the micrometer- or nanometer-sized trenches. The mobile agent autonomously creates appropriate models by moving to and then operating the software resources. The models are calculated by partial least squares regression (PLS), quadratic PLS (QPLS) and error back propagation (BP) methods using artificial neural networks (ANN) and expresses by mathematical formulas to reproduce the calculated results of the simulator. The models show good reproducibility and predictability both for uniformity and filling properties of the films calculated by the simulator. The models using the BP method yield the best performance. The filling property data are more suitable to modeling than film uniformity.

  7. Reproducibility and Practical Adoption of GEOBIA with Open-Source Software in Docker Containers

    Directory of Open Access Journals (Sweden)

    Christian Knoth

    2017-03-01

    Full Text Available Geographic Object-Based Image Analysis (GEOBIA mostly uses proprietary software,but the interest in Free and Open-Source Software (FOSS for GEOBIA is growing. This interest stems not only from cost savings, but also from benefits concerning reproducibility and collaboration. Technical challenges hamper practical reproducibility, especially when multiple software packages are required to conduct an analysis. In this study, we use containerization to package a GEOBIA workflow in a well-defined FOSS environment. We explore the approach using two software stacks to perform an exemplary analysis detecting destruction of buildings in bi-temporal images of a conflict area. The analysis combines feature extraction techniques with segmentation and object-based analysis to detect changes using automatically-defined local reference values and to distinguish disappeared buildings from non-target structures. The resulting workflow is published as FOSS comprising both the model and data in a ready to use Docker image and a user interface for interaction with the containerized workflow. The presented solution advances GEOBIA in the following aspects: higher transparency of methodology; easier reuse and adaption of workflows; better transferability between operating systems; complete description of the software environment; and easy application of workflows by image analysis experts and non-experts. As a result, it promotes not only the reproducibility of GEOBIA, but also its practical adoption.

  8. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  9. Semiautomated, Reproducible Batch Processing of Soy

    Science.gov (United States)

    Thoerne, Mary; Byford, Ivan W.; Chastain, Jack W.; Swango, Beverly E.

    2005-01-01

    A computer-controlled apparatus processes batches of soybeans into one or more of a variety of food products, under conditions that can be chosen by the user and reproduced from batch to batch. Examples of products include soy milk, tofu, okara (an insoluble protein and fiber byproduct of soy milk), and whey. Most processing steps take place without intervention by the user. This apparatus was developed for use in research on processing of soy. It is also a prototype of other soy-processing apparatuses for research, industrial, and home use. Prior soy-processing equipment includes household devices that automatically produce soy milk but do not automatically produce tofu. The designs of prior soy-processing equipment require users to manually transfer intermediate solid soy products and to press them manually and, hence, under conditions that are not consistent from batch to batch. Prior designs do not afford choices of processing conditions: Users cannot use previously developed soy-processing equipment to investigate the effects of variations of techniques used to produce soy milk (e.g., cold grinding, hot grinding, and pre-cook blanching) and of such process parameters as cooking times and temperatures, grinding times, soaking times and temperatures, rinsing conditions, and sizes of particles generated by grinding. In contrast, the present apparatus is amenable to such investigations. The apparatus (see figure) includes a processing tank and a jacketed holding or coagulation tank. The processing tank can be capped by either of two different heads and can contain either of two different insertable mesh baskets. The first head includes a grinding blade and heating elements. The second head includes an automated press piston. One mesh basket, designated the okara basket, has oblong holes with a size equivalent to about 40 mesh [40 openings per inch (.16 openings per centimeter)]. The second mesh basket, designated the tofu basket, has holes of 70 mesh [70 openings

  10. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  11. Transforming Graphical System Models to Graphical Attack Models

    DEFF Research Database (Denmark)

    Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, Rene Rydhof

    2016-01-01

    Manually identifying possible attacks on an organisation is a complex undertaking; many different factors must be considered, and the resulting attack scenarios can be complex and hard to maintain as the organisation changes. System models provide a systematic representation of organisations...... that helps in structuring attack identification and can integrate physical, virtual, and social components. These models form a solid basis for guiding the manual identification of attack scenarios. Their main benefit, however, is in the analytic generation of attacks. In this work we present a systematic...... approach to transforming graphical system models to graphical attack models in the form of attack trees. Based on an asset in the model, our transformations result in an attack tree that represents attacks by all possible actors in the model, after which the actor in question has obtained the asset....

  12. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  13. CTBT integrated verification system evaluation model supplement

    International Nuclear Information System (INIS)

    EDENBURN, MICHAEL W.; BUNTING, MARCUS; PAYNE, ARTHUR C. JR.; TROST, LAWRENCE C.

    2000-01-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0

  14. Rapid Discrimination Among Putative Mechanistic Models of Biochemical Systems.

    Science.gov (United States)

    Lomnitz, Jason G; Savageau, Michael A

    2016-08-31

    An overarching goal in molecular biology is to gain an understanding of the mechanistic basis underlying biochemical systems. Success is critical if we are to predict effectively the outcome of drug treatments and the development of abnormal phenotypes. However, data from most experimental studies is typically noisy and sparse. This allows multiple potential mechanisms to account for experimental observations, and often devising experiments to test each is not feasible. Here, we introduce a novel strategy that discriminates among putative models based on their repertoire of qualitatively distinct phenotypes, without relying on knowledge of specific values for rate constants and binding constants. As an illustration, we apply this strategy to two synthetic gene circuits exhibiting anomalous behaviors. Our results show that the conventional models, based on their well-characterized components, cannot account for the experimental observations. We examine a total of 40 alternative hypotheses and show that only 5 have the potential to reproduce the experimental data, and one can do so with biologically relevant parameter values.

  15. A distributed snow-evolution modeling system (SnowModel)

    Science.gov (United States)

    Glen E. Liston; Kelly. Elder

    2006-01-01

    SnowModel is a spatially distributed snow-evolution modeling system designed for application in landscapes, climates, and conditions where snow occurs. It is an aggregation of four submodels: MicroMet defines meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowPack simulates snow depth and water-equivalent evolution, and SnowTran-3D...

  16. Exploring the Role of Information Professionals in Improving Research Reproducibility:A Case Study in Geosciences

    Science.gov (United States)

    Yan, A.; West, J.

    2016-12-01

    The validity of Geosciences research is of great significance to general public and policy-makers. In an earlier study, we surveyed 136 faculty and graduate students in geosciences. The result indicated that nearly 80% of respondents who had ever reproduced a published study had failed at least one time in reproducing, suggesting a general lack of research reproducibility in geosciences. Although there is much enthusiasm for creation of technologies such as workflow system, literate programming, and cloud-based system to facilitate reproducibility, much less emphasis has been placed on the information services essential for meaningful use of these tools. Library and Information Science (LIS) has a rich tradition of providing customized service for research communities. LIS professionals such as academic librarians have made strong contribution to resources locating, software recommending, data curation, metadata guidance, project management, submission review and author training. In particular, university libraries have been actively developing tools and offering guidelines, consultations, and trainings on Data Management Plan (DMP) required by National Science Foundation (NSF). And effective data management is a significant first step towards reproducible research. Hereby we argue that LIS professionals may be well-positioned to assist researchers to make their research reproducible. In this study, we aim to answer the question: how can LIS professionals assist geoscience researchers in making their research capable of being reproduced? We first synthesize different definitions of "reproducibility" and provide a conceptual framework of "reproducibility" in geosciences to resolve some of the misunderstandings around related terminology. Using a case study approach, we then examine 1) university librarians' technical skills, domain knowledge, professional activities, together with their awareness of, readiness for, and attitudes towards research reproducibility and

  17. Model Updating Nonlinear System Identification Toolbox Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology by adopting the flight data with state-of-the-art...

  18. Power system coherency and model reduction

    CERN Document Server

    Chow, Joe H

    2014-01-01

    This book provides a comprehensive treatment for understanding interarea modes in large power systems and obtaining reduced-order models using the coherency concept and selective modal analysis method.

  19. Regional Ocean Modeling System (ROMS): Samoa

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 7-day, 3-hourly forecast for the region surrounding the islands of Samoa at approximately 3-km resolution. While considerable...

  20. Systemic Therapy: A New Brief Intervention Model.

    Science.gov (United States)

    Searight, H. Russell; Openlander, Patrick

    1984-01-01

    Describes a newly developing mode of problem-oriented brief therapy. The systemic therapy model emphasizes the interactional context of clients' problems and represents an efficient intervention paradigm. (Author/JAC)

  1. Regional Ocean Modeling System (ROMS): CNMI

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 7-day, 3-hourly forecast for the region surrounding the Commonwealth of the Northern Mariana Islands (CNMI) at approximately...

  2. Regional Ocean Modeling System (ROMS): Guam

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Regional Ocean Modeling System (ROMS) 6-day, 3-hourly forecast for the region surrounding Guam at approximately 2-km resolution. While considerable effort has been...

  3. A Telecommunications Industry Primer: A Systems Model.

    Science.gov (United States)

    Obermier, Timothy R.; Tuttle, Ronald H.

    2003-01-01

    Describes the Telecommunications Systems Model to help technical educators and students understand the increasingly complex telecommunications infrastructure. Specifically looks at ownership and regulatory status, service providers, transport medium, network protocols, and end-user services. (JOW)

  4. Modeling Approaches and Systems Related to Structured Modeling.

    Science.gov (United States)

    1987-02-01

    Lasdon 򒾂> and Maturana 򒾃> for surveys of several modern systems. A -6- N NN- %0 CAMPS (Lucas and Mitra 򒾁>) -- Computer Assisted Mathe- %l...583-589. MATURANA , S. 򒾃>. "Comparative Analysis of Mathematical Modeling Systems," informal note, Graduate School of Manage- ment, UCLA, February

  5. Conceptual Modelling of Complex Production Systems

    Directory of Open Access Journals (Sweden)

    Nenad Perši

    2008-12-01

    Full Text Available Complex system dynamics, structure and behaviour performances call for a wide range of methods, algorithms and tools to reach a model capable of finding optimal performing parameters. In the modelling process, it is up to the analyst to select the appropriate combination of methods, algorithms and tools to express significant system performances. Such a methodology for designing complex systems should be based upon conceptual modelling to perform a sensitive analysis of different system levels and views, allowing system representations for developing computer models.Complex systems, such as business systems with a continuous-discrete production process, require a well organised supply chain highly reactive to production assortment changes. Aligning two different production components distinctive in their behaviour is especially delicate at the production parameters transition point. Such system performances require distinctive designing methods that can follow the double nature of the production process behaviour in accordance with their entities dynamics caused by assortment changes. Consequently, such systems need different conceptual presentations for their purpose to be realized from different views and aspects.

  6. Modeling of High Precision Positioning System

    Directory of Open Access Journals (Sweden)

    Giedrius Augustinavičius

    2014-02-01

    Full Text Available This paper presents the modeling of a flexure-based precisionpositioning system for micro-positioning uses. The positioningsystem is featured with monolithic architecture, flexure-basedjoints and ultra fine adjustment screws. Its workspace has beenevaluated via analytical approaches. Reduction mechanism isoptimally designed. The mathematical model of the positioningsystem has been derived, which is verified by resorting to finiteelement analysis (FEA. The established analytical and (FEAmodels are helpful for a reliable architecture optimization andperformance improvement of the positioning system.

  7. Ellipsoidal bag model for heavy quark system

    International Nuclear Information System (INIS)

    Bi Pinzhen; Fudan Univ., Shanghai

    1991-01-01

    The ellipsoidal bag model is used to describe heavy quark systems such as Qanti Q, Qanti Qg and Q 2 anti Q 2 . Instead of two step model, these states are described by an uniform picture. The potential derived from the ellipsoidal bag for Qanti Q is almost equivalent to the Cornell potential. For a Q 2 anti Q 2 system with large quark pair separation, an improvement of 70 MeV is obtained comparing with the spherical bag. (orig.)

  8. Hybrid Energy System Modeling in Modelica

    Energy Technology Data Exchange (ETDEWEB)

    William R. Binder; Christiaan J. J. Paredis; Humberto E. Garcia

    2014-03-01

    In this paper, a Hybrid Energy System (HES) configuration is modeled in Modelica. Hybrid Energy Systems (HES) have as their defining characteristic the use of one or more energy inputs, combined with the potential for multiple energy outputs. Compared to traditional energy systems, HES provide additional operational flexibility so that high variability in both energy production and consumption levels can be absorbed more effectively. This is particularly important when including renewable energy sources, whose output levels are inherently variable, determined by nature. The specific HES configuration modeled in this paper include two energy inputs: a nuclear plant, and a series of wind turbines. In addition, the system produces two energy outputs: electricity and synthetic fuel. The models are verified through simulations of the individual components, and the system as a whole. The simulations are performed for a range of component sizes, operating conditions, and control schemes.

  9. Model Reduction of Fuzzy Logic Systems

    Directory of Open Access Journals (Sweden)

    Zhandong Yu

    2014-01-01

    Full Text Available This paper deals with the problem of ℒ2-ℒ∞ model reduction for continuous-time nonlinear uncertain systems. The approach of the construction of a reduced-order model is presented for high-order nonlinear uncertain systems described by the T-S fuzzy systems, which not only approximates the original high-order system well with an ℒ2-ℒ∞ error performance level γ but also translates it into a linear lower-dimensional system. Then, the model approximation is converted into a convex optimization problem by using a linearization procedure. Finally, a numerical example is presented to show the effectiveness of the proposed method.

  10. System model development for nuclear thermal propulsion

    International Nuclear Information System (INIS)

    Walton, J.T.; Perkins, K.R.; Buksa, J.J.; Worley, B.A.; Dobranich, D.

    1992-01-01

    A critical enabling technology in the evolutionary development of nuclear thermal propulsion (NTP) is the ability to predict the system performance under a variety of operating conditions. Since October 1991, US (DOE), (DOD) and NASA have initiated critical technology development efforts for NTP systems to be used on Space Exploration Initiative (SEI) missions to the Moon and Mars. This paper presents the strategy and progress of an interagency NASA/DOE/DOD team for NTP system modeling. It is the intent of the interagency team to develop several levels of computer programs to simulate various NTP systems. An interagency team was formed for this task to use the best capabilities available and to assure appropriate peer review. The vision and strategy of the interagency team for developing NTP system models will be discussed in this paper. A review of the progress on the Level 1 interagency model is also presented

  11. Economic model of pipeline transportation systems

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.

    1977-07-29

    The objective of the work reported here was to develop a model which could be used to assess the economic effects of energy-conservative technological innovations upon the pipeline industry. The model is a dynamic simulator which accepts inputs of two classes: the physical description (design parameters, fluid properties, and financial structures) of the system to be studied, and the postulated market (throughput and price) projection. The model consists of time-independent submodels: the fluidics model which simulates the physical behavior of the system, and the financial model which operates upon the output of the fluidics model to calculate the economics outputs. Any of a number of existing fluidics models can be used in addition to that developed as a part of this study. The financial model, known as the Systems, Science and Software (S/sup 3/) Financial Projection Model, contains user options whereby pipeline-peculiar characteristics can be removed and/or modified, so that the model can be applied to virtually any kind of business enterprise. The several dozen outputs are of two classes: the energetics and the economics. The energetics outputs of primary interest are the energy intensity, also called unit energy consumption, and the total energy consumed. The primary economics outputs are the long-run average cost, profit, cash flow, and return on investment.

  12. System Identification, Environmental Modelling, and Control System Design

    CERN Document Server

    Garnier, Hugues

    2012-01-01

    System Identification, Environmetric Modelling, and Control Systems Design is dedicated to Professor Peter Young on the occasion of his seventieth birthday. Professor Young has been a pioneer in systems and control, and over the past 45 years he has influenced many developments in this field. This volume is comprised of a collection of contributions by leading experts in system identification, time-series analysis, environmetric modelling and control system design – modern research in topics that reflect important areas of interest in Professor Young’s research career. Recent theoretical developments in and relevant applications of these areas are explored treating the various subjects broadly and in depth. The authoritative and up-to-date research presented here will be of interest to academic researcher in control and disciplines related to environmental research, particularly those to with water systems. The tutorial style in which many of the contributions are composed also makes the book suitable as ...

  13. Automated statistical modeling of analytical measurement systems

    International Nuclear Information System (INIS)

    Jacobson, J.J.

    1992-01-01

    The statistical modeling of analytical measurement systems at the Idaho Chemical Processing Plant (ICPP) has been completely automated through computer software. The statistical modeling of analytical measurement systems is one part of a complete quality control program used by the Remote Analytical Laboratory (RAL) at the ICPP. The quality control program is an integration of automated data input, measurement system calibration, database management, and statistical process control. The quality control program and statistical modeling program meet the guidelines set forth by the American Society for Testing Materials and American National Standards Institute. A statistical model is a set of mathematical equations describing any systematic bias inherent in a measurement system and the precision of a measurement system. A statistical model is developed from data generated from the analysis of control standards. Control standards are samples which are made up at precise known levels by an independent laboratory and submitted to the RAL. The RAL analysts who process control standards do not know the values of those control standards. The object behind statistical modeling is to describe real process samples in terms of their bias and precision and, to verify that a measurement system is operating satisfactorily. The processing of control standards gives us this ability

  14. Accuracy, reproducibility, and time efficiency of dental measurements using different technologies.

    Science.gov (United States)

    Grünheid, Thorsten; Patel, Nishant; De Felippe, Nanci L; Wey, Andrew; Gaillard, Philippe R; Larson, Brent E

    2014-02-01

    Historically, orthodontists have taken dental measurements on plaster models. Technological advances now allow orthodontists to take these measurements on digital models. In this study, we aimed to assess the accuracy, reproducibility, and time efficiency of dental measurements taken on 3 types of digital models. emodels (GeoDigm, Falcon Heights, Minn), SureSmile models (OraMetrix, Richardson, Tex), and AnatoModels (Anatomage, San Jose, Calif) were made for 30 patients. Mesiodistal tooth-width measurements taken on these digital models were timed and compared with those on the corresponding plaster models, which were used as the gold standard. Accuracy and reproducibility were assessed using the Bland-Altman method. Differences in time efficiency were tested for statistical significance with 1-way analysis of variance. Measurements on SureSmile models were the most accurate, followed by those on emodels and AnatoModels. Measurements taken on SureSmile models were also the most reproducible. Measurements taken on SureSmile models and emodels were significantly faster than those taken on AnatoModels and plaster models. Tooth-width measurements on digital models can be as accurate as, and might be more reproducible and significantly faster than, those taken on plaster models. Of the models studied, the SureSmile models provided the best combination of accuracy, reproducibility, and time efficiency of measurement. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  15. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  16. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  17. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  18. Modelling the Replication Management in Information Systems

    Directory of Open Access Journals (Sweden)

    Cezar TOADER

    2017-01-01

    Full Text Available In the modern economy, the benefits of Web services are significant because they facilitates the activities automation in the framework of Internet distributed businesses as well as the cooperation between organizations through interconnection process running in the computer systems. This paper presents the development stages of a model for a reliable information system. This paper describes the communication between the processes within the distributed system, based on the message exchange, and also presents the problem of distributed agreement among processes. A list of objectives for the fault-tolerant systems is defined and a framework model for distributed systems is proposed. This framework makes distinction between management operations and execution operations. The proposed model promotes the use of a central process especially designed for the coordination and control of other application processes. The execution phases and the protocols for the management and the execution components are presented. This model of a reliable system could be a foundation for an entire class of distributed systems models based on the management of replication process.

  19. Generic Sensor Failure Modeling for Cooperative Systems

    Directory of Open Access Journals (Sweden)

    Georg Jäger

    2018-03-01

    Full Text Available The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application’s fault tolerance and thereby promises maintainability of such system’s safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques.

  20. Generic Sensor Failure Modeling for Cooperative Systems

    Science.gov (United States)

    Jäger, Georg; Zug, Sebastian

    2018-01-01

    The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application’s fault tolerance and thereby promises maintainability of such system’s safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques. PMID:29558435

  1. Open System Models of Isotopic Evolution in Earth's Silicate Reservoirs

    Science.gov (United States)

    Kumari, S.; Paul, D.; Stracke, A.

    2016-12-01

    The present-day elemental and isotopic composition of Earth's terrestrial reservoirs can be used as geochemical constraints to study evolution of the crust-mantle system. A flexible open system evolutionary model of the Earth, comprising continental crust (CC), upper depleted mantle (UM) -source of mid-ocean ridge basalts (MORB), and lower mantle (LM) reservoir with an isolated reservoir-source of ocean island basalts (OIB), and incorporating key radioactive isotope systematics (Rb-Sr, Sm-Nd, and U-Th-Pb), is solved numerically at 1 Ma time step for 4.55 Ga, the age of the Earth. The best possible model-derived solution is the one that produces the present-day concentrations as well as isotopic ratios in terrestrial reservoirs, constrained from published data. Various crustal growth scenarios (continuous versus episodic and early versus late) and its effect on the evolution of isotope systematics in the silicate reservoirs have been evaluated. Modeling results suggest that a whole mantle that is compositionally similar to the present-day MORB source is not consistent with observational constraints. However, a heterogeneous mantle model, in which the present-day UM is 60% of the total mantle mass and a lower non-chondritic mantle, reproduces the estimated isotopic ratios and abundances in Earth's silicate reservoirs. Our results shows that mode of crustal growth strongly affects isotopic evolution of silicate Earth; only an exponential crustal growth pattern satisfactorily explains the chemical and isotopic evolution of the crust-mantle system. One notable feature of successful models is an early depletion of incompatible elements (and a rapid decrease in Th/U ratio, κ, in the UM) by the initial 500 Ma, as a result of early formation of continental crust. Assuming a slightly younger age of the Earth (4.45 Ga), our model better satisfies the Pb-isotope systematics in the respective silicate reservoirs, particularly in the UM, and explains the origin of several OIBs

  2. Investigation of the Intra- and Interlaboratory Reproducibility of a Small Scale Standardized Supersaturation and Precipitation Method.

    Science.gov (United States)

    Plum, Jakob; Madsen, Cecilie M; Teleki, Alexandra; Bevernage, Jan; da Costa Mathews, Claudia; Karlsson, Eva M; Carlert, Sara; Holm, Rene; Müller, Thomas; Matthews, Wayne; Sayers, Alice; Ojala, Krista; Tsinsman, Konstantin; Lingamaneni, Ram; Bergström, Christel As; Rades, Thomas; Müllertz, Anette

    2017-12-04

    The high number of poorly water-soluble compounds in drug development has increased the need for enabling formulations to improve oral bioavailability. One frequently applied approach is to induce supersaturation at the absorptive site, e.g., the small intestine, increasing the amount of dissolved compound available for absorption. However, due to the stochastic nature of nucleation, supersaturating drug delivery systems may lead to inter- and intrapersonal variability. The ability to define a feasible range with respect to the supersaturation level is a crucial factor for a successful formulation. Therefore, an in vitro method is needed, from where the ability of a compound to supersaturate can be defined in a reproducible way. Hence, this study investigates the reproducibility of an in vitro small scale standardized supersaturation and precipitation method (SSPM). First an intralaboratory reproducibility study of felodipine was conducted, after which seven partners contributed with data for three model compounds; aprepitant, felodipine, and fenofibrate, to determine the interlaboratory reproducibility of the SSPM. The first part of the SSPM determines the apparent degrees of supersaturation (aDS) to investigate for each compound. Each partner independently determined the maximum possible aDS and induced 100, 87.5, 75, and 50% of their determined maximum possible aDS in the SSPM. The concentration-time profile of the supersaturation and following precipitation was obtained in order to determine the induction time (t ind ) for detectable precipitation. The data showed that the absolute values of t ind and aDS were not directly comparable between partners, however, upon linearization of the data a reproducible rank ordering of the three model compounds was obtained based on the β-value, which was defined as the slope of the ln(t ind ) versus ln(aDS) -2 plot. Linear regression of this plot showed that aprepitant had the highest β-value, 15.1, while felodipine and

  3. Energy model in regional energy system

    International Nuclear Information System (INIS)

    Mura, P.G.; Baccoli, R.; Carlini, U.; Innamorati, R.; Mariotti, S.

    2005-01-01

    In this report is presented a computational model for analysis of energy, materials and mass flux in a complex energy system, at regional scale level. Specifically is described a calculation model of electric power generation for emission forecasting of CO 2 , SO x , NO x , particulate matter, ashes, limestone, chalks [it

  4. Immune System Model Calibration by Genetic Algorithm

    NARCIS (Netherlands)

    Presbitero, A.; Krzhizhanovskaya, V.; Mancini, E.; Brands, R.; Sloot, P.

    2016-01-01

    We aim to develop a mathematical model of the human immune system for advanced individualized healthcare where medication plan is fine-tuned to fit a patient's conditions through monitored biochemical processes. One of the challenges is calibrating model parameters to satisfy existing experimental

  5. Analytical system dynamics modeling and simulation

    CERN Document Server

    Fabien, Brian C

    2008-01-01

    This book offering a modeling technique based on Lagrange's energy method includes 125 worked examples. Using this technique enables one to model and simulate systems as diverse as a six-link, closed-loop mechanism or a transistor power amplifier.

  6. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  7. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  8. Human performance modeling for system of systems analytics :soldier fatigue.

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, Craig R.; Campbell, James E.; Miller, Dwight Peter

    2005-10-01

    The military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives as can be seen in the Department of Defense's (DoD) Defense Modeling and Simulation Office's (DMSO) Master Plan (DoD 5000.59-P 1995). To this goal, the military is currently spending millions of dollars on programs devoted to HPM in various military contexts. Examples include the Human Performance Modeling Integration (HPMI) program within the Air Force Research Laboratory, which focuses on integrating HPMs with constructive models of systems (e.g. cockpit simulations) and the Navy's Human Performance Center (HPC) established in September 2003. Nearly all of these initiatives focus on the interface between humans and a single system. This is insufficient in the era of highly complex network centric SoS. This report presents research and development in the area of HPM in a system-of-systems (SoS). Specifically, this report addresses modeling soldier fatigue and the potential impacts soldier fatigue can have on SoS performance.

  9. In vivo evaluation of inter-operator reproducibility of digital dental and conventional impression techniques

    Science.gov (United States)

    Kamimura, Emi; Tanaka, Shinpei; Takaba, Masayuki; Tachi, Keita; Baba, Kazuyoshi

    2017-01-01

    Purpose The aim of this study was to evaluate and compare the inter-operator reproducibility of three-dimensional (3D) images of teeth captured by a digital impression technique to a conventional impression technique in vivo. Materials and methods Twelve participants with complete natural dentition were included in this study. A digital impression of the mandibular molars of these participants was made by two operators with different levels of clinical experience, 3 or 16 years, using an intra-oral scanner (Lava COS, 3M ESPE). A silicone impression also was made by the same operators using the double mix impression technique (Imprint3, 3M ESPE). Stereolithography (STL) data were directly exported from the Lava COS system, while STL data of a plaster model made from silicone impression were captured by a three-dimensional (3D) laboratory scanner (D810, 3shape). The STL datasets recorded by two different operators were compared using 3D evaluation software and superimposed using the best-fit-algorithm method (least-squares method, PolyWorks, InnovMetric Software) for each impression technique. Inter-operator reproducibility as evaluated by average discrepancies of corresponding 3D data was compared between the two techniques (Wilcoxon signed-rank test). Results The visual inspection of superimposed datasets revealed that discrepancies between repeated digital impression were smaller than observed with silicone impression. Confirmation was forthcoming from statistical analysis revealing significantly smaller average inter-operator reproducibility using a digital impression technique (0.014± 0.02 mm) than when using a conventional impression technique (0.023 ± 0.01 mm). Conclusion The results of this in vivo study suggest that inter-operator reproducibility with a digital impression technique may be better than that of a conventional impression technique and is independent of the clinical experience of the operator. PMID:28636642

  10. Using the Regional Ocean Modelling System (ROMS to improve the sea surface temperature predictions of the MERCATOR Ocean System

    Directory of Open Access Journals (Sweden)

    Pedro Costa

    2012-09-01

    Full Text Available Global models are generally capable of reproducing the observed trends in the globally averaged sea surface temperature (SST. However, the global models do not perform as well on regional scales. Here, we present an ocean forecast system based on the Regional Ocean Modelling System (ROMS, the boundary conditions come from the MERCATOR ocean system for the North Atlantic (1/6° horizontal resolution. The system covers the region of the northwestern Iberian Peninsula with a horizontal resolution of 1/36°, forced with the Weather Research and Forecasting Model (WRF and the Soil Water Assessment Tool (SWAT. The ocean model results from the regional ocean model are validated using real-time SST and observations from the MeteoGalicia, INTECMAR and Puertos Del Estado real-time observational networks. The validation results reveal that over a one-year period the mean absolute error of the SST is less than 1°C, and several sources of measured data reveal that the errors decrease near the coast. This improvement is related to the inclusion of local forcing not present in the boundary condition model.

  11. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  12. Predictive Model of Systemic Toxicity (SOT)

    Science.gov (United States)

    In an effort to ensure chemical safety in light of regulatory advances away from reliance on animal testing, USEPA and L’Oréal have collaborated to develop a quantitative systemic toxicity prediction model. Prediction of human systemic toxicity has proved difficult and remains a ...

  13. CTBT Integrated Verification System Evaluation Model

    Energy Technology Data Exchange (ETDEWEB)

    Edenburn, M.W.; Bunting, M.L.; Payne, A.C. Jr.

    1997-10-01

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia`s Monitoring Systems and Technology Center and has been funded by the US Department of Energy`s Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, top-level, modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM`s unique features is that it integrates results from the various CTBT sensor technologies (seismic, infrasound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection) and location accuracy of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system`s performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. This report describes version 1.2 of IVSEM.

  14. Modeling a Longitudinal Relational Research Data Systems

    Science.gov (United States)

    Olsen, Michelle D. Hunt

    2010-01-01

    A study was conducted to propose a research-based model for a longitudinal data research system that addressed recommendations from a synthesis of literature related to: (1) needs reported by the U.S. Department of Education, (2) the twelve mandatory elements that define federally approved state longitudinal data systems (SLDS), (3) the…

  15. Economic Models and Algorithms for Distributed Systems

    CERN Document Server

    Neumann, Dirk; Altmann, Jorn; Rana, Omer F

    2009-01-01

    Distributed computing models for sharing resources such as Grids, Peer-to-Peer systems, or voluntary computing are becoming increasingly popular. This book intends to discover fresh avenues of research and amendments to existing technologies, aiming at the successful deployment of commercial distributed systems

  16. Installed water resource modelling systems for catchment ...

    African Journals Online (AJOL)

    Following international trends there are a growing number of modelling systems being installed for integrated water resource management, in Southern Africa. Such systems are likely to be installed for operational use in ongoing learning, research, strategic planning and consensus-building amongst stakeholders in the ...

  17. Knowledge Management System Model for Learning Organisations

    Science.gov (United States)

    Amin, Yousif; Monamad, Roshayu

    2017-01-01

    Based on the literature of knowledge management (KM), this paper reports on the progress of developing a new knowledge management system (KMS) model with components architecture that are distributed over the widely-recognised socio-technical system (STS) aspects to guide developers for selecting the most applicable components to support their KM…

  18. Eclectic Model in the Malaysian Education System

    Science.gov (United States)

    Othman, Nooraini; Mohamad, Khairul Azmi; Ilmuwan, Yayasan

    2011-01-01

    The present work aims at analysing the adoption of eclectic model in the Malaysian education system. The analysis is specifically looked from the angle of Islam and the Muslims. Malaysia has a long history of education system developments, from pre to post independence of the country. From what was initially traditional, modernity later came to…

  19. Towards reproducibility of research by reuse of IT best practices

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Reproducibility of any research gives much higher credibility both to research results and to the researchers. This is true for any kind of research including computer science, where a lot of tools and approaches have been developed to ensure reproducibility. In this talk I will focus on basic and seemingly simple principles, which sometimes look too obvious to follow, but help researchers build beautiful and reliable systems that produce consistent, measurable results. My talk will cover, among other things, the problem of embedding machine learning techniques into analysis strategy. I will also speak about the most common pitfalls in this process and how to avoid them. In addition, I will demonstrate the research environment based on the principles that I will have outlined. About the speaker Andrey Ustyuzhanin (36) is Head of CERN partnership program at Yandex. He is involved in the development of event indexing and event filtering services which Yandex has been providing for the LHCb experiment sinc...

  20. SWAT application in intensive irrigation systems: Model modification, calibration and validation

    Science.gov (United States)

    Dechmi, Farida; Burguete, Javier; Skhiri, Ahmed

    2012-11-01

    SummaryThe Soil and Water Assessment Tool (SWAT) is a well established, distributed, eco-hydrologic model. However, using the study case of an agricultural intensive irrigated watershed, it was shown that all the model versions are not able to appropriately reproduce the total streamflow in such system when the irrigation source is outside the watershed. The objective of this study was to modify the SWAT2005 version for correctly simulating the main hydrological processes. Crop yield, total streamflow, total suspended sediment (TSS) losses and phosphorus load calibration and validation were performed using field survey information and water quantity and quality data recorded during 2008 and 2009 years in Del Reguero irrigated watershed in Spain. The goodness of the calibration and validation results was assessed using five statistical measures, including the Nash-Sutcliffe efficiency (NSE). Results indicated that the average annual crop yield and actual evapotranspiration estimations were quite satisfactory. On a monthly basis, the values of NSE were 0.90 (calibration) and 0.80 (validation) indicating that the modified model could reproduce accurately the observed streamflow. The TSS losses were also satisfactorily estimated (NSE = 0.72 and 0.52 for the calibration and validation steps). The monthly temporal patterns and all the statistical parameters indicated that the modified SWAT-IRRIG model adequately predicted the total phosphorus (TP) loading. Therefore, the model could be used to assess the impacts of different best management practices on nonpoint phosphorus losses in irrigated systems.

  1. Guest Editorial-Special Collection Topic: Statistical Systems Theory in Cancer Modeling, Diagnosis, and Therapy.

    Science.gov (United States)

    Dougherty, Edward R; Boulesteix, Anne-Laure; Dalton, Lori A; Zhang, Michelle

    2018-01-01

    Cancer is a systems disease involving mutations and altered regulation. This supplement treats cancer research as it pertains to 3 systems issues of an inherently statistical nature: regulatory modeling and information processing, diagnostic classification, and therapeutic intervention and control. Topics of interest include (but are not limited to) multiscale modeling, gene/protein transcriptional regulation, dynamical systems, pharmacokinetic/pharmacodynamic modeling, compensatory regulation, feedback, apoptotic and proliferative control, copy number-expression interaction, integration of different feature types, error estimation, and reproducibility. We are especially interested in how the above issues relate to the extremely high-dimensional data sets and small- to moderate-sized data sets typically involved in cancer research, for instance, their effect on statistical power, inference accuracy, and multiple comparisons.

  2. Guest Editorial—Special Collection Topic: Statistical Systems Theory in Cancer Modeling, Diagnosis, and Therapy

    Science.gov (United States)

    Dougherty, Edward R; Boulesteix, Anne-Laure; Dalton, Lori A; Zhang, Michelle

    2018-01-01

    Aims and Scope: Cancer is a systems disease involving mutations and altered regulation. This supplement treats cancer research as it pertains to 3 systems issues of an inherently statistical nature: regulatory modeling and information processing, diagnostic classification, and therapeutic intervention and control. Topics of interest include (but are not limited to) multiscale modeling, gene/protein transcriptional regulation, dynamical systems, pharmacokinetic/pharmacodynamic modeling, compensatory regulation, feedback, apoptotic and proliferative control, copy number-expression interaction, integration of different feature types, error estimation, and reproducibility. We are especially interested in how the above issues relate to the extremely high-dimensional data sets and small- to moderate-sized data sets typically involved in cancer research, for instance, their effect on statistical power, inference accuracy, and multiple comparisons. PMID:29531471

  3. An Advanced HIL Simulation Battery Model for Battery Management System Testing

    DEFF Research Database (Denmark)

    Barreras, Jorge Varela; Fleischer, Christian; Christensen, Andreas Elkjær

    2016-01-01

    Developers and manufacturers of battery management systems (BMSs) require extensive testing of controller Hardware (HW) and Software (SW), such as analog front-end and performance of generated control code. In comparison with the tests conducted on real batteries, tests conducted on a state......-of-the-art hardware-in-the-loop (HIL) simulator can be more cost and time effective, easier to reproduce, and safer beyond the normal range of operation, especially at early stages in the development process or during fault insertion. In this paper, an HIL simulation battery model is developed for purposes of BMS...... testing on a commercial HIL simulator. A multicell electrothermal Li-ion battery (LIB) model is integrated in a system-level simulation. Then, the LIB system model is converted to C code and run in real time with the HIL simulator. Finally, in order to demonstrate the capabilities of the setup...

  4. Modelling of functional systems of managerial accounting

    Directory of Open Access Journals (Sweden)

    O.V. Fomina

    2017-12-01

    Full Text Available The modern stage of managerial accounting development takes place under the powerful influence of managerial innovations. The article aimed at the development of integrational model of budgeting and the system of balanced indices in the system of managerial accounting that will contribute the increasing of relevance for making managerial decisions by managers of different levels management. As a result of the study the author proposed the highly pragmatical integration model of budgeting and system of the balanced indices in the system of managerial accounting, which is realized by the development of the system of gathering, consolidation, analysis, and interpretation of financial and nonfinancial information, contributes the increasing of relevance for making managerial decisions on the base of coordination and effective and purpose orientation both strategical and operative resources of an enterprise. The effective integrational process of the system components makes it possible to distribute limited resources rationally taking into account prospective purposes and strategic initiatives, to carry

  5. Degradation Modelling for Health Monitoring Systems

    International Nuclear Information System (INIS)

    Stetter, R; Witczak, M

    2014-01-01

    Condition-monitoring plays an increasingly important role for technical processes in order to improve reliability, availability, maintenance and lifetime of equipment. With increasing demands for efficiency and product quality, plus progress in the integration of automatic control systems in high-cost mechatronic and critical safety processes, the field of health monitoring is gaining interest. A similar research field is concerned with an estimation of the remaining useful life. A central question in these fields is the modelling of degradation; degradation is a process of a gradual and irreversible accumulation of damage which will finally result in a failure of the system. This paper is based on a current research project and explores various degradation modelling techniques. These results are explained on the basis of an industrial product – a system for the generation of health status information for pump systems. The result of this fuzzy-logic based system is a single number indicating the current health of a pump system

  6. World energy projection system: Model documentation

    International Nuclear Information System (INIS)

    1992-06-01

    The World Energy Project System (WEPS) is an accounting framework that incorporates projects from independently documented models and assumptions about the future energy intensity of economic activity (ratios of total energy consumption divided by gross domestic product) and about the rate of incremental energy requirements met by hydropower, geothermal, coal, and natural gas to produce projections of world energy consumption published annually by the Energy Information Administration (EIA) in the International Energy Outlook (IEO) (Figure 1). Two independently documented models presented in Figure 1, the Oil Market Simulation (OMS) model and the World Integrated Nuclear Evaluation System (WINES) provide projections of oil and nuclear power consumption published in the IEO. Output from a third independently documented model, and the International Coal Trade Model (ICTM), is not published in the IEO but is used in WEPS as a supply check on projections of world coal consumption produced by WEPS and published in the IEO. A WEPS model of natural gas production documented in this report provides the same type of implicit supply check on the WEPS projections of world natural gas consumption published in the IEO. Two additional models are included in Figure 1, the OPEC Capacity model and the Non-OPEC Oil Production model. These WEPS models provide inputs to the OMS model and are documented in this report

  7. OFFl Models: Novel Schema for Dynamical Modeling of Biological Systems.

    Directory of Open Access Journals (Sweden)

    C Brandon Ogbunugafor

    Full Text Available Flow diagrams are a common tool used to help build and interpret models of dynamical systems, often in biological contexts such as consumer-resource models and similar compartmental models. Typically, their usage is intuitive and informal. Here, we present a formalized version of flow diagrams as a kind of weighted directed graph which follow a strict grammar, which translate into a system of ordinary differential equations (ODEs by a single unambiguous rule, and which have an equivalent representation as a relational database. (We abbreviate this schema of "ODEs and formalized flow diagrams" as OFFL. Drawing a diagram within this strict grammar encourages a mental discipline on the part of the modeler in which all dynamical processes of a system are thought of as interactions between dynamical species that draw parcels from one or more source species and deposit them into target species according to a set of transformation rules. From these rules, the net rate of change for each species can be derived. The modeling schema can therefore be understood as both an epistemic and practical heuristic for modeling, serving both as an organizational framework for the model building process and as a mechanism for deriving ODEs. All steps of the schema beyond the initial scientific (intuitive, creative abstraction of natural observations into model variables are algorithmic and easily carried out by a computer, thus enabling the future development of a dedicated software implementation. Such tools would empower the modeler to consider significantly more complex models than practical limitations might have otherwise proscribed, since the modeling framework itself manages that complexity on the modeler's behalf. In this report, we describe the chief motivations for OFFL, carefully outline its implementation, and utilize a range of classic examples from ecology and epidemiology to showcase its features.

  8. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  9. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  10. Models for large superconducting toroidal magnet systems

    International Nuclear Information System (INIS)

    Arendt, F.; Brechna, H.; Erb, J.; Komarek, P.; Krauth, H.; Maurer, W.

    1976-01-01

    Prior to the design of large GJ toroidal magnet systems it is appropriate to procure small scale models, which can simulate their pertinent properties and allow to investigate their relevant phenomena. The important feature of the model is to show under which circumstances the system performance can be extrapolated to large magnets. Based on parameters such as the maximum magnetic field and the current density, the maximum tolerable magneto-mechanical stresses, a simple method of designing model magnets is presented. It is shown how pertinent design parameters are changed when the toroidal dimensions are altered. In addition some conductor cost estimations are given based on reactor power output and wall loading

  11. Advanced AEM by Comprehensive Analysis and Modeling of System Drift

    Science.gov (United States)

    Schiller, Arnulf; Klune, Klaus; Schattauer, Ingrid

    2010-05-01

    The quality of the assessment of risks outgoing from environmental hazards strongly depends on the spatial and temporal distribution of the data collected in a survey area. Natural hazards generally emerge from wide areas as it is in the case of volcanoes or land slides. Conventional surface measurements are restricted to few lines or locations and often can't be conducted in difficult terrain. So they only give a spatial and temporary limited data set and therefore limit the reliability of risk analysis. Aero-geophysical measurements potentially provide a valuable tool for completing the data set as they can be performed over a wide area, even above difficult terrain within a short time. A most desirable opportunity in course of such measurements is the ascertainment of the dynamics of such potentially hazardous environmental processes. This necessitates repeated and reproducible measurements. Current HEM systems can't accomplish this adequately due to their system immanent drift and - in some cases - bad signal to noise ratio. So, to develop comprising concepts for advancing state of the art HEM-systems to a valuable tool for data acquisition in risk assessment or hydrological problems, different studies have been undertaken which form the contents of the presented work conducted in course of the project HIRISK (Helicopter Based Electromagnetic System for Advanced Environmental Risk Assessment - FWF L-354 N10, supported by the Austrian Science Fund). The methodology is based upon two paths: A - Comprehensive experimental testing on an existing HEM system serving as an experimental platform. B - The setup of a numerical model which is continuously refined according to the results of the experimental data. The model then serves to simulate the experimental as well as alternative configurations and to analyze them subject to their drift behavior. Finally, concepts for minimizing the drift are derived and tested. Different test series - stationary on ground as well

  12. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  13. Modeling the heart and the circulatory system

    CERN Document Server

    2015-01-01

    The book comprises contributions by some of the most respected scientists in the field of mathematical modeling and numerical simulation of the human cardiocirculatory system. The contributions cover a wide range of topics, from the preprocessing of clinical data to the development of mathematical equations, their numerical solution, and both in-vivo and in-vitro validation. They discuss the flow in the systemic arterial tree and the complex electro-fluid-mechanical coupling in the human heart. Many examples of patient-specific simulations are presented. This book is addressed to all scientists interested in the mathematical modeling and numerical simulation of the human cardiocirculatory system.

  14. Semi-active control of magnetorheological elastomer base isolation system utilising learning-based inverse model

    Science.gov (United States)

    Gu, Xiaoyu; Yu, Yang; Li, Jianchun; Li, Yancheng

    2017-10-01

    Magnetorheological elastomer (MRE) base isolations have attracted considerable attention over the last two decades thanks to its self-adaptability and high-authority controllability in semi-active control realm. Due to the inherent nonlinearity and hysteresis of the devices, it is challenging to obtain a reasonably complicated mathematical model to describe the inverse dynamics of MRE base isolators and hence to realise control synthesis of the MRE base isolation system. Two aims have been achieved in this paper: i) development of an inverse model for MRE base isolator based on optimal general regression neural network (GRNN); ii) numerical and experimental validation of a real-time semi-active controlled MRE base isolation system utilising LQR controller and GRNN inverse model. The superiority of GRNN inverse model lays in fewer input variables requirement, faster training process and prompt calculation response, which makes it suitable for online training and real-time control. The control system is integrated with a three-storey shear building model and control performance of the MRE base isolation system is compared with bare building, passive-on isolation system and passive-off isolation system. Testing results show that the proposed GRNN inverse model is able to reproduce desired control force accurately and the MRE base isolation system can effectively suppress the structural responses when compared to the passive isolation system.

  15. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    The majority of modern software and hardware systems are reactive systems, where input provided by the user (possibly another system) and the output of the system is exchanged continuously throughout the (possibly) indefinite execution of the system. Natural examples include control systems, mobi......, energy consumption, latency, mean-time to failure, and cost. For systems integrated in mass-market products, the ability to quantify trade-offs between performance and robustness, under given technical and economic constraints, is of strategic importance....... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, in terms of a new mathematical basis for systems modeling which can incompas behavioural properties as well as environmental constraints. They continue by pointing out that, continuous performance and robustness measures are paramount when dealing with physical resource levels such as clock frequency...

  16. An Integrated Ecological Modeling System for Assessing ...

    Science.gov (United States)

    We demonstrate a novel, spatially explicit assessment of the current condition of aquatic ecosystem services, with limited sensitivity analysis for the atmospheric contaminant mercury. The Integrated Ecological Modeling System (IEMS) forecasts water quality and quantity, habitat suitability for aquatic biota, fish biomasses, population densities, productivities, and contamination by methylmercury across headwater watersheds. We applied this IEMS to the Coal River Basin (CRB), West Virginia (USA), an 8-digit hydrologic unit watershed, by simulating a network of 97 stream segments using the SWAT watershed model, a watershed mercury loading model, the WASP water quality model, the PiSCES fish community estimation model, a fish habitat suitability model, the BASS fish community and bioaccumulation model, and an ecoservices post-processer. Model application was facilitated by automated data retrieval and model setup and updated model wrappers and interfaces for data transfers between these models from a prior study. This companion study evaluates baseline predictions of ecoservices provided for 1990 – 2010 for the population of streams in the CRB and serves as a foundation for future model development. Published in the journal, Ecological Modeling. Highlights: • Demonstrate a spatially-explicit IEMS for multiple scales. • Design a flexible IEMS for

  17. Modelling dependable systems using hybrid Bayesian networks

    International Nuclear Information System (INIS)

    Neil, Martin; Tailor, Manesh; Marquez, David; Fenton, Norman; Hearty, Peter

    2008-01-01

    A hybrid Bayesian network (BN) is one that incorporates both discrete and continuous nodes. In our extensive applications of BNs for system dependability assessment, the models are invariably hybrid and the need for efficient and accurate computation is paramount. We apply a new iterative algorithm that efficiently combines dynamic discretisation with robust propagation algorithms on junction tree structures to perform inference in hybrid BNs. We illustrate its use in the field of dependability with two example of reliability estimation. Firstly we estimate the reliability of a simple single system and next we implement a hierarchical Bayesian model. In the hierarchical model we compute the reliability of two unknown subsystems from data collected on historically similar subsystems and then input the result into a reliability block model to compute system level reliability. We conclude that dynamic discretisation can be used as an alternative to analytical or Monte Carlo methods with high precision and can be applied to a wide range of dependability problems

  18. Stochastic Modelling Of The Repairable System

    Directory of Open Access Journals (Sweden)

    Andrzejczak Karol

    2015-11-01

    Full Text Available All reliability models consisting of random time factors form stochastic processes. In this paper we recall the definitions of the most common point processes which are used for modelling of repairable systems. Particularly this paper presents stochastic processes as examples of reliability systems for the support of the maintenance related decisions. We consider the simplest one-unit system with a negligible repair or replacement time, i.e., the unit is operating and is repaired or replaced at failure, where the time required for repair and replacement is negligible. When the repair or replacement is completed, the unit becomes as good as new and resumes operation. The stochastic modelling of recoverable systems constitutes an excellent method of supporting maintenance related decision-making processes and enables their more rational use.

  19. A system model for water management.

    Science.gov (United States)

    Schenk, Colin; Roquier, Bastien; Soutter, Marc; Mermoud, André

    2009-03-01

    Although generally accepted as a necessary step to improve water management and planning, integrated water resources management (IWRM) methodology does not provide a clear definition of what should be integrated. The various water-related issues that IWRM might encompass are well documented in the literature, but they are generally addressed separately. Therefore, water management lacks a holistic, systems-based description, with a special emphasis on the interrelations between issues. This article presents such a system model for water management, including a graphical representation and textual descriptions of the various water issues, their components, and their interactions. This model is seen as an aide-memoire and a generic reference, providing background knowledge helping to elicit actual system definitions, in possible combination with other participatory systems approaches. The applicability of the model is demonstrated through its application to two test case studies.

  20. Intelligent Mechatronic Systems Modeling, Control and Diagnosis

    CERN Document Server

    Merzouki, Rochdi; Pathak, Pushparaj Mani; Ould Bouamama, Belkacem

    2013-01-01

    Acting as a support resource for practitioners and professionals looking to advance their understanding of complex mechatronic systems, Intelligent Mechatronic Systems explains their design and recent developments from first principles to practical applications. Detailed descriptions of the mathematical models of complex mechatronic systems, developed from fundamental physical relationships, are built on to develop innovative solutions with particular emphasis on physical model-based control strategies. Following a concurrent engineering approach, supported by industrial case studies, and drawing on the practical experience of the authors, Intelligent Mechatronic Systems covers range of topic and includes:  • An explanation of a common graphical tool for integrated design and its uses from modeling and simulation to the control synthesis • Introductions to key concepts such as different means of achieving fault tolerance, robust overwhelming control and force and impedance control • Dedicated chapters ...