WorldWideScience

Sample records for model systematically underestimates

  1. Quantifying Greenland freshwater flux underestimates in climate models

    Science.gov (United States)

    Little, Christopher M.; Piecuch, Christopher G.; Chaudhuri, Ayan H.

    2016-05-01

    Key processes regulating the mass balance of the Greenland Ice Sheet (GIS) are not represented in current-generation climate models. Here using output from 19 different climate models forced with a high-end business-as-usual emissions pathway, we compare modeled freshwater fluxes (FWF) to a parameterization based on midtropospheric temperature. By the mid 21st century, parameterized GIS FWF is 478 ± 215 km3 yr-1 larger than modeled—over 3 times the 1992-2011 rate of GIS mass loss. By the late 21st century, ensemble mean parameterized GIS FWF anomalies are comparable to FWF anomalies over the northern North Atlantic Ocean, equivalent to approximately 11 cm of global mean sea level rise. The magnitude and spread of these underestimates underscores the need for assessments of the coupled response of the ocean to increased FWF that recognize: (1) the widely varying freshwater budgets of each model and (2) uncertainty in the relationship between GIS FWF and atmospheric temperature.

  2. Prevention of a systematic underestimation of antioxidant activity in competition assays. The impact of unspecific reactions of the reactive species.

    Science.gov (United States)

    Beljaars, Christiaan P; Balk, Jiska M; Bast, Aalt; Haenen, Guido R M M

    2010-02-12

    In antioxidant competition assays, an antioxidant (A) and a detector compound (D) compete for a reactive species (R). In the evaluation of these assays, it is tacitly assumed that all of R is captured by either D or A. Due to the - by definition - high reactivity of R, unspecific reactions of R are likely to occur and neglecting these reactions will result in a systematic underestimation of antioxidant activity. It was shown that in the standard hydroxyl radical scavenging assay this was indeed the case; the inaccurate mathematical evaluation resulted in an underestimation of antioxidant activity of 25% in this competition assay. The systematic underestimation of antioxidant activity can be prevented by using an adjusted Stern-Volmer equation that takes into account that only part of R is captured by D or A.

  3. Systematic Underestimation of Earthquake Magnitudes from Large Intracontinental Reverse Faults: Historical Ruptures Break Across Segment Boundaries

    Science.gov (United States)

    Rubin, C. M.

    1996-01-01

    Because most large-magnitude earthquakes along reverse faults have such irregular and complicated rupture patterns, reverse-fault segments defined on the basis of geometry alone may not be very useful for estimating sizes of future seismic sources. Most modern large ruptures of historical earthquakes generated by intracontinental reverse faults have involved geometrically complex rupture patterns. Ruptures across surficial discontinuities and complexities such as stepovers and cross-faults are common. Specifically, segment boundaries defined on the basis of discontinuities in surficial fault traces, pronounced changes in the geomorphology along strike, or the intersection of active faults commonly have not proven to be major impediments to rupture. Assuming that the seismic rupture will initiate and terminate at adjacent major geometric irregularities will commonly lead to underestimation of magnitudes of future large earthquakes.

  4. How systematic age underestimation can impede understanding of fish population dynamics: Lessons learned from a Lake Superior cisco stock

    Science.gov (United States)

    Yule, D.L.; Stockwell, J.D.; Black, J.A.; Cullis, K.I.; Cholwek, G.A.; Myers, J.T.

    2008-01-01

    Systematic underestimation of fish age can impede understanding of recruitment variability and adaptive strategies (like longevity) and can bias estimates of survivorship. We suspected that previous estimates of annual survival (S; range = 0.20-0.44) for Lake Superior ciscoes Coregonus artedi developed from scale ages were biased low. To test this hypothesis, we estimated the total instantaneous mortality rate of adult ciscoes from the Thunder Bay, Ontario, stock by use of cohort-based catch curves developed from commercial gill-net catches and otolith-aged fish. Mean S based on otolith ages was greater for adult females (0.80) than for adult males (0.75), but these differences were not significant. Applying the results of a study of agreement between scale and otolith ages, we modeled a scale age for each otolith-aged fish to reconstruct catch curves. Using modeled scale ages, estimates of S (0.42 for females, 0.36 for males) were comparable with those reported in past studies. We conducted a November 2005 acoustic and midwater trawl survey to estimate the abundance of ciscoes when the fish were being harvested for roe. Estimated exploitation rates were 0.085 for females and 0.025 for males, and the instantaneous rates of fishing mortality were 0.089 for females and 0.025 for males. The instantaneous rates of natural mortality were 0.131 and 0.265 for females and males, respectively. Using otolith ages, we found that strong year-classes at large during November 2005 were caught in high numbers as age-1 fish in previous annual bottom trawl surveys, whereas weak or absent year-classes were not. For decades, large-scale fisheries on the Great Lakes were allowed to operate because ciscoes were assumed to be short lived and to have regular recruitment. We postulate that the collapse of these fisheries was linked in part to a misunderstanding of cisco biology driven by scale-ageing error. ?? Copyright by the American Fisheries Society 2008.

  5. Evidence for link between modelled trends in Antarctic sea ice and underestimated westerly wind changes

    Science.gov (United States)

    Purich, Ariaan; Cai, Wenju; England, Matthew H.; Cowan, Tim

    2016-02-01

    Despite global warming, total Antarctic sea ice coverage increased over 1979-2013. However, the majority of Coupled Model Intercomparison Project phase 5 models simulate a decline. Mechanisms causing this discrepancy have so far remained elusive. Here we show that weaker trends in the intensification of the Southern Hemisphere westerly wind jet simulated by the models may contribute to this disparity. During austral summer, a strengthened jet leads to increased upwelling of cooler subsurface water and strengthened equatorward transport, conducive to increased sea ice. As the majority of models underestimate summer jet trends, this cooling process is underestimated compared with observations and is insufficient to offset warming in the models. Through the sea ice-albedo feedback, models produce a high-latitude surface ocean warming and sea ice decline, contrasting the observed net cooling and sea ice increase. A realistic simulation of observed wind changes may be crucial for reproducing the recent observed sea ice increase.

  6. Underestimation of boreal soil carbon stocks by mathematical soil carbon models linked to soil nutrient status

    Science.gov (United States)

    Ťupek, Boris; Ortiz, Carina A.; Hashimoto, Shoji; Stendahl, Johan; Dahlgren, Jonas; Karltun, Erik; Lehtonen, Aleksi

    2016-08-01

    Inaccurate estimate of the largest terrestrial carbon pool, soil organic carbon (SOC) stock, is the major source of uncertainty in simulating feedback of climate warming on ecosystem-atmosphere carbon dioxide exchange by process-based ecosystem and soil carbon models. Although the models need to simplify complex environmental processes of soil carbon sequestration, in a large mosaic of environments a missing key driver could lead to a modeling bias in predictions of SOC stock change.We aimed to evaluate SOC stock estimates of process-based models (Yasso07, Q, and CENTURY soil sub-model v4) against a massive Swedish forest soil inventory data set (3230 samples) organized by a recursive partitioning method into distinct soil groups with underlying SOC stock development linked to physicochemical conditions.For two-thirds of measurements all models predicted accurate SOC stock levels regardless of the detail of input data, e.g., whether they ignored or included soil properties. However, in fertile sites with high N deposition, high cation exchange capacity, or moderately increased soil water content, Yasso07 and Q models underestimated SOC stocks. In comparison to Yasso07 and Q, accounting for the site-specific soil characteristics (e. g. clay content and topsoil mineral N) by CENTURY improved SOC stock estimates for sites with high clay content, but not for sites with high N deposition.Our analysis suggested that the soils with poorly predicted SOC stocks, as characterized by the high nutrient status and well-sorted parent material, indeed have had other predominant drivers of SOC stabilization lacking in the models, presumably the mycorrhizal organic uptake and organo-mineral stabilization processes. Our results imply that the role of soil nutrient status as regulator of organic matter mineralization has to be re-evaluated, since correct SOC stocks are decisive for predicting future SOC change and soil CO2 efflux.

  7. Terrestrial biosphere models underestimate photosynthetic capacity and CO2 assimilation in the Arctic.

    Science.gov (United States)

    Rogers, Alistair; Serbin, Shawn P; Ely, Kim S; Sloan, Victoria L; Wullschleger, Stan D

    2017-09-06

    Terrestrial biosphere models (TBMs) are highly sensitive to model representation of photosynthesis, in particular the parameters maximum carboxylation rate and maximum electron transport rate at 25°C (Vc,max.25 and Jmax.25 , respectively). Many TBMs do not include representation of Arctic plants, and those that do rely on understanding and parameterization from temperate species. We measured photosynthetic CO2 response curves and leaf nitrogen (N) content in species representing the dominant vascular plant functional types found on the coastal tundra near Barrow, Alaska. The activation energies associated with the temperature response functions of Vc,max and Jmax were 17% lower than commonly used values. When scaled to 25°C, Vc,max.25 and Jmax.25 were two- to five-fold higher than the values used to parameterize current TBMs. This high photosynthetic capacity was attributable to a high leaf N content and the high fraction of N invested in Rubisco. Leaf-level modeling demonstrated that current parameterization of TBMs resulted in a two-fold underestimation of the capacity for leaf-level CO2 assimilation in Arctic vegetation. This study highlights the poor representation of Arctic photosynthesis in TBMs, and provides the critical data necessary to improve our ability to project the response of the Arctic to global environmental change. No claim to original US Government works New Phytologist © 2017 New Phytologist Trust.

  8. Sampling in Atypical Endometrial Hyperplasia: Which Method Results in the Lowest Underestimation of Endometrial Cancer? A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Bourdel, Nicolas; Chauvet, Pauline; Tognazza, Enrica; Pereira, Bruno; Botchorishvili, Revaz; Canis, Michel

    2016-01-01

    Our objective was to identify the most accurate method of endometrial sampling for the diagnosis of complex atypical hyperplasia (CAH), and the related risk of underestimation of endometrial cancer. We conducted a systematic literature search in PubMed and EMBASE (January 1999-September 2013) to identify all registered articles on this subject. Studies were selected with a 2-step method. First, titles and abstracts were analyzed by 2 reviewers, and 69 relevant articles were selected for full reading. Then, the full articles were evaluated to determine whether full inclusion criteria were met. We selected 27 studies, taking into consideration the comparison between histology of endometrial hyperplasia obtained by diagnostic tests of interest (uterine curettage, hysteroscopically guided biopsy, or hysteroscopic endometrial resection) and subsequent results of hysterectomy. Analysis of the studies reviewed focused on 1106 patients with a preoperative diagnosis of atypical endometrial hyperplasia. The mean risk of finding endometrial cancer at hysterectomy after atypical endometrial hyperplasia diagnosed by uterine curettage was 32.7% (95% confidence interval [CI], 26.2-39.9), with a risk of 45.3% (95% CI, 32.8-58.5) after hysteroscopically guided biopsy and 5.8% (95% CI, 0.8-31.7) after hysteroscopic resection. In total, the risk of underestimation of endometrial cancer reaches a very high rate in patients with CAH using the classic method of evaluation (i.e., uterine curettage or hysteroscopically guided biopsy). This rate of underdiagnosed endometrial cancer leads to the risk of inappropriate surgical procedures (31.7% of tubal conservation in the data available and no abdominal exploration in 24.6% of the cases). Hysteroscopic resection seems to reduce the risk of underdiagnosed endometrial cancer.

  9. Underestimation of Project Costs

    Science.gov (United States)

    Jones, Harry W.

    2015-01-01

    Large projects almost always exceed their budgets. Estimating cost is difficult and estimated costs are usually too low. Three different reasons are suggested: bad luck, overoptimism, and deliberate underestimation. Project management can usually point to project difficulty and complexity, technical uncertainty, stakeholder conflicts, scope changes, unforeseen events, and other not really unpredictable bad luck. Project planning is usually over-optimistic, so the likelihood and impact of bad luck is systematically underestimated. Project plans reflect optimism and hope for success in a supposedly unique new effort rather than rational expectations based on historical data. Past project problems are claimed to be irrelevant because "This time it's different." Some bad luck is inevitable and reasonable optimism is understandable, but deliberate deception must be condemned. In a competitive environment, project planners and advocates often deliberately underestimate costs to help gain project approval and funding. Project benefits, cost savings, and probability of success are exaggerated and key risks ignored. Project advocates have incentives to distort information and conceal difficulties from project approvers. One naively suggested cure is more openness, honesty, and group adherence to shared overall goals. A more realistic alternative is threatening overrun projects with cancellation. Neither approach seems to solve the problem. A better method to avoid the delusions of over-optimism and the deceptions of biased advocacy is to base the project cost estimate on the actual costs of a large group of similar projects. Over optimism and deception can continue beyond the planning phase and into project execution. Hard milestones based on verified tests and demonstrations can provide a reality check.

  10. The gastrointestinal tract: an underestimated organ as demonstrated in an experimental LVAD pig model.

    Science.gov (United States)

    Miyama, M; Dihmis, W C; Deleuze, P H; Uozaki, Y; Bambang, S L; Pasteau, F; Rostaqui, N; Loisance, D Y

    1996-03-01

    Although hemodynamic stability and renal function are important and are monitored closely in patients with implanted left ventricular assist devices (LVAD), the gastrointestinal tract may be underestimated in the early postoperative period with regard to adequate perfusion. We investigated renal, intestinal, and whole body metabolic changes in response to variations in LVAD flow and inspired oxygen concentration (FiO2). Left ventricular assist devices were implanted in 10 adult pigs (weight, 55 +/- 1.76 kg). Renal vein (RV), superior mesenteric vein (SMV), and pulmonary artery (PA) blood oxygen saturation and lactate concentration were measured and used as tissue perfusion markers. These measurements were made at baseline and after changes in LVAD flow or FiO2. Oxygen saturation in the PA, SMV, and RV decreased significantly after a reduction in LVAD flow (P < 0.05), with a greater reduction in the SMV than in the PA and RV (p < 0.05 at LVAD flow 3.5L/min; p < 0.01 at LVAD flow 2.0 and 1.0 L/min). The lactate concentration in the PA and SMV increased significantly (p < 0.01) with decreased flow, with a greater increase in the SMV than in the PA (p< 0.05), whereas it remained unchanged in the RV. Oxygen saturation in the PA, SMV, and RV decreased significantly after a reduction in FiO2 (p < 0.05). Lactate concentration in the PA, SMV, and RV increased significantly at FiO2 of 0.10 (p < 0.05). Lactate concentration in the PA and SMV was significantly higher than that in the RV at Fi)2 of 0.10 (p < 0.01). The results show that the gastrointestinal tract is at high risk during low perfusion or low FiO2, whereas the kidneys' metabolic function appears to be less disturbed. In clinical practice, this emphasizes the need to ensure adequate blood flow and respiratory function, especially after extubation, in patients with implanted LVAD. This might avoid intestinal ischemia and subsequent endotoxemia. Gastrointestinal tonometry may help in the assessment of intestinal

  11. Models are likely to underestimate increase in heavy rainfall in the extratropical regions with high rainfall intensity

    Science.gov (United States)

    Borodina, Aleksandra; Fischer, Erich M.; Knutti, Reto

    2017-07-01

    Model projections of regional changes in heavy rainfall are uncertain. On timescales of few decades, internal variability plays an important role and therefore poses a challenge to detect robust model response in heavy rainfall to rising temperatures. We use spatial aggregation to reduce the major role of internal variability and evaluate the heavy rainfall response to warming temperatures with observations. We show that in the regions with high rainfall intensity and for which gridded observations exist, most of the models underestimate the historical scaling of heavy rainfall and the land fraction with significant positive heavy rainfall scalings during the historical period. The historical behavior is correlated with the projected heavy rainfall intensification across models allowing to apply an observational constraint, i.e., to calibrate multimodel ensembles with observations in order to narrow the range of projections. The constraint suggests a substantially stronger intensification of future heavy rainfall than the multimodel mean.

  12. The neglect of cliff instability can underestimate warming period melting in Antarctic ice sheet models

    CERN Document Server

    Ruckert, Kelsey L; Pollard, Dave; Guan, Yawen; Wong, Tony E; Forest, Chris E; Keller, Klaus

    2016-01-01

    The response of the Antarctic ice sheet (AIS) to changing climate forcings is an important driver of sea-level changes. Anthropogenic climate changes may drive a sizeable AIS tipping point response with subsequent increases in coastal flooding risks. Many studies analyzing flood risks use simple models to project the future responses of AIS and its sea-level contributions. These analyses have provided important new insights, but they are often silent on the effects of potentially important processes such as Marine Ice Sheet Instability (MISI) or Marine Ice Cliff Instability (MICI). These approximations can be well justified and result in more parsimonious and transparent model structures. This raises the question how this approximation impacts hindcasts and projections. Here, we calibrate a previously published AIS model, which neglects the effects of MICI, using a combination of observational constraints and a Bayesian inversion method. Specifically, we approximate the effects of missing MICI by comparing ou...

  13. Underestimation of nuclear fuel burnup – theory, demonstration and solution in numerical models

    Directory of Open Access Journals (Sweden)

    Gajda Paweł

    2016-01-01

    Full Text Available Monte Carlo methodology provides reference statistical solution of neutron transport criticality problems of nuclear systems. Estimated reaction rates can be applied as an input to Bateman equations that govern isotopic evolution of reactor materials. Because statistical solution of Boltzmann equation is computationally expensive, it is in practice applied to time steps of limited length. In this paper we show that simple staircase step model leads to underprediction of numerical fuel burnup (Fissions per Initial Metal Atom – FIMA. Theoretical considerations indicates that this error is inversely proportional to the length of the time step and origins from the variation of heating per source neutron. The bias can be diminished by application of predictor-corrector step model. A set of burnup simulations with various step length and coupling schemes has been performed. SERPENT code version 1.17 has been applied to the model of a typical fuel assembly from Pressurized Water Reactor. In reference case FIMA reaches 6.24% that is equivalent to about 60 GWD/tHM of industrial burnup. The discrepancies up to 1% have been observed depending on time step model and theoretical predictions are consistent with numerical results. Conclusions presented in this paper are important for research and development concerning nuclear fuel cycle also in the context of Gen4 systems.

  14. Underestimation of soil carbon stocks by Yasso07, Q, and CENTURY models in boreal forest linked to overlooking site fertility

    Science.gov (United States)

    Ťupek, Boris; Ortiz, Carina; Hashimoto, Shoji; Stendahl, Johan; Dahlgren, Jonas; Karltun, Erik; Lehtonen, Aleksi

    2016-04-01

    The soil organic carbon stock (SOC) changes estimated by the most process based soil carbon models (e.g. Yasso07, Q and CENTURY), needed for reporting of changes in soil carbon amounts for the United Nations Framework Convention on Climate Change (UNFCCC) and for mitigation of anthropogenic CO2 emissions by soil carbon management, can be biased if in a large mosaic of environments the models are missing a key factor driving SOC sequestration. To our knowledge soil nutrient status as a missing driver of these models was not tested in previous studies. Although, it's known that models fail to reconstruct the spatial variation and that soil nutrient status drives the ecosystem carbon use efficiency and soil carbon sequestration. We evaluated SOC stock estimates of Yasso07, Q and CENTURY process based models against the field data from Swedish Forest Soil National Inventories (3230 samples) organized by recursive partitioning method (RPART) into distinct soil groups with underlying SOC stock development linked to physicochemical conditions. These models worked for most soils with approximately average SOC stocks, but could not reproduce higher measured SOC stocks in our application. The Yasso07 and Q models that used only climate and litterfall input data and ignored soil properties generally agreed with two third of measurements. However, in comparison with measurements grouped according to the gradient of soil nutrient status we found that the models underestimated for the Swedish boreal forest soils with higher site fertility. Accounting for soil texture (clay, silt, and sand content) and structure (bulk density) in CENTURY model showed no improvement on carbon stock estimates, as CENTURY deviated in similar manner. We highlighted the mechanisms why models deviate from the measurements and the ways of considering soil nutrient status in further model development. Our analysis suggested that the models indeed lack other predominat drivers of SOC stabilization

  15. Systematic Multiscale Modeling of Polymers

    Science.gov (United States)

    Faller, Roland; Huang, David; Bayramoglu, Beste; Moule, Adam

    2011-03-01

    The systematic coarse-graining of heterogeneous soft matter systems is an area of current research. We show how the Iterative Boltzmann Inversion systematically develops models for polymers in different environments. We present the scheme and a few applications. We study polystyrene in various environments and compare the different models from the melt, the solution and polymer brushes to validate accuracy and efficiency. We then apply the technique to a complex system needed as active layer in polymer-based solar cells. Nano-scale morphological information is difficult to obtain experimentally. On the other hand, atomistic computer simulations are only feasible to studying systems not much larger than an exciton diffusion length. Thus, we develop a coarse-grained (CG) simulation model, in which collections of atoms from an atomistic model are mapped onto a smaller number of ``superatoms.'' We study mixtures of poly(3-hexylthiophene) and C60 . By comparing the results of atomistic and CG simulations, we demonstrate that the model, parametrized at one temperature and two mixture compositions, accurately reproduces the system structure at other points of the phase diagram. We use the CG model to characterize the microstructure as a function of polymer:fullerene mole fraction and polymer chain length for systems approaching the scale of photovoltaic devices.

  16. Cost Underestimation in Public Works Projects

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren L.

    This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost...... estimates used to decide whether important infrastructure should be built are highly and systematically misleading. The result is continuous cost escalation of billions of dollars. The sample used in the study is the largest of its kind, allowing for the first time statistically valid conclusions regarding...... questions of cost underestimation and escalation for different project types, different geographical regions, and different historical periods. Four kinds of explanation of cost underestimation are examined: technical, economic, psychological, and political. Underestimation cannot be explained by error...

  17. Underestimating Costs in Public Works Projects

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren L.

    2002-01-01

    This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost...... estimates used to decide whether important infrastructure should be built are highly and systematically misleading. The result is continuous cost escalation of billions of dollars. The sample used in the study is the largest of its kind, allowing for the first time statistically valid conclusions regarding...... questions of cost underestimation and escalation for different project types, different geographical regions, and different historical periods. Four kinds of explanation of cost underestimation are examined: technical, economic, psychological, and political. Underestimation cannot be explained by error...

  18. Testing Model Atmospheres for Young Very-low-mass Stars and Brown Dwarfs in the Infrared: Evidence for Significantly Underestimated Dust Opacities

    Science.gov (United States)

    Tottle, Jonathan; Mohanty, Subhanjoy

    2015-05-01

    We test state-of-the-art model atmospheres for young very-low-mass stars and brown dwarfs in the infrared, by comparing the predicted synthetic photometry over 1.2-24 μm to the observed photometry of M-type spectral templates in star-forming regions. We find that (1) in both early and late young M types, the model atmospheres imply effective temperatures ({{T}eff}) several hundred Kelvin lower than predicted by the standard pre-main sequence (PMS) spectral type-{{T}eff} conversion scale (based on theoretical evolutionary models). It is only in the mid-M types that the two temperature estimates agree. (2) The {{T}eff} discrepancy in the early M types (corresponding to stellar masses ≳ 0.4 {{M}⊙ } at ages of a few Myr) probably arises from remaining uncertainties in the treatment of atmospheric convection within the atmospheric models, whereas in the late M types it is likely due to an underestimation of dust opacity. (3) The empirical and model-atmosphere J-band bolometric corrections are both roughly flat, and similar to each other, over the M-type {{T}eff} range. Thus the model atmospheres yield reasonably accurate bolometric luminosities ({{L}bol}), but lead to underestimations of mass and age relative to evolutionary expectations (especially in the late M types) due to lower {{T}eff}. We demonstrate this for a large sample of young Cha I and Taurus sources. (4) The trends in the atmospheric model J-Ks colors, and their deviations from the data, are similar at PMS and main sequence ages, suggesting that the model dust opacity errors we postulate here for young ages also apply at field ages.

  19. First results from the Goddard High-Resolution Spectrograph - Evidence for photospheric microturbulence in early O stars - Are surface gravities systematically underestimated?

    Science.gov (United States)

    Hubeny, I.; Heap, S. R.; Altner, B.

    1991-01-01

    GHRS spectra of two very hot stars provide evidence for the presence of microturbulence in their photospheres. In attempting to reproduce the observed spectra, theoretical models have been built in which the microturbulence is allowed to modify not only the Doppler line widths (classical 'spectroscopic' microturbulence), but also the turbulent pressure (thus mimicking a 'physical' turbulence). It is found that a corresponding modification of the temperature-pressure stratification influences the hydrogen and helium line profiles to the extent that the surface gravities of early O stars determined without considering microturbulence are too low by 0.1-0.15 dex. Thus, including microturbulence would reduce, or resolve completely, a long-standing discrepancy between evolutionary and spectroscopic stellar masses.

  20. Development and evaluation of a prediction model for underestimated invasive breast cancer in women with ductal carcinoma in situ at stereotactic large core needle biopsy.

    Directory of Open Access Journals (Sweden)

    Suzanne C E Diepstraten

    Full Text Available BACKGROUND: We aimed to develop a multivariable model for prediction of underestimated invasiveness in women with ductal carcinoma in situ at stereotactic large core needle biopsy, that can be used to select patients for sentinel node biopsy at primary surgery. METHODS: From the literature, we selected potential preoperative predictors of underestimated invasive breast cancer. Data of patients with nonpalpable breast lesions who were diagnosed with ductal carcinoma in situ at stereotactic large core needle biopsy, drawn from the prospective COBRA (Core Biopsy after RAdiological localization and COBRA2000 cohort studies, were used to fit the multivariable model and assess its overall performance, discrimination, and calibration. RESULTS: 348 women with large core needle biopsy-proven ductal carcinoma in situ were available for analysis. In 100 (28.7% patients invasive carcinoma was found at subsequent surgery. Nine predictors were included in the model. In the multivariable analysis, the predictors with the strongest association were lesion size (OR 1.12 per cm, 95% CI 0.98-1.28, number of cores retrieved at biopsy (OR per core 0.87, 95% CI 0.75-1.01, presence of lobular cancerization (OR 5.29, 95% CI 1.25-26.77, and microinvasion (OR 3.75, 95% CI 1.42-9.87. The overall performance of the multivariable model was poor with an explained variation of 9% (Nagelkerke's R(2, mediocre discrimination with area under the receiver operating characteristic curve of 0.66 (95% confidence interval 0.58-0.73, and fairly good calibration. CONCLUSION: The evaluation of our multivariable prediction model in a large, clinically representative study population proves that routine clinical and pathological variables are not suitable to select patients with large core needle biopsy-proven ductal carcinoma in situ for sentinel node biopsy during primary surgery.

  1. A scaling theory for the size distribution of emitted dust aerosols suggests climate models underestimate the size of the global dust cycle.

    Science.gov (United States)

    Kok, Jasper F

    2011-01-18

    Mineral dust aerosols impact Earth's radiation budget through interactions with clouds, ecosystems, and radiation, which constitutes a substantial uncertainty in understanding past and predicting future climate changes. One of the causes of this large uncertainty is that the size distribution of emitted dust aerosols is poorly understood. The present study shows that regional and global circulation models (GCMs) overestimate the emitted fraction of clay aerosols (climate predictions in dusty regions. On a global scale, the dust cycle in most GCMs is tuned to match radiative measurements, such that the overestimation of the radiative cooling of a given quantity of emitted dust has likely caused GCMs to underestimate the global dust emission rate. This implies that the deposition flux of dust and its fertilizing effects on ecosystems may be substantially larger than thought.

  2. Tidal marsh accretion processes in the San Francisco Bay-Delta - are our models underestimating the historic and future importance of plant-mediated organic accretion?

    Science.gov (United States)

    Windham-Myers, L.; Drexler, J. Z.; Byrd, K. B.; Schile, L. M.

    2012-12-01

    as well as titanium, iron, potassium and manganese. It remains unclear whether the hydrologic conditions associated with mineral inputs or the mineral inputs themselves promote decomposition and favor the accumulation of mineral fractions at the expense of organic fractions. As suspended sediment concentrations are currently decreasing in the SFBay-Delta, organic accretion may be enhanced or at least required for sustaining marsh elevations. These data suggest that a) potential organic accretion may be underestimated during calibration of peat accretion models with recent mineral-rich watershed conditions, and b) plant physiology and biochemistry are significant factors in the future and historic development of coastal peatlands.

  3. A Bayesian model to correct underestimated 3-D wind speeds from sonic anemometers increases turbulent components of the surface energy balance

    Science.gov (United States)

    Frank, John M.; Massman, William J.; Ewers, Brent E.

    2016-12-01

    Sonic anemometers are the principal instruments in micrometeorological studies of turbulence and ecosystem fluxes. Common designs underestimate vertical wind measurements because they lack a correction for transducer shadowing, with no consensus on a suitable correction. We reanalyze a subset of data collected during field experiments in 2011 and 2013 featuring two or four CSAT3 sonic anemometers. We introduce a Bayesian analysis to resolve the three-dimensional correction by optimizing differences between anemometers mounted both vertically and horizontally. A grid of 512 points (˜ ±5° resolution in wind location) is defined on a sphere around the sonic anemometer, from which the shadow correction for each transducer pair is derived from a set of 138 unique state variables describing the quadrants and borders. Using the Markov chain Monte Carlo (MCMC) method, the Bayesian model proposes new values for each state variable, recalculates the fast-response data set, summarizes the 5 min wind statistics, and accepts the proposed new values based on the probability that they make measurements from vertical and horizontal anemometers more equivalent. MCMC chains were constructed for three different prior distributions describing the state variables: no shadow correction, the Kaimal correction for transducer shadowing, and double the Kaimal correction, all initialized with 10 % uncertainty. The final posterior correction did not depend on the prior distribution and revealed both self- and cross-shadowing effects from all transducers. After correction, the vertical wind velocity and sensible heat flux increased ˜ 10 % with ˜ 2 % uncertainty, which was significantly higher than the Kaimal correction. We applied the posterior correction to eddy-covariance data from various sites across North America and found that the turbulent components of the energy balance (sensible plus latent heat flux) increased on average between 8 and 12 %, with an average 95 % credible

  4. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose of the s......The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  5. Systematic modelling and simulation of refrigeration systems

    DEFF Research Database (Denmark)

    Rasmussen, Bjarne D.; Jakobsen, Arne

    1998-01-01

    The task of developing a simulation model of a refrigeration system can be very difficult and time consuming. In order for this process to be effective, a systematic method for developing the system model is required. This method should aim at guiding the developer to clarify the purpose...... of the simulation, to select appropriate component models and to set up the equations in a well-arranged way. In this paper the outline of such a method is proposed and examples showing the use of this method for simulation of refrigeration systems are given....

  6. Systematic approach to MIS model creation

    Directory of Open Access Journals (Sweden)

    Macura Perica

    2004-01-01

    Full Text Available In this paper-work, by application of basic principles of general theory of system (systematic approach, we have formulated a model of marketing information system. Bases for research were basic characteristics of systematic approach and marketing system. Informational base for management of marketing system, i.e. marketing instruments was presented in a way that the most important information for decision making were listed per individual marketing mix instruments. In projected model of marketing information system, information listed in this way create a base for establishing of data bases, i.e. bases of information (data bases of: product, price, distribution, promotion. This paper-work gives basic preconditions for formulation and functioning of the model. Model was presented by explication of elements of its structure (environment, data bases operators, analysts of information system, decision makers - managers, i.e. input, process, output, feedback and relations between these elements which are necessary for its optimal functioning. Beside that, here are basic elements for implementation of the model into business system, as well as conditions for its efficient functioning and development.

  7. Childhood asthma prediction models: a systematic review.

    Science.gov (United States)

    Smit, Henriette A; Pinart, Mariona; Antó, Josep M; Keil, Thomas; Bousquet, Jean; Carlsen, Kai H; Moons, Karel G M; Hooft, Lotty; Carlsen, Karin C Lødrup

    2015-12-01

    Early identification of children at risk of developing asthma at school age is crucial, but the usefulness of childhood asthma prediction models in clinical practice is still unclear. We systematically reviewed all existing prediction models to identify preschool children with asthma-like symptoms at risk of developing asthma at school age. Studies were included if they developed a new prediction model or updated an existing model in children aged 4 years or younger with asthma-like symptoms, with assessment of asthma done between 6 and 12 years of age. 12 prediction models were identified in four types of cohorts of preschool children: those with health-care visits, those with parent-reported symptoms, those at high risk of asthma, or children in the general population. Four basic models included non-invasive, easy-to-obtain predictors only, notably family history, allergic disease comorbidities or precursors of asthma, and severity of early symptoms. Eight extended models included additional clinical tests, mostly specific IgE determination. Some models could better predict asthma development and other models could better rule out asthma development, but the predictive performance of no single model stood out in both aspects simultaneously. This finding suggests that there is a large proportion of preschool children with wheeze for which prediction of asthma development is difficult.

  8. Towards Systematic Benchmarking of Climate Model Performance

    Science.gov (United States)

    Gleckler, P. J.

    2014-12-01

    The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine

  9. Using data assimilation for systematic model improvement

    Science.gov (United States)

    Lang, Matthew S.; van Leeuwen, Peter Jan; Browne, Phil

    2016-04-01

    In Numerical Weather Prediction parameterisations are used to simulate missing physics in the model. These can be due to a lack of scientific understanding or a lack of computing power available to address all the known physical processes. Parameterisations are sources of large uncertainty in a model as parameter values used in these parameterisations cannot be measured directly and hence are often not well known, and the parameterisations themselves are approximations of the processes present in the true atmosphere. Whilst there are many efficient and effective methods for combined state/parameter estimation in data assimilation, such as state augmentation, these are not effective at estimating the structure of parameterisations. A new method of parameterisation estimation is proposed that uses sequential data assimilation methods to estimate errors in the numerical models at each space-time point for each model equation. These errors are then fitted to predetermined functional forms of missing physics or parameterisations, that are based upon prior information. The method picks out the functional form, or that combination of functional forms, that bests fits the error structure. The prior information typically takes the form of expert knowledge. We applied the method to a one-dimensional advection model with additive model error, and it is shown that the method can accurately estimate parameterisations, with consistent error estimates. It is also demonstrated that state augmentation is not successful. The results indicate that this new method is a powerful tool in systematic model improvement.

  10. Systematic model building with flavor symmetries

    Energy Technology Data Exchange (ETDEWEB)

    Plentinger, Florian

    2009-12-19

    The observation of neutrino masses and lepton mixing has highlighted the incompleteness of the Standard Model of particle physics. In conjunction with this discovery, new questions arise: why are the neutrino masses so small, which form has their mass hierarchy, why is the mixing in the quark and lepton sectors so different or what is the structure of the Higgs sector. In order to address these issues and to predict future experimental results, different approaches are considered. One particularly interesting possibility, are Grand Unified Theories such as SU(5) or SO(10). GUTs are vertical symmetries since they unify the SM particles into multiplets and usually predict new particles which can naturally explain the smallness of the neutrino masses via the seesaw mechanism. On the other hand, also horizontal symmetries, i.e., flavor symmetries, acting on the generation space of the SM particles, are promising. They can serve as an explanation for the quark and lepton mass hierarchies as well as for the different mixings in the quark and lepton sectors. In addition, flavor symmetries are significantly involved in the Higgs sector and predict certain forms of mass matrices. This high predictivity makes GUTs and flavor symmetries interesting for both, theorists and experimentalists. These extensions of the SM can be also combined with theories such as supersymmetry or extra dimensions. In addition, they usually have implications on the observed matter-antimatter asymmetry of the universe or can provide a dark matter candidate. In general, they also predict the lepton flavor violating rare decays {mu} {yields} e{gamma}, {tau} {yields} {mu}{gamma}, and {tau} {yields} e{gamma} which are strongly bounded by experiments but might be observed in the future. In this thesis, we combine all of these approaches, i.e., GUTs, the seesaw mechanism and flavor symmetries. Moreover, our request is to develop and perform a systematic model building approach with flavor symmetries and

  11. Parental underestimates of child weight: a meta-analysis.

    Science.gov (United States)

    Lundahl, Alyssa; Kidwell, Katherine M; Nelson, Timothy D

    2014-03-01

    Parental perceptions of their children's weight play an important role in obesity prevention and treatment. The objective of this study was to determine the proportion of parents worldwide who underestimate their children's weight and moderators of such misperceptions. Original studies published to January 2013 were chosen through literature searches in PUBMED, PSYCHINFO, and CINAHL databases. References of retrieved articles were also searched for relevant studies. Studies were published in English and assessed parental perceptions of children's weight and then compared perceptions to recognized standards for defining overweight based on anthropometric measures. Data were extracted on study-level constructs, child- and parent-characteristics, procedural characteristics, and parental underestimates separately for normal-weight and overweight/obese samples. Pooled effect sizes were calculated using random-effects models and adjusted for publication bias. Moderators were explored using mixed-effect models. A total of 69 articles (representing 78 samples; n = 15,791) were included in the overweight/obese meta-analysis. Adjusted effect sizes revealed that 50.7% (95% confidence interval 31.1%-70.2%) of parents underestimate their overweight/obese children's weight. Significant moderators of this effect included child's age and BMI. A total of 52 articles (representing 59 samples; n = 64,895) were included in the normal-weight meta-analysis. Pooled effect sizes indicated that 14.3% (95% confidence interval 11.7%-17.4%) of parents underestimate their children's normal-weight status. Significant moderators of this effect included child gender, parent weight, and the method (visual versus nonvisual) in which perception was assessed. Half of parents underestimated their children's overweight/obese status and a significant minority underestimated children's normal weight. Pediatricians are well positioned to make efforts to remedy parental underestimates and promote adoption

  12. A scaling theory for the size distribution of emitted dust aerosols suggests climate models underestimate the size of the global dust cycle

    CERN Document Server

    Kok, Jasper F

    2010-01-01

    Mineral dust aerosols impact Earth's radiation budget through interactions with clouds, ecosystems, and radiation, which constitutes a substantial uncertainty in understanding past and predicting future climate changes. One of the causes of this large uncertainty is that the size distribution of emitted dust aerosols is poorly understood. The present study shows that regional and global circulation models (GCMs) overestimate the emitted fraction of clay aerosols (< 2 {\\mu}m diameter) by a factor of ~2 - 8 relative to measurements. This discrepancy is resolved by deriving a simple theoretical expression of the emitted dust size distribution that is in excellent agreement with measurements. This expression is based on the physics of the scale-invariant fragmentation of brittle materials, which is shown to be applicable to dust emission. Because clay aerosols produce a strong radiative cooling, the overestimation of the clay fraction causes GCMs to also overestimate the radiative cooling of a given quantity o...

  13. Systematic identification of crystallization kinetics within a generic modelling framework

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Gernaey, Krist

    2012-01-01

    A systematic development of constitutive models within a generic modelling framework has been developed for use in design, analysis and simulation of crystallization operations. The framework contains a tool for model identification connected with a generic crystallizer modelling tool-box, a tool...

  14. Conceptual Model for Systematic Construction Waste Management

    OpenAIRE

    Abd Rahim Mohd Hilmi Izwan; Kasim Narimah

    2017-01-01

    Development of the construction industry generated construction waste which can contribute towards environmental issues. Weaknesses of compliance in construction waste management especially in construction site have also contributed to the big issues of waste generated in landfills and illegal dumping area. This gives sign that construction projects are needed a systematic construction waste management. To date, a comprehensive criteria of construction waste management, particularly for const...

  15. The effect of uncertainty and systematic errors in hydrological modelling

    Science.gov (United States)

    Steinsland, I.; Engeland, K.; Johansen, S. S.; Øverleir-Petersen, A.; Kolberg, S. A.

    2014-12-01

    The aims of hydrological model identification and calibration are to find the best possible set of process parametrization and parameter values that transform inputs (e.g. precipitation and temperature) to outputs (e.g. streamflow). These models enable us to make predictions of streamflow. Several sources of uncertainties have the potential to hamper the possibility of a robust model calibration and identification. In order to grasp the interaction between model parameters, inputs and streamflow, it is important to account for both systematic and random errors in inputs (e.g. precipitation and temperatures) and streamflows. By random errors we mean errors that are independent from time step to time step whereas by systematic errors we mean errors that persists for a longer period. Both random and systematic errors are important in the observation and interpolation of precipitation and temperature inputs. Important random errors comes from the measurements themselves and from the network of gauges. Important systematic errors originate from the under-catch in precipitation gauges and from unknown spatial trends that are approximated in the interpolation. For streamflow observations, the water level recordings might give random errors whereas the rating curve contributes mainly with a systematic error. In this study we want to answer the question "What is the effect of random and systematic errors in inputs and observed streamflow on estimated model parameters and streamflow predictions?". To answer we test systematically the effect of including uncertainties in inputs and streamflow during model calibration and simulation in distributed HBV model operating on daily time steps for the Osali catchment in Norway. The case study is based on observations from, uncertainty carefullt quantified, and increased uncertainties and systmatical errors are done realistically by for example removing a precipitation gauge from the network.We find that the systematical errors in

  16. Systematic underestimation of the age of selected alleles

    Directory of Open Access Journals (Sweden)

    Joanna L. Kelley

    2012-08-01

    Full Text Available A common interpretation of genome-wide selection scans is that the dispersal of anatomically modern humans out of Africa and into diverse environments led to a number of genetic adaptations. If so, patterns of polymorphism from non-African individuals should show the signature of adaptations dating to 40,000 to 100,000 Kya, coinciding with the main exodus from Africa. However, scans of polymorphism data from a few populations have yielded conflicting results about the chronology of local, population-specific adaptations. In particular, a number of papers report very recent ages for selected alleles in humans, which postdate the development of agriculture 10 Kya, and suggest that adaptive differences among human populations are much more recent. I present analysis of simulations suggesting a downward bias in methods commonly used to estimate the age of alleles. These findings indicate that an estimate of a time to the most recent common ancestor (tMRCA obtained using standard methods (used as a proxy for the age of an allele of less than 10Kya is consistent with an allele that actually became selected before the onset of agriculture and potentially as early as 50 Kya. These findings suggest that the genomic scans for selection may be consistent with selective pressures tied to the Out of Africa expansion of modern human populations.

  17. Towards systematic exploration of multi-Higgs-doublet models

    CERN Document Server

    Ivanov, I P

    2015-01-01

    Conservative bSM models with rich scalar sector, such as multi-Higgs-doublet models, can easily accommodate the SM-like properties of the 125 GeV scalar observed at the LHC. Possessing a variety of bSM signals, they are worth investigating in fuller detail. Systematic study of these models is hampered by the highly multi-dimensional parameter space and by mathematical challenges. I outline some directions along which multi-Higgs-doublet models in the vicinity of a large discrete symmetry can be systematically explored.

  18. Is cryptosporidiosis an underestimated disease in cats?

    OpenAIRE

    L da Silveira-Neto; S Inácio; Oliveira, L. N.; KDS Bresciani

    2015-01-01

    Studies on the occurrence of Cryptosporidium spp. in cats are still scarce. In this literature review, we address epidemiological and clinical aspects, as well as diagnostic methods, therapeutic behavoiur, and control and prevention measures for this disease in cats, with the aim of investigating if cryptosporidiosis is an underestimated disease in the laboratory routine and in small animal medical clinics.

  19. Systematic reviews of animal models: methodology versus epistemology.

    Science.gov (United States)

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  20. Systematic Reviews of Animal Models: Methodology versus Epistemology

    Directory of Open Access Journals (Sweden)

    Ray Greek, Andre Menache

    2013-01-01

    Full Text Available Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  1. A Unified Framework for Systematic Model Improvement

    DEFF Research Database (Denmark)

    Kristensen, Niels Rode; Madsen, Henrik; Jørgensen, Sten Bay

    2003-01-01

    A unified framework for improving the quality of continuous time models of dynamic systems based on experimental data is presented. The framework is based on an interplay between stochastic differential equation (SDE) modelling, statistical tests and multivariate nonparametric regression...

  2. Systematic experimental based modeling of a rotary piezoelectric ultrasonic motor

    DEFF Research Database (Denmark)

    Mojallali, Hamed; Amini, Rouzbeh; Izadi-Zamanabadi, Roozbeh

    2007-01-01

    In this paper, a new method for equivalent circuit modeling of a traveling wave ultrasonic motor is presented. The free stator of the motor is modeled by an equivalent circuit containing complex circuit elements. A systematic approach for identifying the elements of the equivalent circuit...

  3. Systematic parameter inference in stochastic mesoscopic modeling

    Science.gov (United States)

    Lei, Huan; Yang, Xiu; Li, Zhen; Karniadakis, George Em

    2017-02-01

    We propose a method to efficiently determine the optimal coarse-grained force field in mesoscopic stochastic simulations of Newtonian fluid and polymer melt systems modeled by dissipative particle dynamics (DPD) and energy conserving dissipative particle dynamics (eDPD). The response surfaces of various target properties (viscosity, diffusivity, pressure, etc.) with respect to model parameters are constructed based on the generalized polynomial chaos (gPC) expansion using simulation results on sampling points (e.g., individual parameter sets). To alleviate the computational cost to evaluate the target properties, we employ the compressive sensing method to compute the coefficients of the dominant gPC terms given the prior knowledge that the coefficients are "sparse". The proposed method shows comparable accuracy with the standard probabilistic collocation method (PCM) while it imposes a much weaker restriction on the number of the simulation samples especially for systems with high dimensional parametric space. Fully access to the response surfaces within the confidence range enables us to infer the optimal force parameters given the desirable values of target properties at the macroscopic scale. Moreover, it enables us to investigate the intrinsic relationship between the model parameters, identify possible degeneracies in the parameter space, and optimize the model by eliminating model redundancies. The proposed method provides an efficient alternative approach for constructing mesoscopic models by inferring model parameters to recover target properties of the physics systems (e.g., from experimental measurements), where those force field parameters and formulation cannot be derived from the microscopic level in a straight forward way.

  4. Systematic parameter inference in stochastic mesoscopic modeling

    CERN Document Server

    Lei, Huan; Li, Zhen; Karniadakis, George

    2016-01-01

    We propose a method to efficiently determine the optimal coarse-grained force field in mesoscopic stochastic simulations of Newtonian fluid and polymer melt systems modeled by dissipative particle dynamics (DPD) and energy conserving dissipative particle dynamics (eDPD). The response surfaces of various target properties (viscosity, diffusivity, pressure, etc.) with respect to model parameters are constructed based on the generalized polynomial chaos (gPC) expansion using simulation results on sampling points (e.g., individual parameter sets). To alleviate the computational cost to evaluate the target properties, we employ the compressive sensing method to compute the coefficients of the dominant gPC terms given the prior knowledge that the coefficients are sparse. The proposed method shows comparable accuracy with the standard probabilistic collocation method (PCM) while it imposes a much weaker restriction on the number of the simulation samples especially for systems with high dimensional parametric space....

  5. The Social Relations Model in Family Studies: A Systematic Review

    Science.gov (United States)

    Eichelsheim, Veroni I.; Dekovic, Maja; Buist, Kirsten L.; Cook, William L.

    2009-01-01

    The Social Relations Model (SRM) allows for examination of family relations on three different levels: the individual level (actor and partner effects), the dyadic level (relationship effects), and the family level (family effect). The aim of this study was to present a systematic review of SRM family studies and identify general patterns in the…

  6. Fishermen´s underestimation of risk

    DEFF Research Database (Denmark)

    Knudsen, Fabienne; Grøn, Sisse

    2009-01-01

      Fishermen's underestimation of risk   Background: In order to understand the effect of footwear and flooring on slips, trips and falls, 1st author visited 4 fishing boats.  An important spinoff of the study was to get an in situ insight in the way, fishermen perceive risk.   Objectives: The pre......  Fishermen's underestimation of risk   Background: In order to understand the effect of footwear and flooring on slips, trips and falls, 1st author visited 4 fishing boats.  An important spinoff of the study was to get an in situ insight in the way, fishermen perceive risk.   Objectives......: The presentation will analyse fishermen's risk perception, its causes and consequences.   Methods: The first author participated in 3 voyages at sea on fishing vessels (from 1 to 10 days each and from 2 to 4 crewmembers) where  interviews and participant observation was undertaken. A 4th fishing boat was visited...... on dock and the crew interviewed. Based on notes, diary, interviews and photo/video documentation, records related to the fishermen's risk perception were compiled, and then analysed by means of theories of risk perception.   Results: Fishermen tend partly to underrate the risks they are running, partly...

  7. Systematic improvement of molecular representations for machine learning models

    CERN Document Server

    Huang, Bing

    2016-01-01

    The predictive accuracy of Machine Learning (ML) models of molecular properties depends on the choice of the molecular representation. We introduce a hierarchy of representations based on uniqueness and target similarity criteria. To systematically control target similarity, we rely on interatomic many body expansions including Bonding, Angular, and higher order terms (BA). Addition of higher order contributions systematically increases similarity to the potential energy function as well as predictive accuracy of the resulting ML models. Numerical evidence is presented for the performance of BAML models trained on molecular properties pre-calculated at electron-correlated and density functional theory level of theory for thousands of small organic molecules. Properties studied include enthalpies and free energies of atomization, heatcapacity, zero-point vibrational energies, dipole-moment, polarizability, HOMO/LUMO energies and gap, ionization potential, electron affinity, and electronic excitations. After tr...

  8. Fishermen´s underestimation of risk

    DEFF Research Database (Denmark)

    Knudsen, Fabienne; Grøn, Sisse

    2009-01-01

      Fishermen's underestimation of risk   Background: In order to understand the effect of footwear and flooring on slips, trips and falls, 1st author visited 4 fishing boats.  An important spinoff of the study was to get an in situ insight in the way, fishermen perceive risk.   Objectives......: The presentation will analyse fishermen's risk perception, its causes and consequences.   Methods: The first author participated in 3 voyages at sea on fishing vessels (from 1 to 10 days each and from 2 to 4 crewmembers) where  interviews and participant observation was undertaken. A 4th fishing boat was visited...... on dock and the crew interviewed. Based on notes, diary, interviews and photo/video documentation, records related to the fishermen's risk perception were compiled, and then analysed by means of theories of risk perception.   Results: Fishermen tend partly to underrate the risks they are running, partly...

  9. Business model framework applications in health care: A systematic review.

    Science.gov (United States)

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-01-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  10. Improved Systematic Pointing Error Model for the DSN Antennas

    Science.gov (United States)

    Rochblatt, David J.; Withington, Philip M.; Richter, Paul H.

    2011-01-01

    New pointing models have been developed for large reflector antennas whose construction is founded on elevation over azimuth mount. At JPL, the new models were applied to the Deep Space Network (DSN) 34-meter antenna s subnet for corrections of their systematic pointing errors; it achieved significant improvement in performance at Ka-band (32-GHz) and X-band (8.4-GHz). The new models provide pointing improvements relative to the traditional models by a factor of two to three, which translate to approximately 3-dB performance improvement at Ka-band. For radio science experiments where blind pointing performance is critical, the new innovation provides a new enabling technology. The model extends the traditional physical models with higher-order mathematical terms, thereby increasing the resolution of the model for a better fit to the underlying systematic imperfections that are the cause of antenna pointing errors. The philosophy of the traditional model was that all mathematical terms in the model must be traced to a physical phenomenon causing antenna pointing errors. The traditional physical terms are: antenna axis tilts, gravitational flexure, azimuth collimation, azimuth encoder fixed offset, azimuth and elevation skew, elevation encoder fixed offset, residual refraction, azimuth encoder scale error, and antenna pointing de-rotation terms for beam waveguide (BWG) antennas. Besides the addition of spherical harmonics terms, the new models differ from the traditional ones in that the coefficients for the cross-elevation and elevation corrections are completely independent and may be different, while in the traditional model, some of the terms are identical. In addition, the new software allows for all-sky or mission-specific model development, and can utilize the previously used model as an a priori estimate for the development of the updated models.

  11. Analysis and Correction of Systematic Height Model Errors

    Science.gov (United States)

    Jacobsen, K.

    2016-06-01

    The geometry of digital height models (DHM) determined with optical satellite stereo combinations depends upon the image orientation, influenced by the satellite camera, the system calibration and attitude registration. As standard these days the image orientation is available in form of rational polynomial coefficients (RPC). Usually a bias correction of the RPC based on ground control points is required. In most cases the bias correction requires affine transformation, sometimes only shifts, in image or object space. For some satellites and some cases, as caused by small base length, such an image orientation does not lead to the possible accuracy of height models. As reported e.g. by Yong-hua et al. 2015 and Zhang et al. 2015, especially the Chinese stereo satellite ZiYuan-3 (ZY-3) has a limited calibration accuracy and just an attitude recording of 4 Hz which may not be satisfying. Zhang et al. 2015 tried to improve the attitude based on the color sensor bands of ZY-3, but the color images are not always available as also detailed satellite orientation information. There is a tendency of systematic deformation at a Pléiades tri-stereo combination with small base length. The small base length enlarges small systematic errors to object space. But also in some other satellite stereo combinations systematic height model errors have been detected. The largest influence is the not satisfying leveling of height models, but also low frequency height deformations can be seen. A tilt of the DHM by theory can be eliminated by ground control points (GCP), but often the GCP accuracy and distribution is not optimal, not allowing a correct leveling of the height model. In addition a model deformation at GCP locations may lead to not optimal DHM leveling. Supported by reference height models better accuracy has been reached. As reference height model the Shuttle Radar Topography Mission (SRTM) digital surface model (DSM) or the new AW3D30 DSM, based on ALOS PRISM images, are

  12. ANALYSIS AND CORRECTION OF SYSTEMATIC HEIGHT MODEL ERRORS

    Directory of Open Access Journals (Sweden)

    K. Jacobsen

    2016-06-01

    Full Text Available The geometry of digital height models (DHM determined with optical satellite stereo combinations depends upon the image orientation, influenced by the satellite camera, the system calibration and attitude registration. As standard these days the image orientation is available in form of rational polynomial coefficients (RPC. Usually a bias correction of the RPC based on ground control points is required. In most cases the bias correction requires affine transformation, sometimes only shifts, in image or object space. For some satellites and some cases, as caused by small base length, such an image orientation does not lead to the possible accuracy of height models. As reported e.g. by Yong-hua et al. 2015 and Zhang et al. 2015, especially the Chinese stereo satellite ZiYuan-3 (ZY-3 has a limited calibration accuracy and just an attitude recording of 4 Hz which may not be satisfying. Zhang et al. 2015 tried to improve the attitude based on the color sensor bands of ZY-3, but the color images are not always available as also detailed satellite orientation information. There is a tendency of systematic deformation at a Pléiades tri-stereo combination with small base length. The small base length enlarges small systematic errors to object space. But also in some other satellite stereo combinations systematic height model errors have been detected. The largest influence is the not satisfying leveling of height models, but also low frequency height deformations can be seen. A tilt of the DHM by theory can be eliminated by ground control points (GCP, but often the GCP accuracy and distribution is not optimal, not allowing a correct leveling of the height model. In addition a model deformation at GCP locations may lead to not optimal DHM leveling. Supported by reference height models better accuracy has been reached. As reference height model the Shuttle Radar Topography Mission (SRTM digital surface model (DSM or the new AW3D30 DSM, based on ALOS

  13. Systematic Uncertainties in High-Energy Hadronic Interaction Models

    Science.gov (United States)

    Zha, M.; Knapp, J.; Ostapchenko, S.

    2003-07-01

    Hadronic interaction models for cosmic ray energies are uncertain since our knowledge of hadronic interactions is extrap olated from accelerator experiments at much lower energies. At present most high-energy models are based on Grib ov-Regge theory of multi-Pomeron exchange, which provides a theoretical framework to evaluate cross-sections and particle production. While experimental data constrain some of the model parameters, others are not well determined and are therefore a source of systematic uncertainties. In this paper we evaluate the variation of results obtained with the QGSJET model, when modifying parameters relating to three ma jor sources of uncertainty: the form of the parton structure function, the role of diffractive interactions, and the string hadronisation. Results on inelastic cross sections, on secondary particle production and on the air shower development are discussed.

  14. Modelling the transmission of healthcare associated infections: a systematic review

    Science.gov (United States)

    2013-01-01

    Background Dynamic transmission models are increasingly being used to improve our understanding of the epidemiology of healthcare-associated infections (HCAI). However, there has been no recent comprehensive review of this emerging field. This paper summarises how mathematical models have informed the field of HCAI and how methods have developed over time. Methods MEDLINE, EMBASE, Scopus, CINAHL plus and Global Health databases were systematically searched for dynamic mathematical models of HCAI transmission and/or the dynamics of antimicrobial resistance in healthcare settings. Results In total, 96 papers met the eligibility criteria. The main research themes considered were evaluation of infection control effectiveness (64%), variability in transmission routes (7%), the impact of movement patterns between healthcare institutes (5%), the development of antimicrobial resistance (3%), and strain competitiveness or co-colonisation with different strains (3%). Methicillin-resistant Staphylococcus aureus was the most commonly modelled HCAI (34%), followed by vancomycin resistant enterococci (16%). Other common HCAIs, e.g. Clostridum difficile, were rarely investigated (3%). Very few models have been published on HCAI from low or middle-income countries. The first HCAI model has looked at antimicrobial resistance in hospital settings using compartmental deterministic approaches. Stochastic models (which include the role of chance in the transmission process) are becoming increasingly common. Model calibration (inference of unknown parameters by fitting models to data) and sensitivity analysis are comparatively uncommon, occurring in 35% and 36% of studies respectively, but their application is increasing. Only 5% of models compared their predictions to external data. Conclusions Transmission models have been used to understand complex systems and to predict the impact of control policies. Methods have generally improved, with an increased use of stochastic models, and

  15. Systematic evaluation of atmospheric chemistry-transport model CHIMERE

    Science.gov (United States)

    Khvorostyanov, Dmitry; Menut, Laurent; Mailler, Sylvain; Siour, Guillaume; Couvidat, Florian; Bessagnet, Bertrand; Turquety, Solene

    2017-04-01

    Regional-scale atmospheric chemistry-transport models (CTM) are used to develop air quality regulatory measures, to support environmentally sensitive decisions in the industry, and to address variety of scientific questions involving the atmospheric composition. Model performance evaluation with measurement data is critical to understand their limits and the degree of confidence in model results. CHIMERE CTM (http://www.lmd.polytechnique.fr/chimere/) is a French national tool for operational forecast and decision support and is widely used in the international research community in various areas of atmospheric chemistry and physics, climate, and environment (http://www.lmd.polytechnique.fr/chimere/CW-articles.php). This work presents the model evaluation framework applied systematically to the new CHIMERE CTM versions in the course of the continuous model development. The framework uses three of the four CTM evaluation types identified by the Environmental Protection Agency (EPA) and the American Meteorological Society (AMS): operational, diagnostic, and dynamic. It allows to compare the overall model performance in subsequent model versions (operational evaluation), identify specific processes and/or model inputs that could be improved (diagnostic evaluation), and test the model sensitivity to the changes in air quality, such as emission reductions and meteorological events (dynamic evaluation). The observation datasets currently used for the evaluation are: EMEP (surface concentrations), AERONET (optical depths), and WOUDC (ozone sounding profiles). The framework is implemented as an automated processing chain and allows interactive exploration of the results via a web interface.

  16. Systematic integration of experimental data and models in systems biology

    Directory of Open Access Journals (Sweden)

    Simeonidis Evangelos

    2010-11-01

    Full Text Available Abstract Background The behaviour of biological systems can be deduced from their mathematical models. However, multiple sources of data in diverse forms are required in the construction of a model in order to define its components and their biochemical reactions, and corresponding parameters. Automating the assembly and use of systems biology models is dependent upon data integration processes involving the interoperation of data and analytical resources. Results Taverna workflows have been developed for the automated assembly of quantitative parameterised metabolic networks in the Systems Biology Markup Language (SBML. A SBML model is built in a systematic fashion by the workflows which starts with the construction of a qualitative network using data from a MIRIAM-compliant genome-scale model of yeast metabolism. This is followed by parameterisation of the SBML model with experimental data from two repositories, the SABIO-RK enzyme kinetics database and a database of quantitative experimental results. The models are then calibrated and simulated in workflows that call out to COPASIWS, the web service interface to the COPASI software application for analysing biochemical networks. These systems biology workflows were evaluated for their ability to construct a parameterised model of yeast glycolysis. Conclusions Distributed information about metabolic reactions that have been described to MIRIAM standards enables the automated assembly of quantitative systems biology models of metabolic networks based on user-defined criteria. Such data integration processes can be implemented as Taverna workflows to provide a rapid overview of the components and their relationships within a biochemical system.

  17. Models Predicting Success of Infertility Treatment: A Systematic Review

    Science.gov (United States)

    Zarinara, Alireza; Zeraati, Hojjat; Kamali, Koorosh; Mohammad, Kazem; Shahnazari, Parisa; Akhondi, Mohammad Mehdi

    2016-01-01

    Background: Infertile couples are faced with problems that affect their marital life. Infertility treatment is expensive and time consuming and occasionally isn’t simply possible. Prediction models for infertility treatment have been proposed and prediction of treatment success is a new field in infertility treatment. Because prediction of treatment success is a new need for infertile couples, this paper reviewed previous studies for catching a general concept in applicability of the models. Methods: This study was conducted as a systematic review at Avicenna Research Institute in 2015. Six data bases were searched based on WHO definitions and MESH key words. Papers about prediction models in infertility were evaluated. Results: Eighty one papers were eligible for the study. Papers covered years after 1986 and studies were designed retrospectively and prospectively. IVF prediction models have more shares in papers. Most common predictors were age, duration of infertility, ovarian and tubal problems. Conclusion: Prediction model can be clinically applied if the model can be statistically evaluated and has a good validation for treatment success. To achieve better results, the physician and the couples’ needs estimation for treatment success rate were based on history, the examination and clinical tests. Models must be checked for theoretical approach and appropriate validation. The privileges for applying the prediction models are the decrease in the cost and time, avoiding painful treatment of patients, assessment of treatment approach for physicians and decision making for health managers. The selection of the approach for designing and using these models is inevitable. PMID:27141461

  18. Background model systematics for the Fermi GeV excess

    CERN Document Server

    Calore, Francesca; Weniger, Christoph

    2014-01-01

    The possible gamma-ray excess in the inner Galaxy and the Galactic center (GC) suggested by Fermi-LAT observations has triggered a large number of studies. It has been interpreted as a variety of different phenomena such as a signal from WIMP dark matter annihilation, gamma-ray emission from a population of millisecond pulsars, or emission from cosmic rays injected in a sequence of burst-like events or continuously at the GC. We present the first comprehensive study of model systematics coming from the Galactic diffuse emission in the inner part of our Galaxy and their impact on the inferred properties of the excess emission at Galactic latitudes $2^\\circ<|b|<20^\\circ$ and 300 MeV to 500 GeV. We study both theoretical and empirical model systematics, which we deduce from a large range of Galactic diffuse emission models and a principal component analysis of residuals in numerous test regions along the Galactic plane. We show that the hypothesis of an extended spherical excess emission with a uniform ene...

  19. Systematic assignment of thermodynamic constraints in metabolic network models

    Directory of Open Access Journals (Sweden)

    Heinemann Matthias

    2006-11-01

    Full Text Available Abstract Background The availability of genome sequences for many organisms enabled the reconstruction of several genome-scale metabolic network models. Currently, significant efforts are put into the automated reconstruction of such models. For this, several computational tools have been developed that particularly assist in identifying and compiling the organism-specific lists of metabolic reactions. In contrast, the last step of the model reconstruction process, which is the definition of the thermodynamic constraints in terms of reaction directionalities, still needs to be done manually. No computational method exists that allows for an automated and systematic assignment of reaction directions in genome-scale models. Results We present an algorithm that – based on thermodynamics, network topology and heuristic rules – automatically assigns reaction directions in metabolic models such that the reaction network is thermodynamically feasible with respect to the production of energy equivalents. It first exploits all available experimentally derived Gibbs energies of formation to identify irreversible reactions. As these thermodynamic data are not available for all metabolites, in a next step, further reaction directions are assigned on the basis of network topology considerations and thermodynamics-based heuristic rules. Briefly, the algorithm identifies reaction subsets from the metabolic network that are able to convert low-energy co-substrates into their high-energy counterparts and thus net produce energy. Our algorithm aims at disabling such thermodynamically infeasible cyclic operation of reaction subnetworks by assigning reaction directions based on a set of thermodynamics-derived heuristic rules. We demonstrate our algorithm on a genome-scale metabolic model of E. coli. The introduced systematic direction assignment yielded 130 irreversible reactions (out of 920 total reactions, which corresponds to about 70% of all irreversible

  20. The underestimated potential of solar energy to mitigate climate change

    Science.gov (United States)

    Creutzig, Felix; Agoston, Peter; Goldschmidt, Jan Christoph; Luderer, Gunnar; Nemet, Gregory; Pietzcker, Robert C.

    2017-09-01

    The Intergovernmental Panel on Climate Change's fifth assessment report emphasizes the importance of bioenergy and carbon capture and storage for achieving climate goals, but it does not identify solar energy as a strategically important technology option. That is surprising given the strong growth, large resource, and low environmental footprint of photovoltaics (PV). Here we explore how models have consistently underestimated PV deployment and identify the reasons for underlying bias in models. Our analysis reveals that rapid technological learning and technology-specific policy support were crucial to PV deployment in the past, but that future success will depend on adequate financing instruments and the management of system integration. We propose that with coordinated advances in multiple components of the energy system, PV could supply 30-50% of electricity in competitive markets.

  1. A systematic review of predictive modeling for bronchiolitis.

    Science.gov (United States)

    Luo, Gang; Nkoy, Flory L; Gesteland, Per H; Glasgow, Tiffany S; Stone, Bryan L

    2014-10-01

    Bronchiolitis is the most common cause of illness leading to hospitalization in young children. At present, many bronchiolitis management decisions are made subjectively, leading to significant practice variation among hospitals and physicians caring for children with bronchiolitis. To standardize care for bronchiolitis, researchers have proposed various models to predict the disease course to help determine a proper management plan. This paper reviews the existing state of the art of predictive modeling for bronchiolitis. Predictive modeling for respiratory syncytial virus (RSV) infection is covered whenever appropriate, as RSV accounts for about 70% of bronchiolitis cases. A systematic review was conducted through a PubMed search up to April 25, 2014. The literature on predictive modeling for bronchiolitis was retrieved using a comprehensive search query, which was developed through an iterative process. Search results were limited to human subjects, the English language, and children (birth to 18 years). The literature search returned 2312 references in total. After manual review, 168 of these references were determined to be relevant and are discussed in this paper. We identify several limitations and open problems in predictive modeling for bronchiolitis, and provide some preliminary thoughts on how to address them, with the hope to stimulate future research in this domain. Many problems remain open in predictive modeling for bronchiolitis. Future studies will need to address them to achieve optimal predictive models. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. A systematic review of animal models for Staphylococcus aureus osteomyelitis

    Directory of Open Access Journals (Sweden)

    W Reizner

    2014-03-01

    Full Text Available Staphylococcus aureus (S. aureus osteomyelitis is a significant complication for orthopaedic patients undergoing surgery, particularly with fracture fixation and arthroplasty. Given the difficulty in studying S. aureus infections in human subjects, animal models serve an integral role in exploring the pathogenesis of osteomyelitis, and aid in determining the efficacy of prophylactic and therapeutic treatments. Animal models should mimic the clinical scenarios seen in patients as closely as possible to permit the experimental results to be translated to the corresponding clinical care. To help understand existing animal models of S. aureus, we conducted a systematic search of PubMed and Ovid MEDLINE to identify in vivo animal experiments that have investigated the management of S. aureus osteomyelitis in the context of fractures and metallic implants. In this review, experimental studies are categorised by animal species and are further classified by the setting of the infection. Study methods are summarised and the relevant advantages and disadvantages of each species and model are discussed. While no ideal animal model exists, the understanding of a model’s strengths and limitations should assist clinicians and researchers to appropriately select an animal model to translate the conclusions to the clinical setting.

  3. Maturity Models in Supply Chain Sustainability: A Systematic Literature Review

    Directory of Open Access Journals (Sweden)

    Elisabete Correia

    2017-01-01

    Full Text Available A systematic literature review of supply chain maturity models with sustainability concerns is presented. The objective is to give insights into methodological issues related to maturity models, namely the research objectives; the research methods used to develop, validate and test them; the scope; and the main characteristics associated with their design. The literature review was performed based on journal articles and conference papers from 2000 to 2015 using the SCOPUS, Emerald Insight, EBSCO and Web of Science databases. Most of the analysed papers have as main objective the development of maturity models and their validation. The case study is the methodology that is most widely used by researchers to develop and validate maturity models. From the sustainability perspective, the scope of the analysed maturity models is the Triple Bottom Line (TBL and environmental dimension, focusing on a specific process (eco-design and new product development and without a broad SC perspective. The dominant characteristics associated with the design of the maturity models are the maturity grids and a continuous representation. In addition, results do not allow identifying a trend for a specific number of maturity levels. The comprehensive review, analysis, and synthesis of the maturity model literature represent an important contribution to the organization of this research area, making possible to clarify some confusion that exists about concepts, approaches and components of maturity models in sustainability. Various aspects associated with the maturity models (i.e., research objectives, research methods, scope and characteristics of the design of models are explored to contribute to the evolution and significance of this multidimensional area.

  4. Cost Underestimation in Public Works Projects

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren L.

    This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost...... estimates used to decide whether important infrastructure should be built are highly and systematically misleading. The result is continuous cost escalation of billions of dollars. The sample used in the study is the largest of its kind, allowing for the first time statistically valid conclusions regarding...... honest numbers should not trust the cost estimates and cost-benefit analyses produced by project promoters and their analysts. Independent estimates and analyses are needed as are institutional checks and balances to curb deception....

  5. Underestimating Costs in Public Works Projects

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren L.

    2002-01-01

    This article presents results from the first statistically significant study of cost escalation in transportation infrastructure projects. Based on a sample of 258 transportation infrastructure projects worth $90 billion (U.S.), it is found with overwhelming statistical significance that the cost...... estimates used to decide whether important infrastructure should be built are highly and systematically misleading. The result is continuous cost escalation of billions of dollars. The sample used in the study is the largest of its kind, allowing for the first time statistically valid conclusions regarding...... honest numbers should not trust the cost estimates and cost-benefit analyses produced by project promoters and their analysts. Independent estimates and analyses are needed as are institutional checks and balances to curb deception....

  6. Systematic multiscale models for deep convection on mesoscales

    Energy Technology Data Exchange (ETDEWEB)

    Klein, Rupert [Freie Universitaet Berlin and Potsdam Institute for Climate Impact Research, FB Mathematik and Informatik, Berlin (Germany); Majda, Andrew J. [New York University, Courant Institute of Mathematical Sciences, New York, NY (United States)

    2006-11-15

    This paper builds on recent developments of a unified asymptotic approach to meteorological modeling [ZAMM, 80: 765-777, 2000, SIAM Proc. App. Math. 116, 227-289, 2004], which was used successfully in the development of Systematic multiscale models for the tropics in Majda and Klein [J. Atmosph. Sci. 60: 393-408, 2003] and Majda and Biello [PNAS, 101: 4736-4741, 2004]. Biello and Majda [J. Atmosph. Sci. 62: 1694-1720, 2005]. Here we account for typical bulk microphysics parameterizations of moist processes within this framework. The key steps are careful nondimensionalization of the bulk microphysics equations and the choice of appropriate distinguished limits for the various nondimensional small parameters that appear. We are then in a position to study scale interactions in the atmosphere involving moist physics. We demonstrate this by developing two systematic multiscale models that are motivated by our interest in mesoscale organized convection. The emphasis here is on multiple length scales but common time scales. The first of these models describes the short-time evolution of slender, deep convective hot towers with horizontal scale {proportional_to}1 km interacting with the linearized momentum balance on length and time scales of (10 km/3 min). We expect this model to describe how convective inhibition may be overcome near the surface, how the onset of deep convection triggers convective-scale gravity waves, and that it will also yield new insight into how such local convective events may conspire to create larger-scale strong storms. The second model addresses the next larger range of length and time scales (10 km, 100 km, and 20 min) and exhibits mathematical features that are strongly reminiscent of mesoscale organized convection. In both cases, the asymptotic analysis reveals how the stiffness of condensation/evaporation processes induces highly nonlinear dynamics. Besides providing new theoretical insights, the derived models may also serve as a

  7. A Systematic Literature Review of Agile Maturity Model Research

    Directory of Open Access Journals (Sweden)

    Vaughan Henriques

    2017-02-01

    Full Text Available Background/Aim/Purpose: A commonly implemented software process improvement framework is the capability maturity model integrated (CMMI. Existing literature indicates higher levels of CMMI maturity could result in a loss of agility due to its organizational focus. To maintain agility, research has focussed attention on agile maturity models. The objective of this paper is to find the common research themes and conclusions in agile maturity model research. Methodology: This research adopts a systematic approach to agile maturity model research, using Google Scholar, Science Direct, and IEEE Xplore as sources. In total 531 articles were initially found matching the search criteria, which was filtered to 39 articles by applying specific exclusion criteria. Contribution:: The article highlights the trends in agile maturity model research, specifically bringing to light the lack of research providing validation of such models. Findings: Two major themes emerge, being the coexistence of agile and CMMI and the development of agile principle based maturity models. The research trend indicates an increase in agile maturity model articles, particularly in the latter half of the last decade, with concentrations of research coinciding with version updates of CMMI. While there is general consensus around higher CMMI maturity levels being incompatible with true agility, there is evidence of the two coexisting when agile is introduced into already highly matured environments. Future Research:\tFuture research direction for this topic should include how to attain higher levels of CMMI maturity using only agile methods, how governance is addressed in agile environments, and whether existing agile maturity models relate to improved project success.

  8. Systematic approach to verification and validation: High explosive burn models

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Laboratory; Scovel, Christina A. [Los Alamos National Laboratory

    2012-04-16

    , run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.

  9. Using Laser Scanners to Augment the Systematic Error Pointing Model

    Science.gov (United States)

    Wernicke, D. R.

    2016-08-01

    The antennas of the Deep Space Network (DSN) rely on precise pointing algorithms to communicate with spacecraft that are billions of miles away. Although the existing systematic error pointing model is effective at reducing blind pointing errors due to static misalignments, several of its terms have a strong dependence on seasonal and even daily thermal variation and are thus not easily modeled. Changes in the thermal state of the structure create a separation from the model and introduce a varying pointing offset. Compensating for this varying offset is possible by augmenting the pointing model with laser scanners. In this approach, laser scanners mounted to the alidade measure structural displacements while a series of transformations generate correction angles. Two sets of experiments were conducted in August 2015 using commercially available laser scanners. When compared with historical monopulse corrections under similar conditions, the computed corrections are within 3 mdeg of the mean. However, although the results show promise, several key challenges relating to the sensitivity of the optical equipment to sunlight render an implementation of this approach impractical. Other measurement devices such as inclinometers may be implementable at a significantly lower cost.

  10. Testing flow diversion in animal models: a systematic review.

    Science.gov (United States)

    Fahed, Robert; Raymond, Jean; Ducroux, Célina; Gentric, Jean-Christophe; Salazkin, Igor; Ziegler, Daniela; Gevry, Guylaine; Darsaut, Tim E

    2016-04-01

    Flow diversion (FD) is increasingly used to treat intracranial aneurysms. We sought to systematically review published studies to assess the quality of reporting and summarize the results of FD in various animal models. Databases were searched to retrieve all animal studies on FD from 2000 to 2015. Extracted data included species and aneurysm models, aneurysm and neck dimensions, type of flow diverter, occlusion rates, and complications. Articles were evaluated using a checklist derived from the Animal Research: Reporting of In Vivo Experiments (ARRIVE) guidelines. Forty-two articles reporting the results of FD in nine different aneurysm models were included. The rabbit elastase-induced aneurysm model was the most commonly used, with 3-month occlusion rates of 73.5%, (95%CI [61.9-82.6%]). FD of surgical sidewall aneurysms, constructed in rabbits or canines, resulted in high occlusion rates (100% [65.5-100%]). FD resulted in modest occlusion rates (15.4% [8.9-25.1%]) when tested in six complex canine aneurysm models designed to reproduce more difficult clinical contexts (large necks, bifurcation, or fusiform aneurysms). Adverse events, including branch occlusion, were rarely reported. There were no hemorrhagic complications. Articles complied with 20.8 ± 3.9 of 41 ARRIVE items; only a small number used randomization (3/42 articles [7.1%]) or a control group (13/42 articles [30.9%]). Preclinical studies on FD have shown various results. Occlusion of elastase-induced aneurysms was common after FD. The model is not challenging but standardized in many laboratories. Failures of FD can be reproduced in less standardized but more challenging surgical canine constructions. The quality of reporting could be improved.

  11. Stem cells in animal asthma models: a systematic review.

    Science.gov (United States)

    Srour, Nadim; Thébaud, Bernard

    2014-12-01

    Asthma control frequently falls short of the goals set in international guidelines. Treatment options for patients with poorly controlled asthma despite inhaled corticosteroids and long-acting β-agonists are limited, and new therapeutic options are needed. Stem cell therapy is promising for a variety of disorders but there has been no human clinical trial of stem cell therapy for asthma. We aimed to systematically review the literature regarding the potential benefits of stem cell therapy in animal models of asthma to determine whether a human trial is warranted. The MEDLINE and Embase databases were searched for original studies of stem cell therapy in animal asthma models. Nineteen studies were selected. They were found to be heterogeneous in their design. Mesenchymal stromal cells were used before sensitization with an allergen, before challenge with the allergen and after challenge, most frequently with ovalbumin, and mainly in BALB/c mice. Stem cell therapy resulted in a reduction of bronchoalveolar lavage fluid inflammation and eosinophilia as well as Th2 cytokines such as interleukin-4 and interleukin-5. Improvement in histopathology such as peribronchial and perivascular inflammation, epithelial thickness, goblet cell hyperplasia and smooth muscle layer thickening was universal. Several studies showed a reduction in airway hyper-responsiveness. Stem cell therapy decreases eosinophilic and Th2 inflammation and is effective in several phases of the allergic response in animal asthma models. Further study is warranted, up to human clinical trials. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  12. A Method for Systematic Improvement of Stochastic Grey-Box Models

    DEFF Research Database (Denmark)

    Kristensen, Niels Rode; Madsen, Henrik; Jørgensen, Sten Bay

    2004-01-01

    A systematic framework for improving the quality of continuous time models of dynamic systems based on experimental data is presented. The framework is based on an interplay between stochastic differential equation modelling, statistical tests and nonparametric modelling and provides features...

  13. Background model systematics for the Fermi GeV excess

    Energy Technology Data Exchange (ETDEWEB)

    Calore, Francesca; Cholis, Ilias; Weniger, Christoph

    2015-03-01

    The possible gamma-ray excess in the inner Galaxy and the Galactic center (GC) suggested by Fermi-LAT observations has triggered a large number of studies. It has been interpreted as a variety of different phenomena such as a signal from WIMP dark matter annihilation, gamma-ray emission from a population of millisecond pulsars, or emission from cosmic rays injected in a sequence of burst-like events or continuously at the GC. We present the first comprehensive study of model systematics coming from the Galactic diffuse emission in the inner part of our Galaxy and their impact on the inferred properties of the excess emission at Galactic latitudes 2° < |b| < 20° and 300 MeV to 500 GeV. We study both theoretical and empirical model systematics, which we deduce from a large range of Galactic diffuse emission models and a principal component analysis of residuals in numerous test regions along the Galactic plane. We show that the hypothesis of an extended spherical excess emission with a uniform energy spectrum is compatible with the Fermi-LAT data in our region of interest at 95% CL. Assuming that this excess is the extended counterpart of the one seen in the inner few degrees of the Galaxy, we derive a lower limit of 10.0° (95% CL) on its extension away from the GC. We show that, in light of the large correlated uncertainties that affect the subtraction of the Galactic diffuse emission in the relevant regions, the energy spectrum of the excess is equally compatible with both a simple broken power-law of break energy E(break) = 2.1 ± 0.2 GeV, and with spectra predicted by the self-annihilation of dark matter, implying in the case of bar bb final states a dark matter mass of m(χ)=49(+6.4)(-)(5.4)  GeV.

  14. EARLINET dust observations vs. BSC-DREAM8b modeled profiles: 12-year-long systematic comparison at Potenza, Italy

    Science.gov (United States)

    Mona, L.; Papagiannopoulos, N.; Basart, S.; Baldasano, J.; Binietoglou, I.; Cornacchia, C.; Pappalardo, G.

    2014-08-01

    In this paper, we report the first systematic comparison of 12-year modeled dust extinction profiles vs. Raman lidar measurements. We use the BSC-DREAM8b model, one of the most widely used dust regional models in the Mediterranean, and Potenza EARLINET lidar profiles for Saharan dust cases, the largest one-site database of dust extinction profiles. A total of 310 dust cases were compared for the May 2000-July 2012 period. The model reconstructs the measured layers well: profiles are correlated within 5% of significance for 60% of the cases and the dust layer center of mass as measured by lidar and modeled by BSC-DREAM8b differ on average 0.3 ± 1.0 km. Events with a dust optical depth lower than 0.1 account for 70% of uncorrelated profiles. Although there is good agreement in terms of profile shape and the order of magnitude of extinction values, the model overestimates the occurrence of dust layer top above 10 km. Comparison with extinction profiles measured by the Raman lidar shows that BSC-DREAM8b typically underestimates the dust extinction coefficient, in particular below 3 km. Lowest model-observation differences (below 17%) correspond to a lidar ratio at 532 nm and Ångström exponent at 355/532 nm of 60 ± 13 and 0.1 ± 0.6 sr, respectively. These are in agreement with values typically observed and modeled for pure desert dust. However, the highest differences (higher than 85%) are typically related to greater Ångström values (0.5 ± 0.6), denoting smaller particles. All these aspects indicate that the level of agreement decreases with an increase in mixing/modification processes.

  15. U.S. Deaths from Cervical Cancer May Be Underestimated

    Science.gov (United States)

    ... page: https://medlineplus.gov/news/fullstory_163192.html U.S. Deaths From Cervical Cancer May Be Underestimated Rates ... women were factored out, Rositch's team found that U.S. cervical cancer deaths are 77 percent higher among ...

  16. Systematic development of reduced reaction mechanisms for dynamic modeling

    Science.gov (United States)

    Frenklach, M.; Kailasanath, K.; Oran, E. S.

    1986-01-01

    A method for systematically developing a reduced chemical reaction mechanism for dynamic modeling of chemically reactive flows is presented. The method is based on the postulate that if a reduced reaction mechanism faithfully describes the time evolution of both thermal and chain reaction processes characteristic of a more complete mechanism, then the reduced mechanism will describe the chemical processes in a chemically reacting flow with approximately the same degree of accuracy. Here this postulate is tested by producing a series of mechanisms of reduced accuracy, which are derived from a full detailed mechanism for methane-oxygen combustion. These mechanisms were then tested in a series of reactive flow calculations in which a large-amplitude sinusoidal perturbation is applied to a system that is initially quiescent and whose temperature is high enough to start ignition processes. Comparison of the results for systems with and without convective flow show that this approach produces reduced mechanisms that are useful for calculations of explosions and detonations. Extensions and applicability to flames are discussed.

  17. Bayesian analysis of an anisotropic universe model: systematics and polarization

    CERN Document Server

    Groeneboom, Nicolaas E; Wehus, Ingunn Kathrine; Eriksen, Hans Kristian

    2009-01-01

    We revisit the anisotropic universe model previously developed by Ackerman, Carroll and Wise (ACW), and generalize both the theoretical and computational framework to include polarization and various forms of systematic effects. We apply our new tools to simulated WMAP data in order to understand the potential impact of asymmetric beams, noise mis-estimation and potential Zodiacal light emission. We find that neither has any significant impact on the results. We next show that the previously reported ACW signal is also present in the 1-year WMAP temperature sky map presented by Liu & Li, where data cuts are more aggressive. Finally, we reanalyze the 5-year WMAP data taking into account a previously neglected (-i)^{l-l'}-term in the signal covariance matrix. We still find a strong detection of a preferred direction in the temperature map. Including multipoles up to l=400, the anisotropy amplitude for the W-band is found to be g = 0.29 +- 0.031, nonzero at 9 sigma. However, the corresponding preferred direc...

  18. Systematic flood modelling to support flood-proof urban design

    Science.gov (United States)

    Bruwier, Martin; Mustafa, Ahmed; Aliaga, Daniel; Archambeau, Pierre; Erpicum, Sébastien; Nishida, Gen; Zhang, Xiaowei; Pirotton, Michel; Teller, Jacques; Dewals, Benjamin

    2017-04-01

    Urban flood risk is influenced by many factors such as hydro-meteorological drivers, existing drainage systems as well as vulnerability of population and assets. The urban fabric itself has also a complex influence on inundation flows. In this research, we performed a systematic analysis on how various characteristics of urban patterns control inundation flow within the urban area and upstream of it. An urban generator tool was used to generate over 2,250 synthetic urban networks of 1 km2. This tool is based on the procedural modelling presented by Parish and Müller (2001) which was adapted to generate a broader variety of urban networks. Nine input parameters were used to control the urban geometry. Three of them define the average length, orientation and curvature of the streets. Two orthogonal major roads, for which the width constitutes the fourth input parameter, work as constraints to generate the urban network. The width of secondary streets is given by the fifth input parameter. Each parcel generated by the street network based on a parcel mean area parameter can be either a park or a building parcel depending on the park ratio parameter. Three setback parameters constraint the exact location of the building whithin a building parcel. For each of synthetic urban network, detailed two-dimensional inundation maps were computed with a hydraulic model. The computational efficiency was enhanced by means of a porosity model. This enables the use of a coarser computational grid , while preserving information on the detailed geometry of the urban network (Sanders et al. 2008). These porosity parameters reflect not only the void fraction, which influences the storage capacity of the urban area, but also the influence of buildings on flow conveyance (dynamic effects). A sensitivity analysis was performed based on the inundation maps to highlight the respective impact of each input parameter characteristizing the urban networks. The findings of the study pinpoint

  19. Underestimated effect sizes in GWAS: fundamental limitations of single SNP analysis for dichotomous phenotypes.

    Directory of Open Access Journals (Sweden)

    Sven Stringer

    Full Text Available Complex diseases are often highly heritable. However, for many complex traits only a small proportion of the heritability can be explained by observed genetic variants in traditional genome-wide association (GWA studies. Moreover, for some of those traits few significant SNPs have been identified. Single SNP association methods test for association at a single SNP, ignoring the effect of other SNPs. We show using a simple multi-locus odds model of complex disease that moderate to large effect sizes of causal variants may be estimated as relatively small effect sizes in single SNP association testing. This underestimation effect is most severe for diseases influenced by numerous risk variants. We relate the underestimation effect to the concept of non-collapsibility found in the statistics literature. As described, continuous phenotypes generated with linear genetic models are not affected by this underestimation effect. Since many GWA studies apply single SNP analysis to dichotomous phenotypes, previously reported results potentially underestimate true effect sizes, thereby impeding identification of true effect SNPs. Therefore, when a multi-locus model of disease risk is assumed, a multi SNP analysis may be more appropriate.

  20. Estimating systematic continuous-time trends in recidivism using a non-gaussian panel data model

    NARCIS (Netherlands)

    Koopman, S.J.; Ooms, M.; Montfort, van K.; Geest, van der W.

    2008-01-01

    We model panel data of crime careers of juveniles from a Dutch Judicial Juvenile Institution. The data are decomposed into a systematic and an individual-specific component, of which the systematic component reflects the general time-varying conditions including the criminological climate. Within a

  1. Telephone surveys underestimate cigarette smoking among African-Americans

    Directory of Open Access Journals (Sweden)

    Hope eLandrine

    2013-09-01

    Full Text Available Background. This study tested the hypothesis that data from random digit-dial telephone surveys underestimate the prevalence of cigarette smoking among African-American adults. Method. A novel, community-sampling method was used to obtain a statewide, random sample of N= 2118 California (CA African-American/Black adults, surveyed door-to-door. This Black community sample was compared to the Blacks in the CA Health Interview Survey (N = 2315, a statewide, random digit-dial telephone-survey conducted simultaneously. Results. Smoking prevalence was significantly higher among community (33% than among telephone-survey (19% Blacks, even after controlling for sample-differences in demographics.Conclusions. Telephone surveys underestimate smoking among African-Americans and probably underestimate other health risk behaviors as well. Alternative methods are needed to obtain accurate data on African-American health behaviors and on the magnitude of racial disparities in them.

  2. Critical appraisal and data extraction for systematic reviews of prediction modelling studies: the CHARMS checklist.

    Directory of Open Access Journals (Sweden)

    Karel G M Moons

    2014-10-01

    Full Text Available Carl Moons and colleagues provide a checklist and background explanation for critically appraising and extracting data from systematic reviews of prognostic and diagnostic prediction modelling studies. Please see later in the article for the Editors' Summary.

  3. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    Science.gov (United States)

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  4. Black carbon in the Arctic: the underestimated role of gas flaring and residential combustion emissions

    Directory of Open Access Journals (Sweden)

    A. Stohl

    2013-09-01

    annual mean Arctic BC surface concentrations due to residential combustion by 68% when using daily emissions. A large part (93% of this systematic increase can be captured also when using monthly emissions; the increase is compensated by a decreased BC burden at lower latitudes. In a comparison with BC measurements at six Arctic stations, we find that using daily-varying residential combustion emissions and introducing gas flaring emissions leads to large improvements of the simulated Arctic BC, both in terms of mean concentration levels and simulated seasonality. Case studies based on BC and carbon monoxide (CO measurements from the Zeppelin observatory appear to confirm flaring as an important BC source that can produce pollution plumes in the Arctic with a high BC / CO enhancement ratio, as expected for this source type. BC measurements taken during a research ship cruise in the White, Barents and Kara seas north of the region with strong flaring emissions reveal very high concentrations of the order of 200–400 ng m−3. The model underestimates these concentrations substantially, which indicates that the flaring emissions (and probably also other emissions in northern Siberia are rather under- than overestimated in our emission data set. Our results suggest that it may not be "vertical transport that is too strong or scavenging rates that are too low" and "opposite biases in these processes" in the Arctic and elsewhere in current aerosol models, as suggested in a recent review article (Bond et al., Bounding the role of black carbon in the climate system: a scientific assessment, J. Geophys. Res., 2013, but missing emission sources and lacking time resolution of the emission data that are causing opposite model biases in simulated BC concentrations in the Arctic and in the mid-latitudes.

  5. Black carbon in the Arctic: the underestimated role of gas flaring and residential combustion emissions

    Science.gov (United States)

    Stohl, A.; Klimont, Z.; Eckhardt, S.; Kupiainen, K.; Shevchenko, V. P.; Kopeikin, V. M.; Novigatsky, A. N.

    2013-09-01

    BC surface concentrations due to residential combustion by 68% when using daily emissions. A large part (93%) of this systematic increase can be captured also when using monthly emissions; the increase is compensated by a decreased BC burden at lower latitudes. In a comparison with BC measurements at six Arctic stations, we find that using daily-varying residential combustion emissions and introducing gas flaring emissions leads to large improvements of the simulated Arctic BC, both in terms of mean concentration levels and simulated seasonality. Case studies based on BC and carbon monoxide (CO) measurements from the Zeppelin observatory appear to confirm flaring as an important BC source that can produce pollution plumes in the Arctic with a high BC / CO enhancement ratio, as expected for this source type. BC measurements taken during a research ship cruise in the White, Barents and Kara seas north of the region with strong flaring emissions reveal very high concentrations of the order of 200-400 ng m-3. The model underestimates these concentrations substantially, which indicates that the flaring emissions (and probably also other emissions in northern Siberia) are rather under- than overestimated in our emission data set. Our results suggest that it may not be "vertical transport that is too strong or scavenging rates that are too low" and "opposite biases in these processes" in the Arctic and elsewhere in current aerosol models, as suggested in a recent review article (Bond et al., Bounding the role of black carbon in the climate system: a scientific assessment, J. Geophys. Res., 2013), but missing emission sources and lacking time resolution of the emission data that are causing opposite model biases in simulated BC concentrations in the Arctic and in the mid-latitudes.

  6. Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis

    Science.gov (United States)

    2016-07-31

    distribution unlimited Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis Matthew...Technical Note 3. DATES COVERED (From - To) 04 January 2016 - 31 July 2016 4. TITLE AND SUBTITLE Hydrocarbon Fuel Thermal Performance Modeling based on...Systematic Measurement and Comprehensive Chromatographic Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S

  7. Factors associated with parental underestimation of child's weight status.

    Science.gov (United States)

    Warkentin, Sarah; Mais, Laís A; Latorre, Maria do Rosário D O; Carnell, Susan; Taddei, José Augusto A C

    2017-08-18

    The aim of this study was to examine the prevalence of parental misperception of child weight status, and identify socioeconomic, anthropometric, behavioral and dietary factors associated with underestimation. Cross-sectional study. Data was collected in 14 Brazilian private schools. Parents of children aged 2-8 years (n=976) completed a self-reported questionnaire assessing their perception of their child's weight status, and sociodemographic, anthropometric, behavioral and dietary information. To measure the agreement between parental perception about child weight status and actual child weight status, the Kappa coefficient was estimated, and to investigate associations between parental underestimation and independent variables, chi-squared tests were performed, followed by multiple logistic regression, considering p≤0.05 for statistical significance. Overall, 48.05% of the parents incorrectly classified their child's weight. Specifically, 45.08% underestimated their child's weight status, with just 3% of parents overestimating. Children with higher body mass index (OR=2.03; p<0.001) and boys (OR=1.70; p<0.001) were more likely to have their weight status underestimated by parents. Since awareness of weight problems is essential for prevention and treatment, clinical practitioners should help parents at high risk of misperception to correctly evaluate their child's weight status. Copyright © 2017 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  8. Statistical Inference Models for Image Datasets with Systematic Variations.

    Science.gov (United States)

    Kim, Won Hwa; Bendlin, Barbara B; Chung, Moo K; Johnson, Sterling C; Singh, Vikas

    2015-06-01

    Statistical analysis of longitudinal or cross sectional brain imaging data to identify effects of neurodegenerative diseases is a fundamental task in various studies in neuroscience. However, when there are systematic variations in the images due to parameter changes such as changes in the scanner protocol, hardware changes, or when combining data from multi-site studies, the statistical analysis becomes problematic. Motivated by this scenario, the goal of this paper is to develop a unified statistical solution to the problem of systematic variations in statistical image analysis. Based in part on recent literature in harmonic analysis on diffusion maps, we propose an algorithm which compares operators that are resilient to the systematic variations. These operators are derived from the empirical measurements of the image data and provide an efficient surrogate to capturing the actual changes across images. We also establish a connection between our method to the design of wavelets in non-Euclidean space. To evaluate the proposed ideas, we present various experimental results on detecting changes in simulations as well as show how the method offers improved statistical power in the analysis of real longitudinal PIB-PET imaging data acquired from participants at risk for Alzheimer's disease (AD).

  9. Statistical Inference Models for Image Datasets with Systematic Variations

    Science.gov (United States)

    Kim, Won Hwa; Bendlin, Barbara B.; Chung, Moo K.; Johnson, Sterling C.; Singh, Vikas

    2016-01-01

    Statistical analysis of longitudinal or cross sectional brain imaging data to identify effects of neurodegenerative diseases is a fundamental task in various studies in neuroscience. However, when there are systematic variations in the images due to parameter changes such as changes in the scanner protocol, hardware changes, or when combining data from multi-site studies, the statistical analysis becomes problematic. Motivated by this scenario, the goal of this paper is to develop a unified statistical solution to the problem of systematic variations in statistical image analysis. Based in part on recent literature in harmonic analysis on diffusion maps, we propose an algorithm which compares operators that are resilient to the systematic variations. These operators are derived from the empirical measurements of the image data and provide an efficient surrogate to capturing the actual changes across images. We also establish a connection between our method to the design of wavelets in non-Euclidean space. To evaluate the proposed ideas, we present various experimental results on detecting changes in simulations as well as show how the method offers improved statistical power in the analysis of real longitudinal PIB-PET imaging data acquired from participants at risk for Alzheimer’s disease (AD). PMID:26989336

  10. Orthodontic measurements on digital study models compared with plaster models: a systematic review.

    Science.gov (United States)

    Fleming, P S; Marinho, V; Johal, A

    2011-02-01

    The aim of this study is to evaluate the validity of the use of digital models to assess tooth size, arch length, irregularity index, arch width and crowding versus measurements generated on hand-held plaster models with digital callipers in patients with and without malocclusion. Studies comparing linear and angular measurements obtained on digital and standard plaster models were identified by searching multiple databases including MEDLINE, LILACS, BBO, ClinicalTrials.gov, the National Research Register and Pro-Quest Dissertation Abstracts and Thesis database, without restrictions relating to publication status or language of publication. Two authors were involved in study selection, quality assessment and the extraction of data. Items from the Quality Assessment of Studies of Diagnostic Accuracy included in Systematic Reviews checklist were used to assess the methodological quality of included studies. No meta-analysis was conducted. Comparisons between measurements of digital and plaster models made directly within studies were reported, and the difference between the (repeated) measurement means for digital and plaster models were considered as estimates. Seventeen relevant studies were included. Where reported, overall, the absolute mean differences between direct and indirect measurements on plaster and digital models were minor and clinically insignificant. Orthodontic measurements with digital models were comparable to those derived from plaster models. The use of digital models as an alternative to conventional measurement on plaster models may be recommended, although the evidence identified in this review is of variable quality. © 2010 John Wiley & Sons A/S.

  11. Systematic approach for the identification of process reference models

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2009-02-01

    Full Text Available Process models are used in different application domains to capture knowledge on the process flow. Process reference models (PRM) are used to capture reusable process models, which should simplify the identification process of process models...

  12. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    -friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend and/or adopt a model. This is based on the idea of model reuse, which emphasizes the use...... and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer...... aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation; 4) model transfer – export/import to/from other application for further extension and application – several types of formats, such as XML...

  13. An information theory approach to minimise correlated systematic uncertainty in modelling resonance parameters

    Energy Technology Data Exchange (ETDEWEB)

    Krishna Kumar, P.T. [Research Laboratory for Nuclear Reactors, Tokyo Institute of Technology, 2-12-1, O-Okayama, Meguro-Ku, Tokyo 152-8550 (Japan)], E-mail: gstptk@yahoo.co.in; Sekimoto, Hiroshi [Research Laboratory for Nuclear Reactors, Tokyo Institute of Technology, 2-12-1, O-Okayama, Meguro-Ku, Tokyo 152-8550 (Japan)], E-mail: hsekimot@nr.titech.ac.jp

    2009-02-15

    Covariance matrix elements depict the statistical and systematic uncertainties in reactor parameter measurements. All the efforts have so far been devoted only to minimise the statistical uncertainty by repeated measurements but the dominant systematic uncertainty has either been neglected or randomized. In recent years efforts has been devoted to simulate the resonance parameter uncertainty information through covariance matrices in code SAMMY. But, the code does not have any provision to check the reliability of the simulated covariance data. We propose a new approach called entropy based information theory to reduce the systematic uncertainty in the correlation matrix element so that resonance parameters with minimum systematic uncertainty can be modelled. We apply our information theory approach in generating the resonance parameters of {sup 156}Gd with reduced systematic uncertainty and demonstrate the superiority of our technique over the principal component analysis method.

  14. Understanding Systematics in ZZ Ceti Model Fitting to Enable Differential Seismology

    Science.gov (United States)

    Fuchs, J. T.; Dunlap, B. H.; Clemens, J. C.; Meza, J. A.; Dennihy, E.; Koester, D.

    2017-03-01

    We are conducting a large spectroscopic survey of over 130 Southern ZZ Cetis with the Goodman Spectrograph on the SOAR Telescope. Because it employs a single instrument with high UV throughput, this survey will both improve the signal-to-noise of the sample of SDSS ZZ Cetis and provide a uniform dataset for model comparison. We are paying special attention to systematics in the spectral fitting and quantify three of those systematics here. We show that relative positions in the log g -Teff plane are consistent for these three systematics.

  15. Understanding Systematics in ZZ Ceti Model Fitting to Enable Differential Seismology

    CERN Document Server

    Fuchs, J T; Clemens, J C; Meza, J A; Dennihy, E; Koester, D

    2016-01-01

    We are conducting a large spectroscopic survey of over 130 Southern ZZ Cetis with the Goodman Spectrograph on the SOAR Telescope. Because it employs a single instrument with high UV throughput, this survey will both improve the signal-to-noise of the sample of SDSS ZZ Cetis and provide a uniform dataset for model comparison. We are paying special attention to systematics in the spectral fitting and quantify three of those systematics here. We show that relative positions in the $\\log{g}$-$T_{\\rm eff}$ plane are consistent for these three systematics.

  16. Data from selective harvests underestimate temporal trends in quantitative traits.

    Science.gov (United States)

    Pelletier, Fanie; Festa-Bianchet, Marco; Jorgenson, Jon T

    2012-10-23

    Human harvests can select against phenotypes favoured by natural selection, and natural resource managers should evaluate possible artificial selection on wild populations. Because the required genetic data are extremely difficult to gather, however, managers typically rely on harvested animals to document temporal trends. It is usually unknown whether these data are unbiased. We explore our ability to detect a decline in horn size of bighorn sheep (Ovis canadensis) by comparing harvested males with all males in a population where evolutionary changes owing to trophy hunting were previously reported. Hunting records underestimated the temporal decline, partly because of an increasing proportion of rams that could not be harvested because their horns were smaller than the threshold set by hunting regulations. If harvests are selective, temporal trends measured from harvest records will underestimate the magnitude of changes in wild populations.

  17. What Pacific tsunamis tell us about underestimating hazard and risk

    Science.gov (United States)

    Goff, J. R.

    2011-12-01

    occurred in prehistory than have occurred in historic time. Larger events have occurred in prehistory than have been modelled in any probabilistic tsunami hazard assessment. This talk presents work that draws together much of the geological, TEK and archaeological evidence for such events and makes one key point - palaeotsunamis have had significant impact upon prehistoric coastal settlements and culture. We are currently underestimating the hazard and this does little to help in our assessment of the risk.

  18. The electron donating capacity of biochar is dramatically underestimated

    Science.gov (United States)

    Prévoteau, Antonin; Ronsse, Frederik; Cid, Inés; Boeckx, Pascal; Rabaey, Korneel

    2016-09-01

    Biochars have gathered considerable interest for agronomic and engineering applications. In addition to their high sorption ability, biochars have been shown to accept or donate considerable amounts of electrons to/from their environment via abiotic or microbial processes. Here, we measured the electron accepting (EAC) and electron donating (EDC) capacities of wood-based biochars pyrolyzed at three different highest treatment temperatures (HTTs: 400, 500, 600 °C) via hydrodynamic electrochemical techniques using a rotating disc electrode. EACs and EDCs varied with HTT in accordance with a previous report with a maximal EAC at 500 °C (0.4 mmol(e-).gchar-1) and a large decrease of EDC with HTT. However, while we monitored similar EAC values than in the preceding study, we show that the EDCs have been underestimated by at least 1 order of magnitude, up to 7 mmol(e-).gchar-1 for a HTT of 400 °C. We attribute this existing underestimation to unnoticed slow kinetics of electron transfer from biochars to the dissolved redox mediators used in the monitoring. The EDC of other soil organic constituents such as humic substances may also have been underestimated. These results imply that the redox properties of biochars may have a much bigger impact on soil biogeochemical processes than previously conjectured.

  19. Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Niko Speybroeck

    2013-11-01

    Full Text Available Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks.

  20. Rank-Defect Adjustment Model for Survey-Line Systematic Errors in Marine Survey Net

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In this paper,the structure of systematic and random errors in marine survey net are discussed in detail and the adjustment method for observations of marine survey net is studied,in which the rank-defect characteristic is discovered first up to now.On the basis of the survey-line systematic error model,the formulae of the rank-defect adjustment model are deduced according to modern adjustment theory.An example of calculations with really observed data is carried out to demonstrate the efficiency of this adjustment model.Moreover,it is proved that the semi-systematic error correction method used at present in marine gravimetry in China is a special case of the adjustment model presented in this paper.

  1. The Chain Information Model: a systematic approach for food product development

    NARCIS (Netherlands)

    Benner, M.

    2005-01-01

    The chain information model has been developed to increase the success rate of new food products. The uniqueness of this approach is that it approaches the problem from a chain perspective and starts with the consumer. The model can be used to analyse the production chain in a systematic way. This

  2. A Demonstration of a Systematic Item-Reduction Approach Using Structural Equation Modeling

    Science.gov (United States)

    Larwin, Karen; Harvey, Milton

    2012-01-01

    Establishing model parsimony is an important component of structural equation modeling (SEM). Unfortunately, little attention has been given to developing systematic procedures to accomplish this goal. To this end, the current study introduces an innovative application of the jackknife approach first presented in Rensvold and Cheung (1999). Unlike…

  3. The Chain Information Model: a systematic approach for food product development

    NARCIS (Netherlands)

    Benner, M.

    2005-01-01

    The chain information model has been developed to increase the success rate of new food products. The uniqueness of this approach is that it approaches the problem from a chain perspective and starts with the consumer. The model can be used to analyse the production chain in a systematic way. This r

  4. A Digital Tool Set for Systematic Model Design in Process-Engineering Education

    Science.gov (United States)

    van der Schaaf, Hylke; Tramper, Johannes; Hartog, Rob J.M.; Vermue, Marian

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material exists to meet this learning objective. For these…

  5. A digital tool set for systematic model design in process-engineering education

    NARCIS (Netherlands)

    Schaaf, van der H.; Tramper, J.; Hartog, R.J.M.; Vermuë, M.H.

    2006-01-01

    One of the objectives of the process technology curriculum at Wageningen University is that students learn how to design mathematical models in the context of process engineering, using a systematic problem analysis approach. Students find it difficult to learn to design a model and little material

  6. Systematic assignment of thermodynamic constraints in metabolic network models

    NARCIS (Netherlands)

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    Background: The availability of genome sequences for many organisms enabled the reconstruction of several genome-scale metabolic network models. Currently, significant efforts are put into the automated reconstruction of such models. For this, several computational tools have been developed that par

  7. Systematic evaluation of land use regression models for NO₂

    NARCIS (Netherlands)

    Wang, M.|info:eu-repo/dai/nl/345480279; Beelen, R.M.J.|info:eu-repo/dai/nl/30483100X; Eeftens, M.R.|info:eu-repo/dai/nl/315028300; Meliefste, C.; Hoek, G.|info:eu-repo/dai/nl/069553475; Brunekreef, B.|info:eu-repo/dai/nl/067548180

    2012-01-01

    Land use regression (LUR) models have become popular to explain the spatial variation of air pollution concentrations. Independent evaluation is important. We developed LUR models for nitrogen dioxide (NO(2)) using measurements conducted at 144 sampling sites in The Netherlands. Sites were randomly

  8. Model uncertainty and systematic risk in US banking

    NARCIS (Netherlands)

    Baele, L.T.M.; De Bruyckere, Valerie; De Jonghe, O.G.; Vander Vennet, Rudi

    2015-01-01

    This paper uses Bayesian Model Averaging to examine the driving factors of equity returns of US Bank Holding Companies. BMA has as an advantage over OLS that it accounts for the considerable uncertainty about the correct set (model) of bank risk factors. We find that out of a broad set of 12 risk fa

  9. Systematic modeling for free stators of rotary - Piezoelectric ultrasonic motors

    DEFF Research Database (Denmark)

    Mojallali, Hamed; Amini, Rouzbeh; Izadi-Zamanabadi, Roozbeh

    2007-01-01

    An equivalent circuit model with complex elements is presented in this paper to describe the free stator model of traveling wave piezoelectric motors. The mechanical, dielectric and piezoelectric losses associated with the vibrator are considered by introducing the imaginary part to the equivalent...

  10. Understanding in vivo modelling of depression in non-human animals: a systematic review protocol

    DEFF Research Database (Denmark)

    Bannach-Brown, Alexandra; Liao, Jing; Wegener, Gregers

    2016-01-01

    The aim of this study is to systematically collect all published preclinical non-human animal literature on depression to provide an unbiased overview of existing knowledge. A systematic search will be carried out in PubMed and Embase. Studies will be included if they use non-human animal...... experimental model(s) to induce or mimic a depressive-like phenotype. Data that will be extracted include the model or method of induction; species and gender of the animals used; the behavioural, anatomical, electrophysiological, neurochemical or genetic outcome measure(s) used; risk of bias....../quality of reporting; and any intervention(s) tested. There were no exclusion criteria based on language or date of publication. Automation techniques will be used, where appropriate, to reduce the human reviewer time. Meta-analyses will be conducted if feasible. This broad systematic review aims to gain a better...

  11. Risk models to predict hypertension: a systematic review.

    Directory of Open Access Journals (Sweden)

    Justin B Echouffo-Tcheugui

    Full Text Available BACKGROUND: As well as being a risk factor for cardiovascular disease, hypertension is also a health condition in its own right. Risk prediction models may be of value in identifying those individuals at risk of developing hypertension who are likely to benefit most from interventions. METHODS AND FINDINGS: To synthesize existing evidence on the performance of these models, we searched MEDLINE and EMBASE; examined bibliographies of retrieved articles; contacted experts in the field; and searched our own files. Dual review of identified studies was conducted. Included studies had to report on the development, validation, or impact analysis of a hypertension risk prediction model. For each publication, information was extracted on study design and characteristics, predictors, model discrimination, calibration and reclassification ability, validation and impact analysis. Eleven studies reporting on 15 different hypertension prediction risk models were identified. Age, sex, body mass index, diabetes status, and blood pressure variables were the most common predictor variables included in models. Most risk models had acceptable-to-good discriminatory ability (C-statistic>0.70 in the derivation sample. Calibration was less commonly assessed, but overall acceptable. Two hypertension risk models, the Framingham and Hopkins, have been externally validated, displaying acceptable-to-good discrimination, and C-statistic ranging from 0.71 to 0.81. Lack of individual-level data precluded analyses of the risk models in subgroups. CONCLUSIONS: The discrimination ability of existing hypertension risk prediction tools is acceptable, but the impact of using these tools on prescriptions and outcomes of hypertension prevention is unclear.

  12. Excised Abdominoplasty Material as a Systematic Plastic Surgical Training Model

    Directory of Open Access Journals (Sweden)

    M. Erol Demirseren

    2012-01-01

    Full Text Available Achieving a level of technical skill and confidence in surgical operations is the main goal of plastic surgical training. Operating rooms were accepted as the practical teaching venues of the traditional apprenticeship model. However, increased patient population, time, and ethical and legal considerations made preoperation room practical work a must for plastic surgical training. There are several plastic surgical teaching models and simulators which are very useful in preoperation room practical training and the evaluation of plastic surgery residents. The full thickness skin with its vascular network excised in abdominoplasty procedures is an easily obtainable real human tissue which could be used as a training model in plastic surgery.

  13. A Systematic Modelling Framework for Phase Transfer Catalyst Systems

    DEFF Research Database (Denmark)

    Anantpinijwatna, Amata; Sales-Cruz, Mauricio; Hyung Kim, Sun

    2016-01-01

    in an aqueous phase. These reacting systems are receiving increased attention as novel organic synthesis options due to their flexible operation, higher product yields, and ability to avoid hazardous or expensive solvents. Major considerations in the design and analysis of PTC systems are physical and chemical...... equilibria, as well as kinetic mechanisms and rates. This paper presents a modelling framework for design and analysis of PTC systems that requires a minimum amount of experimental data to develop and employ the necessary thermodynamic and reaction models and embeds them into a reactor model for simulation....... The application of the framework is made to two cases in order to highlight the performance and issues of activity coefficient models for predicting design and operation and the effects when different organic solvents are employed....

  14. Individuals underestimate moderate and vigorous intensity physical activity.

    Directory of Open Access Journals (Sweden)

    Karissa L Canning

    Full Text Available BACKGROUND: It is unclear whether the common physical activity (PA intensity descriptors used in PA guidelines worldwide align with the associated percent heart rate maximum method used for prescribing relative PA intensities consistently between sexes, ethnicities, age categories and across body mass index (BMI classifications. OBJECTIVES: The objectives of this study were to determine whether individuals properly select light, moderate and vigorous intensity PA using the intensity descriptions in PA guidelines and determine if there are differences in estimation across sex, ethnicity, age and BMI classifications. METHODS: 129 adults were instructed to walk/jog at a "light," "moderate" and "vigorous effort" in a randomized order. The PA intensities were categorized as being below, at or above the following %HRmax ranges of: 50-63% for light, 64-76% for moderate and 77-93% for vigorous effort. RESULTS: On average, people correctly estimated light effort as 51.5±8.3%HRmax but underestimated moderate effort as 58.7±10.7%HRmax and vigorous effort as 69.9±11.9%HRmax. Participants walked at a light intensity (57.4±10.5%HRmax when asked to walk at a pace that provided health benefits, wherein 52% of participants walked at a light effort pace, 19% walked at a moderate effort and 5% walked at a vigorous effort pace. These results did not differ by sex, ethnicity or BMI class. However, younger adults underestimated moderate and vigorous intensity more so than middle-aged adults (P<0.05. CONCLUSION: When the common PA guideline descriptors were aligned with the associated %HRmax ranges, the majority of participants underestimated the intensity of PA that is needed to obtain health benefits. Thus, new subjective descriptions for moderate and vigorous intensity may be warranted to aid individuals in correctly interpreting PA intensities.

  15. Methods for determining the degree of underestimation or overrating of shares using PER analysis

    Directory of Open Access Journals (Sweden)

    Andreea Vasiliu

    2013-04-01

    Full Text Available Multiples method used in evaluating companies started to gain increasingly more credibility to the specialists compared to the traditional business evaluation methods. There are many studies, both theoretical and empirical, that focus on this topic especially on the accuracy of determining multiples and choosing the peer group (comparable company group selection. The objective of this research is to evaluate the shares listed on the Bucharest Stock Exchange by using multiples method, more precise to determine the degree of underestimation or overrating of shares using PER analysis. The research methodology is a constructivist one, the orientation being one explanatory. Methods and techniques used are quantitative analysis, ARIMA model, correlation and regression. For data collection we used the information from different authorized institutes such as Bucharest Stock Exchange Market, National Agency of Fiscal Administration and the National Institute of Statistics. Research findings can be summarized as follows: calculating growing rate by using the information listed in the balance sheet will lead to an underestimate of the shares, PER lead to an overvaluation of shares, compound average and regression analysis provides the most plausible method of determine the degree of underestimation or overrating of shares for listed companies. The study contributes to the development of company valuation method using multiples.

  16. Priapism and glucose-6-phosphate dehydrogenase deficiency: An underestimated correlation?

    Directory of Open Access Journals (Sweden)

    Aldo Franco De Rose

    2016-10-01

    Full Text Available Priapism is a rare clinical condition characterized by a persistent erection unrelated to sexual excitement. Often the etiology is idiopathic. Three cases of priapism in glucose-6-phosphate dehydrogenase (G6PD deficiency patients have been described in literature. We present the case of a 39-year-old man with glucose- 6-phosphate dehydrogenase deficiency, who reached out to our department for the arising of a non-ischemic priapism without arteriolacunar fistula. We suggest that the glucose-6-phosphate dehydrogenase deficiency could be an underestimated risk factor for priapism.

  17. Using logic model methods in systematic review synthesis: describing complex pathways in referral management interventions.

    Science.gov (United States)

    Baxter, Susan K; Blank, Lindsay; Woods, Helen Buckley; Payne, Nick; Rimmer, Melanie; Goyder, Elizabeth

    2014-05-10

    There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. PROSPERO registration number: CRD42013004037.

  18. Underestimating nearby nature: affective forecasting errors obscure the happy path to sustainability.

    Science.gov (United States)

    Nisbet, Elizabeth K; Zelenski, John M

    2011-09-01

    Modern lifestyles disconnect people from nature, and this may have adverse consequences for the well-being of both humans and the environment. In two experiments, we found that although outdoor walks in nearby nature made participants much happier than indoor walks did, participants made affective forecasting errors, such that they systematically underestimated nature's hedonic benefit. The pleasant moods experienced on outdoor nature walks facilitated a subjective sense of connection with nature, a construct strongly linked with concern for the environment and environmentally sustainable behavior. To the extent that affective forecasts determine choices, our findings suggest that people fail to maximize their time in nearby nature and thus miss opportunities to increase their happiness and relatedness to nature. Our findings suggest a happy path to sustainability, whereby contact with nature fosters individual happiness and environmentally responsible behavior.

  19. Interdependence of Model Systematic Biases in the Tropical Atlantic and the Tropical Pacific

    Science.gov (United States)

    Demissie, Teferi; Shonk, Jon; Toniazzo, Thomas; Woolnough, Steve Steve; Guilyardi, Eric

    2017-04-01

    The tropical climatology represented in simulations with General Circulation Models (GCMs) is affected by significant systematic biases despite the huge investments in model devlopment over the past 20 years. In this study, coupled seasonal hindcasts performed with EC-Earth and ECMWF System 4 are analyzed to understand the development of systematic biases in the tropical Atlantic and Pacific oceans. These models use similar atmosphere and ocean components (IFS and NEMO, respectively). We focus on hindcasts initialized in February and May. We discuss possible mechanisms for the evolution and origin of rapidly developing systematic biases over the tropical Atlantic during boreal spring. In addition, we look for evidence of the interrelation of systematic biases in the Atlantic and Pacific, and investigate if the errors in one ocean basin affect those in the other. We perform an upper-atmosphere wave analysis by Fourier filtering for certain ranges of temporal frequencies and zonal wavenumbers. Our results identicate common systematic biases in EC-Earth and System 4 purely attributable to the atmosphere component. Biases develop in the Atlantic basin independently of external influences, while a possible effect of such biases on the eastern Pacific is found.

  20. A Systematic Evaluation Model for Solar Cell Technologies

    Directory of Open Access Journals (Sweden)

    Chang-Fu Hsu

    2014-01-01

    Full Text Available Fossil fuels, including coal, petroleum, natural gas, and nuclear energy, are the primary electricity sources currently. However, with depletion of fossil fuels, global warming, nuclear crisis, and increasing environmental consciousness, the demand for renewable energy resources has skyrocketed. Solar energy is one of the most popular renewable energy resources for meeting global energy demands. Even though there are abundant studies on various solar technology developments, there is a lack of studies on solar technology evaluation and selection. Therefore, this research develops a model using interpretive structural modeling (ISM, benefits, opportunities, costs, and risks concept (BOCR, and fuzzy analytic network process (FANP to aggregate experts' opinions in evaluating current available solar cell technology. A case study in a photovoltaics (PV firm is used to examine the practicality of the proposed model in selecting the most suitable technology for the firm in manufacturing new products.

  1. A systematic review of health manpower forecasting models.

    NARCIS (Netherlands)

    Martins-Coelho, G.; Greuningen, M. van; Barros, H.; Batenburg, R.

    2011-01-01

    Context: Health manpower planning (HMP) aims at matching health manpower (HM) supply to the population’s health requirements. To achieve this, HMP needs information on future HM supply and requirement (S&R). This is estimated by several different forecasting models (FMs). In this paper, we review

  2. Does WISC-IV Underestimate the Intelligence of Autistic Children?

    Science.gov (United States)

    Nader, Anne-Marie; Courchesne, Valérie; Dawson, Michelle; Soulières, Isabelle

    2016-05-01

    Wechsler Intelligence Scale for Children (WISC) is widely used to estimate autistic intelligence (Joseph in The neuropsychology of autism. Oxford University Press, Oxford, 2011; Goldstein et al. in Assessment of autism spectrum disorders. Guilford Press, New York, 2008; Mottron in J Autism Dev Disord 34(1):19-27, 2004). However, previous studies suggest that while WISC-III and Raven's Progressive Matrices (RPM) provide similar estimates of non-autistic intelligence, autistic children perform significantly better on RPM (Dawson et al. in Psychol Sci 18(8):657-662, doi: 10.1111/j.1467-9280.2007.01954.x , 2007). The latest WISC version introduces substantial changes in subtests and index scores; thus, we asked whether WISC-IV still underestimates autistic intelligence. Twenty-five autistic and 22 typical children completed WISC-IV and RPM. Autistic children's RPM scores were significantly higher than their WISC-IV FSIQ, but there was no significant difference in typical children. Further, autistic children showed a distinctively uneven WISC-IV index profile, with a "peak" in the new Perceptual Reasoning Index. In spite of major changes, WISC-IV FSIQ continues to underestimate autistic intelligence.

  3. Consequences of Underestimating Impalement Bicycle Handlebar Injuries in Children.

    Science.gov (United States)

    Ramos-Irizarry, Carmen T; Swain, Shakeva; Troncoso-Munoz, Samantha; Duncan, Malvina

    Impalement bicycle handlebar trauma injuries are rare; however, on initial assessment, they have the potential of being underestimated. We reviewed our prospective trauma database of 3,894 patients for all bicycle injuries from January 2010 to May 2015. Isolated pedal bike injuries were reported in 2.6% (N = 101) of the patients who were admitted to the trauma service. Fifteen patients suffered direct handlebar trauma. Patients were grouped into blunt trauma (n = 12) and impalement trauma (n = 3). We examined gender, age, injury severity score (ISS), Glasgow Coma Scale score, use of protective devices, need for surgical intervention, need for intensive care (ICU), and hospital length of stay. Mean age was 9.6 years. All children with penetrating injuries were males. Mean ISS was less than 9 in both groups. None of the children were wearing bicycle helmets. Three patients who sustained blunt injuries required ICU care due to associated injuries. All of the children with impalement injuries required several surgical interventions. These injuries included a traumatic direct inguinal hernia, a medial groin and thigh laceration with resultant femoral hernia, and a lateral deep thigh laceration. Impalement bicycle handlebar injuries must be thoroughly evaluated, with a similar importance given to blunt injuries. A high index of suspicion must be maintained when examining children with handlebar impalement injuries, as they are at risk for missed or underestimation of their injuries.

  4. Topic Modeling in Sentiment Analysis: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Toqir Ahmad Rana

    2016-06-01

    Full Text Available With the expansion and acceptance of Word Wide Web, sentiment analysis has become progressively popular research area in information retrieval and web data analysis. Due to the huge amount of user-generated contents over blogs, forums, social media, etc., sentiment analysis has attracted researchers both in academia and industry, since it deals with the extraction of opinions and sentiments. In this paper, we have presented a review of topic modeling, especially LDA-based techniques, in sentiment analysis. We have presented a detailed analysis of diverse approaches and techniques, and compared the accuracy of different systems among them. The results of different approaches have been summarized, analyzed and presented in a sophisticated fashion. This is the really effort to explore different topic modeling techniques in the capacity of sentiment analysis and imparting a comprehensive comparison among them.

  5. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification.

    Science.gov (United States)

    Sager, Jennifer E; Yu, Jingjing; Ragueneau-Majlessi, Isabelle; Isoherranen, Nina

    2015-11-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms "PBPK" and "physiologically based pharmacokinetic model" to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines.

  6. Effects of waveform model systematics on the interpretation of GW150914

    CERN Document Server

    Abbott, B P; Abbott, T D; Abernathy, M R; Acernese, F; Ackley, K; Adams, C; Adams, T; Addesso, P; Adhikari, R X; Adya, V B; Affeldt, C; Agathos, M; Agatsuma, K; Aggarwal, N; Aguiar, O D; Aiello, L; Ain, A; Ajith, P; Allen, B; Allocca, A; Altin, P A; Ananyeva, A; Anderson, S B; Anderson, W G; Appert, S; Arai, K; Araya, M C; Areeda, J S; Arnaud, N; Arun, K G; Ascenzi, S; Ashton, G; Ast, M; Aston, S M; Astone, P; Aufmuth, P; Aulbert, C; Avila-Alvarez, A; Babak, S; Bacon, P; Bader, M K M; Baker, P T; Baldaccini, F; Ballardin, G; Ballmer, S W; Barayoga, J C; Barclay, S E; Barish, B C; Barker, D; Barone, F; Barr, B; Barsotti, L; Barsuglia, M; Barta, D; Bartlett, J; Bartos, I; Bassiri, R; Basti, A; Batch, J C; Baune, C; Bavigadda, V; Bazzan, M; Beer, C; Bejger, M; Belahcene, I; Belgin, M; Bell, A S; Berger, B K; Bergmann, G; Berry, C P L; Bersanetti, D; Bertolini, A; Betzwieser, J; Bhagwat, S; Bhandare, R; Bilenko, I A; Billingsley, G; Billman, C R; Birch, J; Birney, R; Birnholtz, O; Biscans, S; Bisht, A; Bitossi, M; Biwer, C; Bizouard, M A; Blackburn, J K; Blackman, J; Blair, C D; Blair, D G; Blair, R M; Bloemen, S; Bock, O; Boer, M; Bogaert, G; Bohe, A; Bondu, F; Bonnand, R; Boom, B A; Bork, R; Boschi, V; Bose, S; Bouffanais, Y; Bozzi, A; Bradaschia, C; Brady, P R; Braginsky, V B; Branchesi, M; Brau, J E; Briant, T; Brillet, A; Brinkmann, M; Brisson, V; Brockill, P; Broida, J E; Brooks, A F; Brown, D A; Brown, D D; Brown, N M; Brunett, S; Buchanan, C C; Buikema, A; Bulik, T; Bulten, H J; Buonanno, A; Buskulic, D; Buy, C; Byer, R L; Cabero, M; Cadonati, L; Cagnoli, G; Cahillane, C; Bustillo, J Calder'on; Callister, T A; Calloni, E; Camp, J B; Cannon, K C; Cao, H; Cao, J; Capano, C D; Capocasa, E; Carbognani, F; Caride, S; Diaz, J Casanueva; Casentini, C; Caudill, S; Cavagli`a, M; Cavalier, F; Cavalieri, R; Cella, G; Cepeda, C B; Baiardi, L Cerboni; Cerretani, G; Cesarini, E; Chamberlin, S J; Chan, M; Chao, S; Charlton, P; Chassande-Mottin, E; Cheeseboro, B D; Chen, H Y; Chen, Y; Cheng, H -P; Chincarini, A; Chiummo, A; Chmiel, T; Cho, H S; Cho, M; Chow, J H; Christensen, N; Chu, Q; Chua, A J K; Chua, S; Chung, S; Ciani, G; Clara, F; Clark, J A; Cleva, F; Cocchieri, C; Coccia, E; Cohadon, P -F; Colla, A; Collette, C G; Cominsky, L; Constancio, M; Conti, L; Cooper, S J; Corbitt, T R; Cornish, N; Corsi, A; Cortese, S; Costa, C A; Coughlin, M W; Coughlin, S B; Coulon, J -P; Countryman, S T; Couvares, P; Covas, P B; Cowan, E E; Coward, D M; Cowart, M J; Coyne, D C; Coyne, R; Creighton, J D E; Creighton, T D; Cripe, J; Crowder, S G; Cullen, T J; Cumming, A; Cunningham, L; Cuoco, E; Canton, T Dal; Danilishin, S L; D'Antonio, S; Danzmann, K; Dasgupta, A; Costa, C F Da Silva; Dattilo, V; Dave, I; Davier, M; Davies, G S; Davis, D; Daw, E J; Day, B; Day, R; De, S; DeBra, D; Debreczeni, G; Degallaix, J; De Laurentis, M; Del'eglise, S; Del Pozzo, W; Denker, T; Dent, T; Dergachev, V; De Rosa, R; DeRosa, R T; DeSalvo, R; Devenson, J; Devine, R C; Dhurandhar, S; D'iaz, M C; Di Fiore, L; Di Giovanni, M; Di Girolamo, T; Di Lieto, A; Di Pace, S; Di Palma, I; Di Virgilio, A; Doctor, Z; Dolique, V; Donovan, F; Dooley, K L; Doravari, S; Dorrington, I; Douglas, R; 'Alvarez, M Dovale; Downes, T P; Drago, M; Drever, R W P; Driggers, J C; Du, Z; Ducrot, M; Dwyer, S E; Edo, T B; Edwards, M C; Effler, A; Eggenstein, H -B; Ehrens, P; Eichholz, J; Eikenberry, S S; Eisenstein, R A; Essick, R C; Etienne, Z; Etzel, T; Evans, M; Evans, T M; Everett, R; Factourovich, M; Fafone, V; Fair, H; Fairhurst, S; Fan, X; Farinon, S; Farr, B; Farr, W M; Fauchon-Jones, E J; Favata, M; Fays, M; Fehrmann, H; Fejer, M M; Galiana, A Fern'andez; Ferrante, I; Ferreira, E C; Ferrini, F; Fidecaro, F; Fiori, I; Fiorucci, D; Fisher, R P; Flaminio, R; Fletcher, M; Fong, H; Forsyth, S S; Fournier, J -D; Frasca, S; Frasconi, F; Frei, Z; Freise, A; Frey, R; Frey, V; Fries, E M; Fritschel, P; Frolov, V V; Fulda, P; Fyffe, M; Gabbard, H; Gadre, B U; Gaebel, S M; Gair, J R; Gammaitoni, L; Gaonkar, S G; Garufi, F; Gaur, G; Gayathri, V; Gehrels, N; Gemme, G; Genin, E; Gennai, A; George, J; Gergely, L; Germain, V; Ghonge, S; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S; Giaime, J A; Giardina, K D; Giazotto, A; Gill, K; Glaefke, A; Goetz, E; Goetz, R; Gondan, L; Gonz'alez, G; Castro, J M Gonzalez; Gopakumar, A; Gorodetsky, M L; Gossan, S E; Gosselin, M; Gouaty, R; Grado, A; Graef, C; Granata, M; Grant, A; Gras, S; Gray, C; Greco, G; Green, A C; Groot, P; Grote, H; Grunewald, S; Guidi, G M; Guo, X; Gupta, A; Gupta, M K; Gushwa, K E; Gustafson, E K; Gustafson, R; Hacker, J J; Hall, B R; Hall, E D; Hammond, G; Haney, M; Hanke, M M; Hanks, J; Hanna, C; Hannam, M D; Hanson, J; Hardwick, T; Harms, J; Harry, G M; Harry, I W; Hart, M J; Hartman, M T; Haster, C -J; Haughian, K; Healy, J; Heidmann, A; Heintze, M C; Heitmann, H; Hello, P; Hemming, G; Hendry, M; Heng, I S; Hennig, J; Henry, J; Heptonstall, A W; Heurs, M; Hild, S; Hoak, D; Hofman, D; Holt, K; Holz, D E; Hopkins, P; Hough, J; Houston, E A; Howell, E J; Hu, Y M; Huerta, E A; Huet, D; Hughey, B; Husa, S; Huttner, S H; Huynh-Dinh, T; Indik, N; Ingram, D R; Inta, R; Isa, H N; Isac, J -M; Isi, M; Isogai, T; Iyer, B R; Izumi, K; Jacqmin, T; Jani, K; Jaranowski, P; Jawahar, S; Jim'enez-Forteza, F; Johnson, W W; Jones, D I; Jones, R; Jonker, R J G; Ju, L; Junker, J; Kalaghatgi, C V; Kalogera, V; Kandhasamy, S; Kang, G; Kanner, J B; Karki, S; Karvinen, K S; Kasprzack, M; Katsavounidis, E; Katzman, W; Kaufer, S; Kaur, T; Kawabe, K; K'ef'elian, F; Keitel, D; Kelley, D B; Kennedy, R; Key, J S; Khalili, F Y; Khan, I; Khan, S; Khan, Z; Khazanov, E A; Kijbunchoo, N; Kim, Chunglee; Kim, J C; Kim, Whansun; Kim, W; Kim, Y -M; Kimbrell, S J; King, E J; King, P J; Kirchhoff, R; Kissel, J S; Klein, B; Kleybolte, L; Klimenko, S; Koch, P; Koehlenbeck, S M; Koley, S; Kondrashov, V; Kontos, A; Korobko, M; Korth, W Z; Kowalska, I; Kozak, D B; Kr"amer, C; Kringel, V; Krishnan, B; Kr'olak, A; Kuehn, G; Kumar, P; Kumar, R; Kuo, L; Kutynia, A; Lackey, B D; Landry, M; Lang, R N; Lange, J; Lantz, B; Lanza, R K; Lartaux-Vollard, A; Lasky, P D; Laxen, M; Lazzarini, A; Lazzaro, C; Leaci, P; Leavey, S; Lebigot, E O; Lee, C H; Lee, H K; Lee, H M; Lee, K; Lehmann, J; Lenon, A; Leonardi, M; Leong, J R; Leroy, N; Letendre, N; Levin, Y; Li, T G F; Libson, A; Littenberg, T B; Liu, J; Lockerbie, N A; Lombardi, A L; London, L T; Lord, J E; Lorenzini, M; Loriette, V; Lormand, M; Losurdo, G; Lough, J D; Lovelace, G; L"uck, H; Lundgren, A P; Lynch, R; Ma, Y; Macfoy, S; Machenschalk, B; MacInnis, M; Macleod, D M; Magana-Sandoval, F; Majorana, E; Maksimovic, I; Malvezzi, V; Man, N; Mandic, V; Mangano, V; Mansell, G L; Manske, M; Mantovani, M; Marchesoni, F; Marion, F; M'arka, S; M'arka, Z; Markosyan, A S; Maros, E; Martelli, F; Martellini, L; Martin, I W; Martynov, D V; Mason, K; Masserot, A; Massinger, T J; Masso-Reid, M; Mastrogiovanni, S; Matichard, F; Matone, L; Mavalvala, N; Mazumder, N; McCarthy, R; McClelland, D E; McCormick, S; McGrath, C; McGuire, S C; McIntyre, G; McIver, J; McManus, D J; McRae, T; McWilliams, S T; Meacher, D; Meadors, G D; Meidam, J; Melatos, A; Mendell, G; Mendoza-Gandara, D; Mercer, R A; Merilh, E L; Merzougui, M; Meshkov, S; Messenger, C; Messick, C; Metzdorff, R; Meyers, P M; Mezzani, F; Miao, H; Michel, C; Middleton, H; Mikhailov, E E; Milano, L; Miller, A L; Miller, A; Miller, B B; Miller, J; Millhouse, M; Minenkov, Y; Ming, J; Mirshekari, S; Mishra, C; Mitra, S; Mitrofanov, V P; Mitselmakher, G; Mittleman, R; Moggi, A; Mohan, M; Mohapatra, S R P; Montani, M; Moore, B C; Moore, C J; Moraru, D; Moreno, G; Morriss, S R; Mours, B; Mow-Lowry, C M; Mueller, G; Muir, A W; Mukherjee, Arunava; Mukherjee, D; Mukherjee, S; Mukund, N; Mullavey, A; Munch, J; Muniz, E A M; Murray, P G; Mytidis, A; Napier, K; Nardecchia, I; Naticchioni, L; Nelemans, G; Nelson, T J N; Neri, M; Nery, M; Neunzert, A; Newport, J M; Newton, G; Nguyen, T T; Nielsen, A B; Nissanke, S; Nitz, A; Noack, A; Nocera, F; Nolting, D; Normandin, M E N; Nuttall, L K; Oberling, J; Ochsner, E; Oelker, E; Ogin, G H; Oh, J J; Oh, S H; Ohme, F; Oliver, M; Oppermann, P; Oram, Richard J; O'Reilly, B; O'Shaughnessy, R; Ottaway, D J; Overmier, H; Owen, B J; Pace, A E; Page, J; Pai, A; Pai, S A; Palamos, J R; Palashov, O; Palomba, C; Pal-Singh, A; Pan, H; Pankow, C; Pannarale, F; Pant, B C; Paoletti, F; Paoli, A; Papa, M A; Paris, H R; Parker, W; Pascucci, D; Pasqualetti, A; Passaquieti, R; Passuello, D; Patricelli, B; Pearlstone, B L; Pedraza, M; Pedurand, R; Pekowsky, L; Pele, A; Penn, S; Perez, C J; Perreca, A; Perri, L M; Pfeiffer, H P; Phelps, M; Piccinni, O J; Pichot, M; Piergiovanni, F; Pierro, V; Pillant, G; Pinard, L; Pinto, I M; Pitkin, M; Poe, M; Poggiani, R; Popolizio, P; Post, A; Powell, J; Prasad, J; Pratt, J W W; Predoi, V; Prestegard, T; Prijatelj, M; Principe, M; Privitera, S; Prodi, G A; Prokhorov, L G; Puncken, O; Punturo, M; Puppo, P; P"urrer, M; Qi, H; Qin, J; Qiu, S; Quetschke, V; Quintero, E A; Quitzow-James, R; Raab, F J; Rabeling, D S; Radkins, H; Raffai, P; Raja, S; Rajan, C; Rakhmanov, M; Rapagnani, P; Raymond, V; Razzano, M; Re, V; Read, J; Regimbau, T; Rei, L; Reid, S; Reitze, D H; Rew, H; Reyes, S D; Rhoades, E; Ricci, F; Riles, K; Rizzo, M; Robertson, N A; Robie, R; Robinet, F; Rocchi, A; Rolland, L; Rollins, J G; Roma, V J; Romano, J D; Romano, R; Romie, J H; Rosi'nska, D; Rowan, S; R"udiger, A; Ruggi, P; Ryan, K; Sachdev, S; Sadecki, T; Sadeghian, L; Sakellariadou, M; Salconi, L; Saleem, M; Salemi, F; Samajdar, A; Sammut, L; Sampson, L M; Sanchez, E J; Sandberg, V; Sanders, J R; Sassolas, B; Sathyaprakash, B S; Saulson, P R; Sauter, O; Savage, R L; Sawadsky, A; Schale, P; Scheuer, J; Schmidt, E; Schmidt, J; Schmidt, P; Schnabel, R; Schofield, R M S; Sch"onbeck, A; Schreiber, E; Schuette, D; Schutz, B F; Schwalbe, S G; Scott, J; Scott, S M; Sellers, D; Sengupta, A S; Sentenac, D; Sequino, V; Sergeev, A; Setyawati, Y; Shaddock, D A; Shaffer, T J; Shahriar, M S; Shapiro, B; Shawhan, P; Sheperd, A; Shoemaker, D H; Shoemaker, D M; Siellez, K; Siemens, X; Sieniawska, M; Sigg, D; Silva, A D; Singer, A; Singer, L P; Singh, A; Singh, R; Singhal, A; Sintes, A M; Slagmolen, B J J; Smith, B; Smith, J R; Smith, R J E; Son, E J; Sorazu, B; Sorrentino, F; Souradeep, T; Spencer, A P; Srivastava, A K; Staley, A; Steinke, M; Steinlechner, J; Steinlechner, S; Steinmeyer, D; Stephens, B C; Stevenson, S P; Stone, R; Strain, K A; Straniero, N; Stratta, G; Strigin, S E; Sturani, R; Stuver, A L; Summerscales, T Z; Sun, L; Sunil, S; Sutton, P J; Swinkels, B L; Szczepa'nczyk, M J; Tacca, M; Talukder, D; Tanner, D B; T'apai, M; Taracchini, A; Taylor, R; Theeg, T; Thomas, E G; Thomas, M; Thomas, P; Thorne, K A; Thrane, E; Tippens, T; Tiwari, S; Tiwari, V; Tokmakov, K V; Toland, K; Tomlinson, C; Tonelli, M; Tornasi, Z; Torrie, C I; T"oyr"a, D; Travasso, F; Traylor, G; Trifir`o, D; Trinastic, J; Tringali, M C; Trozzo, L; Tse, M; Tso, R; Turconi, M; Tuyenbayev, D; Ugolini, D; Unnikrishnan, C S; Urban, A L; Usman, S A; Vahlbruch, H; Vajente, G; Valdes, G; van Bakel, N; van Beuzekom, M; Brand, J F J van den; Broeck, C Van Den; Vander-Hyde, D C; van der Schaaf, L; van Heijningen, J V; van Veggel, A A; Vardaro, M; Varma, V; Vass, S; Vas'uth, M; Vecchio, A; Vedovato, G; Veitch, J; Veitch, P J; Venkateswara, K; Venugopalan, G; Verkindt, D; Vetrano, F; Vicer'e, A; Viets, A D; Vinciguerra, S; Vine, D J; Vinet, J -Y; Vitale, S; Vo, T; Vocca, H; Vorvick, C; Voss, D V; Vousden, W D; Vyatchanin, S P; Wade, A R; Wade, L E; Wade, M; Walker, M; Wallace, L; Walsh, S; Wang, G; Wang, H; Wang, M; Wang, Y; Ward, R L; Warner, J; Was, M; Watchi, J; Weaver, B; Wei, L -W; Weinert, M; Weinstein, A J; Weiss, R; Wen, L; Wessels, P; Westphal, T; Wette, K; Whelan, J T; Whiting, B F; Whittle, C; Williams, D; Williams, R D; Williamson, A R; Willis, J L; Willke, B; Wimmer, M H; Winkler, W; Wipf, C C; Wittel, H; Woan, G; Woehler, J; Worden, J; Wright, J L; Wu, D S; Wu, G; Yam, W; Yamamoto, H; Yancey, C C; Yap, M J; Yu, Hang; Yu, Haocun; Yvert, M; zny, A Zadro; Zangrando, L; Zanolin, M; Zendri, J -P; Zevin, M; Zhang, L; Zhang, M; Zhang, T; Zhang, Y; Zhao, C; Zhou, M; Zhou, Z; Zhu, S J; Zhu, X J; Zucker, M E; Zweizig, J; Boyle, M; Chu, T; Hemberger, D; Hinder, I; Kidder, L E; Ossokine, S; Scheel, M; Szilagyi, B; Teukolsky, S; Vano-Vinuales, A

    2016-01-01

    Parameter estimates of GW150914 were obtained using Bayesian inference, based on three semi-analytic waveform models for binary black hole coalescences. These waveform models differ from each other in their treatment of black hole spins, and all three models make some simplifying assumptions, notably to neglect sub-dominant waveform harmonic modes and orbital eccentricity. Furthermore, while the models are calibrated to agree with waveforms obtained by full numerical solutions of Einstein's equations, any such calibration is accurate only to some non-zero tolerance and is limited by the accuracy of the underlying phenomenology, availability, quality, and parameter-space coverage of numerical simulations. This paper complements the original analyses of GW150914 with an investigation of the effects of possible systematic errors in the waveform models on estimates of its source parameters. To test for systematic errors we repeat the original Bayesian analyses on mock signals from numerical simulations of a serie...

  7. Prediction models for cardiovascular disease risk in the general population : Systematic review

    NARCIS (Netherlands)

    Damen, Johanna A A G; Hooft, Lotty; Schuit, Ewoud; Debray, Thomas P A; Collins, Gary S.; Tzoulaki, Ioanna; Lassale, Camille M.; Siontis, George C M; Chiocchia, Virginia; Roberts, Corran; Schlüssel, Michael Maia; Gerry, Stephen; Black, James A.; Heus, Pauline; Van Der Schouw, Yvonne T.; Peelen, Linda M.; Moons, Karel G M

    2016-01-01

    OBJECTIVE: To provide an overview of prediction models for risk of cardiovascular disease (CVD) in the general population. DESIGN: Systematic review. DATA SOURCES: Medline and Embase until June 2013. ELIGIBILITY CRITERIA FOR STUDY SELECTION: Studies describing the development or external validation

  8. A Systematic Approach to Modelling Change Processes in Construction Projects

    Directory of Open Access Journals (Sweden)

    Ibrahim Motawa

    2012-11-01

    Full Text Available Modelling change processes within construction projects isessential to implement changes efficiently. Incomplete informationon the project variables at the early stages of projects leads toinadequate knowledge of future states and imprecision arisingfrom ambiguity in project parameters. This lack of knowledge isconsidered among the main source of changes in construction.Change identification and evaluation, in addition to predictingits impacts on project parameters, can help in minimising thedisruptive effects of changes. This paper presents a systematicapproach to modelling change process within construction projectsthat helps improve change identification and evaluation. Theapproach represents the key decisions required to implementchanges. The requirements of an effective change processare presented first. The variables defined for efficient changeassessment and diagnosis are then presented. Assessmentof construction changes requires an analysis for the projectcharacteristics that lead to change and also analysis of therelationship between the change causes and effects. The paperconcludes that, at the early stages of a project, projects with a highlikelihood of change occurrence should have a control mechanismover the project characteristics that have high influence on theproject. It also concludes, for the relationship between changecauses and effects, the multiple causes of change should bemodelled in a way to enable evaluating the change effects moreaccurately. The proposed approach is the framework for tacklingsuch conclusions and can be used for evaluating change casesdepending on the available information at the early stages ofconstruction projects.

  9. Effects of waveform model systematics on the interpretation of GW150914

    Science.gov (United States)

    Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Ananyeva, A.; Anderson, S. B.; Anderson, W. G.; Appert, S.; Arai, K.; Araya, M. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Avila-Alvarez, A.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; E Barclay, S.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Beer, C.; Bejger, M.; Belahcene, I.; Belgin, M.; Bell, A. S.; Berger, B. K.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Billman, C. R.; Birch, J.; Birney, R.; Birnholtz, O.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blackman, J.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Boer, M.; Bogaert, G.; Bohe, A.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; E Brau, J.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; E Broida, J.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Brunett, S.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cabero, M.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T. A.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, H.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Cheeseboro, B. D.; Chen, H. Y.; Chen, Y.; Cheng, H.-P.; Chincarini, A.; Chiummo, A.; Chmiel, T.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, A. J. K.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Cocchieri, C.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conti, L.; Cooper, S. J.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Covas, P. B.; E Cowan, E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; E Creighton, J. D.; Creighton, T. D.; Cripe, J.; Crowder, S. G.; Cullen, T. J.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Dasgupta, A.; Da Silva Costa, C. F.; Dattilo, V.; Dave, I.; Davier, M.; Davies, G. S.; Davis, D.; Daw, E. J.; Day, B.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Devenson, J.; Devine, R. C.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Girolamo, T.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Doctor, Z.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Dorrington, I.; Douglas, R.; Dovale Álvarez, M.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; E Dwyer, S.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Eisenstein, R. A.; Essick, R. C.; Etienne, Z.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Farinon, S.; Farr, B.; Farr, W. M.; Fauchon-Jones, E. J.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Fernández Galiana, A.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fong, H.; Forsyth, S. S.; Fournier, J.-D.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fries, E. M.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H.; Gadre, B. U.; Gaebel, S. M.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gaur, G.; Gayathri, V.; Gehrels, N.; Gemme, G.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghonge, S.; Ghosh, Abhirup; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gorodetsky, M. L.; E Gossan, S.; Gosselin, M.; Gouaty, R.; Grado, A.; Graef, C.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; E Gushwa, K.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Healy, J.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Henry, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hofman, D.; Holt, K.; E Holz, D.; Hopkins, P.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; Junker, J.; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Karki, S.; Karvinen, K. S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kéfélian, F.; Keitel, D.; Kelley, D. B.; Kennedy, R.; Key, J. S.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, Chunglee; Kim, J. C.; Kim, Whansun; Kim, W.; Kim, Y.-M.; Kimbrell, S. J.; King, E. J.; King, P. J.; Kirchhoff, R.; Kissel, J. S.; Klein, B.; Kleybolte, L.; Klimenko, S.; Koch, P.; Koehlenbeck, S. M.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Krämer, C.; Kringel, V.; Krishnan, B.; Królak, A.; Kuehn, G.; Kumar, P.; Kumar, R.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lang, R. N.; Lange, J.; Lantz, B.; Lanza, R. K.; Lartaux-Vollard, A.; Lasky, P. D.; Laxen, M.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lehmann, J.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Liu, J.; Lockerbie, N. A.; Lombardi, A. L.; London, L. T.; E Lord, J.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lovelace, G.; Lück, H.; Lundgren, A. P.; Lynch, R.; Ma, Y.; Macfoy, S.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martynov, D. V.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Mastrogiovanni, S.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; McCarthy, R.; E McClelland, D.; McCormick, S.; McGrath, C.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McRae, T.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E. L.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Metzdorff, R.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; E Mikhailov, E.; Milano, L.; Miller, A. L.; Miller, A.; Miller, B. B.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mours, B.; Mow-Lowry, C. M.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Muniz, E. A. M.; Murray, P. G.; Mytidis, A.; Napier, K.; Nardecchia, I.; Naticchioni, L.; Nelemans, G.; Nelson, T. J. N.; Neri, M.; Nery, M.; Neunzert, A.; Newport, J. M.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Noack, A.; Nocera, F.; Nolting, D.; Normandin, M. E. N.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O'Reilly, B.; O'Shaughnessy, R.; Ottaway, D. J.; Overmier, H.; Owen, B. J.; E Pace, A.; Page, J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perez, C. J.; Perreca, A.; Perri, L. M.; Pfeiffer, H. P.; Phelps, M.; Piccinni, O. J.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poe, M.; Poggiani, R.; Popolizio, P.; Post, A.; Powell, J.; Prasad, J.; Pratt, J. W. W.; Predoi, V.; Prestegard, T.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L. G.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Qiu, S.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rajan, C.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Rhoades, E.; Ricci, F.; Riles, K.; Rizzo, M.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, J. D.; Romano, R.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Sakellariadou, M.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sampson, L. M.; Sanchez, E. J.; Sandberg, V.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Scheuer, J.; Schmidt, E.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Schwalbe, S. G.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Setyawati, Y.; Shaddock, D. A.; Shaffer, T. J.; Shahriar, M. S.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sieniawska, M.; Sigg, D.; Silva, A. D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, B.; Smith, J. R.; E Smith, R. J.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Spencer, A. P.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stevenson, S. P.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; E Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sunil, S.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Taracchini, A.; Taylor, R.; Theeg, T.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thrane, E.; Tippens, T.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Toland, K.; Tomlinson, C.; Tonelli, M.; Tornasi, Z.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Trinastic, J.; Tringali, M. C.; Trozzo, L.; Tse, M.; Tso, R.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Varma, V.; Vass, S.; Vasúth, M.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Venugopalan, G.; Verkindt, D.; Vetrano, F.; Viceré, A.; Viets, A. D.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D. V.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; E Wade, L.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Watchi, J.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Wen, L.; Weßels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; Whiting, B. F.; Whittle, C.; Williams, D.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Woehler, J.; Worden, J.; Wright, J. L.; Wu, D. S.; Wu, G.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, Hang; Yu, Haocun; Yvert, M.; Zadrożny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, L.; Zhang, M.; Zhang, T.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, S. J.; Zhu, X. J.; E Zucker, M.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration; Boyle, M.; Chu, T.; Hemberger, D.; Hinder, I.; E Kidder, L.; Ossokine, S.; Scheel, M.; Szilagyi, B.; Teukolsky, S.; Vano Vinuales, A.

    2017-05-01

    Parameter estimates of GW150914 were obtained using Bayesian inference, based on three semi-analytic waveform models for binary black hole coalescences. These waveform models differ from each other in their treatment of black hole spins, and all three models make some simplifying assumptions, notably to neglect sub-dominant waveform harmonic modes and orbital eccentricity. Furthermore, while the models are calibrated to agree with waveforms obtained by full numerical solutions of Einstein’s equations, any such calibration is accurate only to some non-zero tolerance and is limited by the accuracy of the underlying phenomenology, availability, quality, and parameter-space coverage of numerical simulations. This paper complements the original analyses of GW150914 with an investigation of the effects of possible systematic errors in the waveform models on estimates of its source parameters. To test for systematic errors we repeat the original Bayesian analysis on mock signals from numerical simulations of a series of binary configurations with parameters similar to those found for GW150914. Overall, we find no evidence for a systematic bias relative to the statistical error of the original parameter recovery of GW150914 due to modeling approximations or modeling inaccuracies. However, parameter biases are found to occur for some configurations disfavored by the data of GW150914: for binaries inclined edge-on to the detector over a small range of choices of polarization angles, and also for eccentricities greater than  ˜0.05. For signals with higher signal-to-noise ratio than GW150914, or in other regions of the binary parameter space (lower masses, larger mass ratios, or higher spins), we expect that systematic errors in current waveform models may impact gravitational-wave measurements, making more accurate models desirable for future observations.

  10. Systematic U(1 ) B - L extensions of loop-induced neutrino mass models with dark matter

    Science.gov (United States)

    Ho, Shu-Yu; Toma, Takashi; Tsumura, Koji

    2016-08-01

    We study the gauged U(1 ) B - L extensions of the models for neutrino masses and dark matter. In this class of models, tiny neutrino masses are radiatively induced through the loop diagrams, while the origin of the dark matter stability is guaranteed by the remnant of the gauge symmetry. Depending on how the lepton number conservation is violated, these models are systematically classified. We present complete lists for the one-loop Z2 and the two-loop Z3 radiative seesaw models as examples of the classification. The anomaly cancellation conditions in these models are also discussed.

  11. Systematic comparison of trip distribution laws and models

    CERN Document Server

    Lenormand, Maxime; Ramasco, José J

    2016-01-01

    Trip distribution laws are basic for the travel demand characterization needed in transport and urban planning. Several approaches have been considered in the last years. One of them is the so-called gravity law, in which the number of trips is assumed to be related to the population at origin and destination and to decrease with the distance. The mathematical expression of this law resembles Newton's law of gravity, which explains its name. Another popular approach is inspired by the theory of intervening opportunities and it has been concreted into the so-called radiation models. Individuals are supposed to travel until they find a job opportunity, so the population and jobs spatial distributions naturally lead to a trip flow network. In this paper, we perform a thorough comparison between the gravity and the radiation approaches in their ability at estimating commuting flows. We test the gravity and the radiation laws against empirical trip data at different scales and coming from different countries. Diff...

  12. Venous thromboembolism in cancer patients: an underestimated major health problem.

    Science.gov (United States)

    Khalil, Jihane; Bensaid, Badr; Elkacemi, Hanan; Afif, Mohamed; Bensaid, Younes; Kebdani, Tayeb; Benjaafar, Noureddine

    2015-06-20

    Venous thromboembolism (VTE) is a major health problem among patients with cancer, its incidence in this particular population is widely increasing. Although VTE is associated with high rates of mortality and morbidity in cancer patients, its severity is still underestimated by many oncologists. Thromboprophylaxis of VTE now considered as a standard of care is still not prescribed in many institutions; the appropriate treatment of an established VTE is not yet well known by many physicians and nurses in the cancer field. Patients are also not well informed about VTE and its consequences. Many studies and meta-analyses have addressed this question so have many guidelines that dedicated a whole chapter to clarify and expose different treatment strategies adapted to this particular population. There is a general belief that the prevention and treatment of VTE cannot be optimized without a complete awareness by oncologists and patients. The aim of this article is to make VTE a more clear and understood subject.

  13. Underestimating our influence over others' unethical behavior and decisions.

    Science.gov (United States)

    Bohns, Vanessa K; Roghanizad, M Mahdi; Xu, Amy Z

    2014-03-01

    We examined the psychology of "instigators," people who surround an unethical act and influence the wrongdoer (the "actor") without directly committing the act themselves. In four studies, we found that instigators of unethical acts underestimated their influence over actors. In Studies 1 and 2, university students enlisted other students to commit a "white lie" (Study 1) or commit a small act of vandalism (Study 2) after making predictions about how easy it would be to get their fellow students to do so. In Studies 3 and 4, online samples of participants responded to hypothetical vignettes, for example, about buying children alcohol and taking office supplies home for personal use. In all four studies, instigators failed to recognize the social pressure they levied on actors through simple unethical suggestions, that is, the discomfort actors would experience by making a decision that was inconsistent with the instigator's suggestion.

  14. Morphology of rain water channelization in systematically varied model sandy soils

    OpenAIRE

    Wei, Y.; Cejas, C. M.; Barrois, R.; Dreyfus, R.; Durian, D. J.

    2014-01-01

    We visualize the formation of fingered flow in dry model sandy soils under different raining conditions using a quasi-2d experimental set-up, and systematically determine the impact of soil grain diameter and surface wetting property on water channelization phenomenon. The model sandy soils we use are random closely-packed glass beads with varied diameters and surface treatments. For hydrophilic sandy soils, our experiments show that rain water infiltrates into a shallow top layer of soil and...

  15. Developing population models: A systematic approach for pesticide risk assessment using herbaceous plants as an example.

    Science.gov (United States)

    Schmolke, Amelie; Kapo, Katherine E; Rueda-Cediel, Pamela; Thorbek, Pernille; Brain, Richard; Forbes, Valery

    2017-12-01

    Population models are used as tools in species management and conservation and are increasingly recognized as important tools in pesticide risk assessments. A wide variety of population model applications and resources on modeling techniques, evaluation and documentation can be found in the literature. In this paper, we add to these resources by introducing a systematic, transparent approach to developing population models. The decision guide that we propose is intended to help model developers systematically address data availability for their purpose and the steps that need to be taken in any model development. The resulting conceptual model includes the necessary complexity to address the model purpose on the basis of current understanding and available data. We provide specific guidance for the development of population models for herbaceous plant species in pesticide risk assessment and demonstrate the approach with an example of a conceptual model developed following the decision guide for herbicide risk assessment of Mead's milkweed (Asclepias meadii), a species listed as threatened under the US Endangered Species Act. The decision guide specific to herbaceous plants demonstrates the details, but the general approach can be adapted for other species groups and management objectives. Population models provide a tool to link population-level dynamics, species and habitat characteristics as well as information about stressors in a single approach. Developing such models in a systematic, transparent way will increase their applicability and credibility, reduce development efforts, and result in models that are readily available for use in species management and risk assessments. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Coronary calcium scores are systematically underestimated at a large chest size : A multivendor phantom study

    NARCIS (Netherlands)

    Willemink, Martin J.; Abramiuc, Bronislaw; den Harder, Annemarie M.; van der Werf, Niels R.; de Jong, Pim A.; Budde, Ricardo P. J.; Wildberger, Joachim E.; Vliegenthart, Rozemarijn; Willems, Tineke P.; Greuter, Marcel J. W.; Leiner, Tim

    2015-01-01

    Objective: To evaluate the effect of chest size on coronary calcium score (CCS) as assessed with new-generation CT systems from 4 major vendors. Methods: An anthropomorphic, small-sized (300 x 200 mm) chest phantom containing 100 small calcifications (diameters, 0.5-2.0 mm) was evaluated with and wi

  17. [Application analysis of Nursing Care Systematization according to Horta's Conceptual Model].

    Science.gov (United States)

    da Cunha, Sandra Maria Botelho; Barros, Alba Lúcia Botura Leite

    2005-01-01

    This study has as purpose to analyse the implementation of the Nursing Care Systematization in a private hospital in medical surgical units. Results evidenced that the Horta's Conceptual Model was present only in part of nursing hystory instrument, that the remaining phases of nursing process were not inter-related and that there was a lack of coherence of the prescribed actions in relation to the patient's health condition. From the results of the study it can be concluded that the model used for Nursing Care Systematization is eclectic, not obeying therefore, only to Horta's conceptual model; the totality of the data had not been collected in some phases of the nursing process; there is no correlation of the phases in the majority of analyzed patient records; diagnostic and planning phases do not comprise the phases of the nursing process as proposed by Horta.

  18. Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.

    Science.gov (United States)

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew

    2017-05-30

    Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.

  19. Underestimated role of the secondary electron emission in the space

    Science.gov (United States)

    Nemecek, Zdenek; Richterova, Ivana; Safrankova, Jana; Pavlu, Jiri; Vaverka, Jakub; Nouzak, Libor

    2016-07-01

    Secondary electron emission (SEE) is one of many processes that charges surfaces of bodies immersed into a plasma. Until present, a majority of considerations in theories and experiments is based on the sixty year old description of an interaction of planar metallic surfaces with electrons, thus the effects of a surface curvature, roughness, presence of clusters as well as an influence of the material conductance on different aspects of this interaction are neglected. Dust grains or their clusters can be frequently found in many space environments - interstellar clouds, atmospheres of planets, tails of comets or planetary rings are only typical examples. The grains are exposed to electrons of different energies and they can acquire positive or negative charge during this interaction. We review the progress in experimental investigations and computer simulations of the SEE from samples relevant to space that was achieved in course of the last decade. We present a systematic study of well-defined systems that starts from spherical grains of various diameters and materials, and it continues with clusters consisting of different numbers of small spherical grains that can be considered as examples of real irregularly shaped space grains. The charges acquired by investigated objects as well as their secondary emission yields are calculated using the SEE model. We show that (1) the charge and surface potential of clusters exposed to the electron beam are influenced by the number of grains and by their geometry within a particular cluster, (2) the model results are in an excellent agreement with the experiment, and (3) there is a large difference between charging of a cluster levitating in the free space and that attached to a planar surface. The calculation provides a reduction of the secondary electron emission yield of the surface covered by dust clusters by a factor up to 1.5 with respect to the yield of a smooth surface. (4) These results are applied on charging of

  20. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    Science.gov (United States)

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.

  1. Clinical information modeling processes for semantic interoperability of electronic health records: systematic review and inductive analysis.

    Science.gov (United States)

    Moreno-Conde, Alberto; Moner, David; Cruz, Wellington Dimas da; Santos, Marcelo R; Maldonado, José Alberto; Robles, Montserrat; Kalra, Dipak

    2015-07-01

    This systematic review aims to identify and compare the existing processes and methodologies that have been published in the literature for defining clinical information models (CIMs) that support the semantic interoperability of electronic health record (EHR) systems. Following the preferred reporting items for systematic reviews and meta-analyses systematic review methodology, the authors reviewed published papers between 2000 and 2013 that covered that semantic interoperability of EHRs, found by searching the PubMed, IEEE Xplore, and ScienceDirect databases. Additionally, after selection of a final group of articles, an inductive content analysis was done to summarize the steps and methodologies followed in order to build CIMs described in those articles. Three hundred and seventy-eight articles were screened and thirty six were selected for full review. The articles selected for full review were analyzed to extract relevant information for the analysis and characterized according to the steps the authors had followed for clinical information modeling. Most of the reviewed papers lack a detailed description of the modeling methodologies used to create CIMs. A representative example is the lack of description related to the definition of terminology bindings and the publication of the generated models. However, this systematic review confirms that most clinical information modeling activities follow very similar steps for the definition of CIMs. Having a robust and shared methodology could improve their correctness, reliability, and quality. Independently of implementation technologies and standards, it is possible to find common patterns in methods for developing CIMs, suggesting the viability of defining a unified good practice methodology to be used by any clinical information modeler. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Scaling analysis in modeling transport and reaction processes a systematic approach to model building and the art of approximation

    CERN Document Server

    Krantz, William B

    2007-01-01

    This book is unique as the first effort to expound on the subject of systematic scaling analysis. Not written for a specific discipline, the book targets any reader interested in transport phenomena and reaction processes. The book is logically divided into chapters on the use of systematic scaling analysis in fluid dynamics, heat transfer, mass transfer, and reaction processes. An integrating chapter is included that considers more complex problems involving combined transport phenomena. Each chapter includes several problems that are explained in considerable detail. These are followed by several worked examples for which the general outline for the scaling is given. Each chapter also includes many practice problems. This book is based on recognizing the value of systematic scaling analysis as a pedagogical method for teaching transport and reaction processes and as a research tool for developing and solving models and in designing experiments. Thus, the book can serve as both a textbook and a reference boo...

  3. Systematic Multi‐Scale Model Development Strategy for the Fragrance Spraying Process and Transport

    DEFF Research Database (Denmark)

    Heitzig, M.; Rong, Y.; Gregson, C.;

    2012-01-01

    The fast and efficient development and application of reliable models with appropriate degree of detail to predict the behavior of fragrance aerosols are challenging problems of high interest to the related industries. A generic modeling template for the systematic derivation of specific fragrance...... aerosol models is proposed. The main benefits of the fragrance spraying template are the speed‐up of the model development/derivation process, the increase in model quality, and the provision of structured domain knowledge where needed. The fragrance spraying template is integrated in a generic computer......‐aided modeling framework, which is structured based on workflows for different general modeling tasks. The benefits of the fragrance spraying template are highlighted by a case study related to the derivation of a fragrance aerosol model that is able to reflect measured dynamic droplet size distribution profiles...

  4. Procedure for the systematic orientation of digitised cranial models. Design and validation.

    Science.gov (United States)

    Bailo, M; Baena, S; Marín, J J; Arredondo, J M; Auría, J M; Sánchez, B; Tardío, E; Falcón, L

    2015-12-01

    Comparison of bony pieces requires that they are oriented systematically to ensure that homologous regions are compared. Few orientation methods are highly accurate; this is particularly true for methods applied to three-dimensional models obtained by surface scanning, a technique whose special features make it a powerful tool in forensic contexts. The aim of this study was to develop and evaluate a systematic, assisted orientation method for aligning three-dimensional cranial models relative to the Frankfurt Plane, which would be produce accurate orientations independent of operator and anthropological expertise. The study sample comprised four crania of known age and sex. All the crania were scanned and reconstructed using an Eva Artec™ portable 3D surface scanner and subsequently, the position of certain characteristic landmarks were determined by three different operators using the Rhinoceros 3D surface modelling software. Intra-observer analysis showed a tendency for orientation to be more accurate when using the assisted method than when using conventional manual orientation. Inter-observer analysis showed that experienced evaluators achieve results at least as accurate if not more accurate using the assisted method than those obtained using manual orientation; while inexperienced evaluators achieved more accurate orientation using the assisted method. The method tested is a an innovative system capable of providing very precise, systematic and automatised spatial orientations of virtual cranial models relative to standardised anatomical planes independent of the operator and operator experience.

  5. Application of a systematic finite-element model modification technique to dynamic analysis of structures

    Science.gov (United States)

    Robinson, J. C.

    1982-01-01

    A systematic finite-element model modification technique has been applied to two small problems and a model of the main wing box of a research drone aircraft. The procedure determines the sensitivity of the eigenvalues and eigenvector components to specific structural changes, calculates the required changes and modifies the finite-element model. Good results were obtained where large stiffness modifications were required to satisfy large eigenvalue changes. Sensitivity matrix conditioning problems required the development of techniques to insure existence of a solution and accelerate its convergence. A method is proposed to assist the analyst in selecting stiffness parameters for modification.

  6. An attempt to lower sources of systematic measurement error using Hierarchical Generalized Linear Modeling (HGLM).

    Science.gov (United States)

    Sideridis, George D; Tsaousis, Ioannis; Katsis, Athanasios

    2014-01-01

    The purpose of the present studies was to test the effects of systematic sources of measurement error on the parameter estimates of scales using the Rasch model. Studies 1 and 2 tested the effects of mood and affectivity. Study 3 evaluated the effects of fatigue. Last, studies 4 and 5 tested the effects of motivation on a number of parameters of the Rasch model (e.g., ability estimates). Results indicated that (a) the parameters of interest and the psychometric properties of the scales were substantially distorted in the presence of all systematic sources of error, and, (b) the use of HGLM provides a way of adjusting the parameter estimates in the presence of these sources of error. It is concluded that validity in measurement requires a thorough evaluation of potential sources of error and appropriate adjustments based on each occasion.

  7. Logarithmic discretization and systematic derivation of shell models in two-dimensional turbulence.

    Science.gov (United States)

    Gürcan, Ö D; Morel, P; Kobayashi, S; Singh, Rameswar; Xu, S; Diamond, P H

    2016-09-01

    A detailed systematic derivation of a logarithmically discretized model for two-dimensional turbulence is given, starting from the basic fluid equations and proceeding with a particular form of discretization of the wave-number space. We show that it is possible to keep all or a subset of the interactions, either local or disparate scale, and recover various limiting forms of shell models used in plasma and geophysical turbulence studies. The method makes no use of the conservation laws even though it respects the underlying conservation properties of the fluid equations. It gives a family of models ranging from shell models with nonlocal interactions to anisotropic shell models depending on the way the shells are constructed. Numerical integration of the model shows that energy and enstrophy equipartition seem to dominate over the dual cascade, which is a common problem of two-dimensional shell models.

  8. Does I-131-MIBG underestimate skeletal disease burden in neuroblastoma?

    Directory of Open Access Journals (Sweden)

    Barai Sukanta

    2004-10-01

    Full Text Available Background: Controversy persists as to the need for both MIBG and bone scanning in routine evaluation of neuroblastoma. Aim: To compare the efficacy of I-131- metaiodobenzylguanidine (MIBG scan against that of conventional Tc99m- methylene diphosphonate (MDP bone scan for the detection of skeletal deposition of neuroblastoma. Methods and Material: The study included 57 patients (36 boys, 21 girls: age range 1-14 years of neuroblastoma who underwent both bone scan with Tc99m-MDP and I-131-MIBG scan within 15 days of each other at presentation and during follow-up. Results: At presentation 11(19.2% patients had evidence of skeletal metastases on MDP scan against 7 patients who showed bony secondaries on MIBG scan. Of the 7 patients, with positive MIBG and MDP scans, MDP scan detected 11 sites whereas MIBG scan detected 7 sites. On follow-up study, 3 patients with initial abnormal MDP scan but normal MIBG scan, developed skeletal metastases detectable on MIBG scan, whereas 3 of the 46 patients who had normal MDP and MIBG scan at presentation; developed skeletal metastases detectable on MDP scan. MIBG scan was concordant in 2 of them but was normal in the third patient. Conclusion: I-131-MIBG underestimates skeletal disease burden in neuroblastoma. Therefore, Tc99m-MDP bone scan should remain a part of routine assessment of patients with neuroblastoma.

  9. Massive yet grossly underestimated global costs of invasive insects

    Science.gov (United States)

    Bradshaw, Corey J. A.; Leroy, Boris; Bellard, Céline; Roiz, David; Albert, Céline; Fournier, Alice; Barbet-Massin, Morgane; Salles, Jean-Michel; Simard, Frédéric; Courchamp, Franck

    2016-10-01

    Insects have presented human society with some of its greatest development challenges by spreading diseases, consuming crops and damaging infrastructure. Despite the massive human and financial toll of invasive insects, cost estimates of their impacts remain sporadic, spatially incomplete and of questionable quality. Here we compile a comprehensive database of economic costs of invasive insects. Taking all reported goods and service estimates, invasive insects cost a minimum of US$70.0 billion per year globally, while associated health costs exceed US$6.9 billion per year. Total costs rise as the number of estimate increases, although many of the worst costs have already been estimated (especially those related to human health). A lack of dedicated studies, especially for reproducible goods and service estimates, implies gross underestimation of global costs. Global warming as a consequence of climate change, rising human population densities and intensifying international trade will allow these costly insects to spread into new areas, but substantial savings could be achieved by increasing surveillance, containment and public awareness.

  10. Systematic Assessment of Neutron and Gamma Backgrounds Relevant to Operational Modeling and Detection Technology Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Daniel E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hornback, Donald Eric [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Jeffrey O. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Nicholson, Andrew D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Patton, Bruce W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peplow, Douglas E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Thomas Martin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ayaz-Maierhafer, Birsen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    This report summarizes the findings of a two year effort to systematically assess neutron and gamma backgrounds relevant to operational modeling and detection technology implementation. The first year effort focused on reviewing the origins of background sources and their impact on measured rates in operational scenarios of interest. The second year has focused on the assessment of detector and algorithm performance as they pertain to operational requirements against the various background sources and background levels.

  11. Confounding environmental colour and distribution shape leads to underestimation of population extinction risk.

    Directory of Open Access Journals (Sweden)

    Mike S Fowler

    Full Text Available The colour of environmental variability influences the size of population fluctuations when filtered through density dependent dynamics, driving extinction risk through dynamical resonance. Slow fluctuations (low frequencies dominate in red environments, rapid fluctuations (high frequencies in blue environments and white environments are purely random (no frequencies dominate. Two methods are commonly employed to generate the coloured spatial and/or temporal stochastic (environmental series used in combination with population (dynamical feedback models: autoregressive [AR(1] and sinusoidal (1/f models. We show that changing environmental colour from white to red with 1/f models, and from white to red or blue with AR(1 models, generates coloured environmental series that are not normally distributed at finite time-scales, potentially confounding comparison with normally distributed white noise models. Increasing variability of sample Skewness and Kurtosis and decreasing mean Kurtosis of these series alter the frequency distribution shape of the realised values of the coloured stochastic processes. These changes in distribution shape alter patterns in the probability of single and series of extreme conditions. We show that the reduced extinction risk for undercompensating (slow growing populations in red environments previously predicted with traditional 1/f methods is an artefact of changes in the distribution shapes of the environmental series. This is demonstrated by comparison with coloured series controlled to be normally distributed using spectral mimicry. Changes in the distribution shape that arise using traditional methods lead to underestimation of extinction risk in normally distributed, red 1/f environments. AR(1 methods also underestimate extinction risks in traditionally generated red environments. This work synthesises previous results and provides further insight into the processes driving extinction risk in model populations. We

  12. Oro-facial functions in experimental models of cerebral palsy: a systematic review.

    Science.gov (United States)

    Lacerda, D C; Ferraz-Pereira, K N; Bezerra de Morais, A T; Costa-de-Santana, B J R; Quevedo, O G; Manhães-de-Castro, R; Toscano, A E

    2017-04-01

    Children who suffer from cerebral palsy (CP) often present comorbidities in the form of oro-facial dysfunctions. Studies in animals have contributed to elaborate potential therapies aimed at minimising the chronic disability of the syndrome. To systematically review the scientific literature regarding the possible effects that experimental models of CP can have on oro-facial functions. Two independent authors conducted a systematic review in the electronic databases Medline, Scopus, CINAHL, Web of Science and Lilacs, using Mesh and Decs terms in animal models. The motor and sensory parameters of sucking, chewing and swallowing were considered as primary outcomes; reactivity odour, controlled salivation, postural control, head mobility during feeding and the animal's ability to acquire food were secondary outcomes. Ten studies were included in the present review. Most studies used rabbits as experimental models of CP, which was induced by either hypoxia-ischemia, inflammation or intraventricular haemorrhage. Oro-facial functions were altered in all experimental models of CP. However, we found more modifications in hypoxia-ischemia models overall. On the other hand, the model of inflammation was more effective to reproduce higher damage for coordinating sucking and swallowing. All of the CP experimental models that were assessed modified the oral functions in different animal species. However, further studies should be conducted in order to clarify the mechanisms underlying oro-facial damage in order to optimise treatment strategies for children who suffer from CP. © 2017 John Wiley & Sons Ltd.

  13. Decoding Beta-Decay Systematics: A Global Statistical Model for Beta^- Halflives

    CERN Document Server

    Costiris, N J; Gernoth, K A; Clark, J W

    2008-01-01

    Statistical modeling of nuclear data provides a novel approach to nuclear systematics complementary to established theoretical and phenomenological approaches based on quantum theory. Continuing previous studies in which global statistical modeling is pursued within the general framework of machine learning theory, we implement advances in training algorithms designed to improved generalization, in application to the problem of reproducing and predicting the halflives of nuclear ground states that decay 100% by the beta^- mode. More specifically, fully-connected, multilayer feedforward artificial neural network models are developed using the Levenberg-Marquardt optimization algorithm together with Bayesian regularization and cross-validation. The predictive performance of models emerging from extensive computer experiments is compared with that of traditional microscopic and phenomenological models as well as with the performance of other learning systems, including earlier neural network models as well as th...

  14. Current Developments in Dementia Risk Prediction Modelling: An Updated Systematic Review.

    Directory of Open Access Journals (Sweden)

    Eugene Y H Tang

    Full Text Available Accurate identification of individuals at high risk of dementia influences clinical care, inclusion criteria for clinical trials and development of preventative strategies. Numerous models have been developed for predicting dementia. To evaluate these models we undertook a systematic review in 2010 and updated this in 2014 due to the increase in research published in this area. Here we include a critique of the variables selected for inclusion and an assessment of model prognostic performance.Our previous systematic review was updated with a search from January 2009 to March 2014 in electronic databases (MEDLINE, Embase, Scopus, Web of Science. Articles examining risk of dementia in non-demented individuals and including measures of sensitivity, specificity or the area under the curve (AUC or c-statistic were included.In total, 1,234 articles were identified from the search; 21 articles met inclusion criteria. New developments in dementia risk prediction include the testing of non-APOE genes, use of non-traditional dementia risk factors, incorporation of diet, physical function and ethnicity, and model development in specific subgroups of the population including individuals with diabetes and those with different educational levels. Four models have been externally validated. Three studies considered time or cost implications of computing the model.There is no one model that is recommended for dementia risk prediction in population-based settings. Further, it is unlikely that one model will fit all. Consideration of the optimal features of new models should focus on methodology (setting/sample, model development and testing in a replication cohort and the acceptability and cost of attaining the risk variables included in the prediction score. Further work is required to validate existing models or develop new ones in different populations as well as determine the ethical implications of dementia risk prediction, before applying the particular

  15. A systematic approach for comparing modeled biospheric carbon fluxes across regional scales

    Directory of Open Access Journals (Sweden)

    D. N. Huntzinger

    2011-06-01

    Full Text Available Given the large differences between biospheric model estimates of regional carbon exchange, there is a need to understand and reconcile the predicted spatial variability of fluxes across models. This paper presents a set of quantitative tools that can be applied to systematically compare flux estimates despite the inherent differences in model formulation. The presented methods include variogram analysis, variable selection, and geostatistical regression. These methods are evaluated in terms of their ability to assess and identify differences in spatial variability in flux estimates across North America among a small subset of models, as well as differences in the environmental drivers that best explain the spatial variability of predicted fluxes. The examined models are the Simple Biosphere (SiB 3.0, Carnegie Ames Stanford Approach (CASA, and CASA coupled with the Global Fire Emissions Database (CASA GFEDv2, and the analyses are performed on model-predicted net ecosystem exchange, gross primary production, and ecosystem respiration. Variogram analysis reveals consistent seasonal differences in spatial variability among modeled fluxes at a 1° × 1° spatial resolution. However, significant differences are observed in the overall magnitude of the carbon flux spatial variability across models, in both net ecosystem exchange and component fluxes. Results of the variable selection and geostatistical regression analyses suggest fundamental differences between the models in terms of the factors that explain the spatial variability of predicted flux. For example, carbon flux is more strongly correlated with percent land cover in CASA GFEDv2 than in SiB or CASA. Some of the differences in spatial patterns of estimated flux can be linked back to differences in model formulation, and would have been difficult to identify simply by comparing net fluxes between models. Overall, the systematic approach presented here provides a set of tools for comparing

  16. Underestimated Amoebic Appendicitis among HIV-1-Infected Individuals in Japan

    Science.gov (United States)

    Kobayashi, Taiichiro; Yano, Hideaki; Murata, Yukinori; Igari, Toru; Nakada-Tsukui, Kumiko; Yagita, Kenji; Nozaki, Tomoyoshi; Kaku, Mitsuo; Tsukada, Kunihisa; Gatanaga, Hiroyuki; Kikuchi, Yoshimi; Oka, Shinichi

    2016-01-01

    ABSTRACT Entamoeba histolytica is not a common causative agent of acute appendicitis. However, amoebic appendicitis can sometimes be severe and life threatening, mainly due to a lack of awareness. Also, its frequency, clinical features, and pathogenesis remain unclear. The study subjects were HIV-1-infected individuals who presented with acute appendicitis and later underwent appendectomy at our hospital between 1996 and 2014. Formalin-fixed paraffin-embedded preserved appendix specimens were reexamined by periodic acid-Schiff (PAS) staining and PCR to identify undiagnosed amoebic appendicitis. Appendectomies were performed in 57 patients with acute appendicitis. The seroprevalence of E. histolytica was 33% (14/43) from the available stored sera. Based on the medical records, only 3 cases were clinically diagnosed as amoebic appendicitis, including 2 diagnosed at the time of appendectomy and 1 case diagnosed by rereview of the appendix after the development of postoperative complications. Retrospective analyses using PAS staining and PCR identified 3 and 3 more cases, respectively. Thus, E. histolytica infection was confirmed in 9 cases (15.8%) in the present study. Apart from a significantly higher leukocyte count in E. histolytica-positive patients than in negative patients (median, 13,760 versus 10,385 cells/μl, respectively, P = 0.02), there were no other differences in the clinical features of the PCR-positive and -negative groups. In conclusion, E. histolytica infection was confirmed in 9 (15.8%) of the appendicitis cases. However, only 3, including one diagnosed after intestinal perforation, were diagnosed before the present analyses. These results strongly suggest there is frequently a failure to detect trophozoites in routine examination, resulting in an underestimation of the incidence of amoebic appendicitis. PMID:27847377

  17. THE TEXTBOOK AS A PRODUCT OF SCHOOL GEOGRAPHY: underestimated work?

    Directory of Open Access Journals (Sweden)

    José Eustáquio de Sene

    2014-01-01

    Full Text Available ABSTRACT: This article will address the textbook as a specific cultural production of school disciplines having as reference the theoretical debate that opposed the conceptions of "didactic transposition" (CHEVALLARD, 1997 and "school culture" (CHERVEL, 1990. Based on this debate, characteristic of the curriculum field, this article aims to understand why, historically, the textbook has been underestimated and even considered a "less important work” within the limits of the academy (BITTENCOURT, 2004. The examples used will always be of the Geography discipline – both school and academic, as well as the relations between this two fields – having in mind their "multiplicity of paradigms" (LESTEGÁS, 2002. The analysis will also take into account the historic process of institutionalization of academic Geography based on "Layton’s stages" (GOODSON, 2005. RESUMO: Este artigo abordará o livro didático como uma produção cultural específica das disciplinas escolares tendo como referência o debate teórico que opõem as concepções de “transposição didática” (CHEVALLARD, 1997 e de “cultura escolar” (CHERVEL, 1990. Com base em tal debate, próprio do campo curricular, procurará compreender porque historicamente o livro didático tem sido pouco valorizado e até mesmo considerado uma “obra menor” nos limites da academia (BITTENCOURT, 2004. Os exemplos utilizados serão sempre da disciplina Geografia – tanto a escolar quanto a acadêmica, assim como das relações entre ambas – tendo em vista sua “multiplicidade de paradigmas” (LESTEGÁS, 2002. A análise também levará em conta o histórico processo de institucionalização da Geografia acadêmica com base nos “estágios de Layton” (GOODSON, 2005.

  18. Underestimated Amoebic Appendicitis among HIV-1-Infected Individuals in Japan.

    Science.gov (United States)

    Kobayashi, Taiichiro; Watanabe, Koji; Yano, Hideaki; Murata, Yukinori; Igari, Toru; Nakada-Tsukui, Kumiko; Yagita, Kenji; Nozaki, Tomoyoshi; Kaku, Mitsuo; Tsukada, Kunihisa; Gatanaga, Hiroyuki; Kikuchi, Yoshimi; Oka, Shinichi

    2017-01-01

    Entamoeba histolytica is not a common causative agent of acute appendicitis. However, amoebic appendicitis can sometimes be severe and life threatening, mainly due to a lack of awareness. Also, its frequency, clinical features, and pathogenesis remain unclear. The study subjects were HIV-1-infected individuals who presented with acute appendicitis and later underwent appendectomy at our hospital between 1996 and 2014. Formalin-fixed paraffin-embedded preserved appendix specimens were reexamined by periodic acid-Schiff (PAS) staining and PCR to identify undiagnosed amoebic appendicitis. Appendectomies were performed in 57 patients with acute appendicitis. The seroprevalence of E. histolytica was 33% (14/43) from the available stored sera. Based on the medical records, only 3 cases were clinically diagnosed as amoebic appendicitis, including 2 diagnosed at the time of appendectomy and 1 case diagnosed by rereview of the appendix after the development of postoperative complications. Retrospective analyses using PAS staining and PCR identified 3 and 3 more cases, respectively. Thus, E. histolytica infection was confirmed in 9 cases (15.8%) in the present study. Apart from a significantly higher leukocyte count in E. histolytica-positive patients than in negative patients (median, 13,760 versus 10,385 cells/μl, respectively, P = 0.02), there were no other differences in the clinical features of the PCR-positive and -negative groups. In conclusion, E. histolytica infection was confirmed in 9 (15.8%) of the appendicitis cases. However, only 3, including one diagnosed after intestinal perforation, were diagnosed before the present analyses. These results strongly suggest there is frequently a failure to detect trophozoites in routine examination, resulting in an underestimation of the incidence of amoebic appendicitis. Copyright © 2016 Kobayashi et al.

  19. Body Mass Index Underestimates Adiposity in Persons With Multiple Sclerosis.

    Science.gov (United States)

    Pilutti, Lara A; Motl, Robert W

    2016-03-01

    To examine the relation between body mass index (BMI) and adiposity assessed by dual-energy x-ray absorptiometry in persons with multiple sclerosis (MS) and non-MS controls as well as to determine the accuracy of standard and alternate BMI thresholds for obesity. Cross-sectional. University research laboratory. The sample included persons with MS (n=235) and controls (n=53) (N=288). Not applicable. Main outcome measures included BMI, whole body soft tissue composition (ie, percent body fat [%BF], fat mass, and lean soft tissue mass), bone mineral content, and bone mineral density. We observed significant strong associations between BMI and sex-specific %BF in persons with MS and non-MS controls, and BMI explained ∼40% of the variance in %BF in both MS and control samples. Receiver operating characteristic curve analyses indicated that the standard BMI threshold for obesity (ie, 30kg/m(2)) had excellent specificity (93%-100%) but poor sensitivity (37%-44%) in persons with MS and non-MS controls. The BMI threshold that best identified %BF-defined obesity was 24.7kg/m(2) in the MS sample and 25.1kg/m(2) in the control sample. We determined a strong association between BMI and adiposity; however, the current BMI threshold for classifying obesity underestimates true adiposity in persons with MS. A similar relation was observed between BMI and obesity in non-MS controls. The non-MS sample included primarily middle-aged women, and similar BMI-%BF misclassifications have been reported in these samples. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  20. A systematic evaluation of a multidisciplinary social work-lawyer elder mistreatment intervention model.

    Science.gov (United States)

    Rizzo, Victoria M; Burnes, David; Chalfy, Amy

    2015-01-01

    This study introduces a conceptually based, systematic evaluation process employing multivariate techniques to evaluate a multidisciplinary social work-lawyer intervention model (JASA-LEAP). Logistic regression analyses were used with a random sample of case records (n = 250) from three intervention sites. Client retention, program fidelity, and exposure to multidisciplinary services were significantly related to reduction in mistreatment risk at case closure. Female gender, married status, and living with perpetrator significantly predicted unfavorable outcomes. This study extends the elder mistreatment program evaluation literature beyond descriptive/bivariate evaluation strategies. Findings suggest that a multidisciplinary social work-lawyer elder mistreatment intervention model is a successful approach.

  1. Microscopic Calibration and Validation of Car-Following Models -- A Systematic Approach

    CERN Document Server

    Treiber, Martin

    2014-01-01

    Calibration and validation techniques are crucial in assessing the descriptive and predictive power of car-following models and their suitability for analyzing traffic flow. Using real and generated floating-car and trajectory data, we systematically investigate following aspects: Data requirements and preparation, conceptional approach including local maximum-likelihood and global LSE calibration with several objective functions, influence of the data sampling rate and measuring errors, the effect of data smoothing on the calibration result, and model performance in terms of fitting quality, robustness, parameter orthogonality, completeness and plausible parameter values.

  2. Economic Evaluations of Multicomponent Disease Management Programs with Markov Models: A Systematic Review.

    Science.gov (United States)

    Kirsch, Florian

    2016-12-01

    Disease management programs (DMPs) for chronic diseases are being increasingly implemented worldwide. To present a systematic overview of the economic effects of DMPs with Markov models. The quality of the models is assessed, the method by which the DMP intervention is incorporated into the model is examined, and the differences in the structure and data used in the models are considered. A literature search was conducted; the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement was followed to ensure systematic selection of the articles. Study characteristics e.g. results, the intensity of the DMP and usual care, model design, time horizon, discount rates, utility measures, and cost-of-illness were extracted from the reviewed studies. Model quality was assessed by two researchers with two different appraisals: one proposed by Philips et al. (Good practice guidelines for decision-analytic modelling in health technology assessment: a review and consolidation of quality asessment. Pharmacoeconomics 2006;24:355-71) and the other proposed by Caro et al. (Questionnaire to assess relevance and credibility of modeling studies for informing health care decision making: an ISPOR-AMCP-NPC Good Practice Task Force report. Value Health 2014;17:174-82). A total of 16 studies (9 on chronic heart disease, 2 on asthma, and 5 on diabetes) met the inclusion criteria. Five studies reported cost savings and 11 studies reported additional costs. In the quality, the overall score of the models ranged from 39% to 65%, it ranged from 34% to 52%. Eleven models integrated effectiveness derived from a clinical trial or a meta-analysis of complete DMPs and only five models combined intervention effects from different sources into a DMP. The main limitations of the models are bad reporting practice and the variation in the selection of input parameters. Eleven of the 14 studies reported cost-effectiveness results of less than $30,000 per quality-adjusted life-year and

  3. Using Islands to Systematically Compare Satellite Observations to Models and Theory

    Science.gov (United States)

    Sherwood, S. C.; Robinson, F.; Gerstle, D.; Liu, C.; Kirshbaum, D. J.; Hernandez-Deckers, D.; Li, Y.

    2012-12-01

    Satellite observations are our most voluminous, and perhaps most important source of information on atmospheric convective behavior. However testing models is quite difficult, especially with satellites in low Earth orbits, due to several problems including infrequent sampling, the chaotic nature of convection (which means actual storms will always differ from modeled ones even with perfect models), model initialization, and uncertain boundary conditions. This talk presents work using forcing by islands of different sizes as a strategy for overcoming these problems. We examine the systematic dependence of different characteristics of convection with island size, as a target for simple theories of convection and the sea breeze, and for CRMs (cloud resolving models). We find some nonintuitive trends of behavior with size -- some of which we can reproduce with the WRF CRM, and some which we cannot.

  4. Measuring and modelling the effects of systematic non-adherence to mass drug administration.

    Science.gov (United States)

    Dyson, Louise; Stolk, Wilma A; Farrell, Sam H; Hollingsworth, T Déirdre

    2017-03-01

    It is well understood that the success or failure of a mass drug administration campaign critically depends on the level of coverage achieved. To that end coverage levels are often closely scrutinised during campaigns and the response to underperforming campaigns is to attempt to improve coverage. Modelling work has indicated, however, that the quality of the coverage achieved may also have a significant impact on the outcome. If the coverage achieved is likely to miss similar people every round then this can have a serious detrimental effect on the campaign outcome. We begin by reviewing the current modelling descriptions of this effect and introduce a new modelling framework that can be used to simulate a given level of systematic non-adherence. We formalise the likelihood that people may miss several rounds of treatment using the correlation in the attendance of different rounds. Using two very simplified models of the infection of helminths and non-helminths, respectively, we demonstrate that the modelling description used and the correlation included between treatment rounds can have a profound effect on the time to elimination of disease in a population. It is therefore clear that more detailed coverage data is required to accurately predict the time to disease elimination. We review published coverage data in which individuals are asked how many previous rounds they have attended, and show how this information may be used to assess the level of systematic non-adherence. We note that while the coverages in the data found range from 40.5% to 95.5%, still the correlations found lie in a fairly narrow range (between 0.2806 and 0.5351). This indicates that the level of systematic non-adherence may be similar even in data from different years, countries, diseases and administered drugs. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Eating disorders among fashion models: a systematic review of the literature.

    Science.gov (United States)

    Zancu, Simona Alexandra; Enea, Violeta

    2016-06-02

    In the light of recent concerns regarding the eating disorders among fashion models and professional regulations of fashion model occupation, an examination of the scientific evidence on this issue is necessary. The article reviews findings on the prevalence of eating disorders and body image concerns among professional fashion models. A systematic literature search was conducted using ProQUEST, EBSCO, PsycINFO, SCOPUS, and Gale Canage electronic databases. A very low number of studies conducted on fashion models and eating disorders resulted between 1980 and 2015, with seven articles included in this review. Overall, results of these studies do not indicate a higher prevalence of eating disorders among fashion models compared to non-models. Fashion models have a positive body image and generally do not report more dysfunctional eating behaviors than controls. However, fashion models are on average slightly underweight with significantly lower BMI than controls, and give higher importance to appearance and thin body shape, and thus have a higher prevalence of partial-syndrome eating disorders than controls. Despite public concerns, research on eating disorders among professional fashion models is extremely scarce and results cannot be generalized to all models. The existing research fails to clarify the matter of eating disorders among fashion models and given the small number of studies, further research is needed.

  6. [Overweight and obesity in preschool children: an underestimated problem?].

    Science.gov (United States)

    Bielecka-Jasiocha, Joanna; Majcher, Anna; Pyrzak, Beata; Janczarska, Danuta; Rumińska, Małgorzata

    2009-01-01

    The spread of overweight and obesity is alarming in the face of metabolic syndrome development and its consequences. As obesity becomes a social norm, a lack of adequate attention seems to be noticed. In the development of obesity special attention is focused on preschool and pubertal periods, as they are considered as critical in the development of obesity and its persistence into adulthood. We have analyzed anthropological parameters of 302 overweight and obese children, patients of the Department of Pediatrics and Endocrinology between 2004-2007. Children were at the age from 1.5 y to 18.25 y. Overweight was diagnosed when BMI > or =1 SDS, obesity when BMI > or =2 SDS. 77% of boys and 86% of girls were obese. The mean value of BMI, expressed as SDS BMI, was +4.3 SDS (girls) and +4.5 SDS (boys) in children under 6 yrs, +3.03 SDS (girls) and +2.95 SDS (boys) in children between 6-14 yrs, +3.95 SDS (girls) and +4.08 (boys) in children above 14 yrs. The youngest group (i.e. under 6 yrs), although comparatively most obese, was sparse: 7% of all girls and 5.6% of all boys. The oldest group (i.e.above 14 yrs) was plentiful (45.6% of all girls and 27.8% of all boys) and comparatively very obese. Data of parents' weight status were completed in 56% of cases: 31.2% of mothers and 41.5% of fathers were overweight, 33.3% of mothers and 50.8% of fathers were obese. These observations can suggest that overweight and obesity can be underestimated and/or ignored/ disregarded in preschool and pubertal children. It seems to be alarming as these two periods of life are critical in the development of obesity. Special attention should be applied in the field of prevention of obesity, especially in younger children, and early identification of overweight small children and/ or children at risk of obesity.

  7. METABOLIC ACIDOSIS--AN UNDERESTIMATED PROBLEM AFTER KIDNEY TRANSPLANTATION?.

    Science.gov (United States)

    Katalinić, Lea; Blaslov, Kristina; Đanić-Hadžibegović, Ana; Gellineo, Lana; Kes, Petar; Jelaković, Bojan; Basić-Jukić, Nikolina

    2015-12-01

    and lower calcium levels. Nevertheless, metabolic acidosis still stays a highly underestimated problem among nephrologists dealing with transplant recipients. We suggest regular determination of the acid-base status in renal transplant recipients.

  8. Asymmetries of poverty: why global burden of disease valuations underestimate the burden of neglected tropical diseases.

    Directory of Open Access Journals (Sweden)

    Charles H King

    Full Text Available The disability-adjusted life year (DALY initially appeared attractive as a health metric in the Global Burden of Disease (GBD program, as it purports to be a comprehensive health assessment that encompassed premature mortality, morbidity, impairment, and disability. It was originally thought that the DALY would be useful in policy settings, reflecting normative valuations as a standardized unit of ill health. However, the design of the DALY and its use in policy estimates contain inherent flaws that result in systematic undervaluation of the importance of chronic diseases, such as many of the neglected tropical diseases (NTDs, in world health. The conceptual design of the DALY comes out of a perspective largely focused on the individual risk rather than the ecology of disease, thus failing to acknowledge the implications of context on the burden of disease for the poor. It is nonrepresentative of the impact of poverty on disability, which results in the significant underestimation of disability weights for chronic diseases such as the NTDs. Finally, the application of the DALY in policy estimates does not account for the nonlinear effects of poverty in the cost-utility analysis of disease control, effectively discounting the utility of comprehensively treating NTDs. The present DALY framework needs to be substantially revised if the GBD is to become a valid and useful system for determining health priorities.

  9. Systematic coarse-grained modeling of complexation between small interfering RNA and polycations

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Zonghui [Graduate Program in Applied Physics, Northwestern University, Evanston, Illinois 60208 (United States); Luijten, Erik, E-mail: luijten@northwestern.edu [Graduate Program in Applied Physics, Northwestern University, Evanston, Illinois 60208 (United States); Department of Materials Science and Engineering, Northwestern University, Evanston, Illinois 60208 (United States); Department of Engineering Sciences and Applied Mathematics, Northwestern University, Evanston, Illinois 60208 (United States); Department of Physics and Astronomy, Northwestern University, Evanston, Illinois 60208 (United States)

    2015-12-28

    All-atom molecular dynamics simulations can provide insight into the properties of polymeric gene-delivery carriers by elucidating their interactions and detailed binding patterns with nucleic acids. However, to explore nanoparticle formation through complexation of these polymers and nucleic acids and study their behavior at experimentally relevant time and length scales, a reliable coarse-grained model is needed. Here, we systematically develop such a model for the complexation of small interfering RNA (siRNA) and grafted polyethyleneimine copolymers, a promising candidate for siRNA delivery. We compare the predictions of this model with all-atom simulations and demonstrate that it is capable of reproducing detailed binding patterns, charge characteristics, and water release kinetics. Since the coarse-grained model accelerates the simulations by one to two orders of magnitude, it will make it possible to quantitatively investigate nanoparticle formation involving multiple siRNA molecules and cationic copolymers.

  10. Systematic review and retrospective validation of prediction models for weight loss after bariatric surgery.

    Science.gov (United States)

    Sharples, Alistair J; Mahawar, Kamal; Cheruvu, Chandra V N

    2017-08-12

    Patients often have less than realistic expectations of the weight loss they are likely to achieve after bariatric surgery. It would be useful to have a well-validated prediction tool that could give patients a realistic estimate of their expected weight loss. To perform a systematic review of the literature to identify existing prediction models and attempt to validate these models. University hospital, United Kingdom. A systematic review was performed. All English language studies were included if they used data to create a prediction model for postoperative weight loss after bariatric surgery. These models were then tested on patients undergoing bariatric surgery between January 1, 2013 and December 31, 2014 within our unit. An initial literature search produced 446 results, of which only 4 were included in the final review. Our study population included 317 patients. Mean preoperative body mass index was 46.1 ± 7.1. For 257 (81.1%) patients, 12-month follow-up was available, and mean body mass index and percentage excess weight loss at 12 months was 33.0 ± 6.7 and 66.1% ± 23.7%, respectively. All 4 of the prediction models significantly overestimated the amount of weight loss achieved by patients. The best performing prediction model in our series produced a correlation coefficient (R(2)) of .61 and an area under the curve of .71 on receiver operating curve analysis. All prediction models overestimated weight loss after bariatric surgery in our cohort. There is a need to develop better procedures and patient-specific models for better patient counselling. Copyright © 2017 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  11. In vitro biofilm models to study dental caries: a systematic review.

    Science.gov (United States)

    Maske, T T; van de Sande, F H; Arthur, R A; Huysmans, M C D N J M; Cenci, M S

    2017-09-01

    The aim of this systematic review is to characterize and discuss key methodological aspects of in vitro biofilm models for caries-related research and to verify the reproducibility and dose-response of models considering the response to anti-caries and/or antimicrobial substances. Inclusion criteria were divided into Part I (PI): an in vitro biofilm model that produces a cariogenic biofilm and/or caries-like lesions and allows pH fluctuations; and Part II (PII): models showing an effect of anti-caries and/or antimicrobial substances. Within PI, 72.9% consisted of dynamic biofilm models, while 27.1% consisted of batch models. Within PII, 75.5% corresponded to dynamic models, whereas 24.5% corresponded to batch models. Respectively, 20.4 and 14.3% of the studies reported dose-response validations and reproducibility, and 32.7% were classified as having a high risk of bias. Several in vitro biofilm models are available for caries-related research; however, most models lack validation by dose-response and reproducibility experiments for each proposed protocol.

  12. Equation-free analysis of agent-based models and systematic parameter determination

    Science.gov (United States)

    Thomas, Spencer A.; Lloyd, David J. B.; Skeldon, Anne C.

    2016-12-01

    Agent based models (ABM)s are increasingly used in social science, economics, mathematics, biology and computer science to describe time dependent systems in circumstances where a description in terms of equations is difficult. Yet few tools are currently available for the systematic analysis of ABM behaviour. Numerical continuation and bifurcation analysis is a well-established tool for the study of deterministic systems. Recently, equation-free (EF) methods have been developed to extend numerical continuation techniques to systems where the dynamics are described at a microscopic scale and continuation of a macroscopic property of the system is considered. To date, the practical use of EF methods has been limited by; (1) the over-head of application-specific implementation; (2) the laborious configuration of problem-specific parameters; and (3) large ensemble sizes (potentially) leading to computationally restrictive run-times. In this paper we address these issues with our tool for the EF continuation of stochastic systems, which includes algorithms to systematically configuration problem specific parameters and enhance robustness to noise. Our tool is generic and can be applied to any 'black-box' simulator and determines the essential EF parameters prior to EF analysis. Robustness is significantly improved using our convergence-constraint with a corrector-repeat (C3R) method. This algorithm automatically detects outliers based on the dynamics of the underlying system enabling both an order of magnitude reduction in ensemble size and continuation of systems at much higher levels of noise than classical approaches. We demonstrate our method with application to several ABM models, revealing parameter dependence, bifurcation and stability analysis of these complex systems giving a deep understanding of the dynamical behaviour of the models in a way that is not otherwise easily obtainable. In each case we demonstrate our systematic parameter determination stage for

  13. A systematic literature review of open source software quality assessment models.

    Science.gov (United States)

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  14. Comparison of two stochastic techniques for reliable urban runoff prediction by modeling systematic errors

    DEFF Research Database (Denmark)

    Del Giudice, Dario; Löwe, Roland; Madsen, Henrik;

    2015-01-01

    provide probabilistic predictions of wastewater discharge in a similarly reliable way, both for periods ranging from a few hours up to more than 1 week ahead of time. The EBD produces more accurate predictions on long horizons but relies on computationally heavy MCMC routines for parameter inferences......In urban rainfall-runoff, commonly applied statistical techniques for uncertainty quantification mostly ignore systematic output errors originating from simplified models and erroneous inputs. Consequently, the resulting predictive uncertainty is often unreliable. Our objective is to present two...

  15. An evaluation of a model for the systematic documentation of hospital based health promotion activities: results from a multicentre study

    DEFF Research Database (Denmark)

    Tønnesen, Hanne; Christensen, Mette E; Groene, Oliver;

    2007-01-01

    The first step of handling health promotion (HP) in Diagnosis Related Groups (DRGs) is a systematic documentation and registration of the activities in the medical records. So far the possibility and tradition for systematic registration of clinical HP activities in the medical records and in pat......The first step of handling health promotion (HP) in Diagnosis Related Groups (DRGs) is a systematic documentation and registration of the activities in the medical records. So far the possibility and tradition for systematic registration of clinical HP activities in the medical records...... of two parts; first part includes motivational counselling (7 codes) and the second part comprehends intervention, rehabilitation and after treatment (8 codes).The objective was to evaluate in an international study the usefulness, applicability and sufficiency of a simple model for the systematic...

  16. Peak Vertical Ground Reaction Force during Two-Leg Landing: A Systematic Review and Mathematical Modeling

    Directory of Open Access Journals (Sweden)

    Wenxin Niu

    2014-01-01

    Full Text Available Objectives. (1 To systematically review peak vertical ground reaction force (PvGRF during two-leg drop landing from specific drop height (DH, (2 to construct a mathematical model describing correlations between PvGRF and DH, and (3 to analyze the effects of some factors on the pooled PvGRF regardless of DH. Methods. A computerized bibliographical search was conducted to extract PvGRF data on a single foot when participants landed with both feet from various DHs. An innovative mathematical model was constructed to analyze effects of gender, landing type, shoes, ankle stabilizers, surface stiffness and sample frequency on PvGRF based on the pooled data. Results. Pooled PvGRF and DH data of 26 articles showed that the square root function fits their relationship well. An experimental validation was also done on the regression equation for the medicum frequency. The PvGRF was not significantly affected by surface stiffness, but was significantly higher in men than women, the platform than suspended landing, the barefoot than shod condition, and ankle stabilizer than control condition, and higher than lower frequencies. Conclusions. The PvGRF and root DH showed a linear relationship. The mathematical modeling method with systematic review is helpful to analyze the influence factors during landing movement without considering DH.

  17. Systematic problems with using dark matter simulations to model stellar halos

    Energy Technology Data Exchange (ETDEWEB)

    Bailin, Jeremy [Department of Physics and Astronomy, University of Alabama, Box 870324, Tuscaloosa, AL 35487-0324 (United States); Bell, Eric F.; Valluri, Monica [Department of Astronomy, University of Michigan, 830 Dennison Building, 500 Church Street, Ann Arbor, MI 48109 (United States); Stinson, Greg S. [Max-Planck-Institut für Astronomie (MPIA), Königstuhl 17, D-69117 Heidelberg (Germany); Debattista, Victor P. [Jeremiah Horrocks Institute, University of Central Lancashire, Preston PR1 2HE (United Kingdom); Couchman, H. M. P.; Wadsley, James, E-mail: jbailin@ua.edu [Department of Physics and Astronomy, McMaster University, 1280 Main Street West, Hamilton, ON L8S 4M1 (Canada)

    2014-03-10

    The limits of available computing power have forced models for the structure of stellar halos to adopt one or both of the following simplifying assumptions: (1) stellar mass can be 'painted' onto dark matter (DM) particles in progenitor satellites; (2) pure DM simulations that do not form a luminous galaxy can be used. We estimate the magnitude of the systematic errors introduced by these assumptions using a controlled set of stellar halo models where we independently vary whether we look at star particles or painted DM particles, and whether we use a simulation in which a baryonic disk galaxy forms or a matching pure DM simulation that does not form a baryonic disk. We find that the 'painting' simplification reduces the halo concentration and internal structure, predominantly because painted DM particles have different kinematics from star particles even when both are buried deep in the potential well of the satellite. The simplification of using pure DM simulations reduces the concentration further, but increases the internal structure, and results in a more prolate stellar halo. These differences can be a factor of 1.5-7 in concentration (as measured by the half-mass radius) and 2-7 in internal density structure. Given this level of systematic uncertainty, one should be wary of overinterpreting differences between observations and the current generation of stellar halo models based on DM-only simulations when such differences are less than an order of magnitude.

  18. Systematic model for lean product development implementation in an automotive related company

    Directory of Open Access Journals (Sweden)

    Daniel Osezua Aikhuele

    2017-07-01

    Full Text Available Lean product development is a major innovative business strategy that employs sets of practices to achieve an efficient, innovative and a sustainable product development. Despite the many benefits and high hopes in the lean strategy, many companies are still struggling, and unable to either achieve or sustain substantial positive results with their lean implementation efforts. However, as the first step towards addressing this issue, this paper seeks to propose a systematic model that considers the administrative and implementation limitations of lean thinking practices in the product development process. The model which is based on the integration of fuzzy Shannon’s entropy and Modified Technique for Order Preference by Similarity to the Ideal Solution (M-TOPSIS model for the lean product development practices implementation with respective to different criteria including management and leadership, financial capabilities, skills and expertise and organization culture, provides a guide or roadmap for product development managers on the lean implementation route.

  19. Systematic spectral analysis of GX 339-4: influence of Galactic background and reflection models

    CERN Document Server

    Clavel, M; Corbel, S; Coriat, M

    2016-01-01

    Black hole X-ray binaries display large outbursts, during which their properties are strongly variable. We develop a systematic spectral analysis of the 3-40 keV RXTE/PCA data in order to study the evolution of these systems and apply it to GX 339-4. Using the low count rate observations, we provide a precise model of the Galactic background at GX 339-4's location and discuss its possible impact on the source spectral parameters. At higher fluxes, the use of a Gaussian line to model the reflection component can lead to the detection of a high-temperature disk, in particular in the high-hard state. We demonstrate that this component is an artifact arising from an incomplete modeling of the reflection spectrum.

  20. Systematic Features of Axisymmetric Neutrino-Driven Core-Collapse Supernova Models in Multiple Progenitors

    CERN Document Server

    Nakamura, Ko; Kuroda, Takami; Kotake, Kei

    2014-01-01

    We present an overview of axisymmetric core-collapse supernova simulations employing neutrino transport scheme by the isotropic diffusion source approximation. Studying 101 solar-metallicity progenitors covering zero-age main sequence mass from 10.8 to 75.0 solar masses, we systematically investigate how the differences in the structures of these multiple progenitors impact the hydrodynamics evolution. By following a long-term evolution over 1.0 s after bounce, most of the computed models exhibit neutrino-driven revival of the stalled bounce shock at about 200 - 800 ms postbounce, leading to the possibility of explosion. Pushing the boundaries of expectations in previous one-dimensional studies, our results show that the time of shock revival, evolution of shock radii, and diagnostic explosion energies are tightly correlated with the compactness parameter xi which characterizes the structure of the progenitors. Compared to models with low xi, models with high xi undergo high ram pressure from the accreting ma...

  1. Dynamic epidemiological models for dengue transmission: a systematic review of structural approaches.

    Science.gov (United States)

    Andraud, Mathieu; Hens, Niel; Marais, Christiaan; Beutels, Philippe

    2012-01-01

    Dengue is a vector-borne disease recognized as the major arbovirose with four immunologically distant dengue serotypes coexisting in many endemic areas. Several mathematical models have been developed to understand the transmission dynamics of dengue, including the role of cross-reactive antibodies for the four different dengue serotypes. We aimed to review deterministic models of dengue transmission, in order to summarize the evolution of insights for, and provided by, such models, and to identify important characteristics for future model development. We identified relevant publications using PubMed and ISI Web of Knowledge, focusing on mathematical deterministic models of dengue transmission. Model assumptions were systematically extracted from each reviewed model structure, and were linked with their underlying epidemiological concepts. After defining common terms in vector-borne disease modelling, we generally categorised fourty-two published models of interest into single serotype and multiserotype models. The multi-serotype models assumed either vector-host or direct host-to-host transmission (ignoring the vector component). For each approach, we discussed the underlying structural and parameter assumptions, threshold behaviour and the projected impact of interventions. In view of the expected availability of dengue vaccines, modelling approaches will increasingly focus on the effectiveness and cost-effectiveness of vaccination options. For this purpose, the level of representation of the vector and host populations seems pivotal. Since vector-host transmission models would be required for projections of combined vaccination and vector control interventions, we advocate their use as most relevant to advice health policy in the future. The limited understanding of the factors which influence dengue transmission as well as limited data availability remain important concerns when applying dengue models to real-world decision problems.

  2. Dynamic epidemiological models for dengue transmission: a systematic review of structural approaches.

    Directory of Open Access Journals (Sweden)

    Mathieu Andraud

    Full Text Available Dengue is a vector-borne disease recognized as the major arbovirose with four immunologically distant dengue serotypes coexisting in many endemic areas. Several mathematical models have been developed to understand the transmission dynamics of dengue, including the role of cross-reactive antibodies for the four different dengue serotypes. We aimed to review deterministic models of dengue transmission, in order to summarize the evolution of insights for, and provided by, such models, and to identify important characteristics for future model development. We identified relevant publications using PubMed and ISI Web of Knowledge, focusing on mathematical deterministic models of dengue transmission. Model assumptions were systematically extracted from each reviewed model structure, and were linked with their underlying epidemiological concepts. After defining common terms in vector-borne disease modelling, we generally categorised fourty-two published models of interest into single serotype and multiserotype models. The multi-serotype models assumed either vector-host or direct host-to-host transmission (ignoring the vector component. For each approach, we discussed the underlying structural and parameter assumptions, threshold behaviour and the projected impact of interventions. In view of the expected availability of dengue vaccines, modelling approaches will increasingly focus on the effectiveness and cost-effectiveness of vaccination options. For this purpose, the level of representation of the vector and host populations seems pivotal. Since vector-host transmission models would be required for projections of combined vaccination and vector control interventions, we advocate their use as most relevant to advice health policy in the future. The limited understanding of the factors which influence dengue transmission as well as limited data availability remain important concerns when applying dengue models to real-world decision problems.

  3. Surgeons underestimate their influence on medical students entering surgery.

    NARCIS (Netherlands)

    Quillin 3rd, R.C.; Pritts, T.A.; Davis, B.R.; Hanseman, D.; Collins, J.M.; Athota, K.P.; Edwards, M.J.R.; Tevar, A.D.

    2012-01-01

    BACKGROUND: Positive surgical role models influence medical students to pursue a career in surgery. However, the perception by role models of their own effectiveness has yet to be examined. In this study, we evaluated the influence of surgical role models on medical student career choice, and how th

  4. Modeling Systematic Change in Stopover Duration Does Not Improve Bias in Trends Estimated from Migration Counts.

    Directory of Open Access Journals (Sweden)

    Tara L Crewe

    Full Text Available The use of counts of unmarked migrating animals to monitor long term population trends assumes independence of daily counts and a constant rate of detection. However, migratory stopovers often last days or weeks, violating the assumption of count independence. Further, a systematic change in stopover duration will result in a change in the probability of detecting individuals once, but also in the probability of detecting individuals on more than one sampling occasion. We tested how variation in stopover duration influenced accuracy and precision of population trends by simulating migration count data with known constant rate of population change and by allowing daily probability of survival (an index of stopover duration to remain constant, or to vary randomly, cyclically, or increase linearly over time by various levels. Using simulated datasets with a systematic increase in stopover duration, we also tested whether any resulting bias in population trend could be reduced by modeling the underlying source of variation in detection, or by subsampling data to every three or five days to reduce the incidence of recounting. Mean bias in population trend did not differ significantly from zero when stopover duration remained constant or varied randomly over time, but bias and the detection of false trends increased significantly with a systematic increase in stopover duration. Importantly, an increase in stopover duration over time resulted in a compounding effect on counts due to the increased probability of detection and of recounting on subsequent sampling occasions. Under this scenario, bias in population trend could not be modeled using a covariate for stopover duration alone. Rather, to improve inference drawn about long term population change using counts of unmarked migrants, analyses must include a covariate for stopover duration, as well as incorporate sampling modifications (e.g., subsampling to reduce the probability that individuals will

  5. Satellite data for systematic validation of wave model results in the Black Sea

    Science.gov (United States)

    Behrens, Arno; Staneva, Joanna

    2017-04-01

    The Black Sea is with regard to the availability of traditional in situ wave measurements recorded by usual waverider buoys a data sparse semi-enclosed sea. The only possibility for systematic validations of wave model results in such a regional area is the use of satellite data. In the frame of the COPERNICUS Marine Evolution System for the Black Sea that requires wave predictions, the third-generation spectral wave model WAM is used. The operational system is demonstrated based on four years' systematic comparisons with satellite data. The aim of this investigation was to answer two questions. Is the wave model able to provide a reliable description of the wave conditions in the Black Sea and are the satellite measurements suitable for validation purposes on such a regional scale ? Detailed comparisons between measured data and computed model results for the Black Sea including yearly statistics have been done for about 300 satellite overflights per year. The results discussed the different verification schemes needed to review the forecasting skills of the operational system. The good agreement between measured and modeled data supports the expectation that the wave model provides reasonable results and that the satellite data is of good quality and offer an appropriate validation alternative to buoy measurements. This is the required step towards further use of those satellite data for assimilation into the wave fields to improve the wave predictions. Additional support for the good quality of the wave predictions is provided by comparisons between ADCP measurements that are available for a short time period in February 2012 and the corresponding model results at a location near the Bulgarian coast in the western Black Sea. Sensitivity tests with different wave model options and different driving wind fields have been done which identify the appropriate model configuration that provides the best wave predictions. In addition to the comparisons between measured

  6. Turbulent flow as a cause for underestimating coronary flow reserve measured by Doppler guide wire

    Directory of Open Access Journals (Sweden)

    Richartz Barbara M

    2006-03-01

    Full Text Available Abstract Background Doppler-tipped coronary guide-wires (FW are well-established tools in interventional cardiology to quantitatively analyze coronary blood flow. Doppler wires are used to measure the coronary flow velocity reserve (CFVR. The CFVR remains reduced in some patients despite anatomically successful coronary angioplasty. It was the aim of our study to test the influence of changes in flow profile on the validity of intra-coronary Doppler flow velocity measurements in vitro. It is still unclear whether turbulent flow in coronary arteries is of importance for physiologic studies in vivo. Methods We perfused glass pipes of defined inner diameters (1.5 – 5.5 mm with heparinized blood in a pulsatile flow model. Laminar and turbulent flow profiles were achieved by varying the flow velocity. The average peak velocity (APV was recorded using 0.014 inch FW. Flow velocity measurements were also performed in 75 patients during coronary angiography. Coronary hyperemia was induced by intra-coronary injection of adenosine. The APV maximum was taken for further analysis. The mean luminal diameter of the coronary artery at the region of flow velocity measurement was calculated by quantitative angiography in two orthogonal planes. Results In vitro, the measured APV multiplied with the luminal area revealed a significant correlation to the given perfusion volumes in all diameters under laminar flow conditions (r2 > 0.85. Above a critical Reynolds number of 500 – indicating turbulent flow – the volume calculation derived by FW velocity measurement underestimated the actual rate of perfusion by up to 22.5 % (13 ± 4.6 %. In vivo, the hyperemic APV was measured irrespectively of the inherent deviation towards lower velocities. In 15 of 75 patients (20% the maximum APV exceeded the velocity of the critical Reynolds number determined by the in vitro experiments. Conclusion Doppler guide wires are a valid tool for exact measurement of coronary flow

  7. A comprehensive model for executing knowledge management audits in organizations: a systematic review.

    Science.gov (United States)

    Shahmoradi, Leila; Ahmadi, Maryam; Sadoughi, Farahnaz; Piri, Zakieh; Gohari, Mahmood Reza

    2015-01-01

    A knowledge management audit (KMA) is the first phase in knowledge management implementation. Incomplete or incomprehensive execution of the KMA has caused many knowledge management programs to fail. A study was undertaken to investigate how KMAs are performed systematically in organizations and present a comprehensive model for performing KMAs based on a systematic review. Studies were identified by searching electronic databases such as Emerald, LISA, and the Cochrane library and e-journals such as the Oxford Journal and hand searching of printed journals, theses, and books in the Tehran University of Medical Sciences digital library. The sources used in this study consisted of studies available through the digital library of the Tehran University of Medical Sciences that were published between 2000 and 2013, including both Persian- and English-language sources, as well as articles explaining the steps involved in performing a KMA. A comprehensive model for KMAs is presented in this study. To successfully execute a KMA, it is necessary to perform the appropriate preliminary activities in relation to the knowledge management infrastructure, determine the knowledge management situation, and analyze and use the available data on this situation.

  8. Systematic Geometric Error Modeling for Workspace Volumetric Calibration of a 5-axis Turbine Blade Grinding Machine

    Institute of Scientific and Technical Information of China (English)

    Abdul Wahid Khan; Chen Wuyi

    2010-01-01

    A systematic geometric model has been presented for calibration of a newly designed 5-axis turbine blade grinding machine.This machine is designed to serve a specific purpose to attain high accuracy and high efficiency grinding of turbine blades by eliminating the hand grinding process.Although its topology is RPPPR (P:prismatic;R:rotary),its design is quite distinct from the competitive machine tools.As error quantification is the only way to investigate,maintain and improve its accuracy,calibration is recommended for its performance assessment and acceptance testing.Systematic geometric error modeling technique is implemented and 52 position dependent and position independent errors are identified while considering the machine as five rigid bodies by eliminating the set-up errors ofworkpiece and cutting tool.39 of them are found to have influential errors and are accommodated for finding the resultant effect between the cutting tool and the workpiece in workspace volume.Rigid body kinematics techniques and homogenous transformation matrices are used for error synthesis.

  9. A systematic review of models to predict recruitment to multicentre clinical trials

    Directory of Open Access Journals (Sweden)

    Cook Andrew

    2010-07-01

    Full Text Available Abstract Background Less than one third of publicly funded trials managed to recruit according to their original plan often resulting in request for additional funding and/or time extensions. The aim was to identify models which might be useful to a major public funder of randomised controlled trials when estimating likely time requirements for recruiting trial participants. The requirements of a useful model were identified as usability, based on experience, able to reflect time trends, accounting for centre recruitment and contribution to a commissioning decision. Methods A systematic review of English language articles using MEDLINE and EMBASE. Search terms included: randomised controlled trial, patient, accrual, predict, enrol, models, statistical; Bayes Theorem; Decision Theory; Monte Carlo Method and Poisson. Only studies discussing prediction of recruitment to trials using a modelling approach were included. Information was extracted from articles by one author, and checked by a second, using a pre-defined form. Results Out of 326 identified abstracts, only 8 met all the inclusion criteria. Of these 8 studies examined, there are five major classes of model discussed: the unconditional model, the conditional model, the Poisson model, Bayesian models and Monte Carlo simulation of Markov models. None of these meet all the pre-identified needs of the funder. Conclusions To meet the needs of a number of research programmes, a new model is required as a matter of importance. Any model chosen should be validated against both retrospective and prospective data, to ensure the predictions it gives are superior to those currently used.

  10. Immortalized endothelial cell lines for in vitro blood-brain barrier models: A systematic review.

    Science.gov (United States)

    Rahman, Nurul Adhwa; Rasil, Alifah Nur'ain Haji Mat; Meyding-Lamade, Uta; Craemer, Eva Maria; Diah, Suwarni; Tuah, Ani Afiqah; Muharram, Siti Hanna

    2016-07-01

    Endothelial cells play the most important role in construction of the blood-brain barrier. Many studies have opted to use commercially available, easily transfected or immortalized endothelial cell lines as in vitro blood-brain barrier models. Numerous endothelial cell lines are available, but we do not currently have strong evidence for which cell lines are optimal for establishment of such models. This review aimed to investigate the application of immortalized endothelial cell lines as in vitro blood-brain barrier models. The databases used for this review were PubMed, OVID MEDLINE, ProQuest, ScienceDirect, and SpringerLink. A narrative systematic review was conducted and identified 155 studies. As a result, 36 immortalized endothelial cell lines of human, mouse, rat, porcine and bovine origins were found for the establishment of in vitro blood-brain barrier and brain endothelium models. This review provides a summary of immortalized endothelial cell lines as a guideline for future studies and improvements in the establishment of in vitro blood-brain barrier models. It is important to establish a good and reproducible model that has the potential for multiple applications, in particular a model of such a complex compartment such as the blood-brain barrier.

  11. A critical comparison of systematic calibration protocols for activated sludge models: a SWOT analysis.

    Science.gov (United States)

    Sin, Gürkan; Van Hulle, Stijn W H; De Pauw, Dirk J W; van Griensven, Ann; Vanrolleghem, Peter A

    2005-07-01

    Modelling activated sludge systems has gained an increasing momentum after the introduction of activated sludge models (ASMs) in 1987. Application of dynamic models for full-scale systems requires essentially a calibration of the chosen ASM to the case under study. Numerous full-scale model applications have been performed so far which were mostly based on ad hoc approaches and expert knowledge. Further, each modelling study has followed a different calibration approach: e.g. different influent wastewater characterization methods, different kinetic parameter estimation methods, different selection of parameters to be calibrated, different priorities within the calibration steps, etc. In short, there was no standard approach in performing the calibration study, which makes it difficult, if not impossible, to (1) compare different calibrations of ASMs with each other and (2) perform internal quality checks for each calibration study. To address these concerns, systematic calibration protocols have recently been proposed to bring guidance to the modeling of activated sludge systems and in particular to the calibration of full-scale models. In this contribution four existing calibration approaches (BIOMATH, HSG, STOWA and WERF) will be critically discussed using a SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis. It will also be assessed in what way these approaches can be further developed in view of further improving the quality of ASM calibration. In this respect, the potential of automating some steps of the calibration procedure by use of mathematical algorithms is highlighted.

  12. Health literacy and public health: A systematic review and integration of definitions and models

    LENUS (Irish Health Repository)

    Sorensen, Kristine

    2012-01-25

    Abstract Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.

  13. Health literacy and public health: A systematic review and integration of definitions and models

    Directory of Open Access Journals (Sweden)

    Sørensen Kristine

    2012-01-01

    Full Text Available Abstract Background Health literacy concerns the knowledge and competences of persons to meet the complex demands of health in modern society. Although its importance is increasingly recognised, there is no consensus about the definition of health literacy or about its conceptual dimensions, which limits the possibilities for measurement and comparison. The aim of the study is to review definitions and models on health literacy to develop an integrated definition and conceptual model capturing the most comprehensive evidence-based dimensions of health literacy. Methods A systematic literature review was performed to identify definitions and conceptual frameworks of health literacy. A content analysis of the definitions and conceptual frameworks was carried out to identify the central dimensions of health literacy and develop an integrated model. Results The review resulted in 17 definitions of health literacy and 12 conceptual models. Based on the content analysis, an integrative conceptual model was developed containing 12 dimensions referring to the knowledge, motivation and competencies of accessing, understanding, appraising and applying health-related information within the healthcare, disease prevention and health promotion setting, respectively. Conclusions Based upon this review, a model is proposed integrating medical and public health views of health literacy. The model can serve as a basis for developing health literacy enhancing interventions and provide a conceptual basis for the development and validation of measurement tools, capturing the different dimensions of health literacy within the healthcare, disease prevention and health promotion settings.

  14. Transition between process models (BPMN and service models (WS-BPEL and other standards: A systematic review

    Directory of Open Access Journals (Sweden)

    Marko Jurišić

    2011-12-01

    Full Text Available BPMN and BPEL have become de facto standards for modeling of business processes and imple-mentation of business processes via Web services. There is a quintessential problem of discrep-ancy between these two approaches as they are applied in different phases of lifecycle and theirfundamental concepts are different — BPMN is a graph based language while BPEL is basicallya block-based programming language. This paper shows basic concepts and gives an overviewof research and ideas which emerged during last two years, presents state of the art and possiblefuture research directions. Systematic literature review was performed and critical review wasgiven regarding the potential of the given solutions.

  15. SDU: A Semidefinite Programming-Based Underestimation Method for Stochastic Global Optimization in Protein Docking.

    Science.gov (United States)

    Paschalidis, Ioannis Ch; Shen, Yang; Vakili, Pirooz; Vajda, Sandor

    2007-04-01

    This paper introduces a new stochastic global optimization method targeting protein-protein docking problems, an important class of problems in computational structural biology. The method is based on finding general convex quadratic underestimators to the binding energy function that is funnel-like. Finding the optimum underestimator requires solving a semidefinite programming problem, hence the name semidefinite programming-based underestimation (SDU). The underestimator is used to bias sampling in the search region. It is established that under appropriate conditions SDU locates the global energy minimum with probability approaching one as the sample size grows. A detailed comparison of SDU with a related method of convex global underestimator (CGU), and computational results for protein-protein docking problems are provided.

  16. Systematics of nuclear densities, deformations and excitation energies within the context of the generalized rotation-vibration model

    Energy Technology Data Exchange (ETDEWEB)

    Chamon, L.C., E-mail: luiz.chamon@dfn.if.usp.b [Departamento de Fisica Nuclear, Instituto de Fisica da Universidade de Sao Paulo, Caixa Postal 66318, 05315-970, Sao Paulo, SP (Brazil); Carlson, B.V. [Departamento de Fisica, Instituto Tecnologico de Aeronautica, Centro Tecnico Aeroespacial, Sao Jose dos Campos, SP (Brazil)

    2010-11-30

    We present a large-scale systematics of charge densities, excitation energies and deformation parameters for hundreds of heavy nuclei. The systematics is based on a generalized rotation-vibration model for the quadrupole and octupole modes and takes into account second-order contributions of the deformations as well as the effects of finite diffuseness values for the nuclear densities. We compare our results with the predictions of classical surface vibrations in the hydrodynamical approximation.

  17. Systematic study of 16O-induced fusions with the improved quantum molecular dynamics model

    CERN Document Server

    Wang, Ning; Li, Zhuxia

    2014-01-01

    The heavy-ion fusion reactions with 16O bombarding on 62Ni, 65Cu, 74Ge, 148Nd, 180Hf, 186W, 208Pb, 238U are systematically investigated with the improved quantum molecular dynamics (ImQMD) model. The fusion cross sections at energies near and above the Coulomb barriers can be reasonably well reproduced by using this semi-classical microscopic transport model with the parameter sets SkP* and IQ3a. The dynamical nucleus-nucleus potentials and the influence of Fermi constraint on the fusion process are also studied simultaneously. In addition to the mean field, the Fermi constraint also plays a key role for the reliable description of fusion process and for improving the stability of fragments in heavy-ion collisions.

  18. Systematic model researches on the stability limits of the DVL series of float designs

    Science.gov (United States)

    Sottorf, W.

    1949-01-01

    To determine the trim range in which a seaplane can take off without porpoising, stability tests were made of a Plexiglas model, composed of float, wing, and tailplane, which corresponded to a full-size research airplane. The model and full-size stability limits are in good agreement. After all structural parts pertaining to the air frame were removed gradually, the aerodynamic forces replaced by weight forces, and the moment of inertia and position of the center of gravity changed, no marked change of limits of the stable zone was noticeable. The latter, therefore, is for practical purposes affected only by hydrodynamic phenomena. The stability limits of the DVL family of floats were determined by a systematic investigation independent of any particular sea-plane design, thus a seaplane may be designed to give a run free from porpoising.

  19. Religion and Spirituality's Influences on HIV Syndemics Among MSM: A Systematic Review and Conceptual Model.

    Science.gov (United States)

    Lassiter, Jonathan M; Parsons, Jeffrey T

    2016-02-01

    This paper presents a systematic review of the quantitative HIV research that assessed the relationships between religion, spirituality, HIV syndemics, and individual HIV syndemics-related health conditions (e.g. depression, substance abuse, HIV risk) among men who have sex with men (MSM) in the United States. No quantitative studies were found that assessed the relationships between HIV syndemics, religion, and spirituality. Nine studies, with 13 statistical analyses, were found that examined the relationships between individual HIV syndemics-related health conditions, religion, and spirituality. Among the 13 analyses, religion and spirituality were found to have mixed relationships with HIV syndemics-related health conditions (6 nonsignificant associations; 5 negative associations; 2 positive associations). Given the overall lack of inclusion of religion and spirituality in HIV syndemics research, a conceptual model that hypothesizes the potential interactions of religion and spirituality with HIV syndemics-related health conditions is presented. The implications of the model for MSM's health are outlined.

  20. Hubble Frontier Fields: systematic errors in strong lensing models of galaxy clusters - implications for cosmography

    Science.gov (United States)

    Acebron, Ana; Jullo, Eric; Limousin, Marceau; Tilquin, André; Giocoli, Carlo; Jauzac, Mathilde; Mahler, Guillaume; Richard, Johan

    2017-09-01

    Strong gravitational lensing by galaxy clusters is a fundamental tool to study dark matter and constrain the geometry of the Universe. Recently, the Hubble Space Telescope Frontier Fields programme has allowed a significant improvement of mass and magnification measurements but lensing models still have a residual root mean square between 0.2 arcsec and few arcseconds, not yet completely understood. Systematic errors have to be better understood and treated in order to use strong lensing clusters as reliable cosmological probes. We have analysed two simulated Hubble-Frontier-Fields-like clusters from the Hubble Frontier Fields Comparison Challenge, Ares and Hera. We use several estimators (relative bias on magnification, density profiles, ellipticity and orientation) to quantify the goodness of our reconstructions by comparing our multiple models, optimized with the parametric software lenstool, with the input models. We have quantified the impact of systematic errors arising, first, from the choice of different density profiles and configurations and, secondly, from the availability of constraints (spectroscopic or photometric redshifts, redshift ranges of the background sources) in the parametric modelling of strong lensing galaxy clusters and therefore on the retrieval of cosmological parameters. We find that substructures in the outskirts have a significant impact on the position of the multiple images, yielding tighter cosmological contours. The need for wide-field imaging around massive clusters is thus reinforced. We show that competitive cosmological constraints can be obtained also with complex multimodal clusters and that photometric redshifts improve the constraints on cosmological parameters when considering a narrow range of (spectroscopic) redshifts for the sources.

  1. Organization Domain Modeling (ODM): Extending systematic D-AME beyond software domains

    Energy Technology Data Exchange (ETDEWEB)

    Simos, M.A. [Organon Motives, Inc., Belmont, MA (United States)

    1996-12-31

    The emerging discipline of domain analysis, modeling, and engineering, or D-AME, has received most attention from the field of systematic software reuse, where the term {open_quotes}domain{close_quotes} usually denotes a well-scoped area of functionality within a set or class of software systems. A central challenge in D-AME research has been in defining processes and representations sufficiently general to apply in the diverse organizational and technical environments in which D-AME can make useful contribution. The systematic reuse community has established ambitious goals for what a D-AME process should address, such as the ability to support design for reuse for all products and processes of the software life cycle, and applicability beyond software domains: e.g., to domains such as business processes, product variability models, or more generally, domains of shared knowledge about particular technical areas of expertise. In practice, though, the search for generalized domain analysis processes and methods has been fraught. with difficulties. Obstacles include: adoption of a too-narrow conception of the nature of {open_quotes}domains{close_quotes}; tight coupling of D-AME process and methods with software engineering representations; and a consequent lack of understanding of the unique aspects of D-AME as a qualitative process. This paper discusses the goals for the extensibility of D-AME, the primary barriers to achieving these goals, and specific features of the Organization Domain Modeling (ODM) methodology that address these issues. ODM is structured as a core life cycle process model which is broadly applicable to diverse domains and organizational contexts. The core process is augmented by a set of supporting methods which facilitate tailorability, for example, by encapsulating commitments to specific software design representations and processes.

  2. Behavioural change models for infectious disease transmission: a systematic review (2010–2015)

    Science.gov (United States)

    2016-01-01

    We review behavioural change models (BCMs) for infectious disease transmission in humans. Following the Cochrane collaboration guidelines and the PRISMA statement, our systematic search and selection yielded 178 papers covering the period 2010–2015. We observe an increasing trend in published BCMs, frequently coupled to (re)emergence events, and propose a categorization by distinguishing how information translates into preventive actions. Behaviour is usually captured by introducing information as a dynamic parameter (76/178) or by introducing an economic objective function, either with (26/178) or without (37/178) imitation. Approaches using information thresholds (29/178) and exogenous behaviour formation (16/178) are also popular. We further classify according to disease, prevention measure, transmission model (with 81/178 population, 6/178 metapopulation and 91/178 individual-level models) and the way prevention impacts transmission. We highlight the minority (15%) of studies that use any real-life data for parametrization or validation and note that BCMs increasingly use social media data and generally incorporate multiple sources of information (16/178), multiple types of information (17/178) or both (9/178). We conclude that individual-level models are increasingly used and useful to model behaviour changes. Despite recent advancements, we remain concerned that most models are purely theoretical and lack representative data and a validation process. PMID:28003528

  3. A systematic review of predictive models for asthma development in children.

    Science.gov (United States)

    Luo, Gang; Nkoy, Flory L; Stone, Bryan L; Schmick, Darell; Johnson, Michael D

    2015-11-28

    Asthma is the most common pediatric chronic disease affecting 9.6 % of American children. Delay in asthma diagnosis is prevalent, resulting in suboptimal asthma management. To help avoid delay in asthma diagnosis and advance asthma prevention research, researchers have proposed various models to predict asthma development in children. This paper reviews these models. A systematic review was conducted through searching in PubMed, EMBASE, CINAHL, Scopus, the Cochrane Library, the ACM Digital Library, IEEE Xplore, and OpenGrey up to June 3, 2015. The literature on predictive models for asthma development in children was retrieved, with search results limited to human subjects and children (birth to 18 years). Two independent reviewers screened the literature, performed data extraction, and assessed article quality. The literature search returned 13,101 references in total. After manual review, 32 of these references were determined to be relevant and are discussed in the paper. We identify several limitations of existing predictive models for asthma development in children, and provide preliminary thoughts on how to address these limitations. Existing predictive models for asthma development in children have inadequate accuracy. Efforts to improve these models' performance are needed, but are limited by a lack of a gold standard for asthma development in children.

  4. A systematic narrative review of consumer-directed care for older people: implications for model development.

    Science.gov (United States)

    Ottmann, Goetz; Allen, Jacqui; Feldman, Peter

    2013-11-01

    Consumer-directed care is increasingly becoming a mainstream option in community-based aged care. However, a systematic review describing how the current evaluation research translates into practise has not been published to date. This review aimed to systematically establish an evidence base of user preferences for and satisfaction with services associated with consumer-directed care programmes for older people. Twelve databases were searched, including MedLine, BioMed Central, Cinahl, Expanded Academic ASAP, PsychInfo, ProQuest, Age Line, Science Direct, Social Citation Index, Sociological Abstracts, Web of Science and the Cochrane Library. Google Scholar and Google were also searched. Eligible studies were those reporting on choice, user preferences and service satisfaction outcomes regarding a programme or model of home-based care in the United States or United Kingdom. This systematic narrative review retrieved literature published from January 1992 to August 2011. A total of 277 references were identified. Of these 17 met the selection criteria and were reviewed. Findings indicate that older people report varying preferences for consumer-directed care with some demonstrating limited interest. Clients and carers reported good service satisfaction. However, research comparing user preferences across countries or investigating how ecological factors shape user preferences has received limited attention. Policy-makers and practitioners need to carefully consider the diverse contexts, needs and preferences of older adults in adopting consumer-directed care approaches in community aged care. The review calls for the development of consumer-directed care programmes offering a broad range of options that allow for personalisation and greater control over services without necessarily transferring the responsibility for administrative responsibilities to service users. Review findings suggest that consumer-directed care approaches have the potential to empower older

  5. Endogenous opioid antagonism in physiological experimental pain models: a systematic review.

    Science.gov (United States)

    Werner, Mads U; Pereira, Manuel P; Andersen, Lars Peter H; Dahl, Jørgen B

    2015-01-01

    Opioid antagonists are pharmacological tools applied as an indirect measure to detect activation of the endogenous opioid system (EOS) in experimental pain models. The objective of this systematic review was to examine the effect of mu-opioid-receptor (MOR) antagonists in placebo-controlled, double-blind studies using 'inhibitory' or 'sensitizing', physiological test paradigms in healthy human subjects. The databases PubMed and Embase were searched according to predefined criteria. Out of a total of 2,142 records, 63 studies (1,477 subjects [male/female ratio = 1.5]) were considered relevant. Twenty-five studies utilized 'inhibitory' test paradigms (ITP) and 38 studies utilized 'sensitizing' test paradigms (STP). The ITP-studies were characterized as conditioning modulation models (22 studies) and repetitive transcranial magnetic stimulation models (rTMS; 3 studies), and, the STP-studies as secondary hyperalgesia models (6 studies), 'pain' models (25 studies), summation models (2 studies), nociceptive reflex models (3 studies) and miscellaneous models (2 studies). A consistent reversal of analgesia by a MOR-antagonist was demonstrated in 10 of the 25 ITP-studies, including stress-induced analgesia and rTMS. In the remaining 14 conditioning modulation studies either absence of effects or ambiguous effects by MOR-antagonists, were observed. In the STP-studies, no effect of the opioid-blockade could be demonstrated in 5 out of 6 secondary hyperalgesia studies. The direction of MOR-antagonist dependent effects upon pain ratings, threshold assessments and somatosensory evoked potentials (SSEP), did not appear consistent in 28 out of 32 'pain' model studies. In conclusion, only in 2 experimental human pain models, i.e., stress-induced analgesia and rTMS, administration of MOR-antagonist demonstrated a consistent effect, presumably mediated by an EOS-dependent mechanisms of analgesia and hyperalgesia.

  6. Heavy rainfall: An underestimated environmental risk for buildings?

    Directory of Open Access Journals (Sweden)

    Golz Sebastian

    2016-01-01

    Second, heavy rain may result in urban pluvial flooding due to sewer overflow that cause severe damage to buildings. A comprehensive study of the impacts and the consequences in Dresden (Germany, presented in the paper, revealed that the potential risks of flooding from sewers due to hydraulic overload can be estimated on building scale using the model approach IVART (Integrated Spatial Vulnerability and Risk Assessment Tool. Modelling results provide the basis to quantify the effectiveness and efficiency of flood resilience technologies.

  7. Systematic review of risk adjustment models of hospital length of stay (LOS).

    Science.gov (United States)

    Lu, Mingshan; Sajobi, Tolulope; Lucyk, Kelsey; Lorenzetti, Diane; Quan, Hude

    2015-04-01

    Policy decisions in health care, such as hospital performance evaluation and performance-based budgeting, require an accurate prediction of hospital length of stay (LOS). This paper provides a systematic review of risk adjustment models for hospital LOS, and focuses primarily on studies that use administrative data. MEDLINE, EMBASE, Cochrane, PubMed, and EconLit were searched for studies that tested the performance of risk adjustment models in predicting hospital LOS. We included studies that tested models developed for the general inpatient population, and excluded those that analyzed risk factors only correlated with LOS, impact analyses, or those that used disease-specific scales and indexes to predict LOS. Our search yielded 3973 abstracts, of which 37 were included. These studies used various disease groupers and severity/morbidity indexes to predict LOS. Few models were developed specifically for explaining hospital LOS; most focused primarily on explaining resource spending and the costs associated with hospital LOS, and applied these models to hospital LOS. We found a large variation in predictive power across different LOS predictive models. The best model performance for most studies fell in the range of 0.30-0.60, approximately. The current risk adjustment methodologies for predicting LOS are still limited in terms of models, predictors, and predictive power. One possible approach to improving the performance of LOS risk adjustment models is to include more disease-specific variables, such as disease-specific or condition-specific measures, and functional measures. For this approach, however, more comprehensive and standardized data are urgently needed. In addition, statistical methods and evaluation tools more appropriate to LOS should be tested and adopted.

  8. SModelS: A Tool for Making Systematic Use of Simplified Models Results

    Science.gov (United States)

    Waltenberger, Wolfgang; SModelS Group.

    2016-10-01

    We present an automated software tool ”SModelS” to systematically confront theories Beyond the Standard Model (BSM) with experimental data. The tool consists of a general procedure to decompose such BSM theories into their Simplified Models Spectra (SMS). In addition, SModelS features a database containing the majority of the published SMS results of CMS and ATLAS. These results consist of the 95% confidence level upper limits on signal production cross sections. The two components together allow us to quickly confront any BSM model with LHC results. As a show-case example we will briefly discuss an application of our procedure to a specific supersymmetric model. It is one of our ongoing efforts to extend the framework to include also efficiency maps produced either by the experimental collaborations, by efforts performed within the phenomenological groups, or possibly also by ourselves. While the current implementation can handle null results only, it is our ultimate goal to build the Next Standard Model in a bottom-up fashion from both negative and positive results of several experiments. The implementation is open source, written in python, and available from http://smodels.hephy.at.

  9. A systematic approach for scale-down model development and characterization of commercial cell culture processes.

    Science.gov (United States)

    Li, Feng; Hashimura, Yasunori; Pendleton, Robert; Harms, Jean; Collins, Erin; Lee, Brian

    2006-01-01

    The objective of process characterization is to demonstrate robustness of manufacturing processes by understanding the relationship between key operating parameters and final performance. Technical information from the characterization study is important for subsequent process validation, and this has become a regulatory expectation in recent years. Since performing the study at the manufacturing scale is not practically feasible, development of scale-down models that represent the performance of the commercial process is essential to achieve reliable process characterization. In this study, we describe a systematic approach to develop a bioreactor scale-down model and to characterize a cell culture process for recombinant protein production in CHO cells. First, a scale-down model using 2-L bioreactors was developed on the basis of the 2000-L commercial scale process. Profiles of cell growth, productivity, product quality, culture environments (pH, DO, pCO2), and level of metabolites (glucose, glutamine, lactate, ammonia) were compared between the two scales to qualify the scale-down model. The key operating parameters were then characterized in single-parameter ranging studies and an interaction study using this scale-down model. Appropriate operation ranges and acceptance criteria for certain key parameters were determined to ensure the success of process validation and the process performance consistency. The process worst-case condition was also identified through the interaction study.

  10. A systematic composite service design modeling method using graph-based theory.

    Directory of Open Access Journals (Sweden)

    Arafat Abdulgader Mohammed Elhag

    Full Text Available The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.

  11. A systematic composite service design modeling method using graph-based theory.

    Science.gov (United States)

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.

  12. Anxiety in the context of cancer: A systematic review and development of an integrated model.

    Science.gov (United States)

    Curran, Leah; Sharpe, Louise; Butow, Phyllis

    2017-08-01

    Anxiety is common in the context of cancer, but there are few theoretical models that apply to people with cancer across the trajectory of their illness. The aims of this review are to identify existing theories and to propose an integrated model of cancer-related anxiety. Using a systematic literature search of Medline, Premedline and PsycINFO databases, we identified nine theoretical models of anxiety in the context of cancer. We reviewed these for psychological concepts that fell under five themes: pre-existing schema, the inherent nature of cancer, cognitive factors, coping responses and contextual factors. From these themes, we integrated concepts from different models to develop a theoretical framework to explain the development and maintenance of anxiety in the context of cancer. The resulting model suggests that pre-existing schema, past experiences of cancer, an intolerance of uncertainty and meta-cognitive beliefs about worry interact with the inherent nature of cancer to produce overwhelming distress. The distress activates cognitive processes characterized by vigilance, worry and rumination. Attempts to cope by re-establishing control, and a pattern of vigilance to cancer-related cues and/or avoidance reinforce anxiety, in the context of a range of systemic factors that can either buffer against or worsen the anxiety. Copyright © 2017. Published by Elsevier Ltd.

  13. On the risk of severe dengue during secondary infection: A systematic review coupled with mathematical modeling

    Directory of Open Access Journals (Sweden)

    Kenji Mizumoto

    2014-08-01

    Full Text Available Background & objectives: The present study aimed to systematically quantify the well known risk of severe dengue during secondary infection in literature and to understand how epidemiological mechanisms of enhancement during the secondary infection influence the empirically estimated risk of severe dengue by means of mathematical modeling. Methods: Two conditional risks of severe dengue, i.e. symptomatic illness and dengue hemorrhagic fever (DHF or dengue shock syndrome (DSS, given secondary infection were explored based on systematically searched prospective studies. A two-strain epidemiological model was employed to simulate the transmission dynamics of dengue and to identify the relevant data gaps in empirical observations. Results: Using the variance-based weighting, the pooled relative risk (RR of symptomatic illness during secondary infection was estimated at 9.4 [95% confidence interval (CI: 6.1-14.4], and similarly, RR of DHF/DSS was estimated to be 23.7 (95% CI: 15.3-36.9. A variation in the RR of DHF/DSS was observed among prospective studies. Using the mathematical modeling technique, we identified the duration of cross-protective immunity as an important modulator of the time-dependent behaviour of the RR of severe dengue. Different epidemiological mechanisms of enhancement during secondary infection yielded different RR of severe dengue. Interpretation & conclusion: Optimal design of prospective cohort study for dengue should be considered, accounting for the time-dependence in the RR during the course of dengue epidemic. It is critical to statistically infer the duration of cross-protective immunity and clarify how the enhancement influences the epidemiological dynamics during secondary infection.

  14. Mesenchymal Stromal Cells in Animal Bleomycin Pulmonary Fibrosis Models: A Systematic Review.

    Science.gov (United States)

    Srour, Nadim; Thébaud, Bernard

    2015-12-01

    Idiopathic pulmonary fibrosis is an inexorably progressive lung disease with few available treatments. New therapeutic options are needed. Stem cells have generated much enthusiasm for the treatment of several conditions, including lung diseases. Human trials of mesenchymal stromal cell (MSC) therapy for pulmonary fibrosis are under way. To shed light on the potential usefulness of MSCs for human disease, we aimed to systematically review the preclinical literature to determine if MSCs are beneficial in animal bleomycin pulmonary fibrosis models. The MEDLINE and Embase databases were searched for original studies of stem cell therapy in animal bleomycin models of pulmonary fibrosis. Studies using embryonic stem cells or induced pluripotent stem cells were excluded. Seventeen studies were selected, all of which used MSCs in rodents. MSC therapy led to an improvement in bleomycin-induced lung collagen deposition in animal lungs and in the pulmonary fibrosis Ashcroft score in most studies. MSC therapy improved histopathology in almost all studies in which it was evaluated qualitatively. Furthermore, MSC therapy was found to improve 14-day survival in animals with bleomycin-induced pulmonary fibrosis. Bronchoalveolar lavage total and neutrophil counts, as well as transforming growth factor-β levels, were also reduced by MSCs. MSCs are beneficial in rodent bleomycin pulmonary fibrosis models. Since most studies examined the initial inflammatory phase rather than the chronic fibrotic phase, preclinical data offer better support for human trials of MSCs in acute exacerbations of pulmonary fibrosis rather than the chronic phase of the disease. There has been increased interest in mesenchymal stromal cell therapy for lung diseases. A few small clinical trials are under way in idiopathic pulmonary fibrosis. Preclinical evidence was assessed in a systematic review, as is often done for clinical studies. The existing studies offer better support for efficacy in the initial

  15. Periodontal ligament-derived cells for periodontal regeneration in animal models: a systematic review.

    Science.gov (United States)

    Bright, R; Hynes, K; Gronthos, S; Bartold, P M

    2015-04-01

    Implantation of periodontal ligament stem cells is emerging as a potential periodontal regenerative procedure. This systematic review considers the evidence from animal models investigating the use of periodontal ligament stem cells for successful periodontal regeneration. PubMed, Embase, MEDLINE and Google Scholar were searched to December 2013 for quantitative studies examining the outcome of implanting periodontal ligament stem cells into experimental periodontal defects in animals. Inclusion criteria were: implantation of periodontal ligament stem cells into surgically created periodontal defects for periodontal regeneration; animal models only; source of cells either human or animal; and published in English. We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. From the literature search, 43 studies met the inclusion criteria. A wide variety of surgical defects were created in four species of animal (dog, rat, pig and sheep). Owing to wide variability in defect type, cell source and cell scaffold, no meta-analysis was possible. Outcome measures included new bone, new cementum and new connective tissue formation. In 70.5% of the results, statistically significant improvements of these measures was recorded. These results are notable in that they indicate that irrespective of the defect type and animal model used, periodontal ligament stem cell implantation can be expected to result in a beneficial outcome for periodontal regeneration. It is recommended that there is sufficient evidence from preclinical animal studies to warrant moving to human studies to examine the efficacy, safety, feasibility (autologous vs. allogeneic transplantation) and delivery of periodontal ligament stem cells for periodontal regeneration. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. [Skilled communication as "intervention" : Models for systematic communication in the healthcare system].

    Science.gov (United States)

    Weinert, M; Mayer, H; Zojer, E

    2015-02-01

    Specific communication training is currently not integrated into anesthesiology curricula. At the same time communication is an important key factor when working with colleagues, in the physician-patient relationship, during management of emergencies and in avoiding or reducing the legal consequences of adverse medical events. Therefore, focused attention should be brought to this area. In other high risk industries, specific communication training has been standard for a long time and in medicine there is an approach to teach and train these soft skills by simulation. Systematic communication training, however, is rarely an established component of specialist training. It is impossible not to communicate whereby nonverbal indications, such as gestures, mimic expression, posture and tone play an important part. Miscommunication, however, is common and leads to unproductive behavior. The cause of this is not always obvious. This article provides an overview of the communication models of Shannon, Watzlawick et al. and Schulz von Thun et al. and describes their limitations. The "Process Communication Model®" (PCM) is also introduced. An overview is provided with examples of how this tool can be used to look at the communication process from a systematic point of view. People have different psychological needs. Not taking care of these needs will result in individual stress behavior, which can be graded into first, second and third degrees of severity (driver behavior, mask behavior and desperation). These behavior patterns become exposed in predictable sequences. Furthermore, on the basis of this model, successful communication can be established while unproductive behavior that occurs during stress can be dealt with appropriately. Because of the importance of communication in all areas of medical care, opportunities exist to focus research on the influence of targeted communication on patient outcome, complications and management of emergencies.

  17. Prediction models for risk of developing type 2 diabetes : systematic literature search and independent external validation study

    NARCIS (Netherlands)

    Abbasi, Ali; Peelen, Linda M.; Corpeleijn, Eva; van der Schouw, Yvonne T.; Stolk, Ronald P.; Spijkerman, Annemieke M. W.; van der A, Daphne L.; Moons, Karel G. M.; Navis, Gerjan; Bakker, Stephan J. L.; Beulens, Joline W. J.

    2012-01-01

    Objective To identify existing prediction models for the risk of development of type 2 diabetes and to externally validate them in a large independent cohort. Data sources Systematic search of English, German, and Dutch literature in PubMed until February 2011 to identify prediction models for diabe

  18. A systematic approach to obtain validated partial least square models for predicting lipoprotein subclasses from serum NMR spectra

    NARCIS (Netherlands)

    Mihaleva, V.V.; Schalkwijk, van D.B.; Graaf, de A.A.; Duynhoven, van J.P.M.; Dorsten, van F.A.; Vervoort, J.J.M.; Smilde, A.K.; Westerhuis, J.A.; Jacobs, D.M.

    2014-01-01

    A systematic approach is described for building validated PLS models that predict cholesterol and triglyceride concentrations in lipoprotein subclasses in fasting serum from a normolipidemic, healthy population. The PLS models were built on diffusion-edited (1)H NMR spectra and calibrated on HPLC-de

  19. A systematic approach to obtain validated partial least square models for predicting lipoprotein subclasses from serum nmr spectra

    NARCIS (Netherlands)

    Mihaleva, V.V.; Schalkwijk, D.B. van; Graaf, A.A. de; Duynhoven, J. van; Dorsten, F.A. van; Vervoort, J.; Smilde, A.; Westerhuis, J.A.; Jacobs, D.M.

    2014-01-01

    A systematic approach is described for building validated PLS models that predict cholesterol and triglyceride concentrations in lipoprotein subclasses in fasting serum from a normolipidemic, healthy population. The PLS models were built on diffusion-edited 1H NMR spectra and calibrated on HPLC-deri

  20. Systematizing Web Search through a Meta-Cognitive, Systems-Based, Information Structuring Model (McSIS)

    Science.gov (United States)

    Abuhamdieh, Ayman H.; Harder, Joseph T.

    2015-01-01

    This paper proposes a meta-cognitive, systems-based, information structuring model (McSIS) to systematize online information search behavior based on literature review of information-seeking models. The General Systems Theory's (GST) prepositions serve as its framework. Factors influencing information-seekers, such as the individual learning…

  1. Community-wide model validation studies for systematic assessment of ionosphere-thermosphere models

    Science.gov (United States)

    Shim, Ja Soon; Kuznetsova, Maria; Rastätter, Lutz

    2016-07-01

    As an unbiased agent, the Community Coordinated Modeling Center (CCMC) has been leading community-wide model validation efforts; GEM, CEDAR and GEM-CEDAR Modeling Challenges since 2009. The CEDAR ETI (Electrodynamics Thermosphere Ionosphere) Challenge focused on the ability of ionosphere-thermosphere (IT) models to reproduce basic IT system parameters, such as electron and neutral densities, NmF2, hmF2, and Total Electron Content (TEC). Model-data time series comparisons were performed for a set of selected events with different levels of geomagnetic activity (quiet, moderate, storms). The follow-on CEDAR-GEM Challenge aims to quantify geomagnetic storm impacts on the IT system. On-going studies include quantifying the storm energy input, such as increase in auroral precipitation and Joule heating, and quantifying the storm-time variations of neutral density and TEC. In this paper, we will present lessons learned from the Modeling Challenges led by the CCMC.

  2. Modeling of novel diagnostic strategies for active tuberculosis - a systematic review: current practices and recommendations.

    Directory of Open Access Journals (Sweden)

    Alice Zwerling

    Full Text Available The field of diagnostics for active tuberculosis (TB is rapidly developing. TB diagnostic modeling can help to inform policy makers and support complicated decisions on diagnostic strategy, with important budgetary implications. Demand for TB diagnostic modeling is likely to increase, and an evaluation of current practice is important. We aimed to systematically review all studies employing mathematical modeling to evaluate cost-effectiveness or epidemiological impact of novel diagnostic strategies for active TB.Pubmed, personal libraries and reference lists were searched to identify eligible papers. We extracted data on a wide variety of model structure, parameter choices, sensitivity analyses and study conclusions, which were discussed during a meeting of content experts.From 5619 records a total of 36 papers were included in the analysis. Sixteen papers included population impact/transmission modeling, 5 were health systems models, and 24 included estimates of cost-effectiveness. Transmission and health systems models included specific structure to explore the importance of the diagnostic pathway (n = 4, key determinants of diagnostic delay (n = 5, operational context (n = 5, and the pre-diagnostic infectious period (n = 1. The majority of models implemented sensitivity analysis, although only 18 studies described multi-way sensitivity analysis of more than 2 parameters simultaneously. Among the models used to make cost-effectiveness estimates, most frequent diagnostic assays studied included Xpert MTB/RIF (n = 7, and alternative nucleic acid amplification tests (NAATs (n = 4. Most (n = 16 of the cost-effectiveness models compared new assays to an existing baseline and generated an incremental cost-effectiveness ratio (ICER.Although models have addressed a small number of important issues, many decisions regarding implementation of TB diagnostics are being made without the full benefits of insight from mathematical

  3. Associations between psychological variables and pain in experimental pain models. A systematic review

    DEFF Research Database (Denmark)

    Hansen, M S; Horjales-Araujo, E; Dahl, J B

    2015-01-01

    . Translational studies performed in healthy volunteers may provide knowledge concerning psychological factors in healthy individuals as well as basic pain physiology. The aim of this review was to investigate whether psychological vulnerability or specific psychological variables in healthy volunteers...... are predictive of the level of pain following experimental pain models. METHODS: A systematic search on the databases, PubMed, Embase, Cochcrane library, and Clinicaltrials.gov was performed during September 2014. All trials investigating the association between psychological variables and experimental pain...... a sufficiently homogenous group to perform meta-analysis. The collected results were diverse. A total of 16 trials suggested that psychological factors may predict the level of pain, seven studies found divergent results, and six studies found no significant association between psychological variables...

  4. Systematically Searching for New Resonances at the Energy Frontier using Topological Models

    CERN Document Server

    Abdullah, Mohammad; DiFranzo, Anthony; Frate, Meghan; Pitcher, Craig; Shimmin, Chase; Upadhyay, Suneet; Walker, James; Weatherly, Pierce; Fox, Patrick J; Whiteson, Daniel

    2014-01-01

    We propose a new strategy to systematically search for new physics processes in particle collisions at the energy frontier. An examination of all possible topologies which give identifiable resonant features in a specific final state leads to a tractable number of `topological models' per final state and gives specific guidance for their discovery. Using one specific final state, $\\ell\\ell jj$, as an example, we find that the number of possibilities is reasonable and reveals simple, but as-yet-unexplored, topologies which contain significant discovery potential. We propose analysis techniques and estimate the sensitivity for $pp$ collisions with $\\sqrt{s}=14$ TeV and $\\mathcal{L}=300$ fb$^{-1}$.

  5. Partial continuation model and its application in mitigating systematic errors of double-differenced GPS measurements

    Institute of Scientific and Technical Information of China (English)

    GUO Jianfeng; OU Jikun; REN Chao

    2005-01-01

    Based on the so-called partial continuation model with exact finite measurements, a new stochastic assessment procedure is introduced. For every satellite pair, the temporal correlation coefficient is estimated using the original double-differenced (DD) GPS measurements. And then, the Durbin-Watson test is applied to test specific hypothesis on the temporal correlation coefficient. Unless the test is not significant with a certain significant level, a data transformation is required. These transformed measurements are free of time correlations. For purpose of illustration, two static GPS baseline data sets are analyzed in detail. The experimental results demonstrate that the proposed procedure can mitigate effectively the impact of systematic errors on DD GPS measurements.

  6. A Systematic Review of the Anxiolytic-Like Effects of Essential Oils in Animal Models

    Directory of Open Access Journals (Sweden)

    Damião Pergentino de Sousa

    2015-10-01

    Full Text Available The clinical efficacy of standardized essential oils (such as Lavender officinalis, in treating anxiety disorders strongly suggests that these natural products are an important candidate source for new anxiolytic drugs. A systematic review of essential oils, their bioactive constituents, and anxiolytic-like activity is conducted. The essential oil with the best profile is Lavendula angustifolia, which has already been tested in controlled clinical trials with positive results. Citrus aurantium using different routes of administration also showed significant effects in several animal models, and was corroborated by different research groups. Other promising essential oils are Citrus sinensis and bergamot oil, which showed certain clinical anxiolytic actions; along with Achillea wilhemsii, Alpinia zerumbet, Citrus aurantium, and Spiranthera odoratissima, which, like Lavendula angustifolia, appear to exert anxiolytic-like effects without GABA/benzodiazepine activity, thus differing in their mechanisms of action from the benzodiazepines. The anxiolytic activity of 25 compounds commonly found in essential oils is also discussed.

  7. Exposure limits: the underestimation of absorbed cell phone radiation, especially in children.

    Science.gov (United States)

    Gandhi, Om P; Morgan, L Lloyd; de Salles, Alvaro Augusto; Han, Yueh-Ying; Herberman, Ronald B; Davis, Devra Lee

    2012-03-01

    The existing cell phone certification process uses a plastic model of the head called the Specific Anthropomorphic Mannequin (SAM), representing the top 10% of U.S. military recruits in 1989 and greatly underestimating the Specific Absorption Rate (SAR) for typical mobile phone users, especially children. A superior computer simulation certification process has been approved by the Federal Communications Commission (FCC) but is not employed to certify cell phones. In the United States, the FCC determines maximum allowed exposures. Many countries, especially European Union members, use the "guidelines" of International Commission on Non-Ionizing Radiation Protection (ICNIRP), a non governmental agency. Radiofrequency (RF) exposure to a head smaller than SAM will absorb a relatively higher SAR. Also, SAM uses a fluid having the average electrical properties of the head that cannot indicate differential absorption of specific brain tissue, nor absorption in children or smaller adults. The SAR for a 10-year old is up to 153% higher than the SAR for the SAM model. When electrical properties are considered, a child's head's absorption can be over two times greater, and absorption of the skull's bone marrow can be ten times greater than adults. Therefore, a new certification process is needed that incorporates different modes of use, head sizes, and tissue properties. Anatomically based models should be employed in revising safety standards for these ubiquitous modern devices and standards should be set by accountable, independent groups.

  8. Biomass density and filament length synergistically affect activated sludge settling: systematic quantification and modeling.

    Science.gov (United States)

    Jassby, D; Xiao, Y; Schuler, A J

    2014-01-01

    Settling of the biomass produced during biological treatment of wastewater is a critical and often problematic process. Filamentous bacteria content is the best-known factor affecting biomass settleability in activated sludge wastewater treatment systems, and varying biomass density has recently been shown to play an important role as well. The objective of this study was to systematically determine how filament content and biomass density combine to affect microbial biomass settling, with a focus on density variations over the range found in full-scale systems. A laboratory-scale bioreactor system was operated to produce biomass with a range of filamentous bacterium contents. Biomass density was systematically varied in samples from this system by addition of synthetic microspheres to allow separation of filament content and density effects on settleability. Fluorescent in-situ hybridization indicated that the culture was dominated by Sphaerotilus natans, a common contributor to poor settling in full-scale systems. A simple, image-based metric of filament content (filament length per floc area) was linearly correlated with the more commonly used filament length per dry biomass measurement. A non-linear, semi-empirical model of settleability as a function of filament content and density was developed and evaluated, providing a better understanding of how these two parameters combine to affect settleability. Filament content (length per dry biomass weight) was nearly linearly related to sludge volume index (SVI) values, with a slightly decreasing differential, and biomass density exhibited an asymptotic relationship with SVI. The filament content associated with bulking was shown to be a function of biomass density. The marginal effect of filament content on settleability increased with decreasing biomass density (low density biomass was more sensitive to changes in filament content than was high density biomass), indicating a synergistic relationship between these

  9. Systematic Assessment Through Mathematical Model For Sustainability Reporting In Malaysia Context

    Science.gov (United States)

    Lanang, Wan Nurul Syahirah Wan; Turan, Faiz Mohd; Johan, Kartina

    2017-08-01

    Sustainability assessment have been studied and increasingly recognized as a powerful and valuable tool to measure the performance of sustainability in a company or industry. Nowadays, there are many existing tools that the users can use for sustainable development. There are various initiatives exists on tools for sustainable development, though most of the tools focused on environmental, economy and social aspects. Using the Green Project Management (GPM) P5 concept that suggests the firms not only needs to engage in mainly 3Ps principle: planet, profit, people responsible behaviours, but also, product and process need to be included in the practices, this study will introduce a new mathematical model for assessing the level of sustainability practice in the company. Based on multiple case studies, involving in-depth interviews with senior directors, feedback from experts, and previous engineering report, a systematic approach is done with the aims to obtain the respective data from the feedbacks and to be developed into a new mathematical model. By reviewing on the methodology of this research it comprises of several phases where it starts with the analyzation of the parameters and criteria selection according to the Malaysian context of industry. Moving on to the next step is data analysis involving regression and finally the normalisation process will be done to determine the result of this research either succeeded or not. Lastly, this study is expected to provide a clear guideline to any company or organization to assimilate the sustainability assessment in their development stage. In future, the better understanding towards the sustainability assessment is attained to be aligned unitedly in order to integrated the process approach into the systematic approach for the sustainability assessment.

  10. Training models in laparoscopy: a systematic review comparing their effectiveness in learning surgical skills.

    Science.gov (United States)

    Willaert, W; Van De Putte, D; Van Renterghem, K; Van Nieuwenhove, Y; Ceelen, W; Pattyn, P

    2013-01-01

    Surgery has traditionally been learned on patients in the operating room, which is time-consuming, can have an impact on the patient outcomes, and is of variable effectiveness. As a result, surgical training models have been developed, which are compared in this systematic review. We searched Pubmed, CENTRAL, and Science Citation index expanded for randomised clinical trials and randomised cross-over studies comparing laparoscopic training models. Studies comparing one model with no training were also included. The reference list of identified trials was searched for further relevant studies. Fifty-eight trials evaluating several training forms and involving 1591 participants were included (four studies with a low risk of bias). Training (virtual reality (VR) or video trainer (VT)) versus no training improves surgical skills in the majority of trials. Both VR and VT are as effective in most studies. VR training is superior to traditional laparoscopic training in the operating room. Outcome results for VR robotic simulations versus robot training show no clear difference in effectiveness for either model. Only one trial included human cadavers and observed better results versus VR for one out of four scores. Contrasting results are observed when robotic technology is compared with manual laparoscopy. VR training and VT training are valid teaching models. Practicing on these models similarly improves surgical skills. A combination of both methods is recommended in a surgical curriculum. VR training is superior to unstructured traditional training in the operating room. The reciprocal effectiveness of the other models to learn surgical skills has not yet been established.

  11. Clinical outcomes of HIV care delivery models in the US: a systematic review.

    Science.gov (United States)

    Kimmel, April D; Martin, Erika G; Galadima, Hadiza; Bono, Rose S; Tehrani, Ali Bonakdar; Cyrus, John W; Henderson, Margaret; Freedberg, Kenneth A; Krist, Alexander H

    2016-10-01

    With over 1 million people living with HIV, the US faces national challenges in HIV care delivery due to an inadequate HIV specialist workforce and the increasing role of non-communicable chronic diseases in driving morbidity and mortality in HIV-infected patients. Alternative HIV care delivery models, which include substantial roles for advanced practitioners and/or coordination between specialty and primary care settings in managing HIV-infected patients, may address these needs. We aimed to systematically review the evidence on patient-level HIV-specific and primary care health outcomes for HIV-infected adults receiving outpatient care across HIV care delivery models. We identified randomized trials and observational studies from bibliographic and other databases through March 2016. Eligible studies met pre-specified eligibility criteria including on care delivery models and patient-level health outcomes. We considered all available evidence, including non-experimental studies, and evaluated studies for risk of bias. We identified 3605 studies, of which 13 met eligibility criteria. Of the 13 eligible studies, the majority evaluated specialty-based care (9 studies). Across all studies and care delivery models, eligible studies primarily reported mortality and antiretroviral use, with specialty-based care associated with mortality reductions at the clinician and practice levels and with increased antiretroviral initiation or use at the clinician level but not the practice level. Limited and heterogeneous outcomes were reported for other patient-level HIV-specific outcomes (e.g., viral suppression) as well as for primary care health outcomes across all care delivery models. No studies addressed chronic care outcomes related to aging. Limited evidence was available across geographic settings and key populations. As re-design of care delivery in the US continues to evolve, better understanding of patient-level HIV-related and primary care health outcomes, especially

  12. Underestimation of mid-Holocene Arctic warming in PMIP simulations

    Science.gov (United States)

    Zhang, Qiong; Muschitiello, Francesco

    2016-04-01

    Due to the orbital forcing, Arctic is warmer during mid-Holocene (~ 6 kyr BP) in summer because the region received more insolation and also warmer in winter because of strong feedbacks, leads to an annual mean temperature warming. Existing proxy reconstructions show that the Arctic can be two degrees warmer than pre-industrial. However, not all the climate models can capture the warming, and the amplitude is about 0.5 degree less than that seen from proxy data. One possible reason is that these simulations did not take into account a fact of 'Green Sahara', where the large area of Sahara region is covered by vegetation instead of desert as it is today. By using a fully coupled climate model EC-Earth with about 100 km resolution, we have run a series of sensitivity experiments by changing the surface type, as well as accompanied change in dust emission over the northern Sahara. The results show that a green sahara not only results in local climate response such as the northward extension and strengthening of African monsoon, but also affect the large scale circulation and corresponding meridional heat transport. The combination of green sahara and reduced dust entails a general strengthening of the mid-latitude Westerlies, results in a change to more positive North Atlantic Oscillation-like conditions, and more heat transport from lower latitudes to high latitudes both in atmosphere and ocean, eventually leads to a shift towards warmer conditions over the North Atlantic and Arctic regions. This mechanism would explain the sign of rapid hydro-climatic perturbations recorded in several reconstructions from high northern latitudes after the termination of the African Humid Period around 5.5 - 5.0 kyr BP, suggesting that these regions are sensitive to changes in Saharan land cover during the present interglacial. This is central in the debate surrounding Arctic climate amplification and future projections for subtropical precipitation changes and related surface type

  13. Don't Underestimate the Benefits of Being Misunderstood.

    Science.gov (United States)

    Gibson, Edward; Tan, Caitlin; Futrell, Richard; Mahowald, Kyle; Konieczny, Lars; Hemforth, Barbara; Fedorenko, Evelina

    2017-06-01

    Being a nonnative speaker of a language poses challenges. Individuals often feel embarrassed by the errors they make when talking in their second language. However, here we report an advantage of being a nonnative speaker: Native speakers give foreign-accented speakers the benefit of the doubt when interpreting their utterances; as a result, apparently implausible utterances are more likely to be interpreted in a plausible way when delivered in a foreign than in a native accent. Across three replicated experiments, we demonstrated that native English speakers are more likely to interpret implausible utterances, such as "the mother gave the candle the daughter," as similar plausible utterances ("the mother gave the candle to the daughter") when the speaker has a foreign accent. This result follows from the general model of language interpretation in a noisy channel, under the hypothesis that listeners assume a higher error rate in foreign-accented than in nonaccented speech.

  14. Towards systematic evaluation of crop model outputs for global land-use models

    Science.gov (United States)

    Leclere, David; Azevedo, Ligia B.; Skalský, Rastislav; Balkovič, Juraj; Havlík, Petr

    2016-04-01

    Land provides vital socioeconomic resources to the society, however at the cost of large environmental degradations. Global integrated models combining high resolution global gridded crop models (GGCMs) and global economic models (GEMs) are increasingly being used to inform sustainable solution for agricultural land-use. However, little effort has yet been done to evaluate and compare the accuracy of GGCM outputs. In addition, GGCM datasets require a large amount of parameters whose values and their variability across space are weakly constrained: increasing the accuracy of such dataset has a very high computing cost. Innovative evaluation methods are required both to ground credibility to the global integrated models, and to allow efficient parameter specification of GGCMs. We propose an evaluation strategy for GGCM datasets in the perspective of use in GEMs, illustrated with preliminary results from a novel dataset (the Hypercube) generated by the EPIC GGCM and used in the GLOBIOM land use GEM to inform on present-day crop yield, water and nutrient input needs for 16 crops x 15 management intensities, at a spatial resolution of 5 arc-minutes. We adopt the following principle: evaluation should provide a transparent diagnosis of model adequacy for its intended use. We briefly describe how the Hypercube data is generated and how it articulates with GLOBIOM in order to transparently identify the performances to be evaluated, as well as the main assumptions and data processing involved. Expected performances include adequately representing the sub-national heterogeneity in crop yield and input needs: i) in space, ii) across crop species, and iii) across management intensities. We will present and discuss measures of these expected performances and weight the relative contribution of crop model, input data and data processing steps in performances. We will also compare obtained yield gaps and main yield-limiting factors against the M3 dataset. Next steps include

  15. PENGARUH DPR, GRE, DAN SYSTEMATIC RISK TERHADAP PER: UJI KONSISTENSI MODEL

    Directory of Open Access Journals (Sweden)

    Fanny Rifqi El Fuad

    2012-09-01

    Full Text Available The study aims to analyze the stock price valuation which is listed in Jakarta Stock Exchange by employing modelling approach of Price Earnings Ratio (PER and factors that are assumed enable to explain the changes. These factors are Dividend Payout Ratio ( DPR , the Growth Rate of Earnings (GRE, and systematic risk. The result showed that among these three variables, the DPR is the only factor consistently influences the variation in the value of PER on the three models of regression cross section which were made respectively from 2000 to 2002. The next analysis was conducted by using a simple regression between variable of DPR, an independent variable, and PER, dependent variable. This analysis revealed a significant result- a level of consistency coefficient and high intercept. This research also aims to test the regression model consistency cross section. It showed that the theoretical value of PER (earning multiplier gathered from regression cross section, can be used to determine the intrinsic stock if the regression model made is in the similar market situation during the valuation process. Without having this assumption accomplished, an investor cannot compare the theoretical PER value of the various models which are made by using equal sample and method. Penelitian ini dilakukan untuk menganalisis penilaian harga saham yang listed di Bursa Efek Jakarta berdasarkan pendekatan model price earning ratio (PER, beserta faktor-faktor yang diduga mampu menjelaskan perubahannya. Faktor-faktor yang diduga mampu menjelaskan perubahan PER adalah: dividend payout ratio (DPR, growth rate of earning (GRE, dan risiko sistematis. Dari keseluruhan pengujian hipotesis di dalam penelitian ini menunjukkan bahwa ketiga variabel yang diduga mempengaruhi variasi nilai PER saham, hanya variabel DPR yang secara konsisten secara signifikan mempengaruhi variasi nilai PER pada ketiga model regresi cross section yang dibuat berturut-turut mulai tahun 2000 sampai

  16. Prognostic Models in Adults Undergoing Physical Therapy for Rotator Cuff Disorders: Systematic Review.

    Science.gov (United States)

    Braun, Cordula; Hanchard, Nigel C; Batterham, Alan M; Handoll, Helen H; Betthäuser, Andreas

    2016-07-01

    Rotator cuff-related disorders represent the largest subgroup of shoulder complaints. Despite the availability of various conservative and surgical treatment options, the precise indications for these options remain unclear. The purpose of this systematic review was to synthesize the available research on prognostic models for predicting outcomes in adults undergoing physical therapy for painful rotator cuff disorders. The MEDLINE, EMBASE, CINAHL, Cochrane CENTRAL, and PEDro databases and the World Health Organization (WHO) International Clinical Trials Registry Platform (ICTRP) up to October 2015 were searched. The review included primary studies exploring prognostic models in adults undergoing physical therapy, with or without other conservative measures, for painful rotator cuff disorders. Primary outcomes were pain, disability, and adverse events. Inclusion was limited to prospective investigations of prognostic factors elicited at the baseline assessment. Study selection was independently performed by 2 reviewers. A pilot-tested form was used to extract data on key aspects of study design, characteristics, analyses, and results. Risk of bias and applicability were independently assessed by 2 reviewers using the Prediction Study Risk of Bias Assessment tool (PROBAST). Five studies were included in the review. These studies were extremely heterogeneous in many aspects of design, conduct, and analysis. The findings were analyzed narratively. All included studies were rated as at high risk of bias, and none of the resulting prognostic models was found to be usable in clinical practice. There are no prognostic models ready to inform clinical practice in the context of the review question, highlighting the need for further research on prognostic models for predicting outcomes in adults who undergo physical therapy for painful rotator cuff disorders. The design and conduct of future studies should be receptive to developing methods. © 2016 American Physical Therapy

  17. An integrative model of patient-centeredness - a systematic review and concept analysis.

    Directory of Open Access Journals (Sweden)

    Isabelle Scholl

    Full Text Available Existing models of patient-centeredness reveal a lack of conceptual clarity. This results in a heterogeneous use of the term, unclear measurement dimensions, inconsistent results regarding the effectiveness of patient-centered interventions, and finally in difficulties in implementing patient-centered care. The aim of this systematic review was to identify the different dimensions of patient-centeredness described in the literature and to propose an integrative model of patient-centeredness based on these results.Protocol driven search in five databases, combined with a comprehensive secondary search strategy. All articles that include a definition of patient-centeredness were eligible for inclusion in the review and subject to subsequent content analysis. Two researchers independently first screened titles and abstracts, then assessed full texts for eligibility. In each article the given definition of patient-centeredness was coded independently by two researchers. We discussed codes within the research team and condensed them into an integrative model of patient-centeredness.4707 records were identified through primary and secondary search, of which 706 were retained after screening of titles and abstracts. 417 articles (59% contained a definition of patient-centeredness and were coded. 15 dimensions of patient-centeredness were identified: essential characteristics of clinician, clinician-patient relationship, clinician-patient communication, patient as unique person, biopsychosocial perspective, patient information, patient involvement in care, involvement of family and friends, patient empowerment, physical support, emotional support, integration of medical and non-medical care, teamwork and teambuilding, access to care, coordination and continuity of care. In the resulting integrative model the dimensions were mapped onto different levels of care.The proposed integrative model of patient-centeredness allows different stakeholders to speak

  18. An Integrative Model of Patient-Centeredness – A Systematic Review and Concept Analysis

    Science.gov (United States)

    Scholl, Isabelle; Zill, Jördis M.; Härter, Martin; Dirmaier, Jörg

    2014-01-01

    Background Existing models of patient-centeredness reveal a lack of conceptual clarity. This results in a heterogeneous use of the term, unclear measurement dimensions, inconsistent results regarding the effectiveness of patient-centered interventions, and finally in difficulties in implementing patient-centered care. The aim of this systematic review was to identify the different dimensions of patient-centeredness described in the literature and to propose an integrative model of patient-centeredness based on these results. Methods Protocol driven search in five databases, combined with a comprehensive secondary search strategy. All articles that include a definition of patient-centeredness were eligible for inclusion in the review and subject to subsequent content analysis. Two researchers independently first screened titles and abstracts, then assessed full texts for eligibility. In each article the given definition of patient-centeredness was coded independently by two researchers. We discussed codes within the research team and condensed them into an integrative model of patient-centeredness. Results 4707 records were identified through primary and secondary search, of which 706 were retained after screening of titles and abstracts. 417 articles (59%) contained a definition of patient-centeredness and were coded. 15 dimensions of patient-centeredness were identified: essential characteristics of clinician, clinician-patient relationship, clinician-patient communication, patient as unique person, biopsychosocial perspective, patient information, patient involvement in care, involvement of family and friends, patient empowerment, physical support, emotional support, integration of medical and non-medical care, teamwork and teambuilding, access to care, coordination and continuity of care. In the resulting integrative model the dimensions were mapped onto different levels of care. Conclusions The proposed integrative model of patient-centeredness allows different

  19. Disassembly for remanufacturing: A systematic literature review, new model development and future research needs

    Directory of Open Access Journals (Sweden)

    Anjar Priyono

    2016-11-01

    Full Text Available Purpose: Disassembly is an important process that distinguishes remanufacturing from conventional manufacturing. It is a unique process that becomes focus of investigation from many scholars. Yet, most scholars investigate disassembly from technical and operational standpoint that lack of strategic perspective. This paper attempts to fill this gap by looking at disassembly from a strategic perspective by considering organisational characteristics, process choices and product attributes. To be more specific, this paper has three objectives. First, to gain understanding what has been done, and what need to be done in the field of disassembly in remanufacturing. Second, to conduct a systematic literature review for identifying the factors affecting disassembly for remanufacturing. Third, to propose a new model of disassembly for remanufacturing and also to provide avenues for future research. Design/methodology/approach: This study used a systematic literature review method. A series of steps were undertaken during the review. The study was started with determining the purpose of the study, selecting appropriate keywords, and reducing the selected papers using a number of criteria. A deeper analysis was carried out on the final paper that meets the criteria for this review. Findings: There are two main findings of this study. First, a list of factors affecting disassembly in remanufacturing is identified. The factors can be categorised into three groups: organisational factors, process choices and product attributes. Second, using factors that have been identified, a new model of disassembly process for remanufacturing is developed. Current studies only consider disassembly as a physical activity to break down products into components. In the new model, disassembly is viewed as a process that converts into into output, which consist of a series of steps. Research limitations/implications: The opportunities for future research include: the need to

  20. A hybrid variational-ensemble data assimilation scheme with systematic error correction for limited-area ocean models

    Science.gov (United States)

    Oddo, Paolo; Storto, Andrea; Dobricic, Srdjan; Russo, Aniello; Lewis, Craig; Onken, Reiner; Coelho, Emanuel

    2016-10-01

    A hybrid variational-ensemble data assimilation scheme to estimate the vertical and horizontal parts of the background error covariance matrix for an ocean variational data assimilation system is presented and tested in a limited-area ocean model implemented in the western Mediterranean Sea. An extensive data set collected during the Recognized Environmental Picture Experiments conducted in June 2014 by the Centre for Maritime Research and Experimentation has been used for assimilation and validation. The hybrid scheme is used to both correct the systematic error introduced in the system from the external forcing (initialisation, lateral and surface open boundary conditions) and model parameterisation, and improve the representation of small-scale errors in the background error covariance matrix. An ensemble system is run offline for further use in the hybrid scheme, generated through perturbation of assimilated observations. Results of four different experiments have been compared. The reference experiment uses the classical stationary formulation of the background error covariance matrix and has no systematic error correction. The other three experiments account for, or not, systematic error correction and hybrid background error covariance matrix combining the static and the ensemble-derived errors of the day. Results show that the hybrid scheme when used in conjunction with the systematic error correction reduces the mean absolute error of temperature and salinity misfit by 55 and 42 % respectively, versus statistics arising from standard climatological covariances without systematic error correction.

  1. Systematic or Signal? How dark matter misalignments can bias strong lensing models of galaxy clusters

    CERN Document Server

    Harvey, David; Jauzac, Mathilde

    2016-01-01

    We explore how assuming that mass traces light in strong gravitational lensing models can lead to systematic errors in the predicted position of multiple images. Using a model based on the galaxy cluster MACSJ0416 (z = 0.397) from the Hubble Frontier Fields, we split each galactic halo into a baryonic and dark matter component. We then shift the dark matter halo such that it no longer aligns with the baryonic halo and investigate how this affects the resulting position of multiple images. We find for physically motivated misalignments in dark halo position, ellipticity, position angle and density profile, that multiple images can move on average by more than 0.2" with individual images moving greater than 1". We finally estimate the full error induced by assuming that light traces mass and find that this assumption leads to an expected RMS error of 0.5", almost the entire error budget observed in the Frontier Fields. Given the large potential contribution from the assumption that light traces mass to the erro...

  2. A Neurobiological Model of Borderline Personality Disorder: Systematic and Integrative Review.

    Science.gov (United States)

    Ruocco, Anthony C; Carcone, Dean

    2016-01-01

    Borderline personality disorder (BPD) is a severe mental disorder with a multifactorial etiology. The development and maintenance of BPD is sustained by diverse neurobiological factors that contribute to the disorder's complex clinical phenotype. These factors may be identified using a range of techniques to probe alterations in brain systems that underlie BPD. We systematically searched the scientific literature for empirical studies on the neurobiology of BPD, identifying 146 articles in three broad research areas: neuroendocrinology and biological specimens; structural neuroimaging; and functional neuroimaging. We consolidate the results of these studies and provide an integrative model that attempts to incorporate the heterogeneous findings. The model specifies interactions among endogenous stress hormones, neurometabolism, and brain structures and circuits involved in emotion and cognition. The role of the amygdala in BPD is expanded to consider its functions in coordinating the brain's dynamic evaluation of the relevance of emotional stimuli in the context of an individual's goals and motivations. Future directions for neurobiological research on BPD are discussed, including implications for the Research Domain Criteria framework, accelerating genetics research by incorporating endophenotypes and gene × environment interactions, and exploring novel applications of neuroscience findings to treatment research.

  3. Associations between psychological variables and pain in experimental pain models. A systematic review.

    Science.gov (United States)

    Hansen, M S; Horjales-Araujo, E; Dahl, J B

    2015-10-01

    The association between pain and psychological characteristics has been widely debated. Thus, it remains unclear whether an individual's psychological profile influences a particular pain experience, or if previous pain experience contributes to a certain psychological profile. Translational studies performed in healthy volunteers may provide knowledge concerning psychological factors in healthy individuals as well as basic pain physiology. The aim of this review was to investigate whether psychological vulnerability or specific psychological variables in healthy volunteers are predictive of the level of pain following experimental pain models. A systematic search on the databases, PubMed, Embase, Cochcrane library, and Clinicaltrials.gov was performed during September 2014. All trials investigating the association between psychological variables and experimental pain in healthy volunteers were considered for inclusion. Twenty-nine trials met the inclusion criteria, with a total of 2637 healthy volunteers. The included trials investigated a total of 45 different psychological tests and 27 different types of pain models. The retrieved trials did not present a sufficiently homogenous group to perform meta-analysis. The collected results were diverse. A total of 16 trials suggested that psychological factors may predict the level of pain, seven studies found divergent results, and six studies found no significant association between psychological variables and experimental pain. Psychological factors may have predictive value when investigating experimental pain. However, due to substantial heterogeneity and methodological shortcomings of the published literature, firm conclusions are not possible. © 2015 The Acta Anaesthesiologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  4. ScaleNet: a literature-based model of scale insect biology and systematics.

    Science.gov (United States)

    García Morales, Mayrolin; Denno, Barbara D; Miller, Douglass R; Miller, Gary L; Ben-Dov, Yair; Hardy, Nate B

    2016-01-01

    Scale insects (Hemiptera: Coccoidea) are small herbivorous insects found on all continents except Antarctica. They are extremely invasive, and many species are serious agricultural pests. They are also emerging models for studies of the evolution of genetic systems, endosymbiosis and plant-insect interactions. ScaleNet was launched in 1995 to provide insect identifiers, pest managers, insect systematists, evolutionary biologists and ecologists efficient access to information about scale insect biological diversity. It provides comprehensive information on scale insects taken directly from the primary literature. Currently, it draws from 23,477 articles and describes the systematics and biology of 8194 valid species. For 20 years, ScaleNet ran on the same software platform. That platform is no longer viable. Here, we present a new, open-source implementation of ScaleNet. We have normalized the data model, begun the process of correcting invalid data, upgraded the user interface, and added online administrative tools. These improvements make ScaleNet easier to use and maintain and make the ScaleNet data more accurate and extendable. Database URL: http://scalenet.info.

  5. Morphology of Rain Water Channeling in Systematically Varied Model Sandy Soils

    Science.gov (United States)

    Wei, Yuli; Cejas, Cesare M.; Barrois, Rémi; Dreyfus, Rémi; Durian, Douglas J.

    2014-10-01

    We visualize the formation of fingered flow in dry model sandy soils under different rain conditions using a quasi-2D experimental setup and systematically determine the impact of the soil grain diameter and surface wetting properties on the water channeling phenomenon. The model sandy soils we use are random closely packed glass beads with varied diameters and surface treatments. For hydrophilic sandy soils, our experiments show that rain water infiltrates a shallow top layer of soil and creates a horizontal water wetting front that grows downward homogeneously until instabilities occur to form fingered flows. For hydrophobic sandy soils, in contrast, we observe that rain water ponds on the top of the soil surface until the hydraulic pressure is strong enough to overcome the capillary repellency of soil and create narrow water channels that penetrate the soil packing. Varying the raindrop impinging speed has little influence on water channel formation. However, varying the rain rate causes significant changes in the water infiltration depth, water channel width, and water channel separation. At a fixed rain condition, we combine the effects of the grain diameter and surface hydrophobicity into a single parameter and determine its influence on the water infiltration depth, water channel width, and water channel separation. We also demonstrate the efficiency of several soil water improvement methods that relate to the rain water channeling phenomenon, including prewetting sandy soils at different levels before rainfall, modifying soil surface flatness, and applying superabsorbent hydrogel particles as soil modifiers.

  6. Evaluating the effectiveness of health belief model interventions in improving adherence: a systematic review.

    Science.gov (United States)

    Jones, Christina Jane; Smith, Helen; Llewellyn, Carrie

    2014-01-01

    Lack of adherence to health-promoting advice challenges the successful prevention and management of many conditions. The Health Belief Model (HBM) was developed in 1966 to predict health-promoting behaviour and has been used in patients with wide variety of disease. The HBM has also been used to inform the development of interventions to improve health behaviours. Several reviews have documented the HBM's performance in predicting behaviour, but no review has addressed its utility in the design of interventions or the efficacy of these interventions. A systematic review was conducted to identify interventional studies which use the HBM as the theoretical basis for intervention design. The HBM has been used continuously in the development of behaviour change interventions for 40 years. Of 18 eligible studies, 14 (78%) reported significant improvements in adherence, with 7 (39%) showing moderate to large effects. However, only six studies used the HBM in its entirety and five different studies measured health beliefs as outcomes. Intervention success appeared to be unrelated to HBM construct addressed challenging the utility of this model as the theoretical basis for adherence-enhancing interventions. Interventions need to be described in full to allow for the identification of effective components and replication of studies.

  7. Beyond frontier molecular orbital theory: a systematic electron transfer model (ETM) for polar bimolecular organic reactions.

    Science.gov (United States)

    Cahill, Katharine J; Johnson, Richard P

    2013-03-01

    Polar bimolecular reactions often begin as charge-transfer complexes and may proceed with a high degree of electron transfer character. Frontier molecular orbital (FMO) theory is predicated in part on this concept. We have developed an electron transfer model (ETM) in which we systematically transfer one electron between reactants and then use density functional methods to model the resultant radical or radical ion intermediates. Sites of higher reactivity are revealed by a composite spin density map (SDM) of odd electron character on the electron density surface, assuming that a new two-electron bond would occur preferentially at these sites. ETM correctly predicts regio- and stereoselectivity for a broad array of reactions, including Diels-Alder, dipolar and ketene cycloadditions, Birch reduction, many types of nucleophilic additions, and electrophilic addition to aromatic rings and polyenes. Conformational analysis of radical ions is often necessary to predict reaction stereochemistry. The electronic and geometric changes due to one-electron oxidation or reduction parallel the reaction coordinate for electrophilic or nucleophilic addition, respectively. The effect is more dramatic for one-electron reduction.

  8. External Validity and Model Validity: A Conceptual Approach for Systematic Review Methodology

    Directory of Open Access Journals (Sweden)

    Raheleh Khorsan

    2014-01-01

    Full Text Available Background. Evidence rankings do not consider equally internal (IV, external (EV, and model validity (MV for clinical studies including complementary and alternative medicine/integrative health care (CAM/IHC research. This paper describe this model and offers an EV assessment tool (EVAT© for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IHC research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about “real-world” consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  9. Identification of selectivity determinants in CYP monooxygenases by modelling and systematic analysis of sequence and structure.

    Science.gov (United States)

    Seifert, Alexander; Pleiss, Jurgen

    2012-02-01

    Cytochrome P450 monooxygenases (CYPs) form a large, ubiquitous enzyme family and are of great interest in red and white biotechnology. To investigate the effect of protein structure on selectivity, the binding of substrate molecules near to the active site was modelled by molecular dynamics simulations. From a comprehensive and systematic comparison of more than 6300 CYP sequences and 31 structures using the Cytochrome P450 Engineering Database (CYPED), residues were identified which are predicted to point close to the heme centre and thus restrict accessibility for substrates. As a result, sequence-structure-function relationships are described that can be used to predict selectivity-determining positions from CYP sequences and structures. Based on this analysis, a minimal library consisting of bacterial CYP102A1 (P450(BM3)) and 24 variants was constructed. All variants were functionally expressed in E. coli, and the library was screened with four terpene substrates. Only 3 variants showed no activity towards all 4 terpenes, while 11 variants demonstrated either a strong shift or improved regio- or stereoselectivity during oxidation of at least one substrate as compared to CYP102A1 wild type. The minimal library also contains variants that show interesting side products which are not generated by the wild type enzyme. By two additional rounds of molecular modelling, diversification, and screening, the selectivity of one of these variants for a new product was optimised with a minimal screening effort. We propose this as a generic approach for other CYP substrates.

  10. Parameters for the mathematical modelling of Clostridium difficile acquisition and transmission: a systematic review.

    Directory of Open Access Journals (Sweden)

    Eroboghene H Otete

    Full Text Available INTRODUCTION: Mathematical modelling of Clostridium difficile infection dynamics could contribute to the optimisation of strategies for its prevention and control. The objective of this systematic review was to summarise the available literature specifically identifying the quantitative parameters required for a compartmental mathematical model of Clostridium difficile transmission. METHODS: Six electronic healthcare databases were searched and all screening, data extraction and study quality assessments were undertaken in duplicate. Results were synthesised using a narrative approach. RESULTS: Fifty-four studies met the inclusion criteria. Reproduction numbers for hospital based epidemics were described in two studies with a range from 0.55 to 7. Two studies provided consistent data on incubation periods. For 62% of cases, symptoms occurred in less than 4 weeks (3-28 days after infection. Evidence on contact patterns was identified in four studies but with limited data reported for populating a mathematical model. Two studies, including one without clinically apparent donor-recipient pairs, provided information on serial intervals for household or ward contacts, showing transmission intervals of <1 week in ward based contacts compared to up to 2 months for household contacts. Eight studies reported recovery rates of between 75%-100% for patients who had been treated with either metronidazole or vancomycin. Forty-nine studies gave recurrence rates of between 3% and 49% but were limited by varying definitions of recurrence. No study was found which specifically reported force of infection or net reproduction numbers. CONCLUSIONS: There is currently scant literature overtly citing estimates of the parameters required to inform the quantitative modelling of Clostridium difficile transmission. Further high quality studies to investigate transmission parameters are required, including through review of published epidemiological studies where these

  11. Scale interactions on diurnal toseasonal timescales and their relevanceto model systematic errors

    Directory of Open Access Journals (Sweden)

    G. Yang

    2003-06-01

    Full Text Available Examples of current research into systematic errors in climate models are used to demonstrate the importance of scale interactions on diurnal,intraseasonal and seasonal timescales for the mean and variability of the tropical climate system. It has enabled some conclusions to be drawn about possible processes that may need to be represented, and some recommendations to be made regarding model improvements. It has been shown that the Maritime Continent heat source is a major driver of the global circulation but yet is poorly represented in GCMs. A new climatology of the diurnal cycle has been used to provide compelling evidence of important land-sea breeze and gravity wave effects, which may play a crucial role in the heat and moisture budget of this key region for the tropical and global circulation. The role of the diurnal cycle has also been emphasized for intraseasonal variability associated with the Madden Julian Oscillation (MJO. It is suggested that the diurnal cycle in Sea Surface Temperature (SST during the suppressed phase of the MJO leads to a triggering of cumulus congestus clouds, which serve to moisten the free troposphere and hence precondition the atmosphere for the next active phase. It has been further shown that coupling between the ocean and atmosphere on intraseasonal timescales leads to a more realistic simulation of the MJO. These results stress the need for models to be able to simulate firstly, the observed tri-modal distribution of convection, and secondly, the coupling between the ocean and atmosphere on diurnal to intraseasonal timescales. It is argued, however, that the current representation of the ocean mixed layer in coupled models is not adequate to represent the complex structure of the observed mixed layer, in particular the formation of salinity barrier layers which can potentially provide much stronger local coupling between the atmosphere and ocean on diurnal to intraseasonal timescales.

  12. Tissue Engineering in Animal Models for Urinary Diversion: A Systematic Review

    Science.gov (United States)

    Sloff, Marije; de Vries, Rob; Geutjes, Paul; IntHout, Joanna; Ritskes-Hoitinga, Merel

    2014-01-01

    Tissue engineering and regenerative medicine (TERM) approaches may provide alternatives for gastrointestinal tissue in urinary diversion. To continue to clinically translatable studies, TERM alternatives need to be evaluated in (large) controlled and standardized animal studies. Here, we investigated all evidence for the efficacy of tissue engineered constructs in animal models for urinary diversion. Studies investigating this subject were identified through a systematic search of three different databases (PubMed, Embase and Web of Science). From each study, animal characteristics, study characteristics and experimental outcomes for meta-analyses were tabulated. Furthermore, the reporting of items vital for study replication was assessed. The retrieved studies (8 in total) showed extreme heterogeneity in study design, including animal models, biomaterials and type of urinary diversion. All studies were feasibility studies, indicating the novelty of this field. None of the studies included appropriate control groups, i.e. a comparison with the classical treatment using GI tissue. The meta-analysis showed a trend towards successful experimentation in larger animals although no specific animal species could be identified as the most suitable model. Larger animals appear to allow a better translation to the human situation, with respect to anatomy and surgical approaches. It was unclear whether the use of cells benefits the formation of a neo urinary conduit. The reporting of the methodology and data according to standardized guidelines was insufficient and should be improved to increase the value of such publications. In conclusion, animal models in the field of TERM for urinary diversion have probably been chosen for reasons other than their predictive value. Controlled and comparative long term animal studies, with adequate methodological reporting are needed to proceed to clinical translatable studies. This will aid in good quality research with the reduction in

  13. Tissue engineering in animal models for urinary diversion: a systematic review.

    Directory of Open Access Journals (Sweden)

    Marije Sloff

    Full Text Available Tissue engineering and regenerative medicine (TERM approaches may provide alternatives for gastrointestinal tissue in urinary diversion. To continue to clinically translatable studies, TERM alternatives need to be evaluated in (large controlled and standardized animal studies. Here, we investigated all evidence for the efficacy of tissue engineered constructs in animal models for urinary diversion. Studies investigating this subject were identified through a systematic search of three different databases (PubMed, Embase and Web of Science. From each study, animal characteristics, study characteristics and experimental outcomes for meta-analyses were tabulated. Furthermore, the reporting of items vital for study replication was assessed. The retrieved studies (8 in total showed extreme heterogeneity in study design, including animal models, biomaterials and type of urinary diversion. All studies were feasibility studies, indicating the novelty of this field. None of the studies included appropriate control groups, i.e. a comparison with the classical treatment using GI tissue. The meta-analysis showed a trend towards successful experimentation in larger animals although no specific animal species could be identified as the most suitable model. Larger animals appear to allow a better translation to the human situation, with respect to anatomy and surgical approaches. It was unclear whether the use of cells benefits the formation of a neo urinary conduit. The reporting of the methodology and data according to standardized guidelines was insufficient and should be improved to increase the value of such publications. In conclusion, animal models in the field of TERM for urinary diversion have probably been chosen for reasons other than their predictive value. Controlled and comparative long term animal studies, with adequate methodological reporting are needed to proceed to clinical translatable studies. This will aid in good quality research with

  14. Evidence-Based Systematic Review: Effects of Different Service Delivery Models on Communication Outcomes for Elementary School-Age Children

    Science.gov (United States)

    Cirrin, Frank M.; Schooling, Tracy L.; Nelson, Nickola W.; Diehl, Sylvia F.; Flynn, Perry F.; Staskowski, Maureen; Torrey, T. Zoann; Adamczyk, Deborah F.

    2010-01-01

    Purpose: The purpose of this investigation was to conduct an evidence-based systematic review (EBSR) of peer-reviewed articles from the last 30 years about the effect of different service delivery models on speech-language intervention outcomes for elementary school-age students. Method: A computer search of electronic databases was conducted to…

  15. Effect of health belief model and health promotion model on breast cancer early diagnosis behavior: a systematic review.

    Science.gov (United States)

    Ersin, Fatma; Bahar, Zuhal

    2011-01-01

    Breast cancer is an important public health problem on the grounds that it is frequently seen and it is a fatal disease. The objective of this systematic analysis is to indicate the effects of interventions performed by nurses by using the Health Belief Model (HBM) and Health Promotion Model (HPM) on the breast cancer early diagnosis behaviors and on the components of the Health Belief Model and Health Promotion Model. The reveiw was created in line with the Centre for Reviews and Dissemination guide dated 2009 (CRD) and developed by York University National Institute of Health Researches. Review was conducted by using PUBMED, OVID, EBSCO and COCHRANE databases. Six hundred seventy eight studies (PUBMED: 236, OVID: 162, EBSCO: 175, COCHRANE:105) were found in total at the end of the review. Abstracts and full texts of these six hundred seventy eight studies were evaluated in terms of inclusion and exclusion criteria and 9 studies were determined to meet the criteria. Samplings of the studies varied between ninety four and one thousand six hundred fifty five. It was detected in the studies that educations provided by taking the theories as basis became effective on the breast cancer early diagnosis behaviors. When the literature is examined, it is observed that the experimental researches which compare the concepts of Health Belief Model (HBM) and Health Promotion Model (HPM) preoperatively and postoperatively and show the effect of these concepts on education and are conducted by nurses are limited in number. Randomized controlled studies which compare HBM and HPM concepts preoperatively and postoperatively and show the efficiency of the interventions can be useful in evaluating the efficiency of the interventions.

  16. Disclosing bias in bisulfite assay: MethPrimers underestimate high DNA methylation.

    Directory of Open Access Journals (Sweden)

    Andrea Fuso

    Full Text Available Discordant results obtained in bisulfite assays using MethPrimers (PCR primers designed using MethPrimer software or assuming that non-CpGs cytosines are non methylated versus primers insensitive to cytosine methylation lead us to hypothesize a technical bias. We therefore used the two kinds of primers to study different experimental models and methylation statuses. We demonstrated that MethPrimers negatively select hypermethylated DNA sequences in the PCR step of the bisulfite assay, resulting in CpG methylation underestimation and non-CpG methylation masking, failing to evidence differential methylation statuses. We also describe the characteristics of "Methylation-Insensitive Primers" (MIPs, having degenerated bases (G/A to cope with the uncertain C/U conversion. As CpG and non-CpG DNA methylation patterns are largely variable depending on the species, developmental stage, tissue and cell type, a variable extent of the bias is expected. The more the methylome is methylated, the greater is the extent of the bias, with a prevalent effect of non-CpG methylation. These findings suggest a revision of several DNA methylation patterns so far documented and also point out the necessity of applying unbiased analyses to the increasing number of epigenomic studies.

  17. Robustness to divergence time underestimation when inferring species trees from estimated gene trees.

    Science.gov (United States)

    DeGiorgio, Michael; Degnan, James H

    2014-01-01

    To infer species trees from gene trees estimated from phylogenomic data sets, tractable methods are needed that can handle dozens to hundreds of loci. We examine several computationally efficient approaches-MP-EST, STAR, STEAC, STELLS, and STEM-for inferring species trees from gene trees estimated using maximum likelihood (ML) and Bayesian approaches. Among the methods examined, we found that topology-based methods often performed better using ML gene trees and methods employing coalescent times typically performed better using Bayesian gene trees, with MP-EST, STAR, STEAC, and STELLS outperforming STEM under most conditions. We examine why the STEM tree (also called GLASS or Maximum Tree) is less accurate on estimated gene trees by comparing estimated and true coalescence times, performing species tree inference using simulations, and analyzing a great ape data set keeping track of false positive and false negative rates for inferred clades. We find that although true coalescence times are more ancient than speciation times under the multispecies coalescent model, estimated coalescence times are often more recent than speciation times. This underestimation can lead to increased bias and lack of resolution with increased sampling (either alleles or loci) when gene trees are estimated with ML. The problem appears to be less severe using Bayesian gene-tree estimates.

  18. Vertical transmission and fetal damage in animal models of congenital toxoplasmosis: A systematic review.

    Science.gov (United States)

    Vargas-Villavicencio, José Antonio; Besné-Mérida, Alejandro; Correa, Dolores

    2016-06-15

    In humans, the probability of congenital infection and fetal damage due to Toxoplasma gondii is dependent on the gestation period at which primary infection occurs. Many animal models have been used for vaccine, drug testing, or studies on host or parasite factors that affect transmission or fetal pathology, but few works have directly tested fetal infection and damage rates along gestation. So, the purpose of this work was to perform a systematic review of the literature to determine if there is a model which reflects these changes as they occur in humans. We looked for papers appearing between 1970 and 2014 in major databases like Medline and Scopus, as well as gray literature. From almost 11,000 citations obtained, only 49 papers fulfilled the criteria of having data of all independent variables and at least one dependent datum for control (untreated) groups. Some interesting findings could be extracted. For example, pigs seem resistant and sheep susceptible to congenital infection. Also, oocysts cause more congenitally infected offspring than tissue cysts, bradyzoites or tachyzoites. In spite of these interesting findings, very few results on vertical transmission or fetal damage rates were similar to those described for humans and only for one of the gestation thirds, not all. Moreover, in most designs tissue cysts - with unknown number of bradyzoites - were used, so actual dose could not be established. The meta-analysis could not be performed, mainly because of great heterogeneity in experimental conditions. Nevertheless, results gathered suggest that a model could be designed to represent the increase in vertical transmission and decrease in fetal damage found in humans under natural conditions.

  19. Antimatter underestimated

    CERN Document Server

    Gsponer, A; Gsponer, Andre; Hurni, Jean-Pierre

    1987-01-01

    We warn of the potential nuclear proliferation's consequences of military applications of nano- or microgram amounts of antimatter, such as triggering of high-yield thermonuclear explosions, laser pumping, compact sources of energy, directed-energy beams, and portable sources of muons.

  20. Factors associated with adoption of health information technology: a conceptual model based on a systematic review.

    Science.gov (United States)

    Kruse, Clemens Scott; DeShazo, Jonathan; Kim, Forest; Fulton, Lawrence

    2014-05-23

    The Health Information Technology for Economic and Clinical Health Act (HITECH) allocated $19.2 billion to incentivize adoption of the electronic health record (EHR). Since 2009, Meaningful Use Criteria have dominated information technology (IT) strategy. Health care organizations have struggled to meet expectations and avoid penalties to reimbursements from the Center for Medicare and Medicaid Services (CMS). Organizational theories attempt to explain factors that influence organizational change, and many theories address changes in organizational strategy. However, due to the complexities of the health care industry, existing organizational theories fall short of demonstrating association with significant health care IT implementations. There is no organizational theory for health care that identifies, groups, and analyzes both internal and external factors of influence for large health care IT implementations like adoption of the EHR. The purpose of this systematic review is to identify a full-spectrum of both internal organizational and external environmental factors associated with the adoption of health information technology (HIT), specifically the EHR. The result is a conceptual model that is commensurate with the complexity of with the health care sector. We performed a systematic literature search in PubMed (restricted to English), EBSCO Host, and Google Scholar for both empirical studies and theory-based writing from 1993-2013 that demonstrated association between influential factors and three modes of HIT: EHR, electronic medical record (EMR), and computerized provider order entry (CPOE). We also looked at published books on organizational theories. We made notes and noted trends on adoption factors. These factors were grouped as adoption factors associated with various versions of EHR adoption. The resulting conceptual model summarizes the diversity of independent variables (IVs) and dependent variables (DVs) used in articles, editorials, books, as

  1. Prediction models for the mortality risk in chronic dialysis patients: a systematic review and independent external validation study.

    Science.gov (United States)

    Ramspek, Chava L; Voskamp, Pauline Wm; van Ittersum, Frans J; Krediet, Raymond T; Dekker, Friedo W; van Diepen, Merel

    2017-01-01

    In medicine, many more prediction models have been developed than are implemented or used in clinical practice. These models cannot be recommended for clinical use before external validity is established. Though various models to predict mortality in dialysis patients have been published, very few have been validated and none are used in routine clinical practice. The aim of the current study was to identify existing models for predicting mortality in dialysis patients through a review and subsequently to externally validate these models in the same large independent patient cohort, in order to assess and compare their predictive capacities. A systematic review was performed following the preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines. To account for missing data, multiple imputation was performed. The original prediction formulae were extracted from selected studies. The probability of death per model was calculated for each individual within the Netherlands Cooperative Study on the Adequacy of Dialysis (NECOSAD). The predictive performance of the models was assessed based on their discrimination and calibration. In total, 16 articles were included in the systematic review. External validation was performed in 1,943 dialysis patients from NECOSAD for a total of seven models. The models performed moderately to well in terms of discrimination, with C-statistics ranging from 0.710 (interquartile range 0.708-0.711) to 0.752 (interquartile range 0.750-0.753) for a time frame of 1 year. According to the calibration, most models overestimated the probability of death. Overall, the performance of the models was poorer in the external validation than in the original population, affirming the importance of external validation. Floege et al's models showed the highest predictive performance. The present study is a step forward in the use of a prediction model as a useful tool for nephrologists, using evidence-based medicine that

  2. The psychoneuroimmunological effects of music: a systematic review and a new model.

    Science.gov (United States)

    Fancourt, Daisy; Ockelford, Adam; Belai, Abi

    2014-02-01

    There has been a growing interest over the past decade into the health benefits of music, in particular examining its psychological and neurological effects. Yet this is the first attempt to systematically review publications on the psychoneuroimmunology of music. Of the selected sixty-three studies published over the past 22 years, a range of effects of music on neurotransmitters, hormones, cytokines, lymphocytes, vital signs and immunoglobulins as well as psychological assessments are cataloged. Research so far points to the pivotal role of stress pathways in linking music to an immune response. However, several challenges to this research are noted: (1) there is very little discussion on the possible mechanisms by which music is achieving its neurological and immunological impact; (2) the studies tend to examine biomarkers in isolation, without taking into consideration the interaction of the biomarkers in question with other physiological or metabolic activities of the body, leading to an unclear understanding of the impact that music may be having; (3) terms are not being defined clearly enough, such as distinctions not being made between different kinds of stress and 'music' being used to encompass a broad spectrum of activities without determining which aspects of musical engagement are responsible for alterations in biomarkers. In light of this, a new model is presented which provides a framework for developing a taxonomy of musical and stress-related variables in research design, and tracing the broad pathways that are involved in its influence on the body.

  3. Genetics of borderline personality disorder: systematic review and proposal of an integrative model.

    Science.gov (United States)

    Amad, Ali; Ramoz, Nicolas; Thomas, Pierre; Jardri, Renaud; Gorwood, Philip

    2014-03-01

    Borderline personality disorder (BPD) is one of the most common mental disorders and is characterized by a pervasive pattern of emotional lability, impulsivity, interpersonal difficulties, identity disturbances, and disturbed cognition. Here, we performed a systematic review of the literature concerning the genetics of BPD, including familial and twin studies, association studies, and gene-environment interaction studies. Moreover, meta-analyses were performed when at least two case-control studies testing the same polymorphism were available. For each gene variant, a pooled odds ratio (OR) was calculated using fixed or random effects models. Familial and twin studies largely support the potential role of a genetic vulnerability at the root of BPD, with an estimated heritability of approximately 40%. Moreover, there is evidence for both gene-environment interactions and correlations. However, association studies for BPD are sparse, making it difficult to draw clear conclusions. According to our meta-analysis, no significant associations were found for the serotonin transporter gene, the tryptophan hydroxylase 1 gene, or the serotonin 1B receptor gene. We hypothesize that such a discrepancy (negative association studies but high heritability of the disorder) could be understandable through a paradigm shift, in which "plasticity" genes (rather than "vulnerability" genes) would be involved. Such a framework postulates a balance between positive and negative events, which interact with plasticity genes in the genesis of BPD.

  4. Systematic Design and Modeling of a OTA-C Filter for Portable ECG Detection.

    Science.gov (United States)

    Shuenn-Yuh Lee; Chih-Jen Cheng

    2009-02-01

    This study presents a systematic design of the fully differential operational transconductance amplifier-C (OTA-C) filter for a heart activities detection apparatus. Since the linearity and noise of the filter is dependent on the building cell, a precise behavioral model for the real OTA circuit is created. To reduce the influence of coefficient sensitivity and maintain an undistorted biosignal, a fifth-order ladder-type lowpass Butterworth is employed. Based on this topology, a chip fabricated in a 0.18- mum CMOS process is simulated and measured to validate the system estimation. Since the battery life and the integration with the low-voltage digital processor are the most critical requirement for the portable diagnosis device, the OTA-based circuit is operated in the subthreshold region to save power under the supply voltage of 1V. Measurement results show that this low-voltage and low-power filter possesses the HD3 of -48.9 dB, dynamic range (DR) of 50 dB, and power consumption of 453 nW. Therefore, the OTA-C filter can be adopted to eliminate the out-of-band interference of the electrocardiogram (ECG) whose signal bandwidth is located within 250 Hz.

  5. Systematic effects on the size-luminosity relation: dependence on model fitting and morphology

    CERN Document Server

    Bernardi, M; Vikram, V; Huertas-Company, M; Mei, S; Shankar, F; Sheth, R K

    2012-01-01

    We quantify the systematics in the size-luminosity relation of galaxies in the SDSS main sample which arise from fitting different 1- and 2-component model profiles to the images. In objects brighter than L*, fitting a single Sersic profile to what is really a two-component SerExp system leads to biases: the half-light radius is increasingly overestimated as n of the fitted single component increases; it is also overestimated at B/T ~ 0.6. However, the net effect on the R-L relation is small, except for the most luminous tail, where it curves upwards towards larger sizes. We also study how this relation depends on morphological type. Our analysis is one of the first to use Bayesian-classifier derived weights, rather than hard cuts, to define morphology. Crudely, there appear to be only two relations: one for early-types (Es, S0s and Sa's) and another for late-types (Sbs and Scds). However, closer inspection shows that within the early-type sample S0s tend to be 15% smaller than Es of the same luminosity, and,...

  6. The systematic study of the stability of forecasts in the rate- and state-dependent model.

    Science.gov (United States)

    De Gaetano, D.; McCloskey, J.; Nalbant, S.

    2012-04-01

    Numerous observations have shown a general spatial correlation between positive Coulomb failure stress changes due to an earthquake and the locations of aftershocks. However this correlation does not give any indication of the rate from which we can infer the magnitude using the Gutenberg-Richter law. Dieterich's rate- and state-dependent model can be used to obtain a forecast of the observed aftershock rate for the space and time evolution of seismicity caused by stress changes applied to an infinite population of nucleating patches. The seismicity rate changes on this model depend on eight parameters: the stressing rate, the amplitude of the stress perturbation, the physical constitutive properties of faults, the spatial parameters (location and radii of the cells), the start and duration of each of the temporal windows as well as the background seismicity rate. The background seismicity is obtained from the epidemic type aftershock sequence model. We use the 1992 Landers earthquake as a case study, using the Southern California Earthquake Data Centre (SCEDC) catalogue, to examine if Dieterich's rate- and state-dependent model can forecast the aftershock seismicity rate. A systematic study is performed on a range of values on all the parameters to test the forecasting ability of this model. The results obtained suggest variable success in forecasting, when varying the values for the parameters, with the spatial and temporal parameters being the most sensitive. Dieterich's rate- and state-dependent model is compared with a well studied null hypothesis, the Omori-Utsu law. This law describes the aftershock rate as a power law in time following the main shock and depends on only three parameters: the aftershock productivity, the elapsed time since the main shock and the constant time shift, all of which can be estimated in the early part of the aftershock sequence and then extrapolated to give a long term rate forecast. All parameters are estimated using maximum

  7. Academic self-concept, learning motivation, and test anxiety of the underestimated student.

    Science.gov (United States)

    Urhahne, Detlef; Chao, Sheng-Han; Florineth, Maria Luise; Luttenberger, Silke; Paechter, Manuela

    2011-03-01

    BACKGROUND. Teachers' judgments of student performance on a standardized achievement test often result in an overestimation of students' abilities. In the majority of cases, a larger group of overestimated students and a smaller group of underestimated students are formed by these judgments. AIMS. In this research study, the consequences of the underestimation of students' mathematical performance potential were examined. SAMPLE. Two hundred and thirty-five fourth grade students and their fourteen mathematics teachers took part in the investigation. METHOD. Students worked on a standardized mathematics achievement test and completed a self-description questionnaire about motivation and affect. Teachers estimated each individual student's potential with regard to mathematics test performance as well as students' expectancy for success, level of aspiration, academic self-concept, learning motivation, and test anxiety. The differences between teachers' judgments on students' test performance and students' actual performance were used to build groups of underestimated and overestimated students. RESULTS. Underestimated students displayed equal levels of test performance, learning motivation, and level of aspiration in comparison with overestimated students, but had lower expectancy for success, lower academic self-concept, and experienced more test anxiety. Teachers expected that underestimated students would receive lower grades on the next mathematics test, believed that students were satisfied with lower grades, and assumed that the students have weaker learning motivation than their overestimated classmates. CONCLUSION. Teachers' judgment error was not confined to test performance but generalized to motivational and affective traits of the students.

  8. Risk Analysis of Underestimate Cost Offer to The Project Quality in Aceh Province

    Science.gov (United States)

    Rani, Hafnidar A.

    2016-11-01

    The possibility of errors in the process of offer price determination could be enormous, so it can affect the possibility of project underestimate cost which can impact and reduce the profit if being implementing. Government Equipment/Service Procurement Policy Institution (LKPP) assesses that the practices of cheaper price in the government equipment/service procurement are still highly found and can be potential to decrease the project quality. This study aimed to analyze the most dominant factors happened in underestimate cost offer practice, to analyze the relationship of underestimate cost offer risk factors to road construction project quality in Aceh Province and to analyze the most potential factors of underestimate cost offer risk affecting road construction project quality in Aceh Province. Road construction projects observed the projects which have been implemented in Aceh Province since 2013 - 2015. This study conducted by interviewing Government Budget Authority (KPA), and distributing the questionnaire to the road construction contractors with the qualification of K1, K2, K3, M1, M2 and B1. Based on the data from Construction Service Development Institution (LPJK) of Aceh Province on 2016, the populations obtained are 2,717 constructors. By using Slovin Equation, the research samples obtained are 97 contractors. The most dominant factors in underestimate cost offer risk of the road construction projects in Aceh Province is Contingency Cost Factor which the mean is 4.374.

  9. The systematics of strong lens modeling quantified: the effects of constraint selection and redshift information on magnification, mass, and multiple image predictability

    CERN Document Server

    Johnson, Traci L

    2016-01-01

    Until now, systematic errors in strong gravitational lens modeling have been acknowledged but never been fully quantified. Here, we launch an investigation into the systematics induced by constraint selection. We model the simulated cluster Ares 362 times using random selections of image systems with and without spectroscopic redshifts and quantify the systematics using several diagnostics: image predictability, accuracy of model-predicted redshifts, enclosed mass, and magnification. We find that for models with $>15$ image systems, the image plane rms does not decrease significantly when more systems are added; however the rms values quoted in the literature may be misleading as to the ability of a model to predict new multiple images. The mass is well constrained near the Einstein radius in all cases, and systematic error drops to $10$ image systems. Magnification errors are smallest along the straight portions of the critical curve, and the value of the magnification is systematically lower near curved por...

  10. Models in the delivery of depression care: A systematic review of randomised and controlled intervention trials

    Directory of Open Access Journals (Sweden)

    Clack Dannielle

    2008-05-01

    Full Text Available Abstract Background There is still debate as to which features, types or components of primary care interventions are associated with improved depression outcomes. Previous reviews have focused on components of collaborative care models in general practice settings. This paper aims to determine the effective components of depression care in primary care through a systematic examination of both general practice and community based intervention trials. Methods Fifty five randomised and controlled research trials which focused on adults and contained depression outcome measures were identified through PubMed, PsycInfo and the Cochrane Central Register of Controlled Trials databases. Trials were classified according to the components involved in the delivery of treatment, the type of treatment, the primary focus or setting of the study, detailed features of delivery, and the discipline of the professional providing the treatment. The primary outcome measure was significant improvement on the key depression measure. Results Components which were found to significantly predict improvement were the revision of professional roles, the provision of a case manager who provided direct feedback and delivered a psychological therapy, and an intervention that incorporated patient preferences into care. Nurse, psychologist and psychiatrist delivered care were effective, but pharmacist delivery was not. Training directed to general practitioners was significantly less successful than interventions that did not have training as the most important intervention. Community interventions were effective. Conclusion Case management is important in the provision of care in general practice. Certain community models of care (education programs have potential while others are not successful in their current form (pharmacist monitoring.

  11. Is dream recall underestimated by retrospective measures and enhanced by keeping a logbook? A review.

    Science.gov (United States)

    Aspy, Denholm J; Delfabbro, Paul; Proeve, Michael

    2015-05-01

    There are two methods commonly used to measure dream recall in the home setting. The retrospective method involves asking participants to estimate their dream recall in response to a single question and the logbook method involves keeping a daily record of one's dream recall. Until recently, the implicit assumption has been that these measures are largely equivalent. However, this is challenged by the tendency for retrospective measures to yield significantly lower dream recall rates than logbooks. A common explanation for this is that retrospective measures underestimate dream recall. Another is that keeping a logbook enhances it. If retrospective measures underestimate dream recall and if logbooks enhance it they are both unlikely to reflect typical dream recall rates and may be confounded with variables associated with the underestimation and enhancement effects. To date, this issue has received insufficient attention. The present review addresses this gap in the literature. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Calorie Underestimation When Buying High-Calorie Beverages in Fast-Food Contexts.

    Science.gov (United States)

    Franckle, Rebecca L; Block, Jason P; Roberto, Christina A

    2016-07-01

    We asked 1877 adults and 1178 adolescents visiting 89 fast-food restaurants in New England in 2010 and 2011 to estimate calories purchased. Calorie underestimation was greater among those purchasing a high-calorie beverage than among those who did not (adults: 324 ±698 vs 102 ±591 calories; adolescents: 360 ±602 vs 198 ±509 calories). This difference remained significant for adults but not adolescents after adjusting for total calories purchased. Purchasing high-calorie beverages may uniquely contribute to calorie underestimation among adults.

  13. Evaluation of multidecadal variability in CMIP5 surface solar radiation and inferred underestimation of aerosol direct effects over Europe, China, Japan, and India

    Science.gov (United States)

    Allen, R. J.; Norris, J. R.; Wild, M.

    2013-06-01

    Observations from the Global Energy Balance Archive indicate regional decreases in all sky surface solar radiation from ˜1950s to 1980s, followed by an increase during the 1990s. These periods are popularly called dimming and brightening, respectively. Removal of the radiative effects of cloud cover variability from all sky surface solar radiation results in a quantity called "clear sky proxy" radiation, in which multidecadal trends can be seen more distinctly, suggesting aerosol radiative forcing as a likely cause. Prior work has shown climate models from the Coupled Model Intercomparison Project 3 (CMIP3) generally underestimate the magnitude of these trends, particularly over China and India. Here we perform a similar analysis with 173 simulations from 42 climate models participating in the new CMIP5. Results show negligible improvement over CMIP3, as CMIP5 dimming trends over four regions—Europe, China, India, and Japan—are all underestimated. This bias is largest for both India and China, where the multimodel mean yields a decrease in clear sky proxy radiation of -1.3±0.3 and -1.2±0.2 W m-2decade-1, respectively, compared to observed decreases of -6.5±0.9 and -8.2±1.3 W m-2decade-1. Similar underestimation of the observed dimming over Japan exists, with the CMIP5 mean dimming ˜20% as large as observed. Moreover, not a single simulation reproduces the magnitude of the observed dimming trend for these three regions. Relative to dimming, CMIP5 models better simulate the observed brightening, but significant underestimation exists for both China and Japan. Overall, no individual model performs particularly well for all four regions. Model biases do not appear to be related to the use of prescribed versus prognostic aerosols or to aerosol indirect effects. However, models exhibit significant correlations between clear sky proxy radiation and several aerosol-related fields, most notably aerosol optical depth (AOD) and absorption AOD. This suggests model

  14. The HIV Modes of Transmission model: a systematic review of its findings and adherence to guidelines

    Directory of Open Access Journals (Sweden)

    Zara Shubber

    2014-06-01

    Full Text Available Introduction: The HIV Modes of Transmission (MOT model estimates the annual fraction of new HIV infections (FNI acquired by different risk groups. It was designed to guide country-specific HIV prevention policies. To determine if the MOT produced context-specific recommendations, we analyzed MOT results by region and epidemic type, and explored the factors (e.g. data used to estimate parameter inputs, adherence to guidelines influencing the differences. Methods: We systematically searched MEDLINE, EMBASE and UNAIDS reports, and contacted UNAIDS country directors for published MOT results from MOT inception (2003 to 25 September 2012. Results: We retrieved four journal articles and 20 UNAIDS reports covering 29 countries. In 13 countries, the largest FNI (range 26 to 63% was acquired by the low-risk group and increased with low-risk population size. The FNI among female sex workers (FSWs remained low (median 1.3%, range 0.04 to 14.4%, with little variability by region and epidemic type despite variability in sexual behaviour. In India and Thailand, where FSWs play an important role in transmission, the FNI among FSWs was 2 and 4%, respectively. In contrast, the FNI among men who have sex with men (MSM varied across regions (range 0.1 to 89% and increased with MSM population size. The FNI among people who inject drugs (PWID, range 0 to 82% was largest in early-phase epidemics with low overall HIV prevalence. Most MOT studies were conducted and reported as per guidelines but data quality remains an issue. Conclusions: Although countries are generally performing the MOT as per guidelines, there is little variation in the FNI (except among MSM and PWID by region and epidemic type. Homogeneity in MOT FNI for FSWs, clients and low-risk groups may limit the utility of MOT for guiding country-specific interventions in heterosexual HIV epidemics.

  15. Systematic review and meta-analysis of therapeutic hypothermia in animal models of spinal cord injury.

    Directory of Open Access Journals (Sweden)

    Peter E Batchelor

    Full Text Available Therapeutic hypothermia is a clinically useful neuroprotective therapy for cardiac arrest and neonatal hypoxic ischemic encephalopathy and may potentially be useful for the treatment of other neurological conditions including traumatic spinal cord injury (SCI. The pre-clinical studies evaluating the effectiveness of hypothermia in acute SCI broadly utilise either systemic hypothermia or cooling regional to the site of injury. The literature has not been uniformly positive with conflicting studies of varying quality, some performed decades previously.In this study, we systematically review and meta-analyse the literature to determine the efficacy of systemic and regional hypothermia in traumatic SCI, the experimental conditions influencing this efficacy, and the influence of study quality on outcome. Three databases were utilised; PubMed, ISI Web of Science and Embase. Our inclusion criteria consisted of the (i reporting of efficacy of hypothermia on functional outcome (ii number of animals and (iii mean outcome and variance in each group.Systemic hypothermia improved behavioural outcomes by 24.5% (95% CI 10.2 to 38.8 and a similar magnitude of improvement was seen across a number of high quality studies. The overall behavioural improvement with regional hypothermia was 26.2%, but the variance was wide (95% CI -3.77 to 56.2. This result may reflect a preponderance of positive low quality data, although a preferential effect of hypothermia in ischaemic models of injury may explain some of the disparate data. Sufficient heterogeneity was present between studies of regional hypothermia to reveal a number of factors potentially influencing efficacy, including depth and duration of hypothermia, animal species, and neurobehavioural assessment. However, these factors could reflect the influence of earlier lower quality literature.Systemic hypothermia appears to be a promising potential method of treating acute SCI on the basis of meta-analysis of the

  16. Neural systems language: a formal modeling language for the systematic description, unambiguous communication, and automated digital curation of neural connectivity.

    Science.gov (United States)

    Brown, Ramsay A; Swanson, Larry W

    2013-09-01

    Systematic description and the unambiguous communication of findings and models remain among the unresolved fundamental challenges in systems neuroscience. No common descriptive frameworks exist to describe systematically the connective architecture of the nervous system, even at the grossest level of observation. Furthermore, the accelerating volume of novel data generated on neural connectivity outpaces the rate at which this data is curated into neuroinformatics databases to synthesize digitally systems-level insights from disjointed reports and observations. To help address these challenges, we propose the Neural Systems Language (NSyL). NSyL is a modeling language to be used by investigators to encode and communicate systematically reports of neural connectivity from neuroanatomy and brain imaging. NSyL engenders systematic description and communication of connectivity irrespective of the animal taxon described, experimental or observational technique implemented, or nomenclature referenced. As a language, NSyL is internally consistent, concise, and comprehensible to both humans and computers. NSyL is a promising development for systematizing the representation of neural architecture, effectively managing the increasing volume of data on neural connectivity and streamlining systems neuroscience research. Here we present similar precedent systems, how NSyL extends existing frameworks, and the reasoning behind NSyL's development. We explore NSyL's potential for balancing robustness and consistency in representation by encoding previously reported assertions of connectivity from the literature as examples. Finally, we propose and discuss the implications of a framework for how NSyL will be digitally implemented in the future to streamline curation of experimental results and bridge the gaps among anatomists, imagers, and neuroinformatics databases.

  17. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  18. A systematic review of repetitive functional task practice with modelling of resource use, costs and effectiveness.

    Science.gov (United States)

    French, B; Leathley, M; Sutton, C; McAdam, J; Thomas, L; Forster, A; Langhorne, P; Price, C; Walker, A; Watkins, C

    2008-07-01

    To determine whether repetitive functional task practice (RFTP) after stroke improves limb-specific or global function or activities of daily living and whether treatment effects are dependent on the amount of practice, or the type or timing of the intervention. Also to provide estimates of the cost-effectiveness of RFTP. The main electronic databases were searched from inception to week 4, September 2006. Searches were also carried out on non-English-language databases and for unpublished trials up to May 2006. Standard quantitative methods were used to conduct the systematic review. The measures of efficacy of RFTP from the data synthesis were used to inform an economic model. The model used a pre-existing data set and tested the potential impact of RFTP on cost. An incremental cost per quality-adjusted life-year (QALY) gained for RFTP was estimated from the model. Sensitivity analyses around the assumptions made for the model were used to test the robustness of the estimates. Thirty-one trials with 34 intervention-control pairs and 1078 participants were included. Overall, it was found that some forms of RFTP resulted in improvement in global function, and in both arm and lower limb function. Overall standardised mean difference in data suitable for pooling was 0.38 [95% confidence interval (CI) 0.09 to 0.68] for global motor function, 0.24 (95% CI 0.06 to 0.42) for arm function and 0.28 (95% CI 0.05 to 0.51) for functional ambulation. Results suggest that training may be sufficient to have an impact on activities of daily living. Retention effects of training persist for up to 6 months, but whether they persist beyond this is unclear. There was little or no evidence that treatment effects overall were modified by time since stroke or dosage of task practice, but results for upper limb function were modified by type of intervention. The economic modelling suggested that RFTP was cost-effective. Given a threshold for cost-effectiveness of 20,000 pounds per QALY

  19. A minimal set of invariants as a systematic approach to higher order gravity models: Physical and Cosmological Constraints

    CERN Document Server

    Moldenhauer, Jacob

    2009-01-01

    We compare higher order gravity models to observational constraints from magnitude-redshift supernova data, distance to the last scattering surface of the CMB, and Baryon Acoustic Oscillations. We follow a recently proposed systematic approach to higher order gravity models based on minimal sets of curvature invariants, and select models that pass some physical acceptability conditions (free of ghost instabilities, real and positive propagation speeds, and free of separatrices). Models that satisfy these physical and observational constraints are found in this analysis and do provide fits to the data that are very close to those of the LCDM concordance model. However, we find that the limitation of the models considered here comes from the presence of superluminal mode propagations for the constrained parameter space of the models.

  20. Molecular structure based property modeling: Development/ improvement of property models through a systematic property-data-model analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan;

    2013-01-01

    to a wide range of properties of pure compounds. In this work, however, the application of the method is illustrated for the property modeling of normal melting point, enthalpy of fusion, enthalpy of formation, and critical temperature. For all the properties listed above, it has been possible to achieve......The objective of this work is to develop a method for performing property-data-model analysis so that efficient use of knowledge of properties could be made in the development/improvement of property prediction models. The method includes: (i) analysis of property data and its consistency check......; (ii) selection of the most appropriate form of the property model; (iii) selection of the data-set for performing parameter regression and uncertainty analysis; and (iv) analysis of model prediction errors to take necessary corrective steps to improve the accuracy and the reliability of property...

  1. Are the impacts of land use on warming underestimated in climate policy?

    Science.gov (United States)

    Mahowald, Natalie M.; Ward, Daniel S.; Doney, Scott C.; Hess, Peter G.; Randerson, James T.

    2017-09-01

    While carbon dioxide emissions from energy use must be the primary target of climate change mitigation efforts, land use and land cover change (LULCC) also represent an important source of climate forcing. In this study we compute time series of global surface temperature change separately for LULCC and non-LULCC sources (primarily fossil fuel burning), and show that because of the extra warming associated with the co-emission of methane and nitrous oxide with LULCC carbon dioxide emissions, and a co-emission of cooling aerosols with non-LULCC emissions of carbon dioxide, the linear relationship between cumulative carbon dioxide emissions and temperature has a two-fold higher slope for LULCC than for non-LULCC activities. Moreover, projections used in the Intergovernmental Panel on Climate Change (IPCC) for the rate of tropical land conversion in the future are relatively low compared to contemporary observations, suggesting that the future projections of land conversion used in the IPCC may underestimate potential impacts of LULCC. By including a ‘business as usual’ future LULCC scenario for tropical deforestation, we find that even if all non-LULCC emissions are switched off in 2015, it is likely that 1.5 °C of warming relative to the preindustrial era will occur by 2100. Thus, policies to reduce LULCC emissions must remain a high priority if we are to achieve the low to medium temperature change targets proposed as a part of the Paris Agreement. Future studies using integrated assessment models and other climate simulations should include more realistic deforestation rates and the integration of policy that would reduce LULCC emissions.

  2. Exploration of the Drosophila buzzatii transposable element content suggests underestimation of repeats in Drosophila genomes.

    Science.gov (United States)

    Rius, Nuria; Guillén, Yolanda; Delprat, Alejandra; Kapusta, Aurélie; Feschotte, Cédric; Ruiz, Alfredo

    2016-05-10

    Many new Drosophila genomes have been sequenced in recent years using new-generation sequencing platforms and assembly methods. Transposable elements (TEs), being repetitive sequences, are often misassembled, especially in the genomes sequenced with short reads. Consequently, the mobile fraction of many of the new genomes has not been analyzed in detail or compared with that of other genomes sequenced with different methods, which could shed light into the understanding of genome and TE evolution. Here we compare the TE content of three genomes: D. buzzatii st-1, j-19, and D. mojavensis. We have sequenced a new D. buzzatii genome (j-19) that complements the D. buzzatii reference genome (st-1) already published, and compared their TE contents with that of D. mojavensis. We found an underestimation of TE sequences in Drosophila genus NGS-genomes when compared to Sanger-genomes. To be able to compare genomes sequenced with different technologies, we developed a coverage-based method and applied it to the D. buzzatii st-1 and j-19 genome. Between 10.85 and 11.16 % of the D. buzzatii st-1 genome is made up of TEs, between 7 and 7,5 % of D. buzzatii j-19 genome, while TEs represent 15.35 % of the D. mojavensis genome. Helitrons are the most abundant order in the three genomes. TEs in D. buzzatii are less abundant than in D. mojavensis, as expected according to the genome size and TE content positive correlation. However, TEs alone do not explain the genome size difference. TEs accumulate in the dot chromosomes and proximal regions of D. buzzatii and D. mojavensis chromosomes. We also report a significantly higher TE density in D. buzzatii and D. mojavensis X chromosomes, which is not expected under the current models. Our easy-to-use correction method allowed us to identify recently active families in D. buzzatii st-1 belonging to the LTR-retrotransposon superfamily Gypsy.

  3. [Nurse-Led Care Models in the Context of Community Elders With Chronic Disease Management: A Systematic Review].

    Science.gov (United States)

    Hsieh, Pei-Lun; Chen, Ching-Min

    2016-08-01

    Longer average life expectancies have caused the rapid growth of the elderly as a percentage of Taiwan's population and, as a result of the number of elders with chronic diseases and disability. Providing continuing-care services in community settings for elderly with multiple chronic conditions has become an urgent need. To review the nurse-led care models that are currently practiced among elders with chronic disease in the community and to further examine the effectiveness and essential components of these models using a systematic review method. Twelve original articles on chronic disease-care planning for the elderly or on nurse-led care management interventions that were published between 2000 and 2015 in any of five electronic databases: MEDLINE, PubMed, CINAHL (Cumulative Index to Nursing and Allied Health Literature) Plus with Full Text, Cochrane Library, and CEPS (Chinese Electronic Periodicals Service)were selected and analyzed systematically. Four types of nurse-led community care models, including primary healthcare, secondary prevention care, cross-boundary models, and case management, were identified. Chronic disease-care planning, case management, and disease self-management were found to be the essential components of the services that were provided. The care models used systematic processes to conduct assessment, planning, implementation, coordination, and follow-up activities as well as to deliver services and to evaluate disease status. The results revealed that providing continuing-care services through the nurse-led community chronic disease-care model and cross-boundary model enhanced the ability of the elderly to self-manage their chronic diseases, improved healthcare referrals, provided holistic care, and maximized resource utilization efficacy. The present study cross-referenced all reviewed articles in terms of target clients, content, intervention, measurements, and outcome indicators. Study results may be referenced in future

  4. Underestimating the effects of spatial heterogeneity due to individual movement and spatial scale: infectious disease as an example

    Science.gov (United States)

    Cross, Paul C.; Caillaud, Damien; Heisey, Dennis M.

    2013-01-01

    Many ecological and epidemiological studies occur in systems with mobile individuals and heterogeneous landscapes. Using a simulation model, we show that the accuracy of inferring an underlying biological process from observational data depends on movement and spatial scale of the analysis. As an example, we focused on estimating the relationship between host density and pathogen transmission. Observational data can result in highly biased inference about the underlying process when individuals move among sampling areas. Even without sampling error, the effect of host density on disease transmission is underestimated by approximately 50 % when one in ten hosts move among sampling areas per lifetime. Aggregating data across larger regions causes minimal bias when host movement is low, and results in less biased inference when movement rates are high. However, increasing data aggregation reduces the observed spatial variation, which would lead to the misperception that a spatially targeted control effort may not be very effective. In addition, averaging over the local heterogeneity will result in underestimating the importance of spatial covariates. Minimizing the bias due to movement is not just about choosing the best spatial scale for analysis, but also about reducing the error associated with using the sampling location as a proxy for an individual’s spatial history. This error associated with the exposure covariate can be reduced by choosing sampling regions with less movement, including longitudinal information of individuals’ movements, or reducing the window of exposure by using repeated sampling or younger individuals.

  5. Molecular structure based property modeling: Development/ improvement of property models through a systematic property-data-model analysis

    DEFF Research Database (Denmark)

    Hukkerikar, Amol Shivajirao; Sarup, Bent; Sin, Gürkan

    2013-01-01

    models. To make the property-data-model analysis fast and efficient, an approach based on the “molecular structure similarity criteria” to identify molecules (mono-functional, bi-functional, etc.) containing specified set of structural parameters (that is, groups) is employed. The method has been applied...... to a wide range of properties of pure compounds. In this work, however, the application of the method is illustrated for the property modeling of normal melting point, enthalpy of fusion, enthalpy of formation, and critical temperature. For all the properties listed above, it has been possible to achieve...

  6. A Gaussian process framework for modelling instrumental systematics: application to transmission spectroscopy

    CERN Document Server

    Gibson, N P; Roberts, S; Evans, T M; Osborne, M; Pont, F

    2011-01-01

    Transmission spectroscopy, which consists of measuring the wavelength-dependent absorption of starlight by a planet's atmosphere during a transit, is a powerful probe of atmospheric composition. However, the expected signal is typically orders of magnitude smaller than instrumental systematics, and the results are crucially dependent on the treatment of the latter. In this paper, we propose a new method to infer transit parameters in the presence of systematic noise using Gaussian processes, a technique widely used in the machine learning community for Bayesian regression and classification problems. Our method makes use of auxiliary information about the state of the instrument, but does so in a non-parametric manner, without imposing a specific dependence of the systematics on the instrumental parameters, and naturally allows for the correlated nature of the noise. We give an example application of the method to archival NICMOS transmission spectroscopy of the hot Jupiter HD 189733, which goes some way towa...

  7. Energy intake of Swedish overweight and obese children is underestimated using a diet history interview.

    Science.gov (United States)

    Waling, Maria U; Larsson, Christel L

    2009-03-01

    Estimating energy intake (EI) of a child by using a diet history interview (DHI) method may be a challenge because of difficulty for the child to remember what has been eaten as well as to report portion sizes. The aim of this research was to validate reported EI from a DHI in children classified as overweight or obese by comparing the reported EI to total energy expenditure (TEE) measured by 2 objective measures. Eighty-five 10.5- +/- 1.1-y-old overweight and obese children, with help from 1 or 2 parents, reported their EI 2 wk retrospectively in a DHI. Reported EI was compared with TEE, as measured by SenseWear armband (n = 85) and the doubly-labeled water (DLW) method (n = 21), during the same period as the DHI. Reported EI was underestimated by 14% when validated against both the armband and DLW method. Underestimation did not differ between boys and girls. However, the EI of obese children was underestimated by 22%, which is twice the rate as for the overweight children (95% CI: 0.55, 3.08). Underestimated EI was negatively correlated with BMI (r = -0.38; P = BMI and higher age when using a DHI method. The findings show the importance of validating dietary intake of children in general and in overweight and obese children in particular.

  8. Preferred child body size and parental underestimation of child weight in Mexican-American families

    Science.gov (United States)

    Objective: To determine whether parents who prefer a heavier child would underestimate their child's weight more than those who prefer a leaner child. Methods: Participants were Mexican-American families (312 mothers, 173 fathers, and 312 children ages 8-10) who were interviewed and had height and w...

  9. Ethnic differences in maternal underestimation of offspring's weight: the ABCD study.

    Science.gov (United States)

    de Hoog, M L A; Stronks, K; van Eijsden, M; Gemke, R J B J; Vrijkotte, T G M

    2012-01-01

    To determine the ethnic variation in maternal underestimation of their child's weight status and the explanatory role of socio-economic status (SES), acculturation and parental body mass index (BMI). A multi-ethnic sample of 2769 normal or overweight/obese children (underweight children excluded) aged 5-7 years was examined (The Amsterdam Born Child and their Development study), comprising five ethnic subgroups: Dutch (n=1744), African descent (n=184), Turkish (n=86), Moroccan (n=161) and other non-Dutch (n=592). Data on mothers' perception of their child's weight status (5-point scale from 'too low' to 'too high'), SES, acculturation, parental BMI and the children's height and weight were collected. Underestimation was defined by comparing maternal perception with the actual weight status of her child (International Obesity Task Force guidelines). Ethnic differences in underestimation were calculated in the normal weight and overweight/obese categories. Underestimation ranged from 3.6 (Dutch) to 15.7% (Moroccan) in normal-weight children, and from 73.0 (Dutch) to 92.3% (Turkish) in overweight/obese children. After correction for ethnic differences in child's BMI, higher odds ratios (ORs) for underestimation were found in the Turkish (normal weight: OR 6.83; 95% confidence interval (CI) 2.33-20.05 and overweight: OR 2.80; 95% CI 1.12-6.98) and Moroccan (normal weight: OR 11.55; 95% CI 5.28-25.26) groups (reference is the Dutch group). Maternal educational level and immigrant generation largely explained the ethnic differences, with a minor contribution of maternal age. After correction, ORs remained higher in the Moroccan group (OR 4.37; 95% CI 1.79-10.62) among the normal-weight children. Mothers frequently underestimate the actual weight status of their child, especially mothers from Turkish or Moroccan origin. Having a lower SES, being first-generation immigrant and a young mother are important determinants in explaining these differences. As weight perceptions

  10. Prediction Models and Their External Validation Studies for Mortality of Patients with Acute Kidney Injury: A Systematic Review

    Science.gov (United States)

    Ohnuma, Tetsu; Uchino, Shigehiko

    2017-01-01

    Objectives To systematically review AKI outcome prediction models and their external validation studies, to describe the discrepancy of reported accuracy between the results of internal and external validations, and to identify variables frequently included in the prediction models. Methods We searched the MEDLINE and Web of Science electronic databases (until January 2016). Studies were eligible if they derived a model to predict mortality of AKI patients or externally validated at least one of the prediction models, and presented area under the receiver-operator characteristic curves (AUROC) to assess model discrimination. Studies were excluded if they described only results of logistic regression without reporting a scoring system, or if a prediction model was generated from a specific cohort. Results A total of 2204 potentially relevant articles were found and screened, of which 12 articles reporting original prediction models for hospital mortality in AKI patients and nine articles assessing external validation were selected. Among the 21 studies for AKI prediction models and their external validation, 12 were single-center (57%), and only three included more than 1,000 patients (14%). The definition of AKI was not uniform and none used recently published consensus criteria for AKI. Although good performance was reported in their internal validation, most of the prediction models had poor discrimination with an AUROC below 0.7 in the external validation studies. There were 10 common non-renal variables that were reported in more than three prediction models: mechanical ventilation, age, gender, hypotension, liver failure, oliguria, sepsis/septic shock, low albumin, consciousness and low platelet count. Conclusions Information in this systematic review should be useful for future prediction model derivation by providing potential candidate predictors, and for future external validation by listing up the published prediction models. PMID:28056039

  11. Learning with Geoinformation in German Schools: Systematic Integration with a GIS Competency Model

    Science.gov (United States)

    Schubert, Jan Christoph; Uphues, Rainer

    2009-01-01

    While the application of geoinformation (GI) in German schools is becoming increasingly important, a systematic integration in school curricula, arranged in a cumulative and competence-oriented manner, is still lacking. Here the authors discuss existing approaches to the problem and propose a learning strategy based on the stepwise gain of GI…

  12. Systematic screening for Chlamydia trachomatis : Estimating cost-effectiveness using dynamic modeling and Dutch data

    NARCIS (Netherlands)

    de Vries, R.; Van Bergen, J.E.A.M.; de Jong-van den Berg, Lolkje; Postma, Maarten

    2006-01-01

    To estimate the cost-effectiveness of a systematic one-off Chlamydia trachomatis (CT) screening program including partner treatment for Dutch young adults. Data on infection prevalence, participation rates, and sexual behavior were obtained from a large pilot study conducted in The Netherlands. Oppo

  13. Systematization of accuracy indices variance when modelling the forming external cylindrical turning process

    Science.gov (United States)

    Balabanov, I. P.; Simonova, L. A.; Balabanova, O. N.

    2015-06-01

    The article considers the problem of accuracy deviation systematization for external cylindrical turning, proposed a hierarchical approach to the evaluation of these deviations, an approach to the analysis of nesting accuracy metrics, as well as, the common scheme of identification of deviations of the accuracy metrics for party billets in external machining were proposed.

  14. Using Multimodal Learning Analytics to Model Student Behaviour: A Systematic Analysis of Behavioural Framing

    Science.gov (United States)

    Andrade, Alejandro; Delandshere, Ginette; Danish, Joshua A.

    2016-01-01

    One of the challenges many learning scientists face is the laborious task of coding large amounts of video data and consistently identifying social actions, which is time consuming and difficult to accomplish in a systematic and consistent manner. It is easier to catalog observable behaviours (e.g., body motions or gaze) without explicitly…

  15. More Use of Peritoneal Dialysis Gives Significant Savings: A Systematic Review and Health Economic Decision Model

    Science.gov (United States)

    Pike, Eva; Hamidi, Vida; Ringerike, Tove; Wisloff, Torbjorn; Klemp, Marianne

    2017-01-01

    Background Patients with end-stage renal disease (ESRD) are in need of renal replacement therapy as dialysis and/or transplantation. The prevalence of ESRD and, thus, the need for dialysis are constantly growing. The dialysis modalities are either peritoneal performed at home or hemodialysis (HD) performed in-center (hospital or satellite) or home. We examined effectiveness and cost-effectiveness of HD performed at different locations (hospital, satellite, and home) and peritoneal dialysis (PD) at home in the Norwegian setting. Methods We conducted a systematic review for patients above 18 years with end-stage renal failure requiring dialysis in several databases and performed several meta-analyses of existing literature. Mortality and major complications that required were our main clinical outcomes. The quality of the evidence for each outcome was evaluated using GRADE. Cost-effectiveness was assessed by developing a probabilistic Markov model. The analysis was carried out from a societal perspective, and effects were expressed in quality-adjusted life-years. Uncertainties in the base-case parameter values were explored with a probabilistic sensitivity analysis. Scenario analyses were conducted by increasing the proportion of patients receiving PD with a corresponding reduction in HD patients in-center both for Norway and Europian Union. We assumed an annual growth rate of 4% in the number of dialysis patients, and a relative distribution between PD and HD in-center of 30% and 70%, respectively. Results From a societal perspective and over a 5-year time horizon, PD was the most cost-effective dialysis alternative. We found no significant difference in mortality between peritoneal and HD modalities. Our scenario analyses showed that a shift toward more patients on PD (as a first choice) with a corresponding reduction in HD in-center gave a saving over a 5-year period of 32 and 10,623 million EURO, respectively, for Norway and the European Union. Conclusions PD was

  16. Systematics of the Exclusive Meson Production in the Proton-Proton System in Relativistic Quark-Models

    CERN Document Server

    Dillig, M

    2002-01-01

    We investigate the exclusive production of the pseudoscalar mesons $\\pi ^{0}, \\eta, \\eta^{\\prime}, K^{+}$ and of the vector mesons $\\omega, \\phi$ in a microscopic gluon-exchange or instanton model. We describe the baryons as covariant quark - scalar diquark systems with harmonic confinement, thus taking into account center-of-mass corrections and Lorentz contraction in different frames. The excitation of intermediate baryon resonances is accounted by colorless 2-gluon (soft Pomeron) exchange. We find that our model accounts for the systematics of the high precision data on exclusive meson production from various modern proton factories.

  17. Towards a Conceptual Framework of Sustainable Business Model Innovation in the Agri-Food Sector: A Systematic Literature Review

    Directory of Open Access Journals (Sweden)

    Henrik Barth

    2017-09-01

    Full Text Available This paper aims to increase our understanding of sustainable business model innovation in the agri-food sector in terms of its theoretical and practical approaches for sustainability and their degree of complexity and maturity. The paper is based on a systematic literature review of 570 journal articles on business models and business model innovation published between 1990 and 2014. Of these articles, only 21 have business model innovation as their main focus. The review shows that research interest in the agri-food sector has increased in these years. The paper proposes a conceptual framework for sustainable business model innovation in the agri-food sector that can be used to meet the challenges encountered in taking a sustainability perspective.

  18. Evaluation of Simulation Models that Estimate the Effect of Dietary Strategies on Nutritional Intake: A Systematic Review.

    Science.gov (United States)

    Grieger, Jessica A; Johnson, Brittany J; Wycherley, Thomas P; Golley, Rebecca K

    2017-05-01

    Background: Dietary simulation modeling can predict dietary strategies that may improve nutritional or health outcomes.Objectives: The study aims were to undertake a systematic review of simulation studies that model dietary strategies aiming to improve nutritional intake, body weight, and related chronic disease, and to assess the methodologic and reporting quality of these models.Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analyses guided the search strategy with studies located through electronic searches [Cochrane Library, Ovid (MEDLINE and Embase), EBSCOhost (CINAHL), and Scopus]. Study findings were described and dietary modeling methodology and reporting quality were critiqued by using a set of quality criteria adapted for dietary modeling from general modeling guidelines.Results: Forty-five studies were included and categorized as modeling moderation, substitution, reformulation, or promotion dietary strategies. Moderation and reformulation strategies targeted individual nutrients or foods to theoretically improve one particular nutrient or health outcome, estimating small to modest improvements. Substituting unhealthy foods with healthier choices was estimated to be effective across a range of nutrients, including an estimated reduction in intake of saturated fatty acids, sodium, and added sugar. Promotion of fruits and vegetables predicted marginal changes in intake. Overall, the quality of the studies was moderate to high, with certain features of the quality criteria consistently reported.Conclusions: Based on the results of reviewed simulation dietary modeling studies, targeting a variety of foods rather than individual foods or nutrients theoretically appears most effective in estimating improvements in nutritional intake, particularly reducing intake of nutrients commonly consumed in excess. A combination of strategies could theoretically be used to deliver the best improvement in outcomes. Study quality was moderate to

  19. Patient neglect in healthcare institutions: a systematic review and conceptual model

    OpenAIRE

    Reader, Tom W; Gillespie, Alex

    2013-01-01

    Background\\ud Patient neglect is an issue of increasing public concern in Europe and North America, yet remains poorly understood. This is the first systematic review on the nature, frequency and causes of patient neglect as distinct from patient safety topics such as medical error.\\ud \\ud Method\\ud The Pubmed, Science Direct, and Medline databases were searched in order to identify research studies investigating patient neglect. Ten articles and four government reports met the inclusion crit...

  20. Systematic U(1)_{B-L} Extensions of Loop-Induced Neutrino Mass Models with Dark Matter

    CERN Document Server

    Ho, Shu-Yu; Tsumura, Koji

    2016-01-01

    We study the gauged U(1)_{B-L} extensions of the models for neutrino masses and dark matter. In this class of models, tiny neutrino masses are radiatively induced through the loop diagrams, while the origin of the dark matter stability is guaranteed by the remnant of the gauge symmetry. Depending on how the lepton number is violated in the neutrino mass diagrams, these models are systematically classified. We present a complete list for the one-loop Z_2 and the two-loop Z_3 neutrino mass models as examples of the classification. These underlying gauge symmetries and its breaking patterns can be probed at future high energy colliders by looking at the width of the new gauge boson.

  1. Adults in all body mass index categories underestimate daily energy requirements.

    Science.gov (United States)

    Headrick, Lauren B; Rowe, Cassie C; Kendall, Ashley R; Zitt, Michelle A; Bolton, Dawn L; Langkamp-Henken, Bobbi

    2013-01-01

    To compare the difference between self-reported and calculated daily energy requirements of adults within different body mass index (BMI) categories. Adults (n = 978) self-reported daily energy requirements, demographic information, and height, weight, age, and physical activity level (PAL) to calculate total energy expenditure. The main effects of BMI, gender, PAL, and dieting status on the difference between self-reported and calculated energy requirements for weight maintenance were significant (P requirements, but obese individuals underestimated to the greatest degree. Males, current dieters, and those who reported a low-active or active PAL underestimated to the greatest extent in each category. There is a lack of basic nutrition knowledge about personal energy needs in individuals across all BMI categories regardless of age, race/ethnicity, level of education, or work/training in a health-related field. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  2. Attaining Performance with Building Information Modelling: A systematic literature review of product and process modelling in AEC

    NARCIS (Netherlands)

    Papadonikolaki, E.; Koutamanis, A.; Wamelink, J.W.F.

    2013-01-01

    The paper presents the findings of a systematic literature review of approximately 200 scientific sources. It is designed with the aim to identify the current benefits and factors of high performance in Architecture, Engineering, Construction (AEC) since the introduction of Building Information

  3. Latent and sensible heat fluxes overestimated and net heat flux underestimated in Lake Victoria

    CERN Document Server

    Verburg, Piet

    2014-01-01

    Cozar et al. (2012) used remotely-sensed data to link phytoplankton growth to the net heat flux in both the northern and southern parts of Lake Victoria. However, the latent and sensible heat fluxes were overestimated by ~26% by assuming a constant air density of 1.3 kg m-3. As a result, the net heat flux was underestimated, bringing into question conclusions regarding the convective circulation.

  4. Underestimated Rate of Status Epilepticus according to the Traditional Definition of Status Epilepticus

    Directory of Open Access Journals (Sweden)

    Cheung-Ter Ong

    2015-01-01

    Full Text Available Purpose. Status epilepticus (SE is an important neurological emergency. Early diagnosis could improve outcomes. Traditionally, SE is defined as seizures lasting at least 30 min or repeated seizures over 30 min without recovery of consciousness. Some specialists argued that the duration of seizures qualifying as SE should be shorter and the operational definition of SE was suggested. It is unclear whether physicians follow the operational definition. The objective of this study was to investigate whether the incidence of SE was underestimated and to investigate the underestimate rate. Methods. This retrospective study evaluates the difference in diagnosis of SE between operational definition and traditional definition of status epilepticus. Between July 1, 2012, and June 30, 2014, patients discharged with ICD-9 codes for epilepsy (345.X in Chia-Yi Christian Hospital were included in the study. A seizure lasting at least 30 min or repeated seizures over 30 min without recovery of consciousness were considered SE according to the traditional definition of SE (TDSE. A seizure lasting between 5 and 30 min was considered SE according to the operational definition of SE (ODSE; it was defined as underestimated status epilepticus (UESE. Results. During a 2-year period, there were 256 episodes of seizures requiring hospital admission. Among the 256 episodes, 99 episodes lasted longer than 5 min, out of which 61 (61.6% episodes persisted over 30 min (TDSE and 38 (38.4% episodes continued between 5 and 30 min (UESE. In the 38 episodes of seizure lasting 5 to 30 minutes, only one episode was previously discharged as SE (ICD-9-CM 345.3. Conclusion. We underestimated 37.4% of SE. Continuing education regarding the diagnosis and treatment of epilepsy is important for physicians.

  5. Auditory verbal hallucinations and continuum models of psychosis: A systematic review of the healthy voice-hearer literature.

    Science.gov (United States)

    Baumeister, David; Sedgwick, Ottilie; Howes, Oliver; Peters, Emmanuelle

    2017-02-01

    Recent decades have seen a surge of research interest in the phenomenon of healthy individuals who experience auditory verbal hallucinations, yet do not exhibit distress or need for care. The aims of the present systematic review are to provide a comprehensive overview of this research and examine how healthy voice-hearers may best be conceptualised in relation to the diagnostic versus 'quasi-' and 'fully-dimensional' continuum models of psychosis. A systematic literature search was conducted, resulting in a total of 398 article titles and abstracts that were scrutinised for appropriateness to the present objective. Seventy articles were identified for full-text analysis, of which 36 met criteria for inclusion. Subjective perceptual experience of voices, such as loudness or location (i.e., inside/outside head), is similar in clinical and non-clinical groups, although clinical voice-hearers have more frequent voices, more negative voice content, and an older age of onset. Groups differ significantly in beliefs about voices, control over voices, voice-related distress, and affective difficulties. Cognitive biases, reduced global functioning, and psychiatric symptoms such as delusions, appear more prevalent in healthy voice-hearers than in healthy controls, yet less than in clinical samples. Transition to mental health difficulties is increased in HVHs, yet only occurs in a minority and is predicted by previous mood problems and voice distress. Whilst healthy voice-hearers show similar brain activity during hallucinatory experiences to clinical voice-hearers, other neuroimaging measures, such as mismatch negativity, have been inconclusive. Risk factors such as familial and childhood trauma appear similar between clinical and non-clinical voice-hearers. Overall the results of the present systematic review support a continuum view rather than a diagnostic model, but cannot distinguish between 'quasi' and 'fully' dimensional models. Healthy voice-hearers may be a key

  6. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews.

    Science.gov (United States)

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to 'think' conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions.

  7. Immunosuppressive therapy for kidney transplantation in adults: a systematic review and economic model.

    Science.gov (United States)

    Jones-Hughes, Tracey; Snowsill, Tristan; Haasova, Marcela; Coelho, Helen; Crathorne, Louise; Cooper, Chris; Mujica-Mota, Ruben; Peters, Jaime; Varley-Campbell, Jo; Huxley, Nicola; Moore, Jason; Allwood, Matt; Lowe, Jenny; Hyde, Chris; Hoyle, Martin; Bond, Mary; Anderson, Rob

    2016-08-01

    End-stage renal disease is a long-term irreversible decline in kidney function requiring renal replacement therapy: kidney transplantation, haemodialysis or peritoneal dialysis. The preferred option is kidney transplantation, followed by immunosuppressive therapy (induction and maintenance therapy) to reduce the risk of kidney rejection and prolong graft survival. To review and update the evidence for the clinical effectiveness and cost-effectiveness of basiliximab (BAS) (Simulect(®), Novartis Pharmaceuticals UK Ltd) and rabbit anti-human thymocyte immunoglobulin (rATG) (Thymoglobulin(®), Sanofi) as induction therapy, and immediate-release tacrolimus (TAC) (Adoport(®), Sandoz; Capexion(®), Mylan; Modigraf(®), Astellas Pharma; Perixis(®), Accord Healthcare; Prograf(®), Astellas Pharma; Tacni(®), Teva; Vivadex(®), Dexcel Pharma), prolonged-release tacrolimus (Advagraf(®) Astellas Pharma), belatacept (BEL) (Nulojix(®), Bristol-Myers Squibb), mycophenolate mofetil (MMF) (Arzip(®), Zentiva; CellCept(®), Roche Products; Myfenax(®), Teva), mycophenolate sodium (MPS) (Myfortic(®), Novartis Pharmaceuticals UK Ltd), sirolimus (SRL) (Rapamune(®), Pfizer) and everolimus (EVL) (Certican(®), Novartis) as maintenance therapy in adult renal transplantation. Clinical effectiveness searches were conducted until 18 November 2014 in MEDLINE (via Ovid), EMBASE (via Ovid), Cochrane Central Register of Controlled Trials (via Wiley Online Library) and Web of Science (via ISI), Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects and Health Technology Assessment (The Cochrane Library via Wiley Online Library) and Health Management Information Consortium (via Ovid). Cost-effectiveness searches were conducted until 18 November 2014 using a costs or economic literature search filter in MEDLINE (via Ovid), EMBASE (via Ovid), NHS Economic Evaluation Database (via Wiley Online Library), Web of Science (via ISI), Health Economic Evaluations

  8. Immunosuppressive therapy for kidney transplantation in adults: a systematic review and economic model.

    Science.gov (United States)

    Jones-Hughes, Tracey; Snowsill, Tristan; Haasova, Marcela; Coelho, Helen; Crathorne, Louise; Cooper, Chris; Mujica-Mota, Ruben; Peters, Jaime; Varley-Campbell, Jo; Huxley, Nicola; Moore, Jason; Allwood, Matt; Lowe, Jenny; Hyde, Chris; Hoyle, Martin; Bond, Mary; Anderson, Rob

    2016-01-01

    BACKGROUND End-stage renal disease is a long-term irreversible decline in kidney function requiring renal replacement therapy: kidney transplantation, haemodialysis or peritoneal dialysis. The preferred option is kidney transplantation, followed by immunosuppressive therapy (induction and maintenance therapy) to reduce the risk of kidney rejection and prolong graft survival. OBJECTIVES To review and update the evidence for the clinical effectiveness and cost-effectiveness of basiliximab (BAS) (Simulect(®), Novartis Pharmaceuticals UK Ltd) and rabbit anti-human thymocyte immunoglobulin (rATG) (Thymoglobulin(®), Sanofi) as induction therapy, and immediate-release tacrolimus (TAC) (Adoport(®), Sandoz; Capexion(®), Mylan; Modigraf(®), Astellas Pharma; Perixis(®), Accord Healthcare; Prograf(®), Astellas Pharma; Tacni(®), Teva; Vivadex(®), Dexcel Pharma), prolonged-release tacrolimus (Advagraf(®) Astellas Pharma), belatacept (BEL) (Nulojix(®), Bristol-Myers Squibb), mycophenolate mofetil (MMF) (Arzip(®), Zentiva; CellCept(®), Roche Products; Myfenax(®), Teva), mycophenolate sodium (MPS) (Myfortic(®), Novartis Pharmaceuticals UK Ltd), sirolimus (SRL) (Rapamune(®), Pfizer) and everolimus (EVL) (Certican(®), Novartis) as maintenance therapy in adult renal transplantation. METHODS Clinical effectiveness searches were conducted until 18 November 2014 in MEDLINE (via Ovid), EMBASE (via Ovid), Cochrane Central Register of Controlled Trials (via Wiley Online Library) and Web of Science (via ISI), Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects and Health Technology Assessment (The Cochrane Library via Wiley Online Library) and Health Management Information Consortium (via Ovid). Cost-effectiveness searches were conducted until 18 November 2014 using a costs or economic literature search filter in MEDLINE (via Ovid), EMBASE (via Ovid), NHS Economic Evaluation Database (via Wiley Online Library), Web of Science (via ISI

  9. Application of analytic hierarchy process-grey target theory systematic model in comprehensive evaluation of water environmental quality.

    Science.gov (United States)

    Wu, Jun; Tian, Xiaogang; Tang, Ya; Zhao, Yujie; Hu, Yandi; Fang, Zili

    2010-07-01

    Comprehensive evaluation of the water environment for effective water quality management is complicated by a considerable number of factors and uncertainties. It is difficult to combine micro-evaluation with the macro-evaluation process. To effectively eliminate the subjective errors of the traditional analytic hierarchy process (AHP), a new modeling approach--the analytic hierarchy process and grey target theory (AHP-GTT) systematic model--is presented in this study to evaluate water quality in a certain watershed. A case study of applying the AHP-GTT systematic model to the evaluation and analysis of the water environment was conducted in the Yibin section of the Yangtze River, China. The micro-evaluation is based on defining the weights of indices of the water quality (IWQ) of each water cross-section, while the macro-evaluation is based on calculating the comprehensive indices of water environmental quality and analyzing the tendency of the water environment of each cross-section. The results indicated that the Baixi and Shuidongmen sections are seriously polluted areas, with the tendencies of becoming worse. Also, the key IWQs of these two cross-sections are 5-day biochemical oxygen demand and chemical oxygen demand of permanganate, respectively.

  10. Sap flow is Underestimated by Thermal Dissipation Sensors due to Alterations of Wood Anatomy

    Science.gov (United States)

    Marañón-Jiménez, S.; Wiedemann, A.; van den Bulcke, J.; Cuntz, M.; Rebmann, C.; Steppe, K.

    2014-12-01

    The thermal dissipation technique (TD) is one of the most commonly adopted methods for sap flow measurements. However, underestimations of up to 60% of the tree transpiration have been reported with this technique, although the causes are not certainly known. The insertion of TD sensors within the stems causes damage of the wood tissue and subsequent healing reactions, changing wood anatomy and likely the sap flow path. However, the anatomical changes in response to the insertion of sap flow sensors and the effects on the measured flow have not been assessed yet. In this study, we investigate the alteration of vessel anatomy on wounds formed around TD sensors. Our main objectives were to elucidate the anatomical causes of sap flow underestimation for ring-porous and diffuse-porous species, and relate these changes to sap flow underestimations. Successive sets of TD probes were installed in early, mid and end of the growing season in Fagus sylvatica (diffuse-porous) and Quercus petraea (ring-porous) trees. They were logged after the growing season and additional sets of sensors were installed in the logged stems with presumably no healing reaction. The wood tissue surrounding each sensor was then excised and analysed by X-ray computed microtomography (X-ray micro CT). This technique allowed the quantification of vessel anatomical characteristics and the reconstruction of the 3-D internal microstructure of the xylem vessels so that extension and shape of the altered area could be determined. Gels and tyloses clogged the conductive vessels around the sensors in both beech and oak. The extension of the affected area was larger for beech although these anatomical changes led to similar sap flow underestimations in both species. The higher vessel size in oak may explain this result and, therefore, larger sap flow underestimation per area of affected conductive tissue. The wound healing reaction likely occurred within the first weeks after sensor installation, which

  11. Core-crust transition properties of neutron stars within systematically varied extended relativistic mean-field model

    CERN Document Server

    Sulaksono, A; Agrawal, B K

    2014-01-01

    The model dependence and the symmetry energy dependence of the core-crust transition properties for the neutron stars are studied using three different families of systematically varied extended relativistic mean field model. Several forces within each of the families are so considered that they yield wide variations in the values of the nuclear symmetry energy $a_{\\rm sym}$ and its slope parameter $L$ at the saturation density. The core-crust transition density is calculated using a method based on random-phase-approximation. The core-crust transition density is strongly correlated, in a model independent manner, with the symmetry energy slope parameter evaluated at the saturation density. The pressure at the transition point dose not show any meaningful correlations with the symmetry energy parameters at the saturation density. At best, pressure at the transition point is correlated with the symmetry energy parameters and their linear combination evaluated at the some sub-saturation density. Yet, such corre...

  12. redGEM: Systematic reduction and analysis of genome-scale metabolic reconstructions for development of consistent core metabolic models.

    Science.gov (United States)

    Ataman, Meric; Hernandez Gardiol, Daniel F; Fengos, Georgios; Hatzimanikatis, Vassily

    2017-07-01

    Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these "consistently-reduced" models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models.

  13. redGEM: Systematic reduction and analysis of genome-scale metabolic reconstructions for development of consistent core metabolic models.

    Directory of Open Access Journals (Sweden)

    Meric Ataman

    2017-07-01

    Full Text Available Genome-scale metabolic reconstructions have proven to be valuable resources in enhancing our understanding of metabolic networks as they encapsulate all known metabolic capabilities of the organisms from genes to proteins to their functions. However the complexity of these large metabolic networks often hinders their utility in various practical applications. Although reduced models are commonly used for modeling and in integrating experimental data, they are often inconsistent across different studies and laboratories due to different criteria and detail, which can compromise transferability of the findings and also integration of experimental data from different groups. In this study, we have developed a systematic semi-automatic approach to reduce genome-scale models into core models in a consistent and logical manner focusing on the central metabolism or subsystems of interest. The method minimizes the loss of information using an approach that combines graph-based search and optimization methods. The resulting core models are shown to be able to capture key properties of the genome-scale models and preserve consistency in terms of biomass and by-product yields, flux and concentration variability and gene essentiality. The development of these "consistently-reduced" models will help to clarify and facilitate integration of different experimental data to draw new understanding that can be directly extendable to genome-scale models.

  14. Impact of moisture divergence on systematic errors in precipitation around the Tibetan Plateau in a general circulation model

    Science.gov (United States)

    Zhang, Yi; Li, Jian

    2016-11-01

    Current state-of-the-art atmospheric general circulation models tend to strongly overestimate the amount of precipitation around steep mountains, which constitutes a stubborn systematic error that causes the climate drift and hinders the model performance. In this study, two contrasting model tests are performed to investigate the sensitivity of precipitation around steep slopes. The first model solves a true moisture advection equation, whereas the second solves an artificial advection equation with an additional moisture divergence term. It is shown that the orographic precipitation can be largely impacted by this term. Excessive (insufficient) precipitation amounts at the high (low) parts of the steep slopes decrease (increase) when the moisture divergence term is added. The precipitation changes between the two models are primarily attributed to large-scale precipitation, which is directly associated with water vapor saturation and condensation. Numerical weather prediction experiments using these two models suggest that precipitation differences between the models emerge shortly after the model startup. The implications of the results are also discussed.

  15. NATIONAL SURGICAL QUALITY IMPROVEMENT PROGRAM UNDERESTIMATES THE RISK ASSOCIATED WITH MILD AND MODERATE POSTOPERATIVE ACUTE KIDNEY INJURY

    Science.gov (United States)

    Bihorac, Azra; Brennan, Meghan; Baslanti, Tezcan Ozrazgat; Bozorgmehri, Shahab; Efron, Philip A.; Moore, Frederick A.; Segal, Mark S; Hobson, Charles E

    2013-01-01

    Objective In a single-center cohort of surgical patients we assessed the association between postoperative change in serum creatinine (sCr) and adverse outcomes and compared the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP)’s definition for acute kidney injury (NSQIP-AKI) with consensus RIFLE (Risk, Injury, Failure, Loss, and End-stage Kidney) and KDIGO (Kidney Disease: Improving Global Outcomes) definitions. Design Retrospective single center cohort. Setting Academic tertiary medical center. Patients 27,841 adult patients with no previous history of chronic kidney disease undergoing major surgery. Intervention RIFLE defines AKI as change in sCr greater than or equal to 50% while KDIGO uses 0.3 mg/dl change from the reference sCr. Since NSQIP defines AKI as sCr change > 2mg/dl, it may underestimate the risk associated with less severe AKI. Measurements The optimal discrimination limits (ODL) for both percent and absolute sCr changes were calculated by maximizing sensitivity and specificity along the receiver operating characteristic (ROC) curves for postoperative complications and mortality. Main Results Although prevalence of RIFLE-AKI was 37%, only 7% of RIFLE-AKI patients would be diagnosed with AKI using the NSQIP definition. In multivariable logistic models patients with RIFLE or KDIGO-AKI had a 10 times higher odds of dying compared to patients without AKI. The ODLs for change in sCr associated with adverse postoperative outcomes were as low as 0.2 mg/dl while the NSQIP discrimination limit of 2.0 mg/dl had low sensitivity (0.05 – 0.28). Conclusion Current ACS NSQIP definition underestimates the risk associated with mild and moderate AKI otherwise captured by the consensus RIFLE and KDIGO criteria. PMID:23928835

  16. Integrated methodological frameworks for modelling agent-based advanced supply chain planning systems: A systematic literature review

    Directory of Open Access Journals (Sweden)

    Luis Antonio Santa-Eulalia

    2011-12-01

    Full Text Available Purpose: The objective of this paper is to provide a systematic literature review of recent developments in methodological frameworks for the modelling and simulation of agent-based advanced supply chain planning systems.Design/methodology/approach: A systematic literature review is provided to identify, select and make an analysis and a critical summary of all suitable studies in the area. It is organized into two blocks: the first one covers agent-based supply chain planning systems in general terms, while the second one specializes the previous search to identify those works explicitly containing methodological aspects.Findings: Among sixty suitable manuscripts identified in the primary literature search, only seven explicitly considered the methodological aspects. In addition, we noted that, in general, the notion of advanced supply chain planning is not considered unambiguously, that the social and individual aspects of the agent society are not taken into account in a clear manner in several studies and that a significant part of the works are of a theoretical nature, with few real-scale industrial applications. An integrated framework covering all phases of the modelling and simulation process is still lacking in the literature visited.Research limitations/implications: The main research limitations are related to the period covered (last four years, the selected scientific databases, the selected language (i.e. English and the use of only one assessment framework for the descriptive evaluation part.Practical implications: The identification of recent works in the domain and discussion concerning their limitations can help pave the way for new and innovative researches towards a complete methodological framework for agent-based advanced supply chain planning systems.Originality/value: As there are no recent state-of-the-art reviews in the domain of methodological frameworks for agent-based supply chain planning, this paper contributes to

  17. A Systematic Review and Meta-Analysis of Buyang Huanwu Decoction in Animal Model of Focal Cerebral Ischemia

    Directory of Open Access Journals (Sweden)

    Rui-li Wei

    2013-01-01

    Full Text Available Buyang Huanwu Decoction (BHD is a well-known Chinese herbal prescription for ischemic stroke. The objective of this systematic review and meta-analysis is to provide the current evidence for neuroprotective effects of BHD and its possible mechanisms in animal models of focal ischemia. A systematic literature search, through October 2012, was performed using six databases. The outcome measures assessed were infarct size and/or neurological score. Fifty-six studies with 1270 animals that met the inclusion criteria were identified. The median score for methodological quality was 3 with a range of 2 to 6. Compared with vehicle or no treatment controls, BHD gave a 37% improvement in outcome for all doses ranging from 1.0 g/kg to 60 g/kg at each time point that BHD was administered (P<0.01. Efficacy was higher in mouse models that utilized suture occlusion and temporary ischemia. The neuroprotective effects of BHD are involved in multiple mechanisms and act upon multiple cell types. In conclusion, BHD possesses substantial neuroprotective effects in experimental stroke probably as a result of the multitarget therapy strategy typically utilized in traditional Chinese medicine. Future research should examine the presence of possible experimental bias and an in-depth study of herbal compound preparations.

  18. Compromised motor control in children with DCD: a deficit in the internal model?—A systematic review.

    Science.gov (United States)

    Adams, Imke L J; Lust, Jessica M; Wilson, Peter H; Steenbergen, Bert

    2014-11-01

    A viable hypothesis to explain the compromised motor ability of children with Developmental Coordination Disorder (DCD) suggests a fundamental deficit in their ability to utilize internal models for motor control. Dysfunction in this mode of control is thought to compromise their motor learning capabilities. The aim of this systematic review is to examine the available evidence for the internal modeling deficit (IMD) hypothesis. A systematic review using five databases identified 48 relevant articles. These studies were categorized according to the effector system involved in the evaluation of motor control and were evaluated for methodological quality. In most papers, DSM-IV-TR criteria for the classification of DCD were not completely fulfilled and possible attentional problems not accounted for. Results showed compromised control of overt and covert eye movements, dynamic postural control, manual control for tasks that vary in complexity, and for motor imagery of manual and whole-body postures. Importantly, this review shows support for general hypothesis that deficits of predictive control manifest in DCD across effector systems.

  19. Screening to prevent spontaneous preterm birth: systematic reviews of accuracy and effectiveness literature with economic modelling.

    Science.gov (United States)

    Honest, H; Forbes, C A; Durée, K H; Norman, G; Duffy, S B; Tsourapas, A; Roberts, T E; Barton, P M; Jowett, S M; Hyde, C J; Khan, K S

    2009-09-01

    To identify combinations of tests and treatments to predict and prevent spontaneous preterm birth. Searches were run on the following databases up to September 2005 inclusive: MEDLINE, EMBASE, DARE, the Cochrane Library (CENTRAL and Cochrane Pregnancy and Childbirth Group trials register) and MEDION. We also contacted experts including the Cochrane Pregnancy and Childbirth Group and checked reference lists of review articles and papers that were eligible for inclusion. Two series of systematic reviews were performed: (1) accuracy of tests for the prediction of spontaneous preterm birth in asymptomatic women in early pregnancy and in women symptomatic with threatened preterm labour in later pregnancy; (2) effectiveness of interventions with potential to reduce cases of spontaneous preterm birth in asymptomatic women in early pregnancy and to reduce spontaneous preterm birth or improve neonatal outcome in women with a viable pregnancy symptomatic of threatened preterm labour. For the health economic evaluation, a model-based analysis incorporated the combined effect of tests and treatments and their cost-effectiveness. Of the 22 tests reviewed for accuracy, the quality of studies and accuracy of tests was generally poor. Only a few tests had LR+ > 5. In asymptomatic women these were ultrasonographic cervical length measurement and cervicovaginal prolactin and fetal fibronectin screening for predicting spontaneous preterm birth before 34 weeks. In this group, tests with LR- preterm labour, tests with LR+ > 5 were absence of fetal breathing movements, cervical length and funnelling, amniotic fluid interleukin-6 (IL-6), serum CRP for predicting birth within 2-7 days of testing, and matrix metalloprotease-9, amniotic fluid IL-6, cervicovaginal fetal fibronectin and cervicovaginal human chorionic gonadotrophin (hCG) for predicting birth before 34 or 37 weeks. In this group, tests with LR- preterm birth. Smoking cessation programmes, progesterone, periodontal therapy and

  20. A systematic approach to obtain validated partial least square models for predicting lipoprotein subclasses from serum NMR spectra.

    Science.gov (United States)

    Mihaleva, Velitchka V; van Schalkwijk, Daniël B; de Graaf, Albert A; van Duynhoven, John; van Dorsten, Ferdinand A; Vervoort, Jacques; Smilde, Age; Westerhuis, Johan A; Jacobs, Doris M

    2014-01-07

    A systematic approach is described for building validated PLS models that predict cholesterol and triglyceride concentrations in lipoprotein subclasses in fasting serum from a normolipidemic, healthy population. The PLS models were built on diffusion-edited (1)H NMR spectra and calibrated on HPLC-derived lipoprotein subclasses. The PLS models were validated using an independent test set. In addition to total VLDL, LDL, and HDL lipoproteins, statistically significant PLS models were obtained for 13 subclasses, including 5 VLDLs (particle size 64-31.3 nm), 4 LDLs (particle size 28.6-20.7 nm) and 4 HDLs (particle size 13.5-9.8 nm). The best models were obtained for triglycerides in VLDL (0.82 < Q(2) <0.92) and HDL (0.69 < Q(2) <0.79) subclasses and for cholesterol in HDL subclasses (0.68 < Q(2) <0.96). Larger variations in the model performance were observed for triglycerides in LDL subclasses and cholesterol in VLDL and LDL subclasses. The potential of the NMR-PLS model was assessed by comparing the LPD of 52 subjects before and after a 4-week treatment with dietary supplements that were hypothesized to change blood lipids. The supplements induced significant (p < 0.001) changes on multiple subclasses, all of which clearly exceeded the prediction errors.

  1. Prediction models for intracranial hemorrhage or major bleeding in patients on antiplatelet therapy: a systematic review and external validation study.

    Science.gov (United States)

    Hilkens, N A; Algra, A; Greving, J P

    2016-01-01

    ESSENTIALS: Prediction models may help to identify patients at high risk of bleeding on antiplatelet therapy. We identified existing prediction models for bleeding and validated them in patients with cerebral ischemia. Five prediction models were identified, all of which had some methodological shortcomings. Performance in patients with cerebral ischemia was poor. Background Antiplatelet therapy is widely used in secondary prevention after a transient ischemic attack (TIA) or ischemic stroke. Bleeding is the main adverse effect of antiplatelet therapy and is potentially life threatening. Identification of patients at increased risk of bleeding may help target antiplatelet therapy. This study sought to identify existing prediction models for intracranial hemorrhage or major bleeding in patients on antiplatelet therapy and evaluate their performance in patients with cerebral ischemia. We systematically searched PubMed and Embase for existing prediction models up to December 2014. The methodological quality of the included studies was assessed with the CHARMS checklist. Prediction models were externally validated in the European Stroke Prevention Study 2, comprising 6602 patients with a TIA or ischemic stroke. We assessed discrimination and calibration of included prediction models. Five prediction models were identified, of which two were developed in patients with previous cerebral ischemia. Three studies assessed major bleeding, one studied intracerebral hemorrhage and one gastrointestinal bleeding. None of the studies met all criteria of good quality. External validation showed poor discriminative performance, with c-statistics ranging from 0.53 to 0.64 and poor calibration. A limited number of prediction models is available that predict intracranial hemorrhage or major bleeding in patients on antiplatelet therapy. The methodological quality of the models varied, but was generally low. Predictive performance in patients with cerebral ischemia was poor. In order to

  2. Underestimating the Alcohol Content of a Glass of Wine: The Implications for Estimates of Mortality Risk

    Science.gov (United States)

    Britton, Annie; O’Neill, Darragh; Bell, Steven

    2016-01-01

    Aims Increases in glass sizes and wine strength over the last 25 years in the UK are likely to have led to an underestimation of alcohol intake in population studies. We explore whether this probable misclassification affects the association between average alcohol intake and risk of mortality from all causes, cardiovascular disease and cancer. Methods Self-reported alcohol consumption in 1997–1999 among 7010 men and women in the Whitehall II cohort of British civil servants was linked to the risk of mortality until mid-2015. A conversion factor of 8 g of alcohol per wine glass (1 unit) was compared with a conversion of 16 g per wine glass (2 units). Results When applying a higher alcohol content conversion for wine consumption, the proportion of heavy/very heavy drinkers increased from 28% to 41% for men and 15% to 28% for women. There was a significantly increased risk of very heavy drinking compared with moderate drinking for deaths from all causes and cancer before and after change in wine conversion; however, the hazard ratios were reduced when a higher wine conversion was used. Conclusions In this population-based study, assuming higher alcohol content in wine glasses changed the estimates of mortality risk. We propose that investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. Prospectively, researchers need to collect more detailed information on alcohol including serving sizes and strength. Short summary The alcohol content in a wine glass is likely to be underestimated in population surveys as wine strength and serving size have increased in recent years. We demonstrate that in a large cohort study, this underestimation affects estimates of mortality risk. Investigator-led cohorts need to revisit conversion factors based on more accurate estimates of alcohol content in wine glasses. PMID:27261472

  3. Connectedness underlies the underestimation of the horizontal vertical illusion in L-shaped configurations.

    Science.gov (United States)

    Cai, Yongchun; Wang, Ci; Song, Chao; Li, Zhi

    2017-05-01

    L-shaped configuration is a commonly used stimulus configuration in studying horizontal vertical illusion. Here, we report that the horizontal vertical illusion is substantially underestimated when the L-shaped configuration is used for evaluating the illusion. Experiment 1 found that, in a length perception task, the perceived length of a vertical bar was about 10% longer than that of a horizontal bar with the same physical size. Similar amount of HVI was found in a length comparison task, in which the length of a horizontal bar was compared to that of a vertical bar and the two bars were presented separately in space or in time. In contrast, when the length comparison task was conducted with the two bars being arranged in a connected L-shape, the illusion was halved in strength. Experiment 2 and 3 studied what might be the cause of this L-shape induced HVI-underestimation. Two factors were investigated: the connectedness of the two lines, and the 45° absolute orientation or the 45° inner angle information embedded in the upright isosceles L-shape. The results showed that the HVI strength was not much affected when the 45° absolute orientation and the 45° angle information was made useless for the length comparison task. In contrast, the illusion was significantly reduced in strength whenever the two lines were separated as compared to when they were connected. These results suggested that the connectedness of the two lines must underlie the underestimation of the horizontal vertical illusion in the L-shaped configurations.

  4. A systematic multiscale modeling and experimental approach to protect grain boundaries in magnesium alloys from corrosion

    Energy Technology Data Exchange (ETDEWEB)

    Horstemeyer, Mark R. [Mississippi State Univ., Mississippi State, MS (United States); Chaudhuri, Santanu [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-09-30

    A multiscale modeling Internal State Variable (ISV) constitutive model was developed that captures the fundamental structure-property relationships. The macroscale ISV model used lower length scale simulations (Butler-Volmer and Electronics Structures results) in order to inform the ISVs at the macroscale. The chemomechanical ISV model was calibrated and validated from experiments with magnesium (Mg) alloys that were investigated under corrosive environments coupled with experimental electrochemical studies. Because the ISV chemomechanical model is physically based, it can be used for other material systems to predict corrosion behavior. As such, others can use the chemomechanical model for analyzing corrosion effects on their designs.

  5. A systematic review of care delivery models and economic analyses in lymphedema: health policy impact (2004-2011).

    Science.gov (United States)

    Stout, N L; Weiss, R; Feldman, J L; Stewart, B R; Armer, J M; Cormier, J N; Shih, Y-C T

    2013-03-01

    A project of the American Lymphedema Framework Project (ALFP), this review seeks to examine the policy and economic impact of caring for patients with lymphedema, a common side effect of cancer treatment. This review is the first of its kind undertaken to investigate, coordinate, and streamline lymphedema policy initiatives in the United States with potential applicability worldwide. As part of a large scale literature review aiming to systematically evaluate the level of evidence of contemporary peer-reviewed lymphedema literature (2004 to 2011), publications on care delivery models, health policy, and economic impact were retrieved, summarized, and evaluated by a team of investigators and clinical experts. The review substantiates lymphedema education models and clinical models implemented at the community, health care provider, and individual level that improve delivery of care. The review exposes the lack of economic analysis related to lymphedema. Despite a dearth of evidence, efforts towards policy initiatives at the federal and state level are underway. These initiatives and the evidence to support them are examined and recommendations for translating these findings into clinical practice are made. Medical and community-based disease management interventions, taking on a public approach, are effective delivery models for lymphedema care and demonstrate great potential to improve cancer survivorship care. Efforts to create policy at the federal, state, and local level should target implementation of these models. More research is needed to identify costs associated with the treatment of lymphedema and to model the cost outlays and potential cost savings associated with comprehensive management of chronic lymphedema.

  6. Whole-word response scoring underestimates functional spelling ability for some individuals with global agraphia

    Directory of Open Access Journals (Sweden)

    Andrew Tesla Demarco

    2015-05-01

    These data suggest that conventional whole-word scoring may significantly underestimate functional spelling performance. Because by-letter scoring boosted pre-treatment scores to the same extent as post-treatment scores, the magnitude of treatment gains was no greater than estimates from conventional whole-word scoring. Nonetheless, the surprisingly large disparity between conventional whole-word scoring and by-letter scoring suggests that by-letter scoring methods may warrant further investigation. Because by-letter analyses may hold interest to others, we plan to make the software tool used in this study available on-line for use to researchers and clinicians at large.

  7. Chronic rhinosinusitis in Europe - an underestimated disease. A GA(2) LEN study

    DEFF Research Database (Denmark)

    Hastan, D; Fokkens, W J; Bachert, C

    2011-01-01

    , Zuberbier T, Jarvis D, Burney P. Chronic rhinosinusitis in Europe - an underestimated disease. A GA(2) LEN study. Allergy 2011; 66: 1216-1223. ABSTRACT: Background:  Chronic rhinosinusitis (CRS) is a common health problem, with significant medical costs and impact on general health. Even so, prevalence...... figures for Europe are unavailable. In this study, conducted by the GA(2) LEN network of excellence, the European Position Paper on Rhinosinusitis and nasal Polyps (EP(3) OS) diagnostic criteria are applied to estimate variation in the prevalence of Chronic rhinosinusitis (CRS) for Europe. Method...

  8. Case management for dementia in primary health care: a systematic mixed studies review based on the diffusion of innovation model

    Directory of Open Access Journals (Sweden)

    Khanassov V

    2014-06-01

    Full Text Available Vladimir Khanassov, Isabelle Vedel, Pierre PluyeDepartment of Family Medicine, McGill University, Montreal, QC, CanadaBackground: The purpose of this study was to examine factors associated with the implementation of case management (CM interventions in primary health care (PHC and to develop strategies to enhance its adoption by PHC practices.Methods: This study was designed as a systematic mixed studies review (including quantitative and qualitative studies with synthesis based on the diffusion of innovation model. A literature search was performed using MEDLINE, PsycInfo, EMBASE, and the Cochrane Database (1995 to August 2012 to identify quantitative (randomized controlled and nonrandomized and qualitative studies describing the conditions limiting and facilitating successful CM implementation in PHC. The methodological quality of each included study was assessed using the validated Mixed Methods Appraisal Tool. Results: Twenty-three studies (eleven quantitative and 12 qualitative were included. The characteristics of CM that negatively influence implementation are low CM intensity (eg, infrequent follow-up, large caseload (more than 60 patients per full-time case manager, and approach, ie, reactive rather than proactive. Case managers need specific skills to perform their role (eg, good communication skills and their responsibilities in PHC need to be clearly delineated.Conclusion: Our systematic review supports a better understanding of factors that can explain inconsistent evidence with regard to the outcomes of dementia CM in PHC. Lastly, strategies are proposed to enhance implementation of dementia CM in PHC. Keywords: systematic mixed studies review, dementia, case management, primary health care, implementation, diffusion of innovation

  9. A systematic review of Markov models evaluating multicomponent disease management programs in diabetes.

    Science.gov (United States)

    Kirsch, Florian

    2015-01-01

    Diabetes is the most expensive chronic disease; therefore, disease management programs (DMPs) were introduced. The aim of this review is to determine whether Markov models are adequate to evaluate the cost-effectiveness of complex interventions such as DMPs. Additionally, the quality of the models was evaluated using Philips and Caro quality appraisals. The five reviewed models incorporated the DMP into the model differently: two models integrated effectiveness rates derived from one clinical trial/meta-analysis and three models combined interventions from different sources into a DMP. The results range from cost savings and a QALY gain to costs of US$85,087 per QALY. The Spearman's rank coefficient assesses no correlation between the quality appraisals. With restrictions to the data selection process, Markov models are adequate to determine the cost-effectiveness of DMPs; however, to allow prioritization of medical services, more flexibility in the models is necessary to enable the evaluation of single additional interventions.

  10. Effects of waveform model systematics on the interpretation of GW150914

    OpenAIRE

    2016-01-01

    Parameter estimates of GW150914 were obtained using Bayesian inference, based on three semi-analytic waveform models for binary black hole coalescences. These waveform models differ from each other in their treatment of black hole spins, and all three models make some simplifying assumptions, notably to neglect sub-dominant waveform harmonic modes and orbital eccentricity. Furthermore, while the models are calibrated to agree with waveforms obtained by full numerical solutions of Einstein's e...

  11. Semi-empirical white dwarf initial-final mass relationships: a thorough analysis of systematic uncertainties due to stellar evolution models

    CERN Document Server

    Salaris, Maurizio; Weiss, Achim; Bertolami, Marcelo Miller

    2008-01-01

    Using the most recent results about white dwarfs in 10 open clusters, we revisit semi-empirical estimates of the initial-final mass relation in star clusters, with emphasis on the use of stellar evolution models. We discuss the influence of these models on each step of the derivation. One intention of our work is to use consistent sets of calculations both for the isochrones and the white dwarf cooling tracks. The second one is to derive the range of systematic errors arising from stellar evolution theory. This is achieved by using different sources for the stellar models and by varying physical assumptions and input data. We find that systematic errors, including the determination of the cluster age, are dominating the initial mass values, while observational uncertainties influence the final mass primarily. After having determined the systematic errors, the initial-final mass relation allows us finally to draw conclusions about the physics of the stellar models, in particular about convective overshooting.

  12. Case management for dementia in primary health care: a systematic mixed studies review based on the diffusion of innovation model.

    Science.gov (United States)

    Khanassov, Vladimir; Vedel, Isabelle; Pluye, Pierre

    2014-01-01

    The purpose of this study was to examine factors associated with the implementation of case management (CM) interventions in primary health care (PHC) and to develop strategies to enhance its adoption by PHC practices. This study was designed as a systematic mixed studies review (including quantitative and qualitative studies) with synthesis based on the diffusion of innovation model. A literature search was performed using MEDLINE, PsycInfo, EMBASE, and the Cochrane Database (1995 to August 2012) to identify quantitative (randomized controlled and nonrandomized) and qualitative studies describing the conditions limiting and facilitating successful CM implementation in PHC. The methodological quality of each included study was assessed using the validated Mixed Methods Appraisal Tool. Twenty-three studies (eleven quantitative and 12 qualitative) were included. The characteristics of CM that negatively influence implementation are low CM intensity (eg, infrequent follow-up), large caseload (more than 60 patients per full-time case manager), and approach, ie, reactive rather than proactive. Case managers need specific skills to perform their role (eg, good communication skills) and their responsibilities in PHC need to be clearly delineated. Our systematic review supports a better understanding of factors that can explain inconsistent evidence with regard to the outcomes of dementia CM in PHC. Lastly, strategies are proposed to enhance implementation of dementia CM in PHC.

  13. Systematic Selection of Key Logistic Regression Variables for Risk Prediction Analyses: A Five-Factor Maximum Model.

    Science.gov (United States)

    Hewett, Timothy E; Webster, Kate E; Hurd, Wendy J

    2017-08-16

    The evolution of clinical practice and medical technology has yielded an increasing number of clinical measures and tests to assess a patient's progression and return to sport readiness after injury. The plethora of available tests may be burdensome to clinicians in the absence of evidence that demonstrates the utility of a given measurement. Thus, there is a critical need to identify a discrete number of metrics to capture during clinical assessment to effectively and concisely guide patient care. The data sources included Pubmed and PMC Pubmed Central articles on the topic. Therefore, we present a systematic approach to injury risk analyses and how this concept may be used in algorithms for risk analyses for primary anterior cruciate ligament (ACL) injury in healthy athletes and patients after ACL reconstruction. In this article, we present the five-factor maximum model, which states that in any predictive model, a maximum of 5 variables will contribute in a meaningful manner to any risk factor analysis. We demonstrate how this model already exists for prevention of primary ACL injury, how this model may guide development of the second ACL injury risk analysis, and how the five-factor maximum model may be applied across the injury spectrum for development of the injury risk analysis.

  14. Orbital stress analysis, part V: systematic approach to validate a finite element model of a human orbit.

    Science.gov (United States)

    Al-sukhun, Jehad; Penttilä, Heikki; Ashammakhi, Nureddin

    2012-05-01

    The progress in computer technology and the increased use of finite element analysis in the medical field by nonengineers and medical researchers lead us to believe that there is a need to develop a systematic approach to validate a finite element model (FEM), of a human orbit, that simulates part of the maxillofacial skeleton and to investigate the effects and the clinical significance of changing the geometry, boundary conditions, that is, muscle forces, and orthotropic material properties on the predictive outcome of an FEM of a human orbit. Forty-seven variables affecting the material properties, boundary conditions, and the geometry of an FEM of a human orbit including the globe were systematically changed, creating a number of FEMs of the orbit. The effects of the variations were quantified as differences in the principal strain magnitudes modeled by the original FEM (criterion standard), before the sensitivity analyses, and those generated by the changed FEMs. The material properties that had the biggest impact on the predicted principal strains were the shear moduli (up to 21%) and the absence of fatty tissue (up to 75%). The boundary condition properties that had the biggest impact on the predicted principal strains were the superior rectus muscle and canthal ligaments (up to 18% and 23%, respectively). Alterations to the geometry of the orbit, such as an increase in its volume, had the greatest effect on principal strain magnitudes (up to 52%). Changes in geometry, boundary conditions, and orthotropic material properties can induce significant changes in strain patterns. These values must therefore be chosen with care when using finite element modeling techniques. This study also highlights the importance of restoring the orbital fat and volume when reconstructing the orbital floor following a blunt injury. The possibility that the unrestored increase in the orbital volume and the resulting stresses may be a source of globe injuries, causing diplopia

  15. Systematic Analysis of Challenge-Driven Improvements in Molecular Prognostic Models for Breast Cancer

    Science.gov (United States)

    Margolin, Adam A.; Bilal, Erhan; Huang, Erich; Norman, Thea C.; Ottestad, Lars; Mecham, Brigham H.; Sauerwine, Ben; Kellen, Michael R.; Mangravite, Lara M.; Furia, Matthew D.; Vollan, Hans Kristian Moen; Rueda, Oscar M.; Guinney, Justin; Deflaux, Nicole A.; Hoff, Bruce; Schildwachter, Xavier; Russnes, Hege G.; Park, Daehoon; Vang, Veronica O.; Pirtle, Tyler; Youseff, Lamia; Citro, Craig; Curtis, Christina; Kristensen, Vessela N.; Hellerstein, Joseph; Friend, Stephen H.; Stolovitzky, Gustavo; Aparicio, Samuel; Caldas, Carlos; Børresen-Dale, Anne-Lise

    2013-01-01

    Although molecular prognostics in breast cancer are among the most successful examples of translating genomic analysis to clinical applications, optimal approaches to breast cancer clinical risk prediction remain controversial. The Sage Bionetworks–DREAM Breast Cancer Prognosis Challenge (BCC) is a crowdsourced research study for breast cancer prognostic modeling using genome-scale data. The BCC provided a community of data analysts with a common platform for data access and blinded evaluation of model accuracy in predicting breast cancer survival on the basis of gene expression data, copy number data, and clinical covariates. This approach offered the opportunity to assess whether a crowdsourced community Challenge would generate models of breast cancer prognosis commensurate with or exceeding current best-in-class approaches. The BCC comprised multiple rounds of blinded evaluations on held-out portions of data on 1981 patients, resulting in more than 1400 models submitted as open source code. Participants then retrained their models on the full data set of 1981 samples and submitted up to five models for validation in a newly generated data set of 184 breast cancer patients. Analysis of the BCC results suggests that the best-performing modeling strategy outperformed previously reported methods in blinded evaluations; model performance was consistent across several independent evaluations; and aggregating community-developed models achieved performance on par with the best-performing individual models. PMID:23596205

  16. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    KAUST Repository

    Miki, Kenji

    2015-10-01

    © 2015 Elsevier Inc. The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  17. Systematic validation of non-equilibrium thermochemical models using Bayesian inference

    Energy Technology Data Exchange (ETDEWEB)

    Miki, Kenji [NASA Glenn Research Center, OAI, 22800 Cedar Point Rd, Cleveland, OH 44142 (United States); Panesi, Marco, E-mail: mpanesi@illinois.edu [Department of Aerospace Engineering, University of Illinois at Urbana-Champaign, 306 Talbot Lab, 104 S. Wright St., Urbana, IL 61801 (United States); Prudhomme, Serge [Département de mathématiques et de génie industriel, Ecole Polytechnique de Montréal, C.P. 6079, succ. Centre-ville, Montréal, QC, H3C 3A7 (Canada)

    2015-10-01

    The validation process proposed by Babuška et al. [1] is applied to thermochemical models describing post-shock flow conditions. In this validation approach, experimental data is involved only in the calibration of the models, and the decision process is based on quantities of interest (QoIs) predicted on scenarios that are not necessarily amenable experimentally. Moreover, uncertainties present in the experimental data, as well as those resulting from an incomplete physical model description, are propagated to the QoIs. We investigate four commonly used thermochemical models: a one-temperature model (which assumes thermal equilibrium among all inner modes), and two-temperature models developed by Macheret et al. [2], Marrone and Treanor [3], and Park [4]. Up to 16 uncertain parameters are estimated using Bayesian updating based on the latest absolute volumetric radiance data collected at the Electric Arc Shock Tube (EAST) installed inside the NASA Ames Research Center. Following the solution of the inverse problems, the forward problems are solved in order to predict the radiative heat flux, QoI, and examine the validity of these models. Our results show that all four models are invalid, but for different reasons: the one-temperature model simply fails to reproduce the data while the two-temperature models exhibit unacceptably large uncertainties in the QoI predictions.

  18. Efficacy and safety of regenerative cell therapy for pulmonary arterial hypertension in animal models: a preclinical systematic review protocol.

    Science.gov (United States)

    Suen, Colin M; Zhai, Alex; Lalu, Manoj M; Welsh, Christopher; Levac, Brendan M; Fergusson, Dean; McIntyre, Lauralyn; Stewart, Duncan J

    2016-05-25

    Pulmonary arterial hypertension (PAH) is a rare disease (15 cases per million) that is characterized by widespread loss of the pulmonary microcirculation and elevated pulmonary vascular resistance leading to pathological right ventricular remodeling and ultimately right heart failure. Regenerative cell therapies (i.e., therapies involving cells with stem or progenitor-like properties) could potentially restore the effective lung microcirculation and provide a curative therapy for PAH. Preclinical evidence suggests that regenerative cell therapy using endothelial progenitor cells or mesenchymal stem cells may be beneficial in the treatment of PAH. These findings have led to the completion of a small number of human clinical trials, albeit with modest effect compared to animal studies. The objective of this systematic review is to compare the efficacy and safety of regenerative cell therapies in preclinical models of PAH as well as assess study quality to inform future clinical studies. We will include preclinical studies of PAH in which a regenerative cell type was administered and outcomes compared to a disease control. The primary outcome will be pulmonary hemodynamics as assessed by measurement of right ventricular systolic pressure and/or mean pulmonary arterial pressure. Secondary outcomes will include mortality, survival, right ventricular remodeling, pulmonary vascular resistance, cardiac output, cardiac index, pulmonary acceleration time, tricuspid annular systolic excursion, and right ventricular wall thickness. Electronic searches of MEDLINE and EMBASE databases will be constructed and reviewed by the Peer Review of Electronic Search Strategies (PRESS) process. Search results will be screened independently in duplicate. Data from eligible studies will be extracted, pooled, and analyzed using random effects models. Risk of bias will be assessed using the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool, and

  19. Underestimated risks of recurrent long-range ash dispersal from northern Pacific Arc volcanoes

    Science.gov (United States)

    Bourne, A. J.; Abbott, P. M.; Albert, P. G.; Cook, E.; Pearce, N. J. G.; Ponomareva, V.; Svensson, A.; Davies, S. M.

    2016-07-01

    Widespread ash dispersal poses a significant natural hazard to society, particularly in relation to disruption to aviation. Assessing the extent of the threat of far-travelled ash clouds on flight paths is substantially hindered by an incomplete volcanic history and an underestimation of the potential reach of distant eruptive centres. The risk of extensive ash clouds to aviation is thus poorly quantified. New evidence is presented of explosive Late Pleistocene eruptions in the Pacific Arc, currently undocumented in the proximal geological record, which dispersed ash up to 8000 km from source. Twelve microscopic ash deposits or cryptotephra, invisible to the naked eye, discovered within Greenland ice-cores, and ranging in age between 11.1 and 83.7 ka b2k, are compositionally matched to northern Pacific Arc sources including Japan, Kamchatka, Cascades and Alaska. Only two cryptotephra deposits are correlated to known high-magnitude eruptions (Towada-H, Japan, ca 15 ka BP and Mount St Helens Set M, ca 28 ka BP). For the remaining 10 deposits, there is no evidence of age- and compositionally-equivalent eruptive events in regional volcanic stratigraphies. This highlights the inherent problem of under-reporting eruptions and the dangers of underestimating the long-term risk of widespread ash dispersal for trans-Pacific and trans-Atlantic flight routes.

  20. Underestimated risks of recurrent long-range ash dispersal from northern Pacific Arc volcanoes.

    Science.gov (United States)

    Bourne, A J; Abbott, P M; Albert, P G; Cook, E; Pearce, N J G; Ponomareva, V; Svensson, A; Davies, S M

    2016-01-01

    Widespread ash dispersal poses a significant natural hazard to society, particularly in relation to disruption to aviation. Assessing the extent of the threat of far-travelled ash clouds on flight paths is substantially hindered by an incomplete volcanic history and an underestimation of the potential reach of distant eruptive centres. The risk of extensive ash clouds to aviation is thus poorly quantified. New evidence is presented of explosive Late Pleistocene eruptions in the Pacific Arc, currently undocumented in the proximal geological record, which dispersed ash up to 8000 km from source. Twelve microscopic ash deposits or cryptotephra, invisible to the naked eye, discovered within Greenland ice-cores, and ranging in age between 11.1 and 83.7 ka b2k, are compositionally matched to northern Pacific Arc sources including Japan, Kamchatka, Cascades and Alaska. Only two cryptotephra deposits are correlated to known high-magnitude eruptions (Towada-H, Japan, ca 15 ka BP and Mount St Helens Set M, ca 28 ka BP). For the remaining 10 deposits, there is no evidence of age- and compositionally-equivalent eruptive events in regional volcanic stratigraphies. This highlights the inherent problem of under-reporting eruptions and the dangers of underestimating the long-term risk of widespread ash dispersal for trans-Pacific and trans-Atlantic flight routes.

  1. A statistical study of underestimates of wind speeds by VHF radar

    Directory of Open Access Journals (Sweden)

    L. Thomas

    Full Text Available Comparisons are made between horizontal wind measurements carried out using a VHF-radar system at Aberystwyth (52.4°N, 4.1°W and radiosondes launched from Aberporth, some 50 km to the south-west. The radar wind results are derived from Doppler wind measurements at zenith angles of 6° in two orthogonal planes and in the vertical direction. Measurements on a total of 398 days over a 2-year period are considered, but the major part of the study involves a statistical analysis of data collected during 75 radiosonde flights selected to minimise the spatial separation of the two sets of measurements. Whereas good agreement is found between the two sets of wind direction, radar-derived wind speeds show underestimates of 4–6% compared with radiosonde values over the height range 4–14 km. Studies of the characteristics of this discrepancy in wind speeds have concentrated on its directional dependence, the effects of the spatial separation of the two sets of measurements, and the influence of any uncertainty in the radar measurements of vertical velocities. The aspect sensitivity of radar echoes has previously been suggested as a cause of underestimates of wind speeds by VHF radar. The present statistical treatment and case-studies show that an appropriate correction can be applied using estimates of the effective radar beam angle derived from a comparison of echo powers at zenith angles of 4.2° and 8.5°.

  2. Quantifying the underestimation of relative risks from genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Chris Spencer

    2011-03-01

    Full Text Available Genome-wide association studies (GWAS have identified hundreds of associated loci across many common diseases. Most risk variants identified by GWAS will merely be tags for as-yet-unknown causal variants. It is therefore possible that identification of the causal variant, by fine mapping, will identify alleles with larger effects on genetic risk than those currently estimated from GWAS replication studies. We show that under plausible assumptions, whilst the majority of the per-allele relative risks (RR estimated from GWAS data will be close to the true risk at the causal variant, some could be considerable underestimates. For example, for an estimated RR in the range 1.2-1.3, there is approximately a 38% chance that it exceeds 1.4 and a 10% chance that it is over 2. We show how these probabilities can vary depending on the true effects associated with low-frequency variants and on the minor allele frequency (MAF of the most associated SNP. We investigate the consequences of the underestimation of effect sizes for predictions of an individual's disease risk and interpret our results for the design of fine mapping experiments. Although these effects mean that the amount of heritability explained by known GWAS loci is expected to be larger than current projections, this increase is likely to explain a relatively small amount of the so-called "missing" heritability.

  3. Drastic underestimation of amphipod biodiversity in the endangered Irano-Anatolian and Caucasus biodiversity hotspots.

    Science.gov (United States)

    Katouzian, Ahmad-Reza; Sari, Alireza; Macher, Jan N; Weiss, Martina; Saboori, Alireza; Leese, Florian; Weigand, Alexander M

    2016-03-01

    Biodiversity hotspots are centers of biological diversity and particularly threatened by anthropogenic activities. Their true magnitude of species diversity and endemism, however, is still largely unknown as species diversity is traditionally assessed using morphological descriptions only, thereby ignoring cryptic species. This directly limits evidence-based monitoring and management strategies. Here we used molecular species delimitation methods to quantify cryptic diversity of the montane amphipods in the Irano-Anatolian and Caucasus biodiversity hotspots. Amphipods are ecosystem engineers in rivers and lakes. Species diversity was assessed by analysing two genetic markers (mitochondrial COI and nuclear 28S rDNA), compared with morphological assignments. Our results unambiguously demonstrate that species diversity and endemism is dramatically underestimated, with 42 genetically identified freshwater species in only five reported morphospecies. Over 90% of the newly recovered species cluster inside Gammarus komareki and G. lacustris; 69% of the recovered species comprise narrow range endemics. Amphipod biodiversity is drastically underestimated for the studied regions. Thus, the risk of biodiversity loss is significantly greater than currently inferred as most endangered species remain unrecognized and/or are only found locally. Integrative application of genetic assessments in monitoring programs will help to understand the true magnitude of biodiversity and accurately evaluate its threat status.

  4. Underestimation rate of lobular intraepithelial neoplasia in vacuum-assisted breast biopsy.

    Science.gov (United States)

    Meroni, Stefano; Stefano, Meroni; Bozzini, Anna Carla; Carla, Bozzini Anna; Pruneri, Giancarlo; Giancarlo, Pruneri; Moscovici, Oana Codrina; Codrina, Moscovici Oana; Maisonneuve, Patrick; Patrick, Maisonneuve; Menna, Simona; Simona, Menna; Penco, Silvia; Silvia, Penco; Meneghetti, Lorenza; Lorenza, Meneghetti; Renne, Giuseppe; Giuseppe, Renne; Cassano, Enrico; Enrico, Cassano

    2014-07-01

    To evaluate the underestimation rate and clinical relevance of lobular neoplasia in vacuum-assisted breast biopsy (VABB). A total of 161 cases of LN were retrieved from 6,435 VABB. The histological diagnosis was ALH (atypical lobular hyperplasia) in 80 patients, LCIS (lobular carcinoma in situ) in 69 patients and PLCIS (pleomorphic lobular carcinoma in situ) in 12 patients. Seventy-six patients were operated on within 2 years after VABB and 85 were clinically and radiologically monitored. The mean follow-up was 5.2 years, and the prevalence of malignancy was evaluated in the group of 85 patients. The clinico-pathological characteristics significantly favouring surgery were larger lesions, occurrence of a residual lesion following VABB and histological LCIS and PLCIS subtypes. The VABB underestimation rate as compared to surgery was 7.1% for ALH, 12% for LCIS and 50% for PLCIS. Overall, 11 of the 148 patients included in this survival analysis developed an ipsilateral tumour. Although obtained retrospectively in a relatively small series of patients, our data suggest that only patients with a diagnosis of PLCIS in VABB should be treated with surgery, whereas patients with ALH and LCIS could be monitored by clinical and radiological examinations. • The treatment of ALH and LCIS in VABB is still debated • Some authors favour radical treatment and others a more conservative approach • Only patients with PLCIS in VABB should be treated by surgery.

  5. Population-level impact, herd immunity, and elimination after human papillomavirus vaccination: a systematic review and meta-analysis of predictions from transmission-dynamic models

    DEFF Research Database (Denmark)

    Brisson, Marc; Bénard, Élodie; Drolet, Mélanie;

    2016-01-01

    BackgroundModelling studies have been widely used to inform human papillomavirus (HPV) vaccination policy decisions; however, many models exist and it is not known whether they produce consistent predictions of population-level effectiveness and herd effects. We did a systematic review and meta-a...

  6. Climate Change Projection with Reduced Model Systematic Error over Tropic Pacific

    Science.gov (United States)

    Keenlyside, Noel; Shen, Mao-Lin; Selten, Frank; Wiegerinck, Wim; Duane, Gregory

    2014-05-01

    The tropical Pacific is considered as a major driver of the global climate system. However, realistic representation of the equatorial Pacific remains a challenge for state-of-the-art global circulation models (GCMs). For example, the multi-model ensemble mean of the CMIP5 historical simulation exhibits large biases of sea surface temperature. Here we construct an interactive model ensemble (SUMO) by coupling two atmospheric GCMs (AGCM) with one ocean GCM (OGCM). Through optimal coupling weights, synchronization of the atmospheric models over tropical Pacific is enhanced and the dynamic and thermodynamic feedback over Pacific of the GCM become realistic. A set of climate change projections is performed with SUMO and results will be contrasted with conventional multi-model scenario simulations and a standard flux corrected model version to identify main differences.

  7. Systematic construction of kinetic models from genome-scale metabolic networks.

    Directory of Open Access Journals (Sweden)

    Natalie J Stanford

    Full Text Available The quantitative effects of environmental and genetic perturbations on metabolism can be studied in silico using kinetic models. We present a strategy for large-scale model construction based on a logical layering of data such as reaction fluxes, metabolite concentrations, and kinetic constants. The resulting models contain realistic standard rate laws and plausible parameters, adhere to the laws of thermodynamics, and reproduce a predefined steady state. These features have not been simultaneously achieved by previous workflows. We demonstrate the advantages and limitations of the workflow by translating the yeast consensus metabolic network into a kinetic model. Despite crudely selected data, the model shows realistic control behaviour, a stable dynamic, and realistic response to perturbations in extracellular glucose concentrations. The paper concludes by outlining how new data can continuously be fed into the workflow and how iterative model building can assist in directing experiments.

  8. Numerical study identifying the factors causing the significant underestimation of the specific discharge estimated using the modified integral pumping test method in a laboratory experiment.

    Science.gov (United States)

    Sun, Kerang

    2015-09-01

    A three-dimensional finite element model is constructed to simulate the experimental conditions presented in a paper published in this journal [Goltz et al., 2009. Validation of two innovative methods to measure contaminant mass flux in groundwater. Journal of Contaminant Hydrology 106 (2009) 51-61] where the modified integral pumping test (MIPT) method was found to significantly underestimate the specific discharge in an artificial aquifer. The numerical model closely replicates the experimental configuration with explicit representation of the pumping well column and skin, allowing for the model to simulate the wellbore flow in the pumping well as an integral part of the porous media flow in the aquifer using the equivalent hydraulic conductivity approach. The equivalent hydraulic conductivity is used to account for head losses due to friction within the wellbore of the pumping well. Applying the MIPT method on the model simulated piezometric heads resulted in a specific discharge that underestimates the true specific discharge in the experimental aquifer by 18.8%, compared with the 57% underestimation of mass flux by the experiment reported by Goltz et al. (2009). Alternative simulation shows that the numerical model is capable of approximately replicating the experiment results when the equivalent hydraulic conductivity is reduced by an order of magnitude, suggesting that the accuracy of the MIPT estimation could be improved by expanding the physical meaning of the equivalent hydraulic conductivity to account for other factors such as orifice losses in addition to frictional losses within the wellbore. Numerical experiments also show that when applying the MIPT method to estimate hydraulic parameters, use of depth-integrated piezometric head instead of the head near the pump intake can reduce the estimation error resulting from well losses, but not the error associated with the well not being fully screened.

  9. Coordinating the Provision of Health Services in Humanitarian Crises: a Systematic Review of Suggested Models

    OpenAIRE

    Lotfi, Tamara; Bou-Karroum, Lama; Darzi, Andrea; Hajjar, Rayan; El Rahyel, Ahmed; El Eid, Jamale; Itani, Mira; Brax, Hneine; Akik, Chaza; Osman, Mona; Hassan, Ghayda; El-Jardali, Fadi; Akl, Elie

    2016-01-01

    Background: Our objective was to identify published models of coordination between entities funding or delivering health services in humanitarian crises, whether the coordination took place during or after the crises. Methods: We included reports describing models of coordination in sufficient detail to allow reproducibility. We also included reports describing implementation of identified models, as case studies. We searched Medline, PubMed, EMBASE, Cochrane Central Register of Controlled Tr...

  10. Sleep Deprivation and Oxidative Stress in Animal Models: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Gabriel Villafuerte

    2015-01-01

    Full Text Available Because the function and mechanisms of sleep are partially clear, here we applied a meta-analysis to address the issue whether sleep function includes antioxidative properties in mice and rats. Given the expansion of the knowledge in the sleep field, it is indeed ambitious to describe all mammals, or other animals, in which sleep shows an antioxidant function. However, in this paper we reviewed the current understanding from basic studies in two species to drive the hypothesis that sleep is a dynamic-resting state with antioxidative properties. We performed a systematic review of articles cited in Medline, Scopus, and Web of Science until March 2015 using the following search terms: Sleep or sleep deprivation and oxidative stress, lipid peroxidation, glutathione, nitric oxide, catalase or superoxide dismutase. We found a total of 266 studies. After inclusion and exclusion criteria, 44 articles were included, which are presented and discussed in this study. The complex relationship between sleep duration and oxidative stress is discussed. Further studies should consider molecular and genetic approaches to determine whether disrupted sleep promotes oxidative stress.

  11. A generic and systematic procedure to derive a simplified model from the Anaerobic Digestion Model No. 1 (ADM1)

    OpenAIRE

    Ficara, E.; Leva, A.; Harmand, Jérôme

    2015-01-01

    The Anaerobic Digestion Model No.1 (ADM1) developed by the IWA Task Group for mathematical modelling of anaerobic digestion processes (Batstone et al. (2001) [1]) is a structural model which describes the main biochemical and physicochemical processes. For such purposes, other models have been proposed to describe anaerobic processes with a reduced set of parameters, state variables and processes. Among them, the Anaerobic Model No. 2 (AM2) proposed by Bernard et al. (2001) [2] which describe...

  12. Animal models of prenatal immune challenge and their contribution to the study of schizophrenia: a systematic review

    Directory of Open Access Journals (Sweden)

    D.S. Macêdo

    2012-03-01

    Full Text Available Prenatal immune challenge (PIC in pregnant rodents produces offspring with abnormalities in behavior, histology, and gene expression that are reminiscent of schizophrenia and autism. Based on this, the goal of this article was to review the main contributions of PIC models, especially the one using the viral-mimetic particle polyriboinosinic-polyribocytidylic acid (poly-I:C, to the understanding of the etiology, biological basis and treatment of schizophrenia. This systematic review consisted of a search of available web databases (PubMed, SciELO, LILACS, PsycINFO, and ISI Web of Knowledge for original studies published in the last 10 years (May 2001 to October 2011 concerning animal models of PIC, focusing on those using poly-I:C. The results showed that the PIC model with poly-I:C is able to mimic the prodrome and both the positive and negative/cognitive dimensions of schizophrenia, depending on the specific gestation time window of the immune challenge. The model resembles the neurobiology and etiology of schizophrenia and has good predictive value. In conclusion, this model is a robust tool for the identification of novel molecular targets during prenatal life, adolescence and adulthood that might contribute to the development of preventive and/or treatment strategies (targeting specific symptoms, i.e., positive or negative/cognitive for this devastating mental disorder, also presenting biosafety as compared to viral infection models. One limitation of this model is the incapacity to model the full spectrum of immune responses normally induced by viral exposure.

  13. Systematic evaluation of autoregressive error models as post-processors for a probabilistic streamflow forecast system

    Science.gov (United States)

    Morawietz, Martin; Xu, Chong-Yu; Gottschalk, Lars; Tallaksen, Lena

    2010-05-01

    A post-processor that accounts for the hydrologic uncertainty in a probabilistic streamflow forecast system is necessary to account for the uncertainty introduced by the hydrological model. In this study different variants of an autoregressive error model that can be used as a post-processor for short to medium range streamflow forecasts, are evaluated. The deterministic HBV model is used to form the basis for the streamflow forecast. The general structure of the error models then used as post-processor is a first order autoregressive model of the form dt = αdt-1 + σɛt where dt is the model error (observed minus simulated streamflow) at time t, α and σ are the parameters of the error model, and ɛt is the residual error described through a probability distribution. The following aspects are investigated: (1) Use of constant parameters α and σ versus the use of state dependent parameters. The state dependent parameters vary depending on the states of temperature, precipitation, snow water equivalent and simulated streamflow. (2) Use of a Standard Normal distribution for ɛt versus use of an empirical distribution function constituted through the normalized residuals of the error model in the calibration period. (3) Comparison of two different transformations, i.e. logarithmic versus square root, that are applied to the streamflow data before the error model is applied. The reason for applying a transformation is to make the residuals of the error model homoscedastic over the range of streamflow values of different magnitudes. Through combination of these three characteristics, eight variants of the autoregressive post-processor are generated. These are calibrated and validated in 55 catchments throughout Norway. The discrete ranked probability score with 99 flow percentiles as standardized thresholds is used for evaluation. In addition, a non-parametric bootstrap is used to construct confidence intervals and evaluate the significance of the results. The main

  14. Overview of data-synthesis in systematic reviews of studies on outcome prediction models

    NARCIS (Netherlands)

    T. van den Berg (Tobias); M.W. Heymans (Martijn); O. Leone; D. Vergouw (David); J. Hayden (Jill); A.P. Verhagen (Arianne); H.C. de Vet (Henrica C)

    2013-01-01

    textabstractBackground: Many prognostic models have been developed. Different types of models, i.e. prognostic factor and outcome prediction studies, serve different purposes, which should be reflected in how the results are summarized in reviews. Therefore we set out to investigate how authors of

  15. Overview of data-synthesis in systematic reviews of studies on outcome prediction models.

    NARCIS (Netherlands)

    Berg, T. van den; Heymans, M.W.; Leone, S.S.; Vergouw, D.; Hayden, J.A.; Verhagen, A.P.; Vet, H.C.W. de

    2013-01-01

    Background: Many prognostic models have been developed. Different types of models, i.e. prognostic factor and outcome prediction studies, serve different purposes, which should be reflected in how the results are summarized in reviews. Therefore we set out to investigate how authors of reviews

  16. Comparison of first principles model of beer microfiltration to experiments via systematic parameter identification

    NARCIS (Netherlands)

    Sman, van der R.G.M.; Willigenburg, van G.; Vollebregt, H.M.; Eisner, V.; Mepschen, A.

    2015-01-01

    A first principles microfiltration model based on shear-induced diffusion is compared to experiments performed on the clarification of beer. After performing an identifiability and sensitivity analysis, the model parameters are estimated using global minimization of the sum of least squares. The

  17. Systematic comparison of the behaviors produced by computational models of epileptic neocortex.

    Energy Technology Data Exchange (ETDEWEB)

    Warlaumont, A. S.; Lee, H. C.; Benayoun, M.; Stevens, R. L.; Hereld, M. (CLS-CI); ( MCS); (Univ. of Chicago); (Univ. of Memphis)

    2010-12-01

    Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data can be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.

  18. A systematic comparison of jet quenching in different fluid-dynamical models

    CERN Document Server

    Renk, Thorsten; Heinz, Ulrich; Shen, Chun

    2010-01-01

    Comparing four different (ideal and viscous) hydrodynamic models for the evolution of the medium created in 200 AGeV Au-Au collisions, combined with two different models for the path length dependence of parton energy loss, we study the effects of jet quenching on the emission-angle dependence of the nuclear suppression factor R_AA(phi) and the away-side per trigger yield I_AA(phi). Each hydrodynamic model was tuned to provide a reasonable description of the single-particle transverse momentum spectra for all collision centralities, and the energy loss models were adjusted to yield the same pion nuclear suppression factor in central Au-Au collisions. We find that the experimentally measured in-plane vs. out-of-plane spread in R_AA(phi) is better reproduced by models that shift the weight of the parton energy loss to later times along its path. Among the models studied here, this is best achieved by energy loss models that suppress energy loss at early times, combined with hydrodynamic models that delay the di...

  19. A systematic study of Lyman-Alpha transfer through outflowing shells: Model parameter estimation

    CERN Document Server

    Gronke, Max; Dijkstra, Mark

    2015-01-01

    Outflows promote the escape of Lyman-$\\alpha$ (Ly$\\alpha$) photons from dusty interstellar media. The process of radiative transfer through interstellar outflows is often modelled by a spherically symmetric, geometrically thin shell of gas that scatters photons emitted by a central Ly$\\alpha$ source. Despite its simplified geometry, this `shell model' has been surprisingly successful at reproducing observed Ly$\\alpha$ line shapes. In this paper we perform automated line fitting on a set of noisy simulated shell model spectra, in order to determine whether degeneracies exist between the different shell model parameters. While there are some significant degeneracies, we find that most parameters are accurately recovered, especially the HI column density ($N_{\\rm HI}$) and outflow velocity ($v_{\\rm exp}$). This work represents an important first step in determining how the shell model parameters relate to the actual physical properties of Ly$\\alpha$ sources. To aid further exploration of the parameter space, we ...

  20. Contact tracing of tuberculosis: a systematic review of transmission modelling studies.

    Directory of Open Access Journals (Sweden)

    Matt Begun

    Full Text Available The WHO recommended intervention of Directly Observed Treatment, Short-course (DOTS appears to have been less successful than expected in reducing the burden of TB in some high prevalence settings. One strategy for enhancing DOTS is incorporating active case-finding through screening contacts of TB patients as widely used in low-prevalence settings. Predictive models that incorporate population-level effects on transmission provide one means of predicting impacts of such interventions. We aim to identify all TB transmission modelling studies addressing contact tracing and to describe and critically assess their modelling assumptions, parameter choices and relevance to policy. We searched MEDLINE, SCOPUS, COMPENDEX, Google Scholar and Web of Science databases for relevant English language publications up to February 2012. Of the 1285 studies identified, only 5 studies met our inclusion criteria of models of TB transmission dynamics in human populations designed to incorporate contact tracing as an intervention. Detailed implementation of contact processes was only present in two studies, while only one study presented a model for a high prevalence, developing world setting. Some use of relevant data for parameter estimation was made in each study however validation of the predicted impact of interventions was not attempted in any of the studies. Despite a large body of literature on TB transmission modelling, few published studies incorporate contact tracing. There is considerable scope for future analyses to make better use of data and to apply individual based models to facilitate more realistic patterns of infectious contact. Combined with a focus on high burden settings this would greatly increase the potential for models to inform the use of contract tracing as a TB control policy. Our findings highlight the potential for collaborative work between clinicians, epidemiologists and modellers to gather data required to enhance model development

  1. Contact tracing of tuberculosis: a systematic review of transmission modelling studies.

    Science.gov (United States)

    Begun, Matt; Newall, Anthony T; Marks, Guy B; Wood, James G

    2013-01-01

    The WHO recommended intervention of Directly Observed Treatment, Short-course (DOTS) appears to have been less successful than expected in reducing the burden of TB in some high prevalence settings. One strategy for enhancing DOTS is incorporating active case-finding through screening contacts of TB patients as widely used in low-prevalence settings. Predictive models that incorporate population-level effects on transmission provide one means of predicting impacts of such interventions. We aim to identify all TB transmission modelling studies addressing contact tracing and to describe and critically assess their modelling assumptions, parameter choices and relevance to policy. We searched MEDLINE, SCOPUS, COMPENDEX, Google Scholar and Web of Science databases for relevant English language publications up to February 2012. Of the 1285 studies identified, only 5 studies met our inclusion criteria of models of TB transmission dynamics in human populations designed to incorporate contact tracing as an intervention. Detailed implementation of contact processes was only present in two studies, while only one study presented a model for a high prevalence, developing world setting. Some use of relevant data for parameter estimation was made in each study however validation of the predicted impact of interventions was not attempted in any of the studies. Despite a large body of literature on TB transmission modelling, few published studies incorporate contact tracing. There is considerable scope for future analyses to make better use of data and to apply individual based models to facilitate more realistic patterns of infectious contact. Combined with a focus on high burden settings this would greatly increase the potential for models to inform the use of contract tracing as a TB control policy. Our findings highlight the potential for collaborative work between clinicians, epidemiologists and modellers to gather data required to enhance model development and validation and

  2. A Systematic Review of the Effect of Therapists' Internalized Models of Relationships on the Quality of the Therapeutic Relationship.

    Science.gov (United States)

    Steel, Catherine; Macdonald, James; Schroder, Thomas

    2017-05-15

    Previous reviews have found equivocal evidence of an association between therapists' internalized relational models and the therapeutic relationship and have neglected empirical literature based on Sullivan's notion of introject. This review expanded upon previous reviews to examine the effect of therapist internalized relational models on a broader conceptualization of the therapeutic relationship. Systematic search processes identified 22 papers measuring therapist attachment and/or introject and therapeutic relationship: 19 on therapist attachment, 5 on introject with 2 overlapping. Overall, despite heterogeneity in design and variable methodological quality, evidence suggests that therapist attachment affects therapeutic relationship quality, observed in client-rated evaluation, therapist negative countertransference, empathy, and problems in therapy. Interaction effects between client and therapist attachment style were also found. Evidence suggesting that therapist introject also affects therapeutic relationship quality, including therapists' manner and feelings toward their clients, was stronger. Evidence clearly shows that therapists' internalized relational models affect the therapeutic relationship. More research is necessary to clarify exactly how therapist and client internalized relational models interact and translate these findings into clinical practice. © 2017 Wiley Periodicals, Inc.

  3. Development of prognostic models for patients with traumatic brain injury: a systematic review.

    Science.gov (United States)

    Gao, Jinxi; Zheng, Zhaocong

    2015-01-01

    Outcome prediction following traumatic brain injury (TBI) is a widely investigated field of research. Several outcome prediction models have been developed for prognosis after TBI. There are two main prognostic models: International Mission for Prognosis and Clinical Trials in Traumatic Brain Injury (IMPACT) prognosis calculator and the Corticosteroid Randomization after Significant Head Injury (CRASH) prognosis calculator. The prognosis model has three or four levels: (1) model A included age, motor GCS, and pupil reactivity; (2) model B included predictors from model A with CT characteristics; and (3) model C included predictors from model B with laboratory parameters. In consideration of the fact that interventions after admission, such as ICP management also have prognostic value for outcome predictions and may improve the models' performance, Yuan F et al developed another prediction model (model D) which includes ICP. With the development of molecular biology, a handful of brain injury biomarkers were reported that may improve the predictive power of prognostic models, including neuron-specific enolase (NSE), glial fibrillary acid protein (GFAP), S-100β protein, tumour necrosis factor-alpha (TNF-α), interleukin-6 (IL-6), myelin basic protein (MBP), cleaved tau protein (C-tau), spectrin breakdown products (SBDPs), and ubiquitin C-terminal hydrolase-L1 (UCH-L1), and sex hormones. A total of 40 manuscripts reporting 11 biomarkers were identified in the literature. Many substances have been implicated as potential biomarkers for TBI; however, no single biomarker has shown the necessary sensitivity and specificity for predicting outcome. The limited number of publications in this field underscores the need for further investigation. Through fluid biomarker analysis, the advent of multi-analyte profiling technology has enabled substantial advances in the diagnosis and treatment of a variety of conditions. Application of this technology to create a bio

  4. Clinical Interdisciplinary Collaboration Models and Frameworks From Similarities to Differences: A Systematic Review

    Science.gov (United States)

    Mahdizadeh, Mousa; Heydari, Abbas; Moonaghi, Hossien Karimi

    2015-01-01

    Introduction: So far, various models of interdisciplinary collaboration in clinical nursing have been presented, however, yet a comprehensive model is not available. The purpose of this study is to review the evidences that had presented model or framework with qualitative approach about interdisciplinary collaboration in clinical nursing. Methods: All the articles and theses published from 1990 to 10 June 2014 which in both English and Persian models or frameworks of clinicians had presented model or framework of clinical collaboration were searched using databases of Proquest, Scopus, pub Med, Science Direct, and Iranian databases of Sid, Magiran, and Iranmedex. In this review, for published articles and theses, keywords according with MESH such as nurse-physician relations, care team, collaboration, interdisciplinary relations and their Persian equivalents were used. Results: In this study contexts, processes and outcomes of interdisciplinary collaboration as findings were extracted. One of the major components affecting on collaboration that most of the models had emphasized was background of collaboration. Most of studies suggested that the outcome of collaboration were improved care, doctors and nurses’ satisfaction, controlling costs, reducing clinical errors and patient’s safety. Conclusion: Models and frameworks had different structures, backgrounds, and conditions, but the outcomes were similar. Organizational structure, culture and social factors are important aspects of clinical collaboration. So it is necessary to improve the quality and effectiveness of clinical collaboration these factors to be considered. PMID:26153158

  5. Systematic review of the synergist muscle ablation model for compensatory hypertrophy.

    Science.gov (United States)

    Terena, Stella Maris Lins; Fernandes, Kristianne Porta Santos; Bussadori, Sandra Kalill; Deana, Alessandro Melo; Mesquita-Ferrari, Raquel Agnelli

    2017-02-01

    The aim was to evaluate the effectiveness of the experimental synergists muscle ablation model to promote muscle hypertrophy, determine the period of greatest hypertrophy and its influence on muscle fiber types and determine differences in bilateral and unilateral removal to reduce the number of animals used in this model. Following the application of the eligibility criteria for the mechanical overload of the plantar muscle in rats, nineteen papers were included in the review. The results reveal a greatest hypertrophy occurring between days 12 and 15, and based on the findings, synergist muscle ablation is an efficient model for achieving rapid hypertrophy and the contralateral limb can be used as there was no difference between unilateral and bilateral surgery, which reduces the number of animals used in this model. This model differs from other overload models (exercise and training) regarding the characteristics involved in the hypertrophy process (acute) and result in a chronic muscle adaptation with selective regulation and modification of fast-twitch fibers in skeletal muscle. This is an efficient and rapid model for compensatory hypertrophy.

  6. Systematic review of the synergist muscle ablation model for compensatory hypertrophy

    Directory of Open Access Journals (Sweden)

    Stella Maris Lins Terena

    Full Text Available Summary Objective: The aim was to evaluate the effectiveness of the experimental synergists muscle ablation model to promote muscle hypertrophy, determine the period of greatest hypertrophy and its influence on muscle fiber types and determine differences in bilateral and unilateral removal to reduce the number of animals used in this model. Method: Following the application of the eligibility criteria for the mechanical overload of the plantar muscle in rats, nineteen papers were included in the review. Results: The results reveal a greatest hypertrophy occurring between days 12 and 15, and based on the findings, synergist muscle ablation is an efficient model for achieving rapid hypertrophy and the contralateral limb can be used as there was no difference between unilateral and bilateral surgery, which reduces the number of animals used in this model. Conclusion: This model differs from other overload models (exercise and training regarding the characteristics involved in the hypertrophy process (acute and result in a chronic muscle adaptation with selective regulation and modification of fast-twitch fibers in skeletal muscle. This is an efficient and rapid model for compensatory hypertrophy.

  7. Systematic Review and Meta-Analysis of Bone Marrow-Derived Mononuclear Cells in Animal Models of Ischemic Stroke.

    Science.gov (United States)

    Vahidy, Farhaan S; Rahbar, Mohammad H; Zhu, Hongjian; Rowan, Paul J; Bambhroliya, Arvind B; Savitz, Sean I

    2016-06-01

    Bone marrow-derived mononuclear cells (BMMNCs) offer the promise of augmenting poststroke recovery. There is mounting evidence of safety and efficacy of BMMNCs from preclinical studies of ischemic stroke; however, their pooled effects have not been described. Using Preferred Reporting Items for Systematic Review and Meta-Analysis guidelines, we conducted a systematic review of preclinical literature for intravenous use of BMMNCs followed by meta-analyses of histological and behavioral outcomes. Studies were selected based on predefined criteria. Data were abstracted by 2 independent investigators. After quality assessment, the pooled effects were generated using mixed-effect models. Impact of possible biases on estimated effect size was evaluated. Standardized mean difference and 95% confidence interval for reduction in lesion volume was significantly beneficial for BMMNC treatment (standardized mean difference: -3.3; 95% confidence interval, -4.3 to -2.3). n=113 each for BMMNC and controls. BMMNC-treated animals (n=161) also had improved function measured by cylinder test (standardized mean difference: -2.4; 95% confidence interval, -3.1 to -1.6), as compared with controls (n=205). A trend for benefit was observed for adhesive removal test and neurological deficit score. Study quality score (median: 6; Q1-Q3: 5-7) was correlated with year of publication. There was funnel plot asymmetry; however, the pooled effects were robust to the correction of this bias and remained significant in favor of BMMNC treatment. BMMNCs demonstrate beneficial effects across histological and behavioral outcomes in animal ischemic stroke models. Although study quality has improved over time, considerable degree of heterogeneity calls for standardization in the conduct and reporting of experimentation. © 2016 American Heart Association, Inc.

  8. A Systematic Review to Assess Sugar-Sweetened Beverage Interventions for Children and Adolescents across the Socioecological Model.

    Science.gov (United States)

    Lane, Hannah; Porter, Kathleen; Estabrooks, Paul; Zoellner, Jamie

    2016-08-01

    Sugar-sweetened beverage (SSB) consumption among children and adolescents is a determinant of childhood obesity. Many programs to reduce consumption across the socioecological model report significant positive results; however, the generalizability of the results, including whether reporting differences exist among socioecological strategy levels, is unknown. This systematic review aimed to examine the extent to which studies reported internal and external validity indicators defined by the reach, effectiveness, adoption, implementation, and maintenance (RE-AIM) model and assess reporting differences by socioecological level: Intrapersonal/interpersonal (Level 1), environmental/policy (Level 2), and multilevel (Combined Level). A systematic literature review was conducted in six major databases (PubMed, Web of Science, Cinahl, CAB Abstracts, Education Research Information Center, and Arcola) to identify studies from 2004-2015 meeting inclusion criteria (children aged 3 to 12 years, adolescents aged 13 to 17 years, and young adults aged 18 years, experimental or quasiexperimental, and substantial SSB component). Interventions were categorized by socioecological level, and data were extracted using a validated RE-AIM protocol. One-way analysis of variance assessed differences between levels. There were 55 eligible studies accepted, including 21 Level 1, 18 Level 2, and 16 Combined Level studies. Thirty-six studies (65%) were conducted in the United States, 19 studies (35%) were conducted internationally, and 39 studies (71%) were implemented in schools. Across levels, reporting averages were low for all RE-AIM dimensions (reach=29%, efficacy or effectiveness=45%, adoption=26%, implementation=27%, and maintenance=14%). Level 2 studies had significantly lower reporting on reach and effectiveness (10% and 26%, respectively) compared with Level 1 (44% and 57%, respectively) or Combined Level studies (31% and 52%, respectively) (Pconsumption in children and adolescents

  9. A systematic study of multiple minerals precipitation modelling in wastewater treatment

    DEFF Research Database (Denmark)

    Kazadi Mbamba, Christian; Tait, Stephan; Flores-Alsina, Xavier

    2015-01-01

    Mineral solids precipitation is important in wastewater treatment. However approaches to minerals precipitation modelling are varied, often empirical, and mostly focused on single precipitate classes. A common approach, applicable to multi-species precipitates, is needed to integrate into existin...

  10. Hydrocarbon Fuel Thermal Performance Modeling based on Systematic Measurement and Comprehensive Chromatographic Analysis

    Science.gov (United States)

    2016-07-27

    Conference Paper 3. DATES COVERED (From - To) 10 June 2016 - 27 July 2016 4. TITLE AND SUBTITLE Hydrocarbon Fuel Thermal Performance Modeling based on...The Johns Hopkins University Energetics Research Group (JHU/ERG), Columbia, MD and University of Washington, Seattle, WA 14. ABSTRACT Ensuring fuel ...is a common requirement for aircraft, rockets, and hypersonic vehicles. The Aerospace Fuels Quality Test and Model Development (AFQTMoDev) project

  11. Surgeons often underestimate the amount of blood loss in replacement surgeries

    Institute of Scientific and Technical Information of China (English)

    Ganesan Ganesan Ram; Perumal Suresh; Phagal Varthi Vijayaraghavan

    2014-01-01

    Objective:To assess the accuracy of the clinically estimated blood loss (EBL) when compared with the actual blood loss (ABL) in replacement surgeries.Methods:This prospective study was done in Sri Ramachandra Medical Centre from April 2011 to April 2013.Altogether 140 patients undergoing total hip replacement or total knee replacement were included with the inclusion criteria being patients with haemoglobin higher than 100 g/ml and coagulation profile within normal limits.Exclusion criteria were intake of antiplatelet drug or anti-coagulant,bleeding disorders,thrombotic episode,and haematological disorders.There were 65 men and 75 women.In this study,the consultants were free to use any clinical method to estimate the blood loss,including counting the blood-soaked mops and gauze pieces (estimating the volume of blood carded in all the mops and gauzes),measuring blood lost to suction bottles and blood in and around the operative field.The ABL was calculated based on a modification of the Gross's formula using haematocrit values.Results:In 42 of the 140 cases,the EBL exceeded the ABL.These cases had a negative difference in blood loss (or DIFF-BL<0) and were included in the overestimation group,which accounted for 30% of the study population.Of the remaining 98 cases (70%),the ABL exceeded the EBL.Therefore they were put into the underestimation group who had a positive difference in blood loss (DIFF-BL>0).We found that when the average blood loss was small,the accuracy of estimation was high.But when the average blood loss exceeded 500 ml,the accuracy rate decreased significantly.This suggested that clinical estimation is inaccurate with the increase of blood loss.Conclusion:This study has shown that using clinical estimation alone to guide blood transfusion is inadequate.In this study,70% of patients had their blood loss underestimated,proving that surgeons often underestimate blood loss in replacement surgeries.

  12. Surgeons often underestimate the amount of blood loss in replacement surgeries

    Directory of Open Access Journals (Sweden)

    Ram Ganesan Ganesan

    2014-07-01

    Full Text Available 【Abstract】Objective:To assess the accuracy of the clinically estimated blood loss (EBL when compared with the actual blood loss (ABL in replacement surgeries. Methods: This prospective study was done in Sri Ramachandra Medical Centre from April 2011 to April 2013. Altogether 140 patients undergoing total hip replacement or total knee replacement were included with the inclusion criteria being patients with haemoglobin higher than 100 g/ml and coagulation profile within normal limits. Exclusion criteria were intake of antiplatelet drug or anti-coagulant, bleeding disorders, thrombotic episode, and haematological disorders. There were 65 men and 75 women. In this study, the consultants were free to use any clinical method to estimate the blood loss, including counting the blood-soaked mops and gauze pieces (estimating the volume of blood carried in all the mops and gauzes, measuring blood lost to suction bottles and blood in and around the operative field. The ABL was calculated based on a modification of the Gross’s formula using haematocrit values. Results: In 42 of the 140 cases, the EBL exceeded the ABL. These cases had a negative difference in blood loss (or DIFF-BL<0 and were included in the overestimation group, which accounted for 30% of the study population. Of the remaining 98 cases (70%, the ABL exceeded the EBL. Therefore they were put into the underestimation group who had a positive difference in blood loss (DIFF-BL>0. We found that when the average blood loss was small, the accuracy of estimation was high. But when the average blood loss exceeded 500 ml, the accuracy rate decreased significantly. This suggested that clinical estimation is inaccurate with the increase of blood loss. Conclusion:This study has shown that using clinical estimation alone to guide blood transfusion is inadequate. In this study, 70% of patients had their blood loss underestimated, proving that surgeons often underestimate blood loss in replacement

  13. Developing and Optimising the Use of Logic Models in Systematic Reviews: Exploring Practice and Good Practice in the Use of Programme Theory in Reviews

    Science.gov (United States)

    Kneale, Dylan; Thomas, James; Harris, Katherine

    2015-01-01

    Background Logic models are becoming an increasingly common feature of systematic reviews, as is the use of programme theory more generally in systematic reviewing. Logic models offer a framework to help reviewers to ‘think’ conceptually at various points during the review, and can be a useful tool in defining study inclusion and exclusion criteria, guiding the search strategy, identifying relevant outcomes, identifying mediating and moderating factors, and communicating review findings. Methods and Findings In this paper we critique the use of logic models in systematic reviews and protocols drawn from two databases representing reviews of health interventions and international development interventions. Programme theory featured only in a minority of the reviews and protocols included. Despite drawing from different disciplinary traditions, reviews and protocols from both sources shared several limitations in their use of logic models and theories of change, and these were used almost unanimously to solely depict pictorially the way in which the intervention worked. Logic models and theories of change were consequently rarely used to communicate the findings of the review. Conclusions Logic models have the potential to be an aid integral throughout the systematic reviewing process. The absence of good practice around their use and development may be one reason for the apparent limited utility of logic models in many existing systematic reviews. These concerns are addressed in the second half of this paper, where we offer a set of principles in the use of logic models and an example of how we constructed a logic model for a review of school-based asthma interventions. PMID:26575182

  14. A systematic review of health economic models of opioid agonist therapies in maintenance treatment of non-prescription opioid dependence.

    Science.gov (United States)

    Chetty, Mersha; Kenworthy, James J; Langham, Sue; Walker, Andrew; Dunlop, William C N

    2017-02-24

    Opioid dependence is a chronic condition with substantial health, economic and social costs. The study objective was to conduct a systematic review of published health-economic models of opioid agonist therapy for non-prescription opioid dependence, to review the different modelling approaches identified, and to inform future modelling studies. Literature searches were conducted in March 2015 in eight electronic databases, supplemented by hand-searching reference lists and searches on six National Health Technology Assessment Agency websites. Studies were included if they: investigated populations that were dependent on non-prescription opioids and were receiving opioid agonist or maintenance therapy; compared any pharmacological maintenance intervention with any other maintenance regimen (including placebo or no treatment); and were health-economic models of any type. A total of 18 unique models were included. These used a range of modelling approaches, including Markov models (n = 4), decision tree with Monte Carlo simulations (n = 3), decision analysis (n = 3), dynamic transmission models (n = 3), decision tree (n = 1), cohort simulation (n = 1), Bayesian (n = 1), and Monte Carlo simulations (n = 2). Time horizons ranged from 6 months to lifetime. The most common evaluation was cost-utility analysis reporting cost per quality-adjusted life-year (n = 11), followed by cost-effectiveness analysis (n = 4), budget-impact analysis/cost comparison (n = 2) and cost-benefit analysis (n = 1). Most studies took the healthcare provider's perspective. Only a few models included some wider societal costs, such as productivity loss or costs of drug-related crime, disorder and antisocial behaviour. Costs to individuals and impacts on family and social networks were not included in any model. A relatively small number of studies of varying quality were found. Strengths and weaknesses relating to model structure, inputs and approach were identified across

  15. Finite element modelling of the foot for clinical application: A systematic review.

    Science.gov (United States)

    Behforootan, Sara; Chatzistergos, Panagiotis; Naemi, Roozbeh; Chockalingam, Nachiappan

    2017-01-01

    Over the last two decades finite element modelling has been widely used to give new insight on foot and footwear biomechanics. However its actual contribution for the improvement of the therapeutic outcome of different pathological conditions of the foot, such as the diabetic foot, remains relatively limited. This is mainly because finite element modelling has only been used within the research domain. Clinically applicable finite element modelling can open the way for novel diagnostic techniques and novel methods for treatment planning/optimisation which would significantly enhance clinical practice. In this context this review aims to provide an overview of modelling techniques in the field of foot and footwear biomechanics and to investigate their applicability in a clinical setting. Even though no integrated modelling system exists that could be directly used in the clinic and considerable progress is still required, current literature includes a comprehensive toolbox for future work towards clinically applicable finite element modelling. The key challenges include collecting the information that is needed for geometry design, the assignment of material properties and loading on a patient-specific basis and in a cost-effective and non-invasive way. The ultimate challenge for the implementation of any computational system into clinical practice is to ensure that it can produce reliable results for any person that belongs in the population for which it was developed. Consequently this highlights the need for thorough and extensive validation of each individual step of the modelling process as well as for the overall validation of the final integrated system. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  16. The impact of systematically incomplete and positionally inaccurate landslide inventories on statistical landslide susceptibility models

    Science.gov (United States)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Glade, Thomas

    2016-04-01

    Several publications emphasize that the quality of statistical landslide susceptibility maps is highly dependent on the completeness and positional accuracy of the landslide inventory used as a response variable to produce the underlying models. We assume that erroneous landslide inventories distort relationships between a landslide inventory and its predictors while we hypothesize that the predictive performance of the underlying models is not necessarily worse in comparison to models generated with an accurate and unbiased landslide inventory. The objective of this study was to investigate the effect of incomplete and positionally inaccurate landslide inventories on the results of statistical landslide susceptibility models. An additional aim was to explore the potential of applying multilevel models to tackle the problem of confounded model coefficients as a results of inventory-based biases. The study was conducted for a landslide-prone study area (100 km²) located in the western part of Lower Austria. An accurate earth-slide point inventory (n = 591) was available for that region. The methodological approach consisted of an artificial introduction of biases and positional inaccuracies into the present landslide inventory and a subsequent quantitative (odds ratios, variable importance, non-spatial and spatial cross validation) and qualitative (geomorphic plausibility) evaluation of the modelling results. Two mapping biases were introduced separately by gradually thinning landslide data (0%, 20%, 80%) within (i) forested areas and (ii) selected municipalities. Positional inaccuracies were simulated by gradually changing the original landslide position (0, 5, 10, 20, 50 and 120 m). The resulting inventories were introduced into a logistic regression model while we considered the effects of including or excluding predictors directly related to the respective incompleteness. All incomplete inventories were additionally introduced into a two-level generalized

  17. Accuracy and reproducibility of dental measurements on tomographic digital models: a systematic review and meta-analysis.

    Science.gov (United States)

    Ferreira, Jamille B; Christovam, Ilana O; Alencar, David S; da Motta, Andréa F J; Mattos, Claudia T; Cury-Saramago, Adriana

    2017-04-26

    The aim of this systematic review with meta-analysis was to assess the accuracy and reproducibility of dental measurements obtained from digital study models generated from CBCT compared with those acquired from plaster models. The electronic databases Cochrane Library, Medline (via PubMed), Scopus, VHL, Web of Science, and System for Information on Grey Literature in Europe were screened to identify articles from 1998 until February 2016. The inclusion criteria were: prospective and retrospective clinical trials in humans; validation and/or comparison articles of dental study models obtained from CBCT and plaster models; and articles that used dental linear measurements as an assessment tool. The methodological quality of the studies was carried out by Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool. A meta-analysis was performed to validate all comparative measurements. The databases search identified a total of 3160 items and 554 duplicates were excluded. After reading titles and abstracts, 12 articles were selected. Five articles were included after reading in full. The methodological quality obtained through QUADAS-2 was poor to moderate. In the meta-analysis, there were statistical differences between the mesiodistal widths of mandibular incisors, maxillary canines and premolars, and overall Bolton analysis. Therefore, the measurements considered accurate were maxillary and mandibular crowding, intermolar width and mesiodistal width of maxillary incisors, mandibular canines and premolars, in both arches for molars. Digital models obtained from CBCT were not accurate for all measures assessed. The differences were clinically acceptable for all dental linear measurements, except for maxillary arch perimeter. Digital models are reproducible for all measurements when intraexaminer assessment is considered and need improvement in interexaminer evaluation.

  18. Systematic Site Characterization at Seismic Stations combined with Empirical Spectral Modeling: critical data for local hazard analysis

    Science.gov (United States)

    Michel, Clotaire; Hobiger, Manuel; Edwards, Benjamin; Poggi, Valerio; Burjanek, Jan; Cauzzi, Carlo; Kästli, Philipp; Fäh, Donat

    2016-04-01

    The Swiss Seismological Service operates one of the densest national seismic networks in the world, still rapidly expanding (see http://www.seismo.ethz.ch/monitor/index_EN). Since 2009, every newly instrumented site is characterized following an established procedure to derive realistic 1D VS velocity profiles. In addition, empirical Fourier spectral modeling is performed on the whole network for each recorded event with sufficient signal-to-noise ratio. Besides the source characteristics of the earthquakes, statistical real time analyses of the residuals of the spectral modeling provide a seamlessly updated amplification function w.r. to Swiss rock conditions at every station. Our site characterization procedure is mainly based on the analysis of surface waves from passive experiments and includes cross-checks of the derived amplification functions with those obtained through spectral modeling. The systematic use of three component surface-wave analysis, allowing the derivation of both Rayleigh and Love waves dispersion curves, also contributes to the improved quality of the retrieved profiles. The results of site characterisation activities at recently installed strong-motion stations depict the large variety of possible effects of surface geology on ground motion in the Alpine context. Such effects range from de-amplification at hard-rock sites to amplification up to a factor of 15 in lacustrine sediments with respect to the Swiss reference rock velocity model. The derived velocity profiles are shown to reproduce observed amplification functions from empirical spectral modeling. Although many sites are found to exhibit 1D behavior, our procedure allows the detection and qualification of 2D and 3D effects. All data collected during the site characterization procedures in the last 20 years are gathered in a database, implementing a data model proposed for community use at the European scale through NERA and EPOS (www.epos-eu.org). A web stationbook derived from it

  19. Systematic analysis of a xenograft mice model for KSHV+ primary effusion lymphoma (PEL.

    Directory of Open Access Journals (Sweden)

    Lu Dai

    Full Text Available Kaposi's sarcoma-associated herpesvirus is the causative agent of primary effusion lymphoma (PEL, which arises preferentially in the setting of infection with human immunodeficiency virus (HIV. Even with standard cytotoxic chemotherapy, PEL continues to cause high mortality rates, requiring the development of novel therapeutic strategies. PEL xenograft models employing immunodeficient mice have been used to study the in vivo effects of a variety of therapeutic approaches. However, it remains unclear whether these xenograft models entirely reflect clinical presentations of KSHV(+ PEL, especially given the recent description of extracavitary solid tumor variants arising in patients. In addition, effusion and solid tumor cells propagated in vivo exhibit unique biology, differing from one another or from their parental cell lines propagated through in vitro culture. Therefore, we used a KSHV(+ PEL/BCBL-1 xenograft model involving non-obese diabetic/severe-combined immunodeficient (NOD/SCID mice, and compared characteristics of effusion and solid tumors with their parent cell culture-derived counterparts. Our results indicate that although this xenograft model can be used for study of effusion and solid lymphoma observed in patients, tumor cells in vivo display unique features to those passed in vitro, including viral lytic gene expression profile, rate of solid tumor development, the host proteins and the complex of tumor microenvironment. These items should be carefully considered when the xenograft model is used for testing novel therapeutic strategies against KSHV-related lymphoma.

  20. A systematic review of the main factors that determine agility in sport using structural equation modeling.

    Science.gov (United States)

    Hojka, Vladimir; Stastny, Petr; Rehak, Tomas; Gołas, Artur; Mostowik, Aleksandra; Zawart, Marek; Musálek, Martin

    2016-09-01

    While tests of basic motor abilities such as speed, maximum strength or endurance are well recognized, testing of complex motor functions such as agility remains unresolved in current literature. Therefore, the aim of this review was to evaluate which main factor or factor structures quantitatively determine agility. In methodological detail, this review focused on research that explained or described the relationships between latent variables in a factorial model of agility using approaches such as principal component analysis, factor analysis and structural equation modeling. Four research studies met the defined inclusion criteria. No quantitative empirical research was found that tried to verify the quality of the whole suggested model of the main factors determining agility through the use of a structural equation modeling (SEM) approach or a confirmatory factor analysis. From the whole structure of agility, only change of direction speed (CODS) and some of its subtests were appropriately analyzed. The combination of common CODS tests is reliable and useful to estimate performance in sub-elite athletes; however, for elite athletes, CODS tests must be specific to the needs of a particular sport discipline. Sprinting and jumping tests are stronger factors for CODS than explosive strength and maximum strength tests. The authors suggest the need to verify the agility factorial model by a second generation data analysis technique such as SEM.

  1. Fusion of range camera and photogrammetry: a systematic procedure for improving 3-D models metric accuracy.

    Science.gov (United States)

    Guidi, G; Beraldin, J A; Ciofi, S; Atzeni, C

    2003-01-01

    The generation of three-dimensional (3-D) digital models produced by optical technologies in some cases involves metric errors. This happens when small high-resolution 3-D images are assembled together in order to model a large object. In some applications, as for example 3-D modeling of Cultural Heritage, the problem of metric accuracy is a major issue and no methods are currently available for enhancing it. The authors present a procedure by which the metric reliability of the 3-D model, obtained through iterative alignments of many range maps, can be guaranteed to a known acceptable level. The goal is the integration of the 3-D range camera system with a close range digital photogrammetry technique. The basic idea is to generate a global coordinate system determined by the digital photogrammetric procedure, measuring the spatial coordinates of optical targets placed around the object to be modeled. Such coordinates, set as reference points, allow the proper rigid motion of few key range maps, including a portion of the targets, in the global reference system defined by photogrammetry. The other 3-D images are normally aligned around these locked images with usual iterative algorithms. Experimental results on an anthropomorphic test object, comparing the conventional and the proposed alignment method, are finally reported.

  2. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    Science.gov (United States)

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization.

  3. The Efficacy of Trastuzumab in Animal Models of Breast Cancer: A Systematic Review and Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Jiarong Chen

    Full Text Available Breast cancer is the most frequent cancers and is the second leading cause of cancer death among women. Trastuzumab is an effective treatment, the first monoclonal antibody directed against the human epidermal growth factor receptor 2 (HER2. To inform the development of other effective treatments we report summary estimates of efficacy of trastuzumab on survival and tumour volume in animal models of breast cancer.We searched PubMed and EMBASE systematically to identify publications testing trastuzumab in animal models of breast cancer. Data describing tumour volume, median survival and animal features were extracted and we assessed quality using a 12-item checklist. We analysed the impact of study design and quality and evidence for publication bias.We included data from 83 studies reporting 169 experiments using 2076 mice. Trastuzumab treatment caused a substantial reduction in tumour growth, with tumours in treated animals growing to 32.6% of the volume of tumours in control animals (95%CI 27.8%-38.2%. Median survival was prolonged by a factor of 1.45 (1.30-1.62. Many study design and quality features accounted for between-study heterogeneity and we found evidence suggesting publication bias.We have found trastuzumab to be effective in animal breast cancer models across a range of experimental circumstances. However the presence of publication bias and a low prevalence of measures to reduce bias provide a focus for future improvements in preclinical breast cancer research.

  4. A systematic review of collaborative models for health and education professionals working in school settings and implications for training.

    Science.gov (United States)

    Hillier, S L; Civetta, L; Pridham, L

    2010-11-01

    Collaborative engagement between education and health agencies has become requisite since the establishment of school inclusion policies in many developed countries. For the child with healthcare needs in an educational setting, such collaboration is assumed to be necessary to ensure a coordinated and holistic approach. However, it is less clear how this is best achieved. This secondary research aimed to answer the questions: what are the reported models of best practice to support the collaboration between education and health staff and what are the implications for training strategies at an undergraduate and postgraduate level to affect these models? Systematic review of current literature, with narrative summary. Models of interaction and teamwork are well-described, but not necessarily well-evaluated, in the intersection between schools and health agencies. They include a spectrum from consultative to collaborative and interactive teaming. It is suggested that professionals may not be adequately skilled in, or knowledgeable about, teamwork processes or the unique roles each group can play in collaborations around the health needs of school children. There is a need for robust primary research into the questions identified in this paper, as well as a need for educators and health professionals to receive training in interprofessional teamwork and collaboration beyond their traditional domains. It is suggested such training needs to occur at both the undergraduate and postgraduate levels.

  5. Beta-2 receptor antagonists for traumatic brain injury: a systematic review of controlled trials in animal models.

    Science.gov (United States)

    Ker, K; Perel, P; Blackhall, K

    2009-01-01

    A systematic review and meta-analysis of controlled trials was undertaken to assess the effects of beta-2 receptor antagonists in animal models of traumatic brain injury (TBI). Database and reference list searches were performed to identify eligible studies. Outcome data were extracted on functional status, as measured by the grip test or neurological severity score (NSS), and cerebral edema, as measured by brain water content (BWC). Data were pooled using the random-effects model. Seventeen controlled trials involving 817 animals were identified. Overall methodological quality was poor. Results from the grip test suggest that the treatment group maintained grip for a longer period than the control group; pooled weighted mean difference (WMD) = 8.28 (95% CI 5.78-10.78). The treatment group was found to have a lower NSS (i.e., better neurological function); pooled WMD =-3.28 (95% CI -4.72 to -1.85). Analysis of the cerebral edema data showed that the treatment group had a lower BWC than the control; pooled WMD =-0.42 (95% CI -0.59 to -0.26). There was evidence of statistical heterogeneity between comparisons for all outcomes. Evidence for small study effects was found for the grip test and BWC outcomes. The evidence from animal models of TBI suggests that beta-2 receptor antagonists can improve functional outcome and lessen cerebral edema. However, the poor methodological quality of the included studies and presence of small study effects may have influenced these findings.

  6. Systematic Evaluation of Methods for Integration of Transcriptomic Data into Constraint-Based Models of Metabolism

    DEFF Research Database (Denmark)

    Machado, Daniel; Herrgard, Markus

    2014-01-01

    Constraint-based models of metabolism are a widely used framework for predicting flux distributions in genome-scale biochemical networks. The number of published methods for integration of transcriptomic data into constraint-based models has been rapidly increasing. So far the predictive capability...... of these methods has not been critically evaluated and compared. This work presents a survey of recently published methods that use transcript levels to try to improve metabolic flux predictions either by generating flux distributions or by creating context-specific models. A subset of these methods...... of the results to method-specific parameters is also evaluated, as well as their robustness to noise in the data. The results show that none of the methods outperforms the others for all cases. Also, it is observed that for many conditions, the predictions obtained by simple flux balance analysis using growth...

  7. Systematic review of the use of computer simulation modeling of patient flow in surgical care.

    Science.gov (United States)

    Sobolev, Boris G; Sanchez, Victor; Vasilakis, Christos

    2011-02-01

    Computer simulation has been employed to evaluate proposed changes in the delivery of health care. However, little is known about the utility of simulation approaches for analysis of changes in the delivery of surgical care. We searched eight bibliographic databases for this comprehensive review of the literature published over the past five decades, and found 34 publications that reported on simulation models for the flow of surgical patients. The majority of these publications presented a description of the simulation approach: 91% outlined the underlying assumptions for modeling, 88% presented the system requirements, and 91% described the input and output data. However, only half of the publications reported that models were constructed to address the needs of policy-makers, and only 26% reported some involvement of health system managers and policy-makers in the simulation study. In addition, we found a wide variation in the presentation of assumptions, system requirements, input and output data, and results of simulation-based policy analysis.

  8. Topological Bias in Distance-Based Phylogenetic Methods: Problems with Over- and Underestimated Genetic Distances

    Directory of Open Access Journals (Sweden)

    Xuhua Xia

    2006-01-01

    Full Text Available I show several types of topological biases in distance-based methods that use the least-squares method to evaluate branch lengths and the minimum evolution (ME or the Fitch-Margoliash (FM criterion to choose the best tree. For a 6-species tree, there are two tree shapes, one with three cherries (a cherry is a pair of adjacent leaves descending from the most recent common ancestor, and the other with two. When genetic distances are underestimated, the 3-cherry tree shape is favored with either the ME or FM criterion. When the genetic distances are overestimated, the ME criterion favors the 2-cherry tree, but the direction of bias with the FM criterion depends on whether negative branches are allowed, i.e. allowing negative branches favors the 3-cherry tree shape but disallowing negative branches favors the 2-cherry tree shape. The extent of the bias is explored by computer simulation of sequence evolution.

  9. Efficient trawl avoidance by mesopelagic fishes causes large underestimation of their biomass

    KAUST Repository

    Kaartvedt, Stein

    2012-06-07

    Mesopelagic fishes occur in all the world’s oceans, but their abundance and consequently their ecological significance remains uncertain. The current global estimate based on net sampling prior to 1980 suggests a global abundance of one gigatonne (109 t) wet weight. Here we report novel evidence of efficient avoidance of such sampling by the most common myctophid fish in the Northern Atlantic, i.e. Benthosema glaciale. We reason that similar avoidance of nets may explain consistently higher acoustic abundance estimates of mesopelagic fish from different parts of the world’s oceans. It appears that mesopelagic fish abundance may be underestimated by one order of magnitude, suggesting that the role of mesopelagic fish in the oceans might need to be revised.

  10. Does verbatim sentence recall underestimate the language competence of near-native speakers?

    Science.gov (United States)

    Schweppe, Judith; Barth, Sandra; Ketzer-Nöltge, Almut; Rummer, Ralf

    2015-01-01

    Verbatim sentence recall is widely used to test the language competence of native and non-native speakers since it involves comprehension and production of connected speech. However, we assume that, to maintain surface information, sentence recall relies particularly on attentional resources, which differentially affects native and non-native speakers. Since even in near-natives language processing is less automatized than in native speakers, processing a sentence in a foreign language plus retaining its surface may result in a cognitive overload. We contrasted sentence recall performance of German native speakers with that of highly proficient non-natives. Non-natives recalled the sentences significantly poorer than the natives, but performed equally well on a cloze test. This implies that sentence recall underestimates the language competence of good non-native speakers in mixed groups with native speakers. The findings also suggest that theories of sentence recall need to consider both its linguistic and its attentional aspects.

  11. Hydrolysis of soy isoflavone conjugates using enzyme may underestimate isoflavone concentrations in tissue

    Institute of Scientific and Technical Information of China (English)

    Hebron C. Chang; Myriam Laly; Melody Harrison; Thomas M. Badger

    2005-01-01

    Objective: To investigate the differences of using enzymatic hydrolysis and acid hydrolysis for identification and quantification of isoflavone aglycones from biomatrices. Methods: β-glucuronidase/sulfatase isolated from Helix pomatia for routine enzymatic hydrolysis or 6N HCl was used to release glucuronide and sulfate conjugates in the serum, urine and tissue samples. Profiles of soy isoflavones after enzymatic hydrolysis or acid hydrolysis in several tissues of rat fed with diets containing soy protein isolate were also compared using LC/MS and HPLC-ECD. Results: Acid hydrolysis released more aglycone than enzymatic digestion ( P <0.05) in liver tissue. The total genistein, daidzein and other metabolites were 20% to 60% lower in samples from enzymatic hydrolysis than in acid hydrolysis. Conclusion: These results indicated that unknown factors in tissues reduced the enzymatic hydrolytic efficiency for releasing isoflavone aglycones even in optimized condition. This would underestimate isoflavone tissue concentrations up to 60%.

  12. Quantification of Underestimation of Physical Activity During Cycling to School When Using Accelerometry

    DEFF Research Database (Denmark)

    Tarp, Jakob; Andersen, Lars B; Østergaard, Lars

    2015-01-01

    to school. Results: Mean (95% CI) minutes of moderate-to-vigorous physical activity (MVPA) during round-trip commutes was 10.8 (7.1 - 16.6). Each kilometre of cycling meant an underestimation of 9314 (95%CI: 7719 - 11238) counts and 2.7 (95%CI: 2.1 - 3.5) minutes of MVPA. Adjusting for cycling to school...... increased estimates of MVPA/day by 6.0 (95%CI: 3.8 - 9.6) minutes. Conclusions: Cycling to and from school contribute substantially to levels of MVPA and to mean counts/min in children. This was not collected by accelerometers. Using distance to school in conjunction with self-reported cycling to school may...

  13. Is the contribution of bacteria to terrestrial carbon budget greatly underestimated?

    Science.gov (United States)

    Braissant, Olivier; Verrecchia, Eric; Aragno, Michel

    2002-07-01

    Some commonly found species of soil bacteria use low molecular weight organic acids as their sole source of carbon and energy. This study shows that acids such as citrate and oxalate (produced in large amounts by fungi and plants) can rapidly be consumed by these bacteria. Two strains, Ralstonia eutropha and Xanthobacter autotrophicus, were cultured on acetate- and citrate-rich media. The resulting CO2 and/or HCO3- reacted with calcium ions to precipitate two polymorphs of calcium carbonate (CaCO3), calcite and vaterite, depending on the quantity of slime produced by the strains. This production of primary calcium carbonate crystals by oxalate- and citrate-degrading bacteria from soil organic carbon sources highlights the existence of an important and underestimated potential carbon sink.

  14. The Effect of ISO 9001 and the EFQM Model on Improving Hospital Performance: A Systematic Review.

    Science.gov (United States)

    Yousefinezhadi, Taraneh; Mohamadi, Efat; Safari Palangi, Hossein; Akbari Sari, Ali

    2015-12-01

    This study aimed to explore the effect of the International Organization for Standardization (ISO) ISO 9001 standard and the European foundation for quality management (EFQM) model on improving hospital performance. PubMed, Embase and the Cochrane Library databases were searched. In addition, Elsevier and Springer were searched as main publishers in the field of health sciences. We included empirical studies with any design that had used ISO 9001 or the EFQM model to improve the quality of healthcare. Data were collected and tabulated into a data extraction sheet that was specifically designed for this study. The collected data included authors' names, country, year of publication, intervention, improvement aims, setting, length of program, study design, and outcomes. Seven out of the 121 studies that were retrieved met the inclusion criteria. Three studies assessed the EFQM model and four studies assessed the ISO 9001 standard. Use of the EFQM model increased the degree of patient satisfaction and the number of hospital admissions and reduced the average length of stay, the delay on the surgical waiting list, and the number of emergency re-admissions. ISO 9001 also increased the degree of patient satisfaction and patient safety, increased cost-effectiveness, improved the hospital admissions process, and reduced the percentage of unscheduled returns to the hospital. Generally, there is a lack of robust and high quality empirical evidence regarding the effects of ISO 9001 and the EFQM model on the quality care provided by and the performance of hospitals. However, the limited evidence shows that ISO 9001 and the EFQM model might improve hospital performance.

  15. A more realistic estimate of the variances and systematic errors in spherical harmonic geomagnetic field models

    DEFF Research Database (Denmark)

    Lowes, F.J.; Olsen, Nils

    2004-01-01

    Most modern spherical harmonic geomagnetic models based on satellite data include estimates of the variances of the spherical harmonic coefficients of the model; these estimates are based on the geometry of the data and the fitting functions, and on the magnitude of the residuals. However......, led to quite inaccurate variance estimates. We estimate correction factors which range from 1/4 to 20, with the largest increases being for the zonal, m = 0, and sectorial, m = n, terms. With no correction, the OSVM variances give a mean-square vector field error of prediction over the Earth's surface...

  16. Health information regarding diabetes mellitus reduces misconceptions and underestimation of consequences in the general population.

    Science.gov (United States)

    Dorner, Thomas E; Lackinger, Christian; Schindler, Karin; Stein, K Viktoria; Rieder, Anita; Ludvik, Bernhard

    2013-11-01

    To evaluate self-assessed knowledge about diabetes mellitus, to assess determinants of health knowledge and to evaluate consequences of health knowledge on appraisal about consequences of the disease. Population-based computer-assisted web interview survey, supplemented with a paper-and-pencil survey via post. Representative sample of the general Austrian population aged 15 years and older. Men (n 1935) and women (n 2065) with and without diabetes mellitus. Some 20.5% of men and 17.7% of women with diabetes, and 46.2% of men and 36.7% of women without diabetes, rated their knowledge about diabetes mellitus to be ‘very bad’ or ‘rather bad’. Individuals with diabetes and individuals with a family member with diabetes rated their information level more often as ‘very good’ or ‘rather good’, with adjusted OR (95% CI) of 1.7 (1.1, 2.8) and 2.1 (1.6, 2.7), respectively, in men and 2.7 (1.5, 4.8) and 2.7 (2.1, 3.5), respectively, in women. Additional significant influencing factors on diabetes knowledge were age and educational level in both sexes, and city size in men. Independent of personal diabetes status, diabetes knowledge was associated with a lower perception of restrictions on daily life of diabetes patients and with a lower probability of underestimating health consequences of diabetes. Health knowledge is associated with fewer misconceptions and less underestimation of health consequences in individuals both with and without diabetes mellitus. Thus health information about diabetes is important on the individual level towards disease management as well as on the public health level towards disease prevention.

  17. How and why DNA barcodes underestimate the diversity of microbial eukaryotes.

    Directory of Open Access Journals (Sweden)

    Gwenael Piganeau

    Full Text Available BACKGROUND: Because many picoplanktonic eukaryotic species cannot currently be maintained in culture, direct sequencing of PCR-amplified 18S ribosomal gene DNA fragments from filtered sea-water has been successfully used to investigate the astounding diversity of these organisms. The recognition of many novel planktonic organisms is thus based solely on their 18S rDNA sequence. However, a species delimited by its 18S rDNA sequence might contain many cryptic species, which are highly differentiated in their protein coding sequences. PRINCIPAL FINDINGS: Here, we investigate the issue of species identification from one gene to the whole genome sequence. Using 52 whole genome DNA sequences, we estimated the global genetic divergence in protein coding genes between organisms from different lineages and compared this to their ribosomal gene sequence divergences. We show that this relationship between proteome divergence and 18S divergence is lineage dependent. Unicellular lineages have especially low 18S divergences relative to their protein sequence divergences, suggesting that 18S ribosomal genes are too conservative to assess planktonic eukaryotic diversity. We provide an explanation for this lineage dependency, which suggests that most species with large effective population sizes will show far less divergence in 18S than protein coding sequences. CONCLUSIONS: There is therefore a trade-off between using genes that are easy to amplify in all species, but which by their nature are highly conserved and underestimate the true number of species, and using genes that give a better description of the number of species, but which are more difficult to amplify. We have shown that this trade-off differs between unicellular and multicellular organisms as a likely consequence of differences in effective population sizes. We anticipate that biodiversity of microbial eukaryotic species is underestimated and that numerous "cryptic species" will become

  18. Ductal carcinoma in situ diagnosed using an ultrasound-guided 14-gauge core needle biopsy of breast masses: can underestimation be predicted preoperatively?

    Directory of Open Access Journals (Sweden)

    Sung Hee Park

    2014-04-01

    Conclusion: We found a 30.4% rate of DCIS underestimation in breast masses based on a US-14G-CNB. The presence of abnormal lymph nodes as detected on axillary ultrasound may be useful to preoperatively predict underestimation.

  19. Systematic model development for partial nitrification of landfill leachate in a SBR

    DEFF Research Database (Denmark)

    Ganigue, R.; Volcke, E.I.P.; Puig, S.

    2010-01-01

    This study deals with partial nitrification in a sequencing batch reactor (PN-SBR) treating raw urban landfill leachate. In order to enhance process insight (e.g. quantify interactions between aeration, CO2 stripping, alkalinity, pH, nitrification kinetics), a mathematical model has been set up...

  20. Compromised motor control in children with DCD: A deficit in the internal model - a systematic review

    NARCIS (Netherlands)

    Adams, I.L.J.; Lust, J.M.; Wilson, P.H.; Steenbergen, B.

    2014-01-01

    A viable hypothesis to explain the compromised motor ability of children with Developmental Coordination Disorder (DCD) suggests a fundamental deficit in their ability to utilize internal models for motor control. Dysfunction in this mode of control is thought to compromise their motor learning

  1. ScaleNet: A literature-based model of scale insect biology and systematics

    Science.gov (United States)

    Scale insects (Hemiptera: Coccoidea) are small herbivorous insects found in all continents except Antarctica. They are extremely invasive, and many species are serious agricultural pests. They are also emerging models for studies of the evolution of genetic systems, endosymbiosis, and plant-insect i...

  2. Systematic construction of a conceptual minimal model of plasma cholesterol levels based on knockout mouse phenotypes

    NARCIS (Netherlands)

    Pas, N.C.A. van de; Soffers, A.E.M.F.; Freidig, A.P.; Ommen, B. van; Woutersen, R.A.; Rietjens, I.M.C.M.; Graaf, A.A. de

    2010-01-01

    Elevated plasma cholesterol, a well-known risk factor for cardiovascular diseases, is the result of the activity of many genes and their encoded proteins in a complex physiological network. We aim to develop a minimal kinetic computational model for predicting plasma cholesterol levels. To define th

  3. Compromised motor control in children with DCD: A deficit in the internal model - a systematic review

    NARCIS (Netherlands)

    Adams, I.L.J.; Lust, J.M.; Wilson, P.H.; Steenbergen, B.

    2014-01-01

    A viable hypothesis to explain the compromised motor ability of children with Developmental Coordination Disorder (DCD) suggests a fundamental deficit in their ability to utilize internal models for motor control. Dysfunction in this mode of control is thought to compromise their motor learning capa

  4. A Systematic and Numerically Efficient Procedure for Stable Dynamic Model Inversion of LTI Systems

    NARCIS (Netherlands)

    George, K.; Verhaegen, M.; Scherpen, J.M.A.

    1999-01-01

    Output tracking via the novel Stable Dynamic model Inversion (SDI) technique, applicable to non-minimum phase systems, and which naturally takes into account the presence of noise in target time histories, is considered here. We are motivated by the typical need to replicate time signals in the auto

  5. Efficient systematic scheme to construct second-principles lattice dynamical models

    Science.gov (United States)

    Escorihuela-Sayalero, Carlos; Wojdeł, Jacek C.; Íñiguez, Jorge

    2017-03-01

    We start from the polynomial interatomic potentials introduced by Wojdeł et al. [J. Phys.: Condens. Matter 25, 305401 (2013), 10.1088/0953-8984/25/30/305401] and take advantage of one of their key features—namely, the linear dependence of the energy on the potential's adjustable parameters—to devise a scheme for the construction of first-principles-based (second-principles) models for large-scale lattice-dynamical simulations. Our method presents the following convenient features. The parameters of the model are computed in a very fast and efficient way, as it is possible to recast the fit to a training set of first-principles data into a simple matrix diagonalization problem. Our method selects automatically the interactions that are most relevant to reproduce the training-set data, by choosing from a pool that includes virtually all possible coupling terms, and produces a family of models of increasing complexity and accuracy. We work with practical and convenient cross-validation criteria linked to the physical properties that will be relevant in future simulations based on the new model, and which greatly facilitate the task of identifying a potential that is simultaneously simple (thus computationally light), very accurate, and predictive. We also discuss practical ways to guarantee that our energy models are bounded from below, with a minimal impact on their accuracy. Finally, we demonstrate our scheme with an application to ferroelastic perovskite SrTiO3, which features many nontrivial lattice-dynamical features (e.g., a phase transition driven by soft phonons, competing structural instabilities, highly anharmonic dynamics) and provides a very demanding test.

  6. Are the true impacts of adverse events considered in economic models of antineoplastic drugs? A systematic review.

    Science.gov (United States)

    Pearce, Alison; Haas, Marion; Viney, Rosalie

    2013-12-01

    Antineoplastic drugs for cancer are often associated with adverse events, which influence patients' physical health, quality of life and survival. However, the modelling of adverse events in cost-effectiveness analyses of antineoplastic drugs has not been examined. This article reviews published economic evaluations that include a calculated cost for adverse events of antineoplastic drugs. The aim is to identify how existing models manage four issues specific to antineoplastic drug adverse events: the selection of adverse events for inclusion in models, the influence of dose modifications on drug quantity and survival outcomes, the influence of adverse events on quality of life and the consideration of multiple simultaneous or recurring adverse events. A systematic literature search was conducted using MESH headings and key words in multiple electronic databases, covering the years 1999-2009. Inclusion criteria for eligibility were papers covering a population of adults with solid tumour cancers, the inclusion of at least one adverse event and the resource use and/or costs of adverse event treatment. From 4,985 citations, 26 eligible articles were identified. Studies were generally of moderate quality and addressed a range of cancers and treatment types. While the four issues specific to antineoplastic drug adverse events were addressed by some studies, no study addressed all of the issues in the same model. This review indicates that current modelling assumptions may restrict our understanding of the true impact of adverse events on cost effectiveness of antineoplastic drugs. This understanding could be improved through consideration of the selection of adverse events, dose modifications, multiple events and quality of life in cost-effectiveness studies.

  7. How Do Type Ia Supernova Nebular Spectra Depend on Explosion Properties? Insights from Systematic Non-LTE Modeling

    Science.gov (United States)

    Botyánszki, János; Kasen, Daniel

    2017-08-01

    We present a radiative transfer code to model the nebular phase spectra of supernovae (SNe) in non-LTE (NLTE). We apply it to a systematic study of SNe Ia using parameterized 1D models and show how nebular spectral features depend on key physical parameters, such as the time since explosion, total ejecta mass, kinetic energy, radial density profile, and the masses of 56Ni, intermediate-mass elements, and stable iron-group elements. We also quantify the impact of uncertainties in atomic data inputs. We find the following. (1) The main features of SN Ia nebular spectra are relatively insensitive to most physical parameters. Degeneracy among parameters precludes a unique determination of the ejecta properties from spectral fitting. In particular, features can be equally well fit with generic Chandrasekhar mass ({M}{ch}), sub-{M}{Ch}, and super-{M}{Ch} models. (2) A sizable (≳0.1 {M}⊙ ) central region of stable iron-group elements, often claimed as evidence for {M}{Ch} models, is not essential to fit the optical spectra and may produce an unusual flat-top [Co iii] profile. (3) The strength of [S iii] emission near 9500 Å can provide a useful diagnostic of explosion nucleosynthesis. (4) Substantial amounts (≳0.1 {M}⊙ ) of unburned C/O mixed throughout the ejecta produce [O iii] emission not seen in observations. (5) Shifts in the wavelength of line peaks can arise from line-blending effects. (6) The steepness of the ejecta density profile affects the line shapes, offering a constraint on explosion models. (7) Uncertainties in atomic data limit the ability to infer physical parameters.

  8. Metabolic Power Method: Underestimation of Energy Expenditure in Field-Sport Movements Using a Global Positioning System Tracking System.

    Science.gov (United States)

    Brown, Darcy M; Dwyer, Dan B; Robertson, Samuel J; Gastin, Paul B

    2016-11-01

    The purpose of this study was to assess the validity of a global positioning system (GPS) tracking system to estimate energy expenditure (EE) during exercise and field-sport locomotor movements. Twenty-seven participants each completed a 90-min exercise session on an outdoor synthetic futsal pitch. During the exercise session, they wore a 5-Hz GPS unit interpolated to 15 Hz and a portable gas analyzer that acted as the criterion measure of EE. The exercise session was composed of alternating 5-minute exercise bouts of randomized walking, jogging, running, or a field-sport circuit (×3) followed by 10 min of recovery. One-way analysis of variance showed significant (P consumption (VO2) -derived EE for all field-sport circuits (% difference ≈ -44%). No differences in EE were observed for the jog (7.8%) and run (4.8%), whereas very large overestimations were found for the walk (43.0%). The GPS metabolic power EE over the entire 90-min session was significantly lower (P < .01) than the VO2 EE, resulting in a moderate underestimation overall (-19%). The results of this study suggest that a GPS tracking system using the metabolic power model of EE does not accurately estimate EE in field-sport movements or over an exercise session consisting of mixed locomotor activities interspersed with recovery periods; however, is it able to provide a reasonably accurate estimation of EE during continuous jogging and running.

  9. Intra-arch dimensional measurement validity of laser-scanned digital dental models compared with the original plaster models: a systematic review.

    Science.gov (United States)

    De Luca Canto, G; Pachêco-Pereira, C; Lagravere, M O; Flores-Mir, C; Major, P W

    2015-05-01

    A systematic review was undertaken to evaluate the validity of intra-arch dimensional measurements made from laser-scanned digital dental models in comparison with measurements directly obtained from the original plaster casts (gold standard). Finally included articles were only those reporting studies that compared measurements from digital models produced from laser scanning against their plaster models. Measurements from the original plaster models should have been made using a manual or digital caliper (gold standard). Articles that used scans from impressions or digital photographs were discarded. Detailed individual search strategies for Cochrane, EMBASE, MEDLINE, PubMed, and LILACS were developed. The references cited in the selected articles were also checked for any references that could have been missed in the electronic database searches. A partial gray literature search was undertaken using Google Scholar. The methodology of selected studies was evaluated using the 14-item quality assessment tool for diagnostic accuracy studies (QUADAS). Only 16 studies were finally included for the qualitative/quantitative synthesis. The selected studies consistently agree that the validity of measurements obtained after using a laser scanner from plaster models is similar to direct measurements. Any stated differences would be unlikely clinically relevant. There is consistent scientific evidence to support the validity of measurements from digital dental models in comparison with intra-arch dimensional measurements directly obtained from them. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. [Nursing care systematization in rehabilitation unit, in accordance to Horta's conceptual model].

    Science.gov (United States)

    Neves, Rinaldo de Souza

    2006-01-01

    The utilization of a conceptual model in the Nursing Attendance Systemization allows the development of activities based on theoretical references that can guide the implantation and the implementation of nursing proceedings in hospitals. In this article we examine the option made for the implementation of the Horta's conceptual model in the construction of a nursing attendance system in the Rehabilitation Unit of a public hospital located in the Federal District of Brazil. Through the utilization of these theoretical references it was possible to make available a data collection tool based on the basic human needs. The identification of these needs made possible the construction of the hierarchically disposed pyramid of the neurological patients' modified basic needs. Through this reference paper we intend to elaborate the prescription and nursing evolution based in the concepts and standards of the Horta's nursing process, making possible the inter-relationship of all phases of this attendance methodology.

  11. A fast and systematic procedure to develop dynamic models of bioprocesses: application to microalgae cultures

    Directory of Open Access Journals (Sweden)

    J. Mailier

    2010-09-01

    Full Text Available The purpose of this paper is to report on the development of a procedure for inferring black-box, yet biologically interpretable, dynamic models of bioprocesses based on sets of measurements of a few external components (biomass, substrates, and products of interest. The procedure has three main steps: (a the determination of the number of macroscopic biological reactions linking the measured components; (b the estimation of a first reaction scheme, which has interesting mathematical properties, but might lack a biological interpretation; and (c the "projection" (or transformation of this reaction scheme onto a biologically-consistent scheme. The advantage of the method is that it allows the fast prototyping of models for the culture of microorganisms that are not well documented. The good performance of the third step of the method is demonstrated by application to an example of microalgal culture.

  12. ScaleNet: a literature-based model of scale insect biology and systematics

    OpenAIRE

    García Morales, Mayrolin; Denno, Barbara D.; Miller, Douglass R.; Miller, Gary L.; Ben-Dov, Yair; Hardy, Nate B.

    2016-01-01

    Scale insects (Hemiptera: Coccoidea) are small herbivorous insects found on all continents except Antarctica. They are extremely invasive, and many species are serious agricultural pests. They are also emerging models for studies of the evolution of genetic systems, endosymbiosis and plant-insect interactions. ScaleNet was launched in 1995 to provide insect identifiers, pest managers, insect systematists, evolutionary biologists and ecologists efficient access to information about scale insec...

  13. The value of using feasibility models in systematic conservation planning to predict landholder management uptake.

    Science.gov (United States)

    Tulloch, Ayesha I T; Tulloch, Vivitskaia J D; Evans, Megan C; Mills, Morena

    2014-12-01

    Understanding the social dimensions of conservation opportunity is crucial for conservation planning in multiple-use landscapes. However, factors that influence the feasibility of implementing conservation actions, such as the history of landscape management, and landholders' willingness to engage are often difficult or time consuming to quantify and rarely incorporated into planning. We examined how conservation agencies could reduce costs of acquiring such data by developing predictive models of management feasibility parameterized with social and biophysical factors likely to influence landholders' decisions to engage in management. To test the utility of our best-supported model, we developed 4 alternative investment scenarios based on different input data for conservation planning: social data only; biological data only; potential conservation opportunity derived from modeled feasibility that incurs no social data collection costs; and existing conservation opportunity derived from feasibility data that incurred collection costs. Using spatially explicit information on biodiversity values, feasibility, and management costs, we prioritized locations in southwest Australia to control an invasive predator that is detrimental to both agriculture and natural ecosystems: the red fox (Vulpes vulpes). When social data collection costs were moderate to high, the most cost-effective investment scenario resulted from a predictive model of feasibility. Combining empirical feasibility data with biological data was more cost-effective for prioritizing management when social data collection costs were low (<4% of the total budget). Calls for more data to inform conservation planning should take into account the costs and benefits of collecting and using social data to ensure that limited funding for conservation is spent in the most cost-efficient and effective manner. © 2014 Society for Conservation Biology.

  14. Using the Soil and Water Assessment Tool (SWAT) to model ecosystem services: A systematic review

    Science.gov (United States)

    Francesconi, Wendy; Srinivasan, Raghavan; Pérez-Miñana, Elena; Willcock, Simon P.; Quintero, Marcela

    2016-04-01

    SWAT, a watershed modeling tool has been proposed to help quantify ecosystem services. The concept of ecosystem services incorporates the collective benefits natural systems provide primarily to human beings. It is becoming increasingly important to track the impact that human activities have on the environment in order to determine its resilience and sustainability. The objectives of this paper are to provide an overview of efforts using SWAT to quantify ecosystem services, to determine the model's capability examining various types of services, and to describe the approach used by various researchers. A literature review was conducted to identify studies in which SWAT was explicitly used for quantifying ecosystem services in terms of provisioning, regulating, supporting, and cultural aspects. A total of 44 peer reviewed publications were identified. Most of these used SWAT to quantify provisioning services (34%), regulating services (27%), or a combination of both (25%). While studies using SWAT for evaluating ecosystem services are limited (approximately 1% of SWAT's peered review publications), and usage (vs. potential) of services by beneficiaries is a current model limitation, the available literature sets the stage for the continuous development and potential of SWAT as a methodological framework for quantifying ecosystem services to assist in decision-making.

  15. An evaluation of a model for the systematic documentation of hospital based health promotion activities: results from a multicentre study

    DEFF Research Database (Denmark)

    Tønnesen, Hanne; Christensen, Mette E; Groene, Oliver;

    2007-01-01

    The first step of handling health promotion (HP) in Diagnosis Related Groups (DRGs) is a systematic documentation and registration of the activities in the medical records. So far the possibility and tradition for systematic registration of clinical HP activities in the medical records and in pat......The first step of handling health promotion (HP) in Diagnosis Related Groups (DRGs) is a systematic documentation and registration of the activities in the medical records. So far the possibility and tradition for systematic registration of clinical HP activities in the medical records...

  16. Anti-tumor effects of metformin in animal models of hepatocellular carcinoma: a systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Juan Li

    Full Text Available Several studies have reported that metformin can reduce the risk of hepatocellular carcinoma (HCC in diabetes patients. However, the direct anti-HCC effects of metformin have hardly been studied in patients, but have been extensively investigated in animal models of HCC. We therefore performed a systematic review and meta-analysis of animal studies evaluating the effects of metformin on HCC.We collected the relevant studies by searching EMBASE, Medline (OvidSP, Web of Science, Scopus, PubMed Publisher, and Google Scholar. Studies were included according to the following inclusion criteria: HCC, animal study, and metformin intervention. Study quality was assessed using SYRCLE's risk of bias tool. A meta-analysis was performed for the outcome measures: tumor growth (tumor volume, weight and size, tumor number and incidence.The search resulted in 573 references, of which 13 could be included in the review and 12 included in the meta-analysis. The study characteristics of the included studies varied considerably. Two studies used rats, while the others used mice. Only one study used female animals, nine used male, and three studies didn't mention the gender of animals in their experiments. The quality of the included studies was low to moderate based on the assessment of their risk of bias. The meta-analysis showed that metformin significantly inhibited the growth of HCC tumour (SMD -2.20[-2.96,-1.43]; n=16, but no significant effect on the number of tumors (SMD-1.05[-2.13,0.03]; n=5 or the incidence of HCC was observed (RR 0.62[0.33,1.16]; n=6. To investigate the potential sources of significant heterogeneities found in outcome of tumor growth (I2=81%, subgroup analyses of scales of growth measures and of types of animal models used were performed.Metformin appears to have a direct anti-HCC effect in animal models. Although the intrinsic limitations of animal studies, this systematic review could provide an important reference for future

  17. Efficacy of deferoxamine in animal models of intracerebral hemorrhage: a systematic review and stratified meta-analysis.

    Directory of Open Access Journals (Sweden)

    Han-Jin Cui

    Full Text Available Intracerebral hemorrhage (ICH is a subtype of stroke associated with high morbidity and mortality rates. No proven treatments are available for this condition. Iron-mediated free radical injury is associated with secondary damage following ICH. Deferoxamine (DFX, a ferric-iron chelator, is a candidate drug for the treatment of ICH. We performed a systematic review of studies involving the administration of DFX following ICH. In total, 20 studies were identified that described the efficacy of DFX in animal models of ICH and assessed changes in the brain water content, neurobehavioral score, or both. DFX reduced the brain water content by 85.7% in animal models of ICH (-0.86, 95% CI: -.48- -0.23; P < 0.01; 23 comparisons, and improved the neurobehavioral score by -1.08 (95% CI: -1.23- -0.92; P < 0.01; 62 comparisons. DFX was most efficacious when administered 2-4 h after ICH at a dose of 10-50 mg/kg depending on species, and this beneficial effect remained for up to 24 h postinjury. The efficacy was higher with phenobarbital anesthesia, intramuscular injection, and lysed erythrocyte infusion, and in Fischer 344 rats or aged animals. Overall, although DFX was found to be effective in experimental ICH, additional confirmation is needed due to possible publication bias, poor study quality, and the limited number of studies conducting clinical trials.

  18. Preclinical evaluation of cell-based strategies to prevent or treat bronchopulmonary dysplasia in animal models: a systematic review.

    Science.gov (United States)

    Lesage, Flore; Jimenez, Julio; Toelen, Jaan; Deprest, Jan

    2017-03-21

    Bronchopulmonary dysplasia (BPD) remains the most common complication of extreme prematurity as no effective treatment is available to date. This calls for the exploration of new therapeutic options like cell therapy, which is already effective for various human (lung) disorders. We systematically searched the MEDLINE, Embase, and Web of Science databases from the earliest date till January 2017 and included original studies on the perinatal use of cell-based therapies (i.e. cells and/or cell-derivatives) to treat BDP in animal models. Fourth publications describing 47 interventions were retrieved. Newborn mice/rats raised in a hyperoxic environment were studied in most interventions. Different cell types - either intact cells or their conditioned medium - were administered, but bone marrow and umbilical cord blood derived mesenchymal stem cells were most prevalent. All studies reported positive effects on outcome parameters including alveolar and vascular morphometry, lung function, and inflammation. Cell homing to the lungs was demonstrated in some studies, but the therapeutic effects seemed to be mostly mediated via paracrine modulation of inflammation, fibrosis and angiogenesis. Multiple rat/mouse studies show promise for cell therapy for BPD. Yet careful study of action mechanisms and side effects in large animal models is imperative before clinical translation can be achieved.

  19. Blunt force impact to the head using a teeball bat: systematic comparison of physical and finite element modeling.

    Science.gov (United States)

    Kettner, Mattias; Ramsthaler, Frank; Potente, Stefan; Bockenheimer, Alexander; Schmidt, Peter H; Schrodt, Michael

    2014-12-01

    Blunt head trauma secondary to violent actions with various weapons is frequently a cause of injury in forensic casework; differing striking tools have varying degrees of injury capacity. The systematic approach used to examine a 19-year-old student who was beaten with a wooden teeball bat will be described. The assailant stopped beating the student when the teeball bat broke into two pieces. The surviving victim sustained bruises and a forehead laceration. The State's Attorney assigned a forensic expert to examine whether the forces exerted on the victim's head (leading to the fracture of the bat) were potentially life threatening (e.g. causing cranial bone fractures). Physical modeling was conducted using a pigskin-covered polyethylene end cap cushioned by cellulose that was connected to a piezoelectric force gauge. Experiments with teeball bats weighing 295-485 g demonstrated that 12-20 kN forces were necessary to cause a comparable bat fracture. In addition to physical testing, a computer-aided simulation was conducted, utilizing a finite-element (FE) method. In the FE approach, after selecting for wood properties, a virtual bat was swung against a hemisphere comprising two layers that represented bone and soft tissue. Employing this model, a 17.6 kN force was calculated, with the highest fracture probability points resembling the fracture patterns of the physically tested bats.

  20. Religion and Spirituality’s Influences on HIV Syndemics Among MSM: A Systematic Review and Conceptual Model

    Science.gov (United States)

    Parsons, Jeffrey T.

    2015-01-01

    This paper presents a systematic review of the quantitative HIV research that assessed the relationships between religion, spirituality, HIV syndemics, and individual HIV syndemics-related health conditions (e.g. depression, substance abuse, HIV risk) among men who have sex with men (MSM) in the United States. No quantitative studies were found that assessed the relationships between HIV syndemics, religion, and spirituality. Nine studies, with 13 statistical analyses, were found that examined the relationships between individual HIV syndemics-related health conditions, religion, and spirituality. Among the 13 analyses, religion and spirituality were found to have mixed relationships with HIV syndemics-related health conditions (6 nonsignificant associations; 5 negative associations; 2 positive associations). Given the overall lack of inclusion of religion and spirituality in HIV syndemics research, a conceptual model that hypothesizes the potential interactions of religion and spirituality with HIV syndemics-related health conditions is presented. The implications of the model for MSM’s health are outlined. PMID:26319130

  1. Flow variability and its physical causes in infusion technology: a systematic review of in vitro measurement and modeling studies.

    Science.gov (United States)

    Snijder, Roland A; Konings, Maurits K; Lucas, Peter; Egberts, Toine C; Timmerman, Annemoon D

    2015-08-01

    Infusion therapy is medically and technically challenging and frequently associated with medical errors. When administering pharmaceuticals by means of infusion, dosing errors can occur due to flow rate variability. These dosing errors may lead to adverse effects. We aimed to systematically review the available biomedical literature for in vitro measurement and modeling studies that investigated the physical causes of flow rate variability. Special focus was given to syringe pump setups, which are typically used if very accurate drug delivery is required. We aimed to extract from literature the component with the highest mechanical compliance in syringe pump setups. We included 53 studies, six of which were theoretical models, two articles were earlier reviews of infusion literature, and 45 were in vitro measurement studies. Mechanical compliance, flow resistance, and dead volume of infusion systems were stated as the most important and frequently identified physical causes of flow rate variability. The syringe was indicated as the most important source of mechanical compliance in syringe pump setups (9.0×10-9 to 2.1×10-8 l/Pa). Mechanical compliance caused longer flow rate start-up times (from several minutes up to approximately 70 min) and delayed occlusion alarm times (up to 117 min).

  2. Systematic analysis of neutron yields from thick targets bombarded by heavy ions and protons with moving source model

    CERN Document Server

    Kato, T; Nakamura, T

    2002-01-01

    A simple phenomenological analysis using the moving source model has been performed on the neutron energy spectra produced by bombarding thick targets with high energy heavy ions which have been systematically measured at the Heavy-Ion Medical Accelerator (HIMAC) facility (located in Chiba, Japan) of the National Institute of Radiological Sciences (NIRS). For the bombardment of both heavy ions and protons in the energy region of 100-500 MeV per nucleon, the moving source model incorporating the knock-on process could be generally successful in reproducing the measured neutron spectra within a factor of two margin of accuracy. This phenomenological analytical equation is expressed having several parameters as functions of atomic number Z sub p , mass number A sub p , energy per nucleon E sub p for projectile, and atomic number Z sub T , mass number A sub T for target. By inputting these basic data for projectile and target into this equation we can easily estimate the secondary neutron energy spectra at an emi...

  3. How to assess the external validity and model validity of therapeutic trials: a conceptual approach to systematic review methodology.

    Science.gov (United States)

    Khorsan, Raheleh; Crawford, Cindy

    2014-01-01

    Background. Evidence rankings do not consider equally internal (IV), external (EV), and model validity (MV) for clinical studies including complementary and alternative medicine/integrative medicine (CAM/IM) research. This paper describe this model and offers an EV assessment tool (EVAT©) for weighing studies according to EV and MV in addition to IV. Methods. An abbreviated systematic review methodology was employed to search, assemble, and evaluate the literature that has been published on EV/MV criteria. Standard databases were searched for keywords relating to EV, MV, and bias-scoring from inception to Jan 2013. Tools identified and concepts described were pooled to assemble a robust tool for evaluating these quality criteria. Results. This study assembled a streamlined, objective tool to incorporate for the evaluation of quality of EV/MV research that is more sensitive to CAM/IM research. Conclusion. Improved reporting on EV can help produce and provide information that will help guide policy makers, public health researchers, and other scientists in their selection, development, and improvement in their research-tested intervention. Overall, clinical studies with high EV have the potential to provide the most useful information about "real-world" consequences of health interventions. It is hoped that this novel tool which considers IV, EV, and MV on equal footing will better guide clinical decision making.

  4. A graphical model approach to systematically missing data in meta-analysis of observational studies.

    Science.gov (United States)

    Kovačić, Jelena; Varnai, Veda Marija

    2016-10-30

    When studies in meta-analysis include different sets of confounders, simple analyses can cause a bias (omitting confounders that are missing in certain studies) or precision loss (omitting studies with incomplete confounders, i.e. a complete-case meta-analysis). To overcome these types of issues, a previous study proposed modelling the high correlation between partially and fully adjusted regression coefficient estimates in a bivariate meta-analysis. When multiple differently adjusted regression coefficient estimates are available, we propose exploiting such correlations in a graphical model. Compared with a previously suggested bivariate meta-analysis method, such a graphical model approach is likely to reduce the number of parameters in complex missing data settings by omitting the direct relationships between some of the estimates. We propose a structure-learning rule whose justification relies on the missingness pattern being monotone. This rule was tested using epidemiological data from a multi-centre survey. In the analysis of risk factors for early retirement, the method showed a smaller difference from a complete data odds ratio and greater precision than a commonly used complete-case meta-analysis. Three real-world applications with monotone missing patterns are provided, namely, the association between (1) the fibrinogen level and coronary heart disease, (2) the intima media thickness and vascular risk and (3) allergic asthma and depressive episodes. The proposed method allows for the inclusion of published summary data, which makes it particularly suitable for applications involving both microdata and summary data. Copyright © 2016 John Wiley & Sons, Ltd.

  5. [Systematization and hygienic standardization of environmental factors on the basis of common graphic models].

    Science.gov (United States)

    Galkin, A A

    2012-01-01

    On the basis of graphic models of the human response to environmental factors, two main types of complex quantitative influence as well as interrelation between determined effects at the level of an individual, and stochastic effects on population were revealed. Two main kinds of factors have been suggested to be distinguished. They are essential factors and accidental factors. The essential factors are common for environment. The accidental factors are foreign for environment. The above two kinds are different in approaches of hygienic standardization Accidental factors need a dot-like approach, whereas a two-level range approach is suitable for the essential factors.

  6. Regularities in hadron systematics, Regge trajectories and a string quark model

    Energy Technology Data Exchange (ETDEWEB)

    Chekanov, S.V. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Levchenko, B.B. [Moscow State Univ. (Russian Federation). Skobeltsyn Inst. of Nuclear Physics

    2006-08-15

    An empirical principle for the construction of a linear relationship between the total angular momentum and squared-mass of baryons is proposed. In order to examine linearity of the trajectories, a rigorous least-squares regression analysis was performed. Unlike the standard Regge-Chew-Frautschi approach, the constructed trajectories do not have non-linear behaviour. A similar regularity may exist for lowest-mass mesons. The linear baryonic trajectories are well described by a semi-classical picture based on a spinning relativistic string with tension. The obtained numerical solution of this model was used to extract the (di)quark masses. (orig.)

  7. Forecasting Optimal Solar Energy Supply in Jiangsu Province (China: A Systematic Approach Using Hybrid of Weather and Energy Forecast Models

    Directory of Open Access Journals (Sweden)

    Xiuli Zhao

    2014-01-01

    Full Text Available The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, “least-cost,” and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor.

  8. Forecasting Optimal Solar Energy Supply in Jiangsu Province (China): A Systematic Approach Using Hybrid of Weather and Energy Forecast Models

    Science.gov (United States)

    Zhao, Xiuli; Yiranbon, Ethel

    2014-01-01

    The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, “least-cost,” and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor. PMID:24511292

  9. Factors and models associated with the amount of hospital care services as demanded by hospitalized patients: a systematic review.

    Directory of Open Access Journals (Sweden)

    Catharina J van Oostveen

    Full Text Available BACKGROUND: Hospitals are constantly being challenged to provide high-quality care despite ageing populations, diminishing resources, and budgetary restraints. While the costs of care depend on the patients' needs, it is not clear which patient characteristics are associated with the demand for care and inherent costs. The aim of this study was to ascertain which patient-related characteristics or models can predict the need for medical and nursing care in general hospital settings. METHODS: We systematically searched MEDLINE, Embase, Business Source Premier and CINAHL. Pre-defined eligibility criteria were used to detect studies that explored patient characteristics and health status parameters associated to the use of hospital care services for hospitalized patients. Two reviewers independently assessed study relevance, quality with the STROBE instrument, and performed data analysis. RESULTS: From 2,168 potentially relevant articles, 17 met our eligibility criteria. These showed a large variety of factors associated with the use of hospital care services; models were found in only three studies. Age, gender, medical and nursing diagnoses, severity of illness, patient acuity, comorbidity, and complications were the characteristics found the most. Patient acuity and medical and nursing diagnoses were the most influencing characteristics. Models including medical or nursing diagnoses and patient acuity explain the variance in the use of hospital care services for at least 56.2%, and up to 78.7% when organizational factors were added. CONCLUSIONS: A larger variety of factors were found to be associated with the use of hospital care services. Models that explain the extent to which hospital care services are used should contain patient characteristics, including patient acuity, medical or nursing diagnoses, and organizational and staffing characteristics, e.g., hospital size, organization of care, and the size and skill mix of staff. This would

  10. Forecasting optimal solar energy supply in Jiangsu Province (China): a systematic approach using hybrid of weather and energy forecast models.

    Science.gov (United States)

    Zhao, Xiuli; Asante Antwi, Henry; Yiranbon, Ethel

    2014-01-01

    The idea of aggregating information is clearly recognizable in the daily lives of all entities whether as individuals or as a group, since time immemorial corporate organizations, governments, and individuals as economic agents aggregate information to formulate decisions. Energy planning represents an investment-decision problem where information needs to be aggregated from credible sources to predict both demand and supply of energy. To do this there are varying methods ranging from the use of portfolio theory to managing risk and maximizing portfolio performance under a variety of unpredictable economic outcomes. The future demand for energy and need to use solar energy in order to avoid future energy crisis in Jiangsu province in China require energy planners in the province to abandon their reliance on traditional, "least-cost," and stand-alone technology cost estimates and instead evaluate conventional and renewable energy supply on the basis of a hybrid of optimization models in order to ensure effective and reliable supply. Our task in this research is to propose measures towards addressing optimal solar energy forecasting by employing a systematic optimization approach based on a hybrid of weather and energy forecast models. After giving an overview of the sustainable energy issues in China, we have reviewed and classified the various models that existing studies have used to predict the influences of the weather influences and the output of solar energy production units. Further, we evaluate the performance of an exemplary ensemble model which combines the forecast output of two popular statistical prediction methods using a dynamic weighting factor.

  11. Telehealth in Schools Using a Systematic Educational Model Based on Fiction Screenplays, Interactive Documentaries, and Three-Dimensional Computer Graphics.

    Science.gov (United States)

    Miranda, Diogo Julien; Wen, Chao Lung

    2017-07-18

    Preliminary studies suggest the need of a global vision in academic reform, leading to education re-invention. This would include problem-based education using transversal topics, developing of thinking skills, social interaction, and information-processing skills. We aimed to develop a new educational model in health with modular components to be broadcast and applied as a tele-education course. We developed a systematic model based on a "Skills and Goals Matrix" to adapt scientific contents on fictional screenplays, three-dimensional (3D) computer graphics of the human body, and interactive documentaries. We selected 13 topics based on youth vulnerabilities in Brazil to be disseminated through a television show with 15 episodes. We developed scientific content for each theme, naturally inserting it into screenplays, together with 3D sequences and interactive documentaries. The modular structure was then adapted to a distance-learning course. The television show was broadcast on national television for two consecutive years to an estimated audience of 30 million homes, and ever since on an Internet Protocol Television (IPTV) channel. It was also reorganized as a tele-education course for 2 years, reaching 1,180 subscriptions from all 27 Brazilian states, resulting in 240 graduates. Positive results indicate the feasibility, acceptability, and effectiveness of a model of modular entertainment audio-visual productions using health and education integrated concepts. This structure also allowed the model to be interconnected with other sources and applied as tele-education course, educating, informing, and stimulating the behavior change. Future works should reinforce this joint structure of telehealth, communication, and education.

  12. Efficacy of Mesenchymal Stromal Cell Therapy for Acute Lung Injury in Preclinical Animal Models: A Systematic Review.

    Directory of Open Access Journals (Sweden)

    Lauralyn A McIntyre

    Full Text Available The Acute Respiratory Distress Syndrome (ARDS is a devastating clinical condition that is associated with a 30-40% risk of death, and significant long term morbidity for those who survive. Mesenchymal stromal cells (MSC have emerged as a potential novel treatment as in pre-clinical models they have been shown to modulate inflammation (a major pathophysiological hallmark of ARDS while enhancing bacterial clearance and reducing organ injury and death. A systematic search of MEDLINE, EMBASE, BIOSIS and Web of Science was performed to identify pre-clinical studies that examined the efficacy MSCs as compared to diseased controls for the treatment of Acute Lung Injury (ALI (the pre-clinical correlate of human ARDS on mortality, a clinically relevant outcome. We assessed study quality and pooled results using random effect meta-analysis. A total of 54 publications met our inclusion criteria of which 17 (21 experiments reported mortality and were included in the meta-analysis. Treatment with MSCs, as compared to controls, significantly decreased the overall odds of death in animals with ALI (Odds Ratio 0.24, 95% Confidence Interval 0.18-0.34, I2 8%. Efficacy was maintained across different types of animal models and means of ALI induction; MSC origin, source, route of administration and preparation; and the clinical relevance of the model (timing of MSC administration, administration of fluids and or antibiotics. Reporting of standard MSC characterization for experiments that used human MSCs and risks of bias was generally poor, and although not statistically significant, a funnel plot analysis for overall mortality suggested the presence of publication bias. The results from our meta-analysis support that MSCs substantially reduce the odds of death in animal models of ALI but important reporting elements were sub optimal and limit the strength of our conclusions.

  13. Model-based economic evaluations in smoking cessation and their transferability to new contexts: a systematic review.

    Science.gov (United States)

    Berg, Marrit L; Cheung, Kei Long; Hiligsmann, Mickaël; Evers, Silvia; de Kinderen, Reina J A; Kulchaitanaroaj, Puttarin; Pokhrel, Subhash

    2017-06-01

    To identify different types of models used in economic evaluations of smoking cessation, analyse the quality of the included models examining their attributes and ascertain their transferability to a new context. A systematic review of the literature on the economic evaluation of smoking cessation interventions published between 1996 and April 2015, identified via Medline, EMBASE, National Health Service (NHS) Economic Evaluation Database (NHS EED), Health Technology Assessment (HTA). The checklist-based quality of the included studies and transferability scores was based on the European Network of Health Economic Evaluation Databases (EURONHEED) criteria. Studies that were not in smoking cessation, not original research, not a model-based economic evaluation, that did not consider adult population and not from a high-income country were excluded. Among the 64 economic evaluations included in the review, the state-transition Markov model was the most frequently used method (n = 30/64), with quality adjusted life years (QALY) being the most frequently used outcome measure in a life-time horizon. A small number of the included studies (13 of 64) were eligible for EURONHEED transferability checklist. The overall transferability scores ranged from 0.50 to 0.97, with an average score of 0.75. The average score per section was 0.69 (range = 0.35-0.92). The relative transferability of the studies could not be established due to a limitation present in the EURONHEED method. All existing economic evaluations in smoking cessation lack in one or more key study attributes necessary to be fully transferable to a new context. © 2017 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  14. Methodological quality and reporting of generalized linear mixed models in clinical medicine (2000-2012: a systematic review.

    Directory of Open Access Journals (Sweden)

    Martí Casals

    Full Text Available BACKGROUND: Modeling count and binary data collected in hierarchical designs have increased the use of Generalized Linear Mixed Models (GLMMs in medicine. This article presents a systematic review of the application and quality of results and information reported from GLMMs in the field of clinical medicine. METHODS: A search using the Web of Science database was performed for published original articles in medical journals from 2000 to 2012. The search strategy included the topic "generalized linear mixed models","hierarchical generalized linear models", "multilevel generalized linear model" and as a research domain we refined by science technology. Papers reporting methodological considerations without application, and those that were not involved in clinical medicine or written in English were excluded. RESULTS: A total of 443 articles were detected, with an increase over time in the number of articles. In total, 108 articles fit the inclusion criteria. Of these, 54.6% were declared to be longitudinal studies, whereas 58.3% and 26.9% were defined as repeated measurements and multilevel design, respectively. Twenty-two articles belonged to environmental and occupational public health, 10 articles to clinical neurology, 8 to oncology, and 7 to infectious diseases and pediatrics. The distribution of the response variable was reported in 88% of the articles, predominantly Binomial (n = 64 or Poisson (n = 22. Most of the useful information about GLMMs was not reported in most cases. Variance estimates of random effects were described in only 8 articles (9.2%. The model validation, the method of covariate selection and the method of goodness of fit were only reported in 8.0%, 36.8% and 14.9% of the articles, respectively. CONCLUSIONS: During recent years, the use of GLMMs in medical literature has increased to take into account the correlation of data when modeling qualitative data or counts. According to the current recommendations, the

  15. The selection, optimization, and compensation model in the work context : A systematic review and meta-analysis of two decades of research

    NARCIS (Netherlands)

    Moghimi, Darya; Zacher, Hannes; Scheibe, Susanne; Van Yperen, Nico W.

    2016-01-01

    Over the past two decades, the selection, optimization, and compensation (SOC) model has been applied in the work context to investigate antecedents and outcomes of employees’ use of action regulation strategies. We systematically review, meta-analyze, and critically discuss the literature on SOC st

  16. Series: Clinical Epidemiology in South Africa. Paper 3: Logic models help make sense of complexity in systematic reviews and health technology assessments.

    Science.gov (United States)

    Rohwer, Anke; Pfadenhauer, Lisa; Burns, Jacob; Brereton, Louise; Gerhardus, Ansgar; Booth, Andrew; Oortwijn, Wija; Rehfuess, Eva

    2017-03-01

    To describe the development and application of logic model templates for systematic reviews and health technology assessments (HTAs) of complex interventions. This study demonstrates the development of a method to conceptualize complexity and make underlying assumptions transparent. Examples from systematic reviews with specific relevance to Sub-Saharan Africa (SSA) and other low- and middle-income countries (LMICs) illustrate its usefulness. Two distinct templates are presented: the system-based logic model, describing the system in which the interaction between participants, intervention, and context takes place; and the process-orientated logic model, which displays the processes and causal pathways that lead from the intervention to multiple outcomes. Logic models can help authors of systematic reviews and HTAs to explicitly address and make sense of complexity, adding value by achieving a better understanding of the interactions between the intervention, its implementation, and its multiple outcomes among a given population and context. They thus have the potential to help build systematic review capacity-in SSA and other LMICs-at an individual level, by equipping authors with a tool that facilitates the review process; and at a system-level, by improving communication between producers and potential users of research evidence. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. A systematic experimental investigation of significant parameters affecting model tire hydroplaning

    Science.gov (United States)

    Wray, G. A.; Ehrlich, I. R.

    1973-01-01

    The results of a comprehensive parametric study of model and small pneumatic tires operating on a wet surface are presented. Hydroplaning inception (spin down) and rolling restoration (spin up) are discussed. Conclusions indicate that hydroplaning inception occurs at a speed significantly higher than the rolling restoration speed. Hydroplaning speed increases considerably with tread depth, surface roughness and tire inflation pressure of footprint pressure, and only moderately with increased load. Water film thickness affects spin down speed only slightly. Spin down speed varies inversely as approximately the one-sixth power of film thickness. Empirical equations relating tire inflation pressure, normal load, tire diameter and water film thickness have been generated for various tire tread and surface configurations.

  18. Systematic Model-in-the-Loop Test of Embedded Control Systems

    Science.gov (United States)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  19. Results from systematic modeling of neutron damage in inertial fusion energy reactors

    Energy Technology Data Exchange (ETDEWEB)

    Perlado, J.M. E-mail: mperlado@denim.upm.es; Dominguez, E.; Malerba, L.; Marian, J.; Lodi, D.; Salvador, M.; Alonso, E.; Caturla, Ma.J.; Diaz de la Rubia, T

    2002-01-01

    Radiation damage is an important issue in the lifetime of the structural materials in an Inertial Fusion Energy (IFE) Reactor. The effect will strongly depend on the class of chamber protection at the IFE Reactor design. This paper gives results from DENIM, and collaboration with LLNL, on the necessary magnitudes for the final evaluation of neutron damage. The determination of the neutron intensities and energy spectra emerging from the target, the energy spectra of the Primary Knock-on Atoms (PKA) resulting from the neutron interactions, the modeling at microscopic scale of the pulsed irradiation in metals are reported, in addition to reference to the work on the time dependence of neutron flux in IFE protected chamber. Results are also presented on the damage accumulation in SiC, relevant both for magnetic (MFE) and inertial fusion.

  20. A trans-Amazonian screening of mtDNA reveals deep intraspecific divergence in forest birds and suggests a vast underestimation of species diversity.

    Directory of Open Access Journals (Sweden)

    Borja Milá

    Full Text Available The Amazonian avifauna remains severely understudied relative to that of the temperate zone, and its species richness is thought to be underestimated by current taxonomy. Recent molecular systematic studies using mtDNA sequence reveal that traditionally accepted species-level taxa often conceal genetically divergent subspecific lineages found to represent new species upon close taxonomic scrutiny, suggesting that intraspecific mtDNA variation could be useful in species discovery. Surveys of mtDNA variation in Holarctic species have revealed patterns of variation that are largely congruent with species boundaries. However, little information exists on intraspecific divergence in most Amazonian species. Here we screen intraspecific mtDNA genetic variation in 41 Amazonian forest understory species belonging to 36 genera and 17 families in 6 orders, using 758 individual samples from Ecuador and French Guiana. For 13 of these species, we also analyzed trans-Andean populations from the Ecuadorian Chocó. A consistent pattern of deep intraspecific divergence among trans-Amazonian haplogroups was found for 33 of the 41 taxa, and genetic differentiation and genetic diversity among them was highly variable, suggesting a complex range of evolutionary histories. Mean sequence divergence within families was the same as that found in North American birds (13%, yet mean intraspecific divergence in Neotropical species was an order of magnitude larger (2.13% vs. 0.23%, with mean distance between intraspecific lineages reaching 3.56%. We found no clear relationship between genetic distances and differentiation in plumage color. Our results identify numerous genetically and phenotypically divergent lineages which may result in new species-level designations upon closer taxonomic scrutiny and thorough sampling, although lineages in the tropical region could be older than those in the temperate zone without necessarily representing separate species. In

  1. A trans-Amazonian screening of mtDNA reveals deep intraspecific divergence in forest birds and suggests a vast underestimation of species diversity.

    Science.gov (United States)

    Milá, Borja; Tavares, Erika S; Muñoz Saldaña, Alberto; Karubian, Jordan; Smith, Thomas B; Baker, Allan J

    2012-01-01

    The Amazonian avifauna remains severely understudied relative to that of the temperate zone, and its species richness is thought to be underestimated by current taxonomy. Recent molecular systematic studies using mtDNA sequence reveal that traditionally accepted species-level taxa often conceal genetically divergent subspecific lineages found to represent new species upon close taxonomic scrutiny, suggesting that intraspecific mtDNA variation could be useful in species discovery. Surveys of mtDNA variation in Holarctic species have revealed patterns of variation that are largely congruent with species boundaries. However, little information exists on intraspecific divergence in most Amazonian species. Here we screen intraspecific mtDNA genetic variation in 41 Amazonian forest understory species belonging to 36 genera and 17 families in 6 orders, using 758 individual samples from Ecuador and French Guiana. For 13 of these species, we also analyzed trans-Andean populations from the Ecuadorian Chocó. A consistent pattern of deep intraspecific divergence among trans-Amazonian haplogroups was found for 33 of the 41 taxa, and genetic differentiation and genetic diversity among them was highly variable, suggesting a complex range of evolutionary histories. Mean sequence divergence within families was the same as that found in North American birds (13%), yet mean intraspecific divergence in Neotropical species was an order of magnitude larger (2.13% vs. 0.23%), with mean distance between intraspecific lineages reaching 3.56%. We found no clear relationship between genetic distances and differentiation in plumage color. Our results identify numerous genetically and phenotypically divergent lineages which may result in new species-level designations upon closer taxonomic scrutiny and thorough sampling, although lineages in the tropical region could be older than those in the temperate zone without necessarily representing separate species. In-depth phylogeographic surveys

  2. Satellites may underestimate rice residue and associated burning emissions in Vietnam

    Science.gov (United States)

    Lasko, Kristofer; Vadrevu, Krishna P.; Tran, Vinh T.; Ellicott, Evan; Nguyen, Thanh T. N.; Bui, Hung Q.; Justice, Christopher

    2017-08-01

    In this study, we estimate rice residue, associated burning emissions, and compare results with existing emissions inventories employing a bottom-up approach. We first estimated field-level post-harvest rice residues, including separate fuel-loading factors for rice straw and rice stubble. Results suggested fuel-loading factors of 0.27 kg m-2 (±0.033), 0.61 kg m-2 (±0.076), and 0.88 kg m-2 (±0.083) for rice straw, stubble, and total post-harvest biomass, respectively. Using these factors, we quantified potential emissions from rice residue burning and compared our estimates with other studies. Our results suggest total rice residue burning emissions as 2.24 Gg PM2.5, 36.54 Gg CO and 567.79 Gg CO2 for Hanoi Province, which are significantly higher than earlier studies. We attribute our higher emission estimates to improved fuel-loading factors; moreover, we infer that some earlier studies relying on residue-to-product ratios could be underestimating rice residue emissions by more than a factor of 2.3 for Hanoi, Vietnam. Using the rice planted area data from the Vietnamese government, and combining our fuel-loading factors, we also estimated rice residue PM2.5 emissions for the entirety of Vietnam and compared these estimates with an existing all-sources emissions inventory, and the Global Fire Emissions Database (GFED). Results suggest 75.98 Gg of PM2.5 released from rice residue burning accounting for 12.8% of total emissions for Vietnam. The GFED database suggests 42.56 Gg PM2.5 from biomass burning with 5.62 Gg attributed to agricultural waste burning indicating satellite-based methods may be significantly underestimating emissions. Our results not only provide improved residue and emission estimates, but also highlight the need for emissions mitigation from rice residue burning.

  3. Sustainable business models: systematic approach toward successful ambulatory care pharmacy practice.

    Science.gov (United States)

    Sachdev, Gloria

    2014-08-15

    This article discusses considerations for making ambulatory care pharmacist services at least cost neutral and, ideally, generate a margin that allows for service expansion. The four pillars of business sustainability are leadership, staffing, information technology, and compensation. A key facet of leadership in ambulatory care pharmacy practice is creating and expressing a clear vision for pharmacists' services. Staffing considerations include establishing training needs, maximizing efficiencies, and minimizing costs. Information technology is essential for efficiency in patient care delivery and outcomes assessment. The three domains of compensation are cost savings, pay for performance, and revenue generation. The following eight steps for designing and implementing an ambulatory care pharmacist service are discussed: (1) prepare a needs assessment, (2) analyze existing strengths, weaknesses, opportunities, and threats, (3) analyze service gaps and feasibility, (4) consider financial opportunities, (5) consider stakeholders' interests, (6) develop a business plan, (7) implement the service, and (8) measure outcomes. Potential future changes in national healthcare policy (such as pharmacist provider status and expanded pay for performance) could enhance the opportunities for sustainable ambulatory care pharmacy practice. The key challenges facing ambulatory care pharmacists are developing sustainable business models, determining which services yield a positive return on investment, and demanding payment for value-added services. Copyright © 2014 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  4. Modelling systematics of ground-based transit photometry I. Implications on transit timing variations

    CERN Document Server

    von Essen, C; Mallonn, M; Tingley, B; Marcussen, M

    2016-01-01

    The transit timing variation technique (TTV) has been widely used to detect and characterize multiple planetary systems. Due to the observational biases imposed mainly by the photometric conditions and instrumentation and the high signal-to-noise required to produce primary transit observations, ground-based data acquired using small telescopes limit the technique to the follow-up of hot Jupiters. However, space-based missions such as Kepler and CoRoT have already revealed that hot Jupiters are mainly found in single systems. Thus, it is natural to question ourselves if we are properly using the observing time at hand carrying out such follow-ups, or if the use of medium-to-low quality transit light curves, combined with current standard techniques of data analysis, could be playing a main role against exoplanetary search via TTVs. The purpose of this work is to investigate to what extent ground-based observations treated with current modelling techniques are reliable to detect and characterize additional pla...

  5. Climate change and dengue: a critical and systematic review of quantitative modelling approaches.

    Science.gov (United States)

    Naish, Suchithra; Dale, Pat; Mackenzie, John S; McBride, John; Mengersen, Kerrie; Tong, Shilu

    2014-03-26

    Many studies have found associations between climatic conditions and dengue transmission. However, there is a debate about the future impacts of climate change on dengue transmission. This paper reviewed epidemiological evidence on the relationship between climate and dengue with a focus on quantitative methods for assessing the potential impacts of climate change on global dengue transmission. A literature search was conducted in October 2012, using the electronic databases PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search focused on peer-reviewed journal articles published in English from January 1991 through October 2012. Sixteen studies met the inclusion criteria and most studies showed that the transmission of dengue is highly sensitive to climatic conditions, especially temperature, rainfall and relative humidity. Studies on the potential impacts of climate change on dengue indicate increased climatic suitability for transmission and an expansion of the geographic regions at risk during this century. A variety of quantitative modelling approaches were used in the studies. Several key methodological issues and current knowledge gaps were identified through this review. It is important to assemble spatio-temporal patterns of dengue transmission compatible with long-term data on climate and other socio-ecological changes and this would advance projections of dengue risks associated with climate change.

  6. Brazilian healthcare model for people with mental disorders: a systematic literature review

    Directory of Open Access Journals (Sweden)

    Marcelo Theophilo Lima

    2013-03-01

    Full Text Available Objective: To evaluate the process of implementing the Brazilian psychiatric reform, especially regarding its impact on families’ management of healthcare issues. Methods: Interpretative research performed between August 2011 and January 2012, where symbolic interactionism was used as theoretical reference and Grounded Theory was used as the methodological reference. Initially, 49 articles on the subject were selected applying as descriptors mental health, psychiatric reform and psychosocial care, in Scielo, lIlACS and university libraries’ databases. of these, 17 articles were excluded for being published prior to 2008 and 18 for having approaches not comprised in the scope of the study. Results: The power relationships in the treatment method were identified as causal conditions of the de-hospitalization process, which occurs in a context of deficiency in the network intended to replace psychiatric hospitals, therefore requiring the participation of patients’ families in their reintegration at home and treatment. This strategy to deconstruct the psychiatric hospital-based model results in an excessive burden to the families. Conclusion: If, on one hand, the shift from hospitalization to in-home care, with embracement of the disease and of patients’ suffering in their very social relationships, was to propose the recovery of patients’ civil and human rights and their remaining into the society, on the other hand, it creates another series of problems, such as the emotional and logistical burden imposed on patients’ families.

  7. A Systematic Review of Exercise Training To Promote Locomotor Recovery in Animal Models of Spinal Cord Injury

    Science.gov (United States)

    Callister, Robert J.; Callister, Robin; Galea, Mary P.

    2012-01-01

    Abstract In the early 1980s experiments on spinalized cats showed that exercise training on the treadmill could enhance locomotor recovery after spinal cord injury (SCI). In this review, we summarize the evidence for the effectiveness of exercise training aimed at promoting locomotor recovery in animal models of SCI. We performed a systematic search of the literature using Medline, Web of Science, and Embase. Of the 362 studies screened, 41 were included. The adult female rat was the most widely used animal model. The majority of studies (73%) reported that exercise training had a positive effect on some aspect of locomotor recovery. Studies employing a complete SCI were less likely to have positive outcomes. For incomplete SCI models, contusion was the most frequently employed method of lesion induction, and the degree of recovery depended on injury severity. Positive outcomes were associated with training regimens that involved partial weight-bearing activity, commenced within a critical period of 1–2 weeks after SCI, and maintained training for at least 8 weeks. Considerable heterogeneity in training paradigms and methods used to assess or quantify recovery was observed. A 13-item checklist was developed and employed to assess the quality of reporting and study design; only 15% of the studies had high methodological quality. We recommend that future studies include control groups, randomize animals to groups, conduct blinded assessments, report the extent of the SCI lesion, and report sample size calculations. A small battery of objective assessment methods including assessment of over-ground stepping should also be developed and routinely employed. This would allow future meta-analyses of the effectiveness of exercise interventions on locomotor recovery. PMID:22401139

  8. A systematic review of the impact of including both waist and hip circumference in risk models for cardiovascular diseases, diabetes and mortality.

    Science.gov (United States)

    Cameron, A J; Magliano, D J; Söderberg, S

    2013-01-01

    Both a larger waist and narrow hips are associated with heightened risk of diabetes, cardiovascular diseases and premature mortality. We review the risk of these outcomes for levels of waist and hip circumferences when terms for both anthropometric measures were included in regression models. MEDLINE and EMBASE were searched (last updated July 2012) for studies reporting the association with the outcomes mentioned earlier for both waist and hip circumferences (unadjusted and with both terms included in the model). Ten studies reported the association between hip circumference and death and/or disease outcomes both unadjusted and adjusted for waist circumference. Five studies reported the risk associated with waist circumference both unadjusted and adjusted for hip circumference. With the exception of one study of venous thromboembolism, the full strength of the association between either waist circumference or hip circumference with morbidity and/or mortality was only apparent when terms for both anthropometric measures were included in regression models. Without accounting for the protective effect of hip circumference, the effect of obesity on risk of death and disease may be seriously underestimated. Considered together (but not as a ratio measure), waist and hip circumference may improve risk prediction models for cardiovascular disease and other outcomes.

  9. Systematic review

    DEFF Research Database (Denmark)

    Enggaard, Helle

    2016-01-01

    Title: Systematic review a method to promote nursing students skills in Evidence Based Practice Background: Department of nursing educate students to practice Evidence Based Practice (EBP), where clinical decisions is based on the best available evidence, patient preference, clinical experience...... with systematic review is used to develop didactic practice end evidence based teaching in different part of the education. Findings: The poster will present how teacher’s training and experiences with systematic review contribute to the nursing education in relation to didactic, research methodology and patient...... sources of evidence influence EBP. Furthermore teachers skills in systematic review will be used to develop systematic reviews on topics in the education where there aren’t any in order to promote Evidence Based Teaching....

  10. Information processing of genetically modified food messages under different motives: an adaptation of the multiple-motive heuristic-systematic model.

    Science.gov (United States)

    Kim, Jooyoung; Paek, Hye-Jin

    2009-12-01

    Recent risk management research has noted the importance of understanding how the lay public processes and reacts to risk-related information. Guided by the multiple-motive heuristic-systematic model, this study examines (1) how individuals process messages in the context of genetically modified foods to change their attitudes and (2) how the persuasion process varies across types of motives. In the three treatment conditions of accuracy, defense, and impression motives, the respondents changed their attitudes through either the heuristic or the systematic mode, depending on their motives. The accuracy-motive group appeared to use the systematic processing mode, while the impression-motive group seemed to employ the heuristic processing mode. The empirical findings highlight the importance of incorporating motives to improve our understanding of the process of attitude change in risk management and communication contexts.

  11. Biofeedback as complementary treatment in patients with epilepsy – an underestimated therapeutic option? Review, results, discussion

    Directory of Open Access Journals (Sweden)

    Uhlmann Carmen

    2016-12-01

    Full Text Available Background. Biofeedback methods represent side effect free complementary options in the treatment of epilepsy. In this paper we review the current status of these methods in terms of clinical study results and their evaluation by systematic review papers. Possible mechanisms of action in biofeedback methods are discussed.

  12. Systematic trends in total-mass profiles from dynamical models of early-type galaxies

    Science.gov (United States)

    Poci, Adriano; Cappellari, Michele; McDermid, Richard M.

    2017-01-01

    We study trends in the slope of the total mass profiles and dark matter fractions within the central half-light radius of 258 early-type galaxies, using data from the volume-limited ATLAS3D survey. We use three distinct sets of dynamical models, which vary in their assumptions and also allow for spatial variations in the stellar mass-to-light ratio, to test the robustness of our results. We confirm that the slopes of the total mass profiles are approximately isothermal, and investigate how the total-mass slope depends on various galactic properties. The most statistically-significant correlations we find are a function of either surface density, Σe, or velocity dispersion, σe. However there is evidence for a break in the latter relation, with a nearly universal logarithmic slope above log10[σe/(km s-1)] ˜ 2.1 and a steeper trend below this value. For the 142 galaxies above that critical σe value, the total mass-density logarithmic slopes have a mean value = -2.193 ± 0.016 (1σ error) with an observed rms scatter of only σ _{γ ^' }= 0.168 ± 0.015. Considering the observational errors, we estimate an intrinsic scatter of σ _{γ ^' }^intr ≈ 0.15. These values are broadly consistent with those found by strong lensing studies at similar radii and agree, within the tight errors, with values recently found at much larger radii via stellar dynamics or HI rotation curves (using significantly smaller samples than this work).

  13. A Systematic Review of Perennial Staple Crops Literature Using Topic Modeling and Bibliometric Analysis.

    Science.gov (United States)

    Kane, Daniel A; Rogé, Paul; Snapp, Sieglinde S

    2016-01-01

    Research on perennial staple crops has increased in the past ten years due to their potential to improve ecosystem services in agricultural systems. However, multiple past breeding efforts as well as research on traditional ratoon systems mean there is already a broad body of literature on perennial crops. In this review, we compare the development of research on perennial staple crops, including wheat, rice, rye, sorghum, and pigeon pea. We utilized the advanced search capabilities of Web of Science, Scopus, ScienceDirect, and Agricola to gather a library of 914 articles published from 1930 to the present. We analyzed the metadata in the entire library and in collections of literature on each crop to understand trends in research and publishing. In addition, we applied topic modeling to the article abstracts, a type of text analysis that identifies frequently co-occurring terms and latent topics. We found: 1.) Research on perennials is increasing overall, but individual crops have each seen periods of heightened interest and research activity; 2.) Specialist journals play an important role in supporting early research efforts. Research often begins within communities of specialists or breeders for the individual crop before transitioning to a more general scientific audience; 3.) Existing perennial agricultural systems and their domesticated crop material, such as ratoon rice systems, can provide a useful foundation for breeding efforts, accelerating the development of truly perennial crops and farming systems; 4.) Primary research is lacking for crops that are produced on a smaller scale globally, such as pigeon pea and sorghum, and on the ecosystem service benefits of perennial agricultural systems.

  14. A Systematic Review of Perennial Staple Crops Literature Using Topic Modeling and Bibliometric Analysis

    Science.gov (United States)

    2016-01-01

    Research on perennial staple crops has increased in the past ten years due to their potential to improve ecosystem services in agricultural systems. However, multiple past breeding efforts as well as research on traditional ratoon systems mean there is already a broad body of literature on perennial crops. In this review, we compare the development of research on perennial staple crops, including wheat, rice, rye, sorghum, and pigeon pea. We utilized the advanced search capabilities of Web of Science, Scopus, ScienceDirect, and Agricola to gather a library of 914 articles published from 1930 to the present. We analyzed the metadata in the entire library and in collections of literature on each crop to understand trends in research and publishing. In addition, we applied topic modeling to the article abstracts, a type of text analysis that identifies frequently co-occurring terms and latent topics. We found: 1.) Research on perennials is increasing overall, but individual crops have each seen periods of heightened interest and research activity; 2.) Specialist journals play an important role in supporting early research efforts. Research often begins within communities of specialists or breeders for the individual crop before transitioning to a more general scientific audience; 3.) Existing perennial agricultural systems and their domesticated crop material, such as ratoon rice systems, can provide a useful foundation for breeding efforts, accelerating the development of truly perennial crops and farming systems; 4.) Primary research is lacking for crops that are produced on a smaller scale globally, such as pigeon pea and sorghum, and on the ecosystem service benefits of perennial agricultural systems. PMID:27213283

  15. Determination of counterfeit medicines by Raman spectroscopy: Systematic study based on a large set of model tablets.

    Science.gov (United States)

    Neuberger, Sabine; Neusüß, Christian

    2015-08-10

    In the last decade, counterfeit pharmaceutical products have become a widespread issue for public health. Raman spectroscopy which is easy, non-destructive and information-rich is particularly suitable as screening method for fast characterization of chemicals and pharmaceuticals. Combined with chemometric techniques, it provides a powerful tool for the analysis and determination of counterfeit medicines. Here, for the first time, a systematic study of the benefits and limitations of Raman spectroscopy for the analysis of pharmaceutical samples on a large set of model tablets, varying with respect to chemical and physical properties, was performed. To discriminate between the different mixtures, a combination of dispersive Raman spectroscopy performing in backscattering mode and principal component analysis was used. The discrimination between samples with different coatings, a varying amount of active pharmaceutical ingredients and a diversity of excipients were possible. However, it was not possible to distinguish between variations of the press power, mixing quality and granulation. As a showcase, the change in Raman signals of commercial acetylsalicylic acid effervescent tablets due to five different storage conditions was monitored. It was possible to detect early small chemical changes caused by inappropriate storage conditions. These results demonstrate that Raman spectroscopy combined with multivariate data analysis provides a powerful methodology for the fast and easy characterization of genuine and counterfeit medicines. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Evaluation of internally consistent parameters for the triple-layer model by the systematic analysis of oxide surface titration data

    Energy Technology Data Exchange (ETDEWEB)

    Sahai, N.; Sverjensky, D.A. [Johns Hopkins Univ., Baltimore, MD (United States)

    1997-07-01

    Systematic analysis of surface titration data from the literature has been performed for ten oxides (anatase, hematite, goethite, rutile, amorphous silica, quartz, magnetite, {delta}-MnO{sub 2}, corundum, and {gamma}-alumina) in ten electrolytes (LiNO{sub 3}, NaNO{sub 3}, KNO{sub 3}, CsNO{sub 3}, LiCl, NaCl, KCl, CsCl, Nal, and NaClO{sub 4}) over a wide range of ionic strengths (0.001 M-2.9 M) to establish adsorption equilibrium constants and capacitances consistent with the triple-layer model of surface complexation. Experimental data for the same mineral in different electrolytes and data for a given mineral/electrolyte system from various investigators have been compared. In this analysis, the surface protonation constants (K{sub s,1} and K{sub s,2}) were calculated by combining predicted values of {Delta}pK(log K{sub s,2} - log K{sub s,1}) with experimental points of zero charge; site-densities were obtained from tritium-exchange experiments reported in the literature, and the outer-layer capacitance (C{sub 2}) was set at 0.2 F {center_dot} m{sup -2}. 98 refs., 8 figs., 27 tabs.

  17. Systematic analysis of the mechanisms of virus-triggered type I IFN signaling pathways through mathematical modeling.

    Science.gov (United States)

    Zhang, Wei; Zou, Xiufen

    2013-01-01

    Based on biological experimental data, we developed a mathematical model of the virus-triggered signaling pathways that lead to induction of type I IFNs and systematically analyzed the mechanisms of the cellular antiviral innate immune responses, including the negative feedback regulation of ISG56 and the positive feedback regulation of IFNs. We found that the time between 5 and 48 hours after viral infection is vital for the control and/or elimination of the virus from the host cells and demonstrated that the ISG56-induced inhibition of MITA activation is stronger than the ISG56-induced inhibition of TBK1 activation. The global parameter sensitivity analysis suggests that the positive feedback regulation of IFNs is very important in the innate antiviral system. Furthermore, the robustness of the innate immune signaling network was demonstrated using a new robustness index. These results can help us understand the mechanisms of the virus-induced innate immune response at a system level and provide instruction for further biological experiments.

  18. Unique and Overlapping Symptoms in Schizophrenia Spectrum and Dissociative Disorders in Relation to Models of Psychopathology: A Systematic Review.

    Science.gov (United States)

    Renard, Selwyn B; Huntjens, Rafaele J C; Lysaker, Paul H; Moskowitz, Andrew; Aleman, André; Pijnenborg, Gerdina H M

    2017-01-01

    Schizophrenia spectrum disorders (SSDs) and dissociative disorders (DDs) are described in the fifth edition of the Diagnostic and Statistical Manual for Mental Disorders (DSM-5) and tenth edition of the International Statistical Classification of Diseases and Related Health Problems (ICD-10) as 2 categorically distinct diagnostic categories. However, several studies indicate high levels of co-occurrence between these diagnostic groups, which might be explained by overlapping symptoms. The aim of this systematic review is to provide a comprehensive overview of the research concerning overlap and differences in symptoms between schizophrenia spectrum and DDs. For this purpose the PubMed, PsycINFO, and Web of Science databases were searched for relevant literature. The literature contained a large body of evidence showing the presence of symptoms of dissociation in SSDs. Although there are quantitative differences between diagnoses, overlapping symptoms are not limited to certain domains of dissociation, nor to nonpathological forms of dissociation. In addition, dissociation seems to be related to a history of trauma in SSDs, as is also seen in DDs. There is also evidence showing that positive and negative symptoms typically associated with schizophrenia may be present in DD. Implications of these results are discussed with regard to different models of psychopathology and clinical practice. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  19. Bioavailability of lysine for kittens in overheated casein is underestimated by the rat growth assay method.

    Science.gov (United States)

    Larsen, J A; Fascetti, A J; Calvert, C C; Rogers, Q R

    2010-10-01

    Growth assays were performed to determine lysine bioavailability for kittens and rats in untreated and heated casein; these values were compared with estimates obtained with an in vitro method. Body weight, food intake, nitrogen and dry matter digestibility, and plasma lysine were determined during an 80-day growth trial using kittens (n = 16). Body weight and food intake were determined during a 21-day growth trial using weanling rats (n = 80). The growth data showed bioavailable lysine to be 102.4% and 100.2% (for untreated casein) and 66.1% and 51.7% (for heated casein) for kittens and rats, respectively. There was no relationship between plasma lysine and dietary lysine concentrations for kittens. There were no significant differences in nitrogen or dry matter digestibility among diets for kittens. The chemically reactive lysine content of untreated casein was 99.6%, and of heated casein was 67.1%. Heat treatment of casein resulted in significantly decreased lysine bioavailability as estimated by all methods. For untreated casein, both growth assays showed good agreement with the in vitro method for available lysine. For heated casein, the rat growth assay significantly underestimated bioavailable lysine as determined in kittens while the in vitro method closely approximated this value for the cat.

  20. Concomitant spuriously elevated white blood cell count, a previously underestimated phenomenon in EDTA-dependent pseudothrombocytopenia.

    Science.gov (United States)

    Xiao, Yufei; Xu, Yang

    2015-01-01

    The proportion and potential risk of concomitant spuriously elevated white blood cell count (SEWC) are underestimated in ethylenediaminetetraacetic acid (EDTA)-dependent pseudothrombocytopenia (PTCP). The proportion, kinetics and prevention of SEWC remain poorly understood. A total of 25 patients with EDTA-dependent PTCP were enrolled in this study. With the hematology analyzer Coulter LH 750, we determined the time courses of WBC count, WBC differential and platelet count in EDTA- and sodium citrate-anticoagulated blood, respectively. Blood smears were prepared to insp