#### Sample records for survival model applied

1. A Log Logistic Survival Model Applied to Hypobaric Decompression Sickness

Science.gov (United States)

Conkin, Johnny

2001-01-01

Decompression sickness (DCS) is a complex, multivariable problem. A mathematical description or model of the likelihood of DCS requires a large amount of quality research data, ideas on how to define a decompression dose using physical and physiological variables, and an appropriate analytical approach. It also requires a high-performance computer with specialized software. I have used published DCS data to develop my decompression doses, which are variants of equilibrium expressions for evolved gas plus other explanatory variables. My analytical approach is survival analysis, where the time of DCS occurrence is modeled. My conclusions can be applied to simple hypobaric decompressions - ascents lasting from 5 to 30 minutes - and, after minutes to hours, to denitrogenation (prebreathing). They are also applicable to long or short exposures, and can be used whether the sufferer of DCS is at rest or exercising at altitude. Ultimately I would like my models to be applied to astronauts to reduce the risk of DCS during spacewalks, as well as to future spaceflight crews on the Moon and Mars.

2. Applied survival analysis using R

CERN Document Server

Moore, Dirk F

2016-01-01

Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

3. Modelling survival

DEFF Research Database (Denmark)

Ashauer, Roman; Albert, Carlo; Augustine, Starrlight

2016-01-01

well GUTS, calibrated with short-term survival data of Gammarus pulex exposed to four pesticides, can forecast effects of longer-term pulsed exposures. Thirdly, we tested the ability of GUTS to estimate 14-day median effect concentrations of malathion for a range of species and use these estimates...

4. A frailty model for (interval) censored family survival data, applied to the age at onset of non-physical problems

NARCIS (Netherlands)

Jonker, M.A.; Boomsma, D.I.

2010-01-01

Family survival data can be used to estimate the degree of genetic and environmental contributions to the age at onset of a disease or of a specific event in life. The data can be modeled with a correlated frailty model in which the frailty variable accounts for the degree of kinship within the

5. Applied the additive hazard model to predict the survival time of patient with diffuse large B- cell lymphoma and determine the effective genes, using microarray data

Directory of Open Access Journals (Sweden)

2015-09-01

Full Text Available Background: Recent studies have shown that effective genes on survival time of cancer patients play an important role as a risk factor or preventive factor. Present study was designed to determine effective genes on survival time for diffuse large B-cell lymphoma patients and predict the survival time using these selected genes. Materials & Methods: Present study is a cohort study was conducted on 40 patients with diffuse large B-cell lymphoma. For these patients, 2042 gene expression was measured. In order to predict the survival time, the composition of the semi-parametric additive survival model with two gene selection methods elastic net and lasso were used. Two methods were evaluated by plotting area under the ROC curve over time and calculating the integral of this curve. Results: Based on our findings, the elastic net method identified 10 genes, and Lasso-Cox method identified 7 genes. GENE3325X increased the survival time (P=0.006, Whereas GENE3980X and GENE377X reduced the survival time (P=0.004. These three genes were selected as important genes in both methods. Conclusion: This study showed that the elastic net method outperformed the common Lasso method in terms of predictive power. Moreover, apply the additive model instead Cox regression and using microarray data is usable way for predict the survival time of patients.

6. Modelling survival and connectivity of

NARCIS (Netherlands)

van der Molen, J.; van Beek, J.; Augustine, S.; Vansteenbrugge, L.; van Walraven, L.; van Langenberg, V.; van der Veer, H.W.; Hostens, K.; Pitois, S.; Robbens, J.

2015-01-01

Three different models were applied to study the reproduction, survival and dispersal of Mnemiopsis leidyi in the Scheldt estuaries and the southern North Sea: a high-resolution particle tracking model with passive particles, a low-resolution particle tracking model with a reproduction model

7. Flexible survival regression modelling

DEFF Research Database (Denmark)

Cortese, Giuliana; Scheike, Thomas H; Martinussen, Torben

2009-01-01

Regression analysis of survival data, and more generally event history data, is typically based on Cox's regression model. We here review some recent methodology, focusing on the limitations of Cox's regression model. The key limitation is that the model is not well suited to represent time-varyi...

8. Survival analysis models and applications

CERN Document Server

Liu, Xian

2012-01-01

Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

9. Applied impulsive mathematical models

CERN Document Server

Stamova, Ivanka

2016-01-01

Using the theory of impulsive differential equations, this book focuses on mathematical models which reflect current research in biology, population dynamics, neural networks and economics. The authors provide the basic background from the fundamental theory and give a systematic exposition of recent results related to the qualitative analysis of impulsive mathematical models. Consisting of six chapters, the book presents many applicable techniques, making them available in a single source easily accessible to researchers interested in mathematical models and their applications. Serving as a valuable reference, this text is addressed to a wide audience of professionals, including mathematicians, applied researchers and practitioners.

10. Artillery Survivability Model

Science.gov (United States)

2016-06-01

experiment mode also enables users to set their own design of experiment by manipulating an editable CSV file. The second one is a real-time mode that...renders a 3D virtual environment of a restricted battlefield where the survivability movements of an artillery company are visualized . This mode...provides detailed visualization of the simulation and enables future experimental uses of the simulation as a training tool. 14. SUBJECT TERMS

11. Frailty Models in Survival Analysis

CERN Document Server

Wienke, Andreas

2010-01-01

The concept of frailty offers a convenient way to introduce unobserved heterogeneity and associations into models for survival data. In its simplest form, frailty is an unobserved random proportionality factor that modifies the hazard function of an individual or a group of related individuals. "Frailty Models in Survival Analysis" presents a comprehensive overview of the fundamental approaches in the area of frailty models. The book extensively explores how univariate frailty models can represent unobserved heterogeneity. It also emphasizes correlated frailty models as extensions of

12. Modelling population-based cancer survival trends using join point models for grouped survival data.

Science.gov (United States)

Yu, Binbing; Huang, Lan; Tiwari, Ram C; Feuer, Eric J; Johnson, Karen A

2009-04-01

In the United States cancer as a whole is the second leading cause of death and a major burden to health care, thus the medical progress against cancer is a major public health goal. There are many individual studies to suggest that cancer treatment breakthroughs and early diagnosis have significantly improved the prognosis of cancer patients. To better understand the relationship between medical improvements and the survival experience for the patient population at large, it is useful to evaluate cancer survival trends on the population level, e.g., to find out when and how much the cancer survival rates changed. In this paper, we analyze the population-based grouped cancer survival data by incorporating joinpoints into the survival models. A joinpoint survival model facilitates the identification of trends with significant change points in cancer survival, when related to cancer treatments or interventions. The Bayesian Information Criterion is used to select the number of joinpoints. The performance of the joinpoint survival models is evaluated with respect to cancer prognosis, joinpoint locations, annual percent changes in death rates by year of diagnosis, and sample sizes through intensive simulation studies. The model is then applied to the grouped relative survival data for several major cancer sites from the Surveillance, Epidemiology and End Results (SEER) Program of the National Cancer Institute. The change points in the survival trends for several major cancer sites are identified and the potential driving forces behind such change points are discussed.

13. Applied stochastic modelling

CERN Document Server

Morgan, Byron JT; Tanner, Martin Abba; Carlin, Bradley P

2008-01-01

Introduction and Examples Introduction Examples of data sets Basic Model Fitting Introduction Maximum-likelihood estimation for a geometric model Maximum-likelihood for the beta-geometric model Modelling polyspermy Which model? What is a model for? Mechanistic models Function Optimisation Introduction MATLAB: graphs and finite differences Deterministic search methods Stochastic search methods Accuracy and a hybrid approach Basic Likelihood ToolsIntroduction Estimating standard errors and correlations Looking at surfaces: profile log-likelihoods Confidence regions from profiles Hypothesis testing in model selectionScore and Wald tests Classical goodness of fit Model selection biasGeneral Principles Introduction Parameterisation Parameter redundancy Boundary estimates Regression and influence The EM algorithm Alternative methods of model fitting Non-regular problemsSimulation Techniques Introduction Simulating random variables Integral estimation Verification Monte Carlo inference Estimating sampling distributi...

14. Combining parametric, semi-parametric, and non-parametric survival models with stacked survival models.

Science.gov (United States)

Wey, Andrew; Connett, John; Rudser, Kyle

2015-07-01

15. Applied Bayesian modelling

CERN Document Server

Congdon, Peter

2014-01-01

This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

16. Modeling survival data extending the cox model

CERN Document Server

Therneau, Terry M

2000-01-01

Extending the Cox Model is aimed at researchers, practitioners, and graduate students who have some exposure to traditional methods of survival analysis The emphasis is on semiparametric methods based on the proportional hazards model The inclusion of examples with SAS and S-PLUS code will make the book accessible to most working statisticians

17. Shared Frailty Model for Left-Truncated Multivariate Survival Data

DEFF Research Database (Denmark)

Jensen, Henrik; Brookmeyer, Ron; Aaby, Peter

multivariate survival data, left truncation, multiplicative hazard model, shared gamma frailty, conditional model, piecewise exponential model, childhood survival......multivariate survival data, left truncation, multiplicative hazard model, shared gamma frailty, conditional model, piecewise exponential model, childhood survival...

18. Locally Applied Valproate Enhances Survival in Rats after Neocortical Treatment with Tetanus Toxin and Cobalt Chloride

Directory of Open Access Journals (Sweden)

Dirk-Matthias Altenmüller

2013-01-01

Full Text Available Purpose. In neocortical epilepsies not satisfactorily responsive to systemic antiepileptic drug therapy, local application of antiepileptic agents onto the epileptic focus may enhance treatment efficacy and tolerability. We describe the effects of focally applied valproate (VPA in a newly emerging rat model of neocortical epilepsy induced by tetanus toxin (TeT plus cobalt chloride (CoCl2. Methods. In rats, VPA ( or sodium chloride (NaCl ( containing polycaprolactone (PCL implants were applied onto the right motor cortex treated before with a triple injection of 75 ng TeT plus 15 mg CoCl2. Video-EEG monitoring was performed with intracortical depth electrodes. Results. All rats randomized to the NaCl group died within one week after surgery. In contrast, the rats treated with local VPA survived significantly longer (. In both groups, witnessed deaths occurred in the context of seizures. At least of the rats surviving the first postoperative day developed neocortical epilepsy with recurrent spontaneous seizures. Conclusions. The novel TeT/CoCl2 approach targets at a new model of neocortical epilepsy in rats and allows the investigation of local epilepsy therapy strategies. In this vehicle-controlled study, local application of VPA significantly enhanced survival in rats, possibly by focal antiepileptic or antiepileptogenic mechanisms.

19. Evaluating survival model performance: a graphical approach.

Science.gov (United States)

Mandel, M; Galai, N; Simchen, E

2005-06-30

In the last decade, many statistics have been suggested to evaluate the performance of survival models. These statistics evaluate the overall performance of a model ignoring possible variability in performance over time. Using an extension of measures used in binary regression, we propose a graphical method to depict the performance of a survival model over time. The method provides estimates of performance at specific time points and can be used as an informal test for detecting time varying effects of covariates in the Cox model framework. The method is illustrated on real and simulated data using Cox proportional hazard model and rank statistics. Copyright 2005 John Wiley & Sons, Ltd.

20. Model selection criterion in survival analysis

Science.gov (United States)

Karabey, Uǧur; Tutkun, Nihal Ata

2017-07-01

Survival analysis deals with time until occurrence of an event of interest such as death, recurrence of an illness, the failure of an equipment or divorce. There are various survival models with semi-parametric or parametric approaches used in medical, natural or social sciences. The decision on the most appropriate model for the data is an important point of the analysis. In literature Akaike information criteria or Bayesian information criteria are used to select among nested models. In this study,the behavior of these information criterion is discussed for a real data set.

1. A stochastic evolutionary model for survival dynamics

Science.gov (United States)

Fenner, Trevor; Levene, Mark; Loizou, George

2014-09-01

The recent interest in human dynamics has led researchers to investigate the stochastic processes that explain human behaviour in different contexts. Here we propose a generative model to capture the essential dynamics of survival analysis, traditionally employed in clinical trials and reliability analysis in engineering. In our model, the only implicit assumption made is that the longer an actor has been in the system, the more likely it is to have failed. We derive a power-law distribution for the process and provide preliminary empirical evidence for the validity of the model from two well-known survival analysis data sets.

2. A hierarchical nest survival model integrating incomplete temporally varying covariates

Science.gov (United States)

Converse, Sarah J.; Royle, J. Andrew; Adler, Peter H.; Urbanek, Richard P.; Barzan, Jeb A.

2013-01-01

Nest success is a critical determinant of the dynamics of avian populations, and nest survival modeling has played a key role in advancing avian ecology and management. Beginning with the development of daily nest survival models, and proceeding through subsequent extensions, the capacity for modeling the effects of hypothesized factors on nest survival has expanded greatly. We extend nest survival models further by introducing an approach to deal with incompletely observed, temporally varying covariates using a hierarchical model. Hierarchical modeling offers a way to separate process and observational components of demographic models to obtain estimates of the parameters of primary interest, and to evaluate structural effects of ecological and management interest. We built a hierarchical model for daily nest survival to analyze nest data from reintroduced whooping cranes (Grus americana) in the Eastern Migratory Population. This reintroduction effort has been beset by poor reproduction, apparently due primarily to nest abandonment by breeding birds. We used the model to assess support for the hypothesis that nest abandonment is caused by harassment from biting insects. We obtained indices of blood-feeding insect populations based on the spatially interpolated counts of insects captured in carbon dioxide traps. However, insect trapping was not conducted daily, and so we had incomplete information on a temporally variable covariate of interest. We therefore supplemented our nest survival model with a parallel model for estimating the values of the missing insect covariates. We used Bayesian model selection to identify the best predictors of daily nest survival. Our results suggest that the black fly Simulium annulus may be negatively affecting nest survival of reintroduced whooping cranes, with decreasing nest survival as abundance of S. annulus increases. The modeling framework we have developed will be applied in the future to a larger data set to evaluate the

3. A generalized additive regression model for survival times

DEFF Research Database (Denmark)

Scheike, Thomas H.

2001-01-01

Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models......Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models...

4. Comparison of Cox and Gray's survival models in severe sepsis

DEFF Research Database (Denmark)

Kasal, Jan; Andersen, Zorana Jovanovic; Clermont, Gilles

2004-01-01

Although survival is traditionally modeled using Cox proportional hazards modeling, this approach may be inappropriate in sepsis, in which the proportional hazards assumption does not hold. Newer, more flexible models, such as Gray's model, may be more appropriate.......Although survival is traditionally modeled using Cox proportional hazards modeling, this approach may be inappropriate in sepsis, in which the proportional hazards assumption does not hold. Newer, more flexible models, such as Gray's model, may be more appropriate....

5. Geostatistical methods applied to field model residuals

DEFF Research Database (Denmark)

Maule, Fox; Mosegaard, K.; Olsen, Nils

consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based......The geomagnetic field varies on a variety of time- and length scales, which are only rudimentary considered in most present field models. The part of the observed field that can not be explained by a given model, the model residuals, is often considered as an estimate of the data uncertainty (which....... Once these are known, they can be used in the modelling process to improve the model, estimate its uncertainty and eventually determine the optimal sampling rate in time and space. The uncertainty estimate on the field model can be used to estimate the uncertainty on quantities deduced from the field...

6. Prognostic survival model for people diagnosed with invasive cutaneous melanoma.

Science.gov (United States)

Baade, Peter D; Royston, Patrick; Youl, Philipa H; Weinstock, Martin A; Geller, Alan; Aitken, Joanne F

2015-01-31

The ability of medical practitioners to communicate risk estimates effectively to patients diagnosed with melanoma relies on accurate information about prognostic factors and their impact on survival. This study reports the development of one of the few melanoma prognostic models, called the Melanoma Severity Index (MSI), based on population-based cancer registry data. Data from the Queensland Cancer Registry for people (20-89 years) diagnosed with a single invasive melanoma between 1995 and 2008 (n = 28,654; 1,700 melanoma deaths). Additional clinical information about metastasis, ulceration and positive lymph nodes was manually extracted from pathology forms. Flexible parametric survival models were combined with multivariable fractional polynomial for selecting variables and transformations of continuous variables. Multiple imputation was used for missing covariate values. The MSI contained the variables thickness (transformed, explained 40.6% of variation in survival), body site (additional 1.9% in variation), metastasis (1.8%), positive nodes (0.7%), ulceration (1.3%), age (1.1%). Royston and Sauerbrei's D statistic (measure of discrimination) was 1.50 (95% CI = 1.44, 1.56) and the corresponding RD2 (measure of explained variation) was 0.47 (0.45, 0.49), demonstrating strong explanatory performance. The Harrell-C statistic was 0.88 (0.88, 0.89). Lacking an external validation dataset, we applied internal-external cross validation to demonstrate the consistency of the prognostic information across geographically-defined subsets of the cohort. The MSI provides good ability to predict survival for melanoma patients. Beyond the immediate clinical use, the MSI may have important public health and research applications for evaluations of public health interventions aimed at reducing deaths from melanoma.

7. Applying Data Mining Techniques to Extract Hidden Patterns about Breast Cancer Survival in an Iranian Cohort Study.

Science.gov (United States)

2016-01-01

Breast cancer survival has been analyzed by many standard data mining algorithms. A group of these algorithms belonged to the decision tree category. Ability of the decision tree algorithms in terms of visualizing and formulating of hidden patterns among study variables were main reasons to apply an algorithm from the decision tree category in the current study that has not studied already. The classification and regression trees (CART) was applied to a breast cancer database contained information on 569 patients in 2007-2010. The measurement of Gini impurity used for categorical target variables was utilized. The classification error that is a function of tree size was measured by 10-fold cross-validation experiments. The performance of created model was evaluated by the criteria as accuracy, sensitivity and specificity. The CART model produced a decision tree with 17 nodes, 9 of which were associated with a set of rules. The rules were meaningful clinically. They showed in the if-then format that Stage was the most important variable for predicting breast cancer survival. The scores of accuracy, sensitivity and specificity were: 80.3%, 93.5% and 53%, respectively. The current study model as the first one created by the CART was able to extract useful hidden rules from a relatively small size dataset.

8. Applying incentive sensitization models to behavioral addiction

DEFF Research Database (Denmark)

Rømer Thomsen, Kristine; Fjorback, Lone; Møller, Arne

2014-01-01

The incentive sensitization theory is a promising model for understanding the mechanisms underlying drug addiction, and has received support in animal and human studies. So far the theory has not been applied to the case of behavioral addictions like Gambling Disorder, despite sharing clinical...

9. Applied probability models with optimization applications

CERN Document Server

Ross, Sheldon M

1992-01-01

Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

10. Applied Integer Programming Modeling and Solution

CERN Document Server

Chen, Der-San; Dang, Yu

2011-01-01

An accessible treatment of the modeling and solution of integer programming problems, featuring modern applications and software In order to fully comprehend the algorithms associated with integer programming, it is important to understand not only how algorithms work, but also why they work. Applied Integer Programming features a unique emphasis on this point, focusing on problem modeling and solution using commercial software. Taking an application-oriented approach, this book addresses the art and science of mathematical modeling related to the mixed integer programming (MIP) framework and

11. Applied research in uncertainty modeling and analysis

CERN Document Server

Ayyub, Bilal

2005-01-01

Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

12. Applied Mathematics, Modelling and Computational Science

CERN Document Server

Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

2015-01-01

The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

13. ANTEROCOD: actuarial survival curves applied to medical coding support for chronic diseases.

Science.gov (United States)

Lecornu, L; Le Guillou, C; Le Saux, F; Hubert, M; Puentes, J; Cauvin, J M

2010-01-01

For the practitioner, choosing diagnosis codes is a non-intuitive operation. Mistakes are frequent, causing severe consequences on healthcare performance evaluation and funding. French physicians have to assign a code to all their activities and are frequently prone to these errors. Given that most of the time and particularly for chronic diseases indexed information is already available, we propose a tool named AnterOcod, in order to support the medical coding task. It suggests the list of most relevant plausible codes, predicted from the patient's earlier hospital stays, according to a set of previously utilized diagnosis codes. Our method applies the estimation of code reappearance rates, based on an equivalent approach to actuarial survival curves. Around 33% of the expected correct diagnosis codes were retrieved in this manner, after evaluating 998 discharge abstracts, significantly improving the coding task.

14. Efficient estimation of semiparametric copula models for bivariate survival data

KAUST Repository

Cheng, Guang

2014-01-01

A semiparametric copula model for bivariate survival data is characterized by a parametric copula model of dependence and nonparametric models of two marginal survival functions. Efficient estimation for the semiparametric copula model has been recently studied for the complete data case. When the survival data are censored, semiparametric efficient estimation has only been considered for some specific copula models such as the Gaussian copulas. In this paper, we obtain the semiparametric efficiency bound and efficient estimation for general semiparametric copula models for possibly censored data. We construct an approximate maximum likelihood estimator by approximating the log baseline hazard functions with spline functions. We show that our estimates of the copula dependence parameter and the survival functions are asymptotically normal and efficient. Simple consistent covariance estimators are also provided. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2013 Elsevier Inc.

15. Time series modeling of system self-assessment of survival

Energy Technology Data Exchange (ETDEWEB)

Lu, H.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

1999-06-01

Self-assessment of survival for a system, subsystem or component is implemented by assessing conditional performance reliability in real-time, which includes modeling and analysis of physical performance data. This paper proposes a time series analysis approach to system self-assessment (prediction) of survival. In the approach, physical performance data are modeled in a time series. The performance forecast is based on the model developed and is converted to the reliability of system survival. In contrast to a standard regression model, a time series model, using on-line data, is suitable for the real-time performance prediction. This paper illustrates an example of time series modeling and survival assessment, regarding an excessive tool edge wear failure mode for a twist drill operation.

16. Applying incentive sensitization models to behavioral addiction.

Science.gov (United States)

Rømer Thomsen, Kristine; Fjorback, Lone O; Møller, Arne; Lou, Hans C

2014-09-01

The incentive sensitization theory is a promising model for understanding the mechanisms underlying drug addiction, and has received support in animal and human studies. So far the theory has not been applied to the case of behavioral addictions like Gambling Disorder, despite sharing clinical symptoms and underlying neurobiology. We examine the relevance of this theory for Gambling Disorder and point to predictions for future studies. The theory promises a significant contribution to the understanding of behavioral addiction and opens new avenues for treatment. Copyright © 2014 Elsevier Ltd. All rights reserved.

17. Applied Regression Modeling A Business Approach

CERN Document Server

Pardoe, Iain

2012-01-01

An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

18. SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL

Directory of Open Access Journals (Sweden)

Jenq-Daw Lee

2008-07-01

Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.

19. A life-cycle model with ambiguous survival beliefs

NARCIS (Netherlands)

Groneck, Max; Ludwig, Alexander; Zimper, Alexander

Based on a cognitive notion of neo-additive capacities reflecting likelihood insensitivity with respect to survival chances, we construct a Choquet Bayesian learning model over the life-cycle that generates a motivational notion of neo-additive survival beliefs expressing ambiguity attitudes. We

20. Prediction of survival with alternative modeling techniques using pseudo values

NARCIS (Netherlands)

T. van der Ploeg (Tjeerd); F.R. Datema (Frank); R.J. Baatenburg de Jong (Robert Jan); E.W. Steyerberg (Ewout)

2014-01-01

textabstractBackground: The use of alternative modeling techniques for predicting patient survival is complicated by the fact that some alternative techniques cannot readily deal with censoring, which is essential for analyzing survival data. In the current study, we aimed to demonstrate that pseudo

1. Commercial Consolidation Model Applied to Transport Infrastructure

Energy Technology Data Exchange (ETDEWEB)

Guilherme de Aragão, J.J.; Santos Fontes Pereira, L. dos; Yamashita, Y.

2016-07-01

Since the 1990s, transport concessions, including public-private partnerships (PPPs), have been increasingly adopted by governments as an alternative for financing and operations in public investments, especially in transport infrastructure. The advantage pointed out by proponents of these models lies in merging the expertise and capital of the private sector to the public interest. Several arrangements are possible and have been employed in different cases. After the duration of the first PPP contracts in transportation, many authors have analyzed the success and failure factors of partnerships. The occurrence of failures in some stages of the process can greatly encumber the public administration, incurring losses to the fiscal responsibility of the competent bodies. This article aims to propose a new commercial consolidation model applied to transport infrastructure to ensure fiscal sustainability and overcome the weaknesses of current models. Initially, a systematic review of the literature covering studies on transport concessions between 1990 and 2015 is offered, where the different approaches between various countries are compared and the critical success factors indicated in the studies are identified. In the subsequent part of the paper, an approach for the commercial consolidation of the infrastructure concessions is presented, where the concessionary is paid following a finalistic performance model, which includes the overall fiscal balance of regional growth. Finally, the papers analyses the usefulness of the model in coping with the critical success factors explained before. (Author)

2. Application of Survival Analysis and Multistate Modeling to Understand Animal Behavior: Examples from Guide Dogs

OpenAIRE

Lucy Asher; Harvey, Naomi D.; Martin Green; England, Gary C.W.

2017-01-01

Epidemiology is the study of patterns of health-related states or events in populations. Statistical models developed for epidemiology could be usefully applied to behavioral states or events. The aim of this study is to present the application of epidemiological statistics to understand animal behavior where discrete outcomes are of interest, using data from guide dogs to illustrate. Specifically, survival analysis and multistate modeling are applied to data on guide dogs comparing dogs that...

3. Survival

Data.gov (United States)

U.S. Geological Survey, Department of the Interior — These data provide information on the survival of California red-legged frogs in a unique ecosystem to better conserve this threatened species while restoring...

4. Terahertz spectroscopy applied to food model systems

DEFF Research Database (Denmark)

Møller, Uffe

Water plays a crucial role in the quality of food. Apart from the natural water content of a food product, the state of that water is very important. Water can be found integrated into the biological material or it can be added during production of the product. Currently it is difficult to differ...... to differentiate between these types of water in subsequent quality controls. This thesis describes terahertz time-domain spectroscopy applied on aqueous food model systems, with particular focus on ethanol-water mixtures and confined water pools in inverse micelles.......Water plays a crucial role in the quality of food. Apart from the natural water content of a food product, the state of that water is very important. Water can be found integrated into the biological material or it can be added during production of the product. Currently it is difficult...

5. Conformal Field Theory Applied to Loop Models

Science.gov (United States)

Jacobsen, Jesper Lykke

The application of methods of quantum field theory to problems of statistical mechanics can in some sense be traced back to Onsager's 1944 solution [1] of the two-dimensional Ising model. It does however appear fair to state that the 1970's witnessed a real gain of momentum for this approach, when Wilson's ideas on scale invariance [2] were applied to study critical phenomena, in the form of the celebrated renormalisation group [3]. In particular, the so-called ɛ expansion permitted the systematic calculation of critical exponents [4], as formal power series in the space dimensionality d, below the upper critical dimension d c . An important lesson of these efforts was that critical exponents often do not depend on the precise details of the microscopic interactions, leading to the notion of a restricted number of distinct universality classes.

6. A Cooperation Model Applied in a Kindergarten

Directory of Open Access Journals (Sweden)

Jose I. Rodriguez

2011-10-01

Full Text Available The need for collaboration in a global world has become a key factor for success for many organizations and individuals. However in several regions and organizations in the world, it has not happened yet. One of the settings where major obstacles occur for collaboration is in the business arena, mainly because of competitive beliefs that cooperation could hurt profitability. We have found such behavior in a wide variety of countries, in advanced and developing economies. Such cultural behaviors or traits characterized entrepreneurs by working in isolation, avoiding the possibilities of building clusters to promote regional development. The needs to improve the essential abilities that conforms cooperation are evident. It is also very difficult to change such conduct with adults. So we decided to work with children to prepare future generations to live in a cooperative world, so badly hit by greed and individualism nowadays. We have validated that working with children at an early age improves such behavior. This paper develops a model to enhance the essential abilities in order to improve cooperation. The model has been validated by applying it at a kindergarten school.

7. Mediation analysis for survival data using semiparametric probit models.

Science.gov (United States)

Huang, Yen-Tsung; Cai, Tianxi

2016-06-01

Causal mediation modeling has become a popular approach for studying the effect of an exposure on an outcome through mediators. Currently, the literature on mediation analyses with survival outcomes largely focused on settings with a single mediator and quantified the mediation effects on the hazard, log hazard and log survival time (Lange and Hansen 2011; VanderWeele 2011). In this article, we propose a multi-mediator model for survival data by employing a flexible semiparametric probit model. We characterize path-specific effects (PSEs) of the exposure on the outcome mediated through specific mediators. We derive closed form expressions for PSEs on a transformed survival time and the survival probabilities. Statistical inference on the PSEs is developed using a nonparametric maximum likelihood estimator under the semiparametric probit model and the functional Delta method. Results from simulation studies suggest that our proposed methods perform well in finite sample. We illustrate the utility of our method in a genomic study of glioblastoma multiforme survival. © 2015, The International Biometric Society.

8. Infinite mixture-of-experts model for sparse survival regression with application to breast cancer

Directory of Open Access Journals (Sweden)

Dahl Edgar

2010-10-01

Full Text Available Abstract Background We present an infinite mixture-of-experts model to find an unknown number of sub-groups within a given patient cohort based on survival analysis. The effect of patient features on survival is modeled using the Cox’s proportionality hazards model which yields a non-standard regression component. The model is able to find key explanatory factors (chosen from main effects and higher-order interactions for each sub-group by enforcing sparsity on the regression coefficients via the Bayesian Group-Lasso. Results Simulated examples justify the need of such an elaborate framework for identifying sub-groups along with their key characteristics versus other simpler models. When applied to a breast-cancer dataset consisting of survival times and protein expression levels of patients, it results in identifying two distinct sub-groups with different survival patterns (low-risk and high-risk along with the respective sets of compound markers. Conclusions The unified framework presented here, combining elements of cluster and feature detection for survival analysis, is clearly a powerful tool for analyzing survival patterns within a patient group. The model also demonstrates the feasibility of analyzing complex interactions which can contribute to definition of novel prognostic compound markers.

9. Infinite mixture-of-experts model for sparse survival regression with application to breast cancer

Science.gov (United States)

2010-01-01

Background We present an infinite mixture-of-experts model to find an unknown number of sub-groups within a given patient cohort based on survival analysis. The effect of patient features on survival is modeled using the Cox’s proportionality hazards model which yields a non-standard regression component. The model is able to find key explanatory factors (chosen from main effects and higher-order interactions) for each sub-group by enforcing sparsity on the regression coefficients via the Bayesian Group-Lasso. Results Simulated examples justify the need of such an elaborate framework for identifying sub-groups along with their key characteristics versus other simpler models. When applied to a breast-cancer dataset consisting of survival times and protein expression levels of patients, it results in identifying two distinct sub-groups with different survival patterns (low-risk and high-risk) along with the respective sets of compound markers. Conclusions The unified framework presented here, combining elements of cluster and feature detection for survival analysis, is clearly a powerful tool for analyzing survival patterns within a patient group. The model also demonstrates the feasibility of analyzing complex interactions which can contribute to definition of novel prognostic compound markers. PMID:21034433

10. Modeling survival: application of the Andersen-Gill model to Yellowstone grizzly bears

Science.gov (United States)

Johnson, Christopher J.; Boyce, Mark S.; Schwartz, Charles C.; Haroldson, Mark A.

2004-01-01

Wildlife ecologists often use the Kaplan-Meier procedure or Cox proportional hazards model to estimate survival rates, distributions, and magnitude of risk factors. The Andersen-Gill formulation (A-G) of the Cox proportional hazards model has seen limited application to mark-resight data but has a number of advantages, including the ability to accommodate left-censored data, time-varying covariates, multiple events, and discontinuous intervals of risks. We introduce the A-G model including structure of data, interpretation of results, and assessment of assumptions. We then apply the model to 22 years of radiotelemetry data for grizzly bears (Ursus arctos) of the Greater Yellowstone Grizzly Bear Recovery Zone in Montana, Idaho, and Wyoming, USA. We used Akaike's Information Criterion (AICc) and multi-model inference to assess a number of potentially useful predictive models relative to explanatory covariates for demography, human disturbance, and habitat. Using the most parsimonious models, we generated risk ratios, hypothetical survival curves, and a map of the spatial distribution of high-risk areas across the recovery zone. Our results were in agreement with past studies of mortality factors for Yellowstone grizzly bears. Holding other covariates constant, mortality was highest for bears that were subjected to repeated management actions and inhabited areas with high road densities outside Yellowstone National Park. Hazard models developed with covariates descriptive of foraging habitats were not the most parsimonious, but they suggested that high-elevation areas offered lower risks of mortality when compared to agricultural areas.

11. Prediction of survival for older hospitalized patients: the HELP survival model. Hospitalized Elderly Longitudinal Project.

Science.gov (United States)

Teno, J M; Harrell, F E; Knaus, W; Phillips, R S; Wu, A W; Connors, A; Wenger, N S; Wagner, D; Galanos, A; Desbiens, N A; Lynn, J

2000-05-01

To develop and validate a model estimating the survival time of hospitalized persons aged 80 years and older. A prospective cohort study with mortality follow-up using the National Death Index. Four teaching hospitals in the US. Hospitalized patients enrolled between January 1993 and November 1994 in the Hospitalized Elderly Longitudinal Project (HELP). Patients were excluded if their length of hospital stay was 48 hours or less or if admitted electively for planned surgery. A log-normal model of survival time up to 711 days was developed with the following variables: patient demographics, disease category, nursing home residence, severity of physiologic imbalance, chart documentation of weight loss, current quality of life, exercise capacity, and functional status. We assessed whether model accuracy could be improved by including symptoms of depression or history of recent fall, serum albumin, physician's subjective estimate of prognosis, and physician and patient preferences for general approach to care. A total of 1266 patients were enrolled over a 10-month period, (median age 84.9, 61% female, 68% with one or more dependency), and 505 (40%) died during an average follow-up of more than 2 years. Important prognostic factors included the Acute Physiology Score of APACHE III collected on the third hospital day, modified Glasgow coma score, major diagnosis (ICU categories together, congestive heart failure, cancer, orthopedic, and all other), age, activities of daily living, exercise capacity, chart documentation of weight loss, and global quality of life. The Somers' Dxy for a model including these factors was 0.48 (equivalent to a receiver-operator curve (ROC) area of 0.74, suggesting good discrimination). Bootstrap estimation indicated good model validation (corrected Dxy of 0.46, ROC of 0.73). A nomogram based on this log-normal model is presented to facilitate calculation of median survival time and 10th and 90th percentile of survival time. A count of

12. Applying survival analysis to a large-scale forest inventory for assessment of tree mortality in Minnesota

Science.gov (United States)

C.W. Woodall; P.L. Grambsch; W. Thomas

2005-01-01

Tree mortality has traditionally been assessed in forest inventories through summaries of mortality by location, species, and causal agents. Although these methods have historically constituted the majority of tree mortality summarizations, they have had limited use in assessing mortality trends and dynamics. This study proposed a novel method of applying survival...

13. Introduction of a prediction model to assigning periodontal prognosis based on survival rates.

Science.gov (United States)

Martinez-Canut, Pedro; Alcaraz, Jaime; Alcaraz, Jaime; Alvarez-Novoa, Pablo; Alvarez-Novoa, Carmen; Marcos, Ana; Noguerol, Blas; Noguerol, Fernando; Zabalegui, Ion

2017-09-04

14. Statistical models and methods for reliability and survival analysis

CERN Document Server

Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

2013-01-01

Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

15. Cure fraction estimation from the mixture cure models for grouped survival data.

Science.gov (United States)

Yu, Binbing; Tiwari, Ram C; Cronin, Kathleen A; Feuer, Eric J

2004-06-15

Mixture cure models are usually used to model failure time data with long-term survivors. These models have been applied to grouped survival data. The models provide simultaneous estimates of the proportion of the patients cured from disease and the distribution of the survival times for uncured patients (latency distribution). However, a crucial issue with mixture cure models is the identifiability of the cure fraction and parameters of kernel distribution. Cure fraction estimates can be quite sensitive to the choice of latency distributions and length of follow-up time. In this paper, sensitivity of parameter estimates under semi-parametric model and several most commonly used parametric models, namely lognormal, loglogistic, Weibull and generalized Gamma distributions, is explored. The cure fraction estimates from the model with generalized Gamma distribution is found to be quite robust. A simulation study was carried out to examine the effect of follow-up time and latency distribution specification on cure fraction estimation. The cure models with generalized Gamma latency distribution are applied to the population-based survival data for several cancer sites from the Surveillance, Epidemiology and End Results (SEER) Program. Several cautions on the general use of cure model are advised. Copyright 2004 John Wiley & Sons, Ltd.

16. Applying Modeling Tools to Ground System Procedures

Science.gov (United States)

Di Pasquale, Peter

2012-01-01

As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

17. Application of Cox and Parametric Survival Models to Assess Social Determinants of Health Affecting Three-Year Survival of Breast Cancer Patients.

Science.gov (United States)

Mohseny, Maryam; Amanpour, Farzaneh; Mosavi-Jarrahi, Alireza; Jafari, Hossein; Moradi-Joo, Mohammad; Davoudi Monfared, Esmat

2016-01-01

Breast cancer is one of the most common causes of cancer mortality in Iran. Social determinants of health are among the key factors affecting the pathogenesis of diseases. This cross-sectional study aimed to determine the social determinants of breast cancer survival time with parametric and semi-parametric regression models. It was conducted on male and female patients diagnosed with breast cancer presenting to the Cancer Research Center of Shohada-E-Tajrish Hospital from 2006 to 2010. The Cox proportional hazard model and parametric models including the Weibull, log normal and log-logistic models were applied to determine the social determinants of survival time of breast cancer patients. The Akaike information criterion (AIC) was used to assess the best fit. Statistical analysis was performed with STATA (version 11) software. This study was performed on 797 breast cancer patients, aged 25-93 years with a mean age of 54.7 (±11.9) years. In both semi-parametric and parametric models, the three-year survival was related to level of education and municipal district of residence (P<0.05). The AIC suggested that log normal distribution was the best fit for the three-year survival time of breast cancer patients. Social determinants of health such as level of education and municipal district of residence affect the survival of breast cancer cases. Future studies must focus on the effect of childhood social class on the survival times of cancers, which have hitherto only been paid limited attention.

18. Applying mechanistic models in bioprocess development

DEFF Research Database (Denmark)

Lencastre Fernandes, Rita; Bodla, Vijaya Krishna; Carlquist, Magnus

2013-01-01

incorporates process-relevant input (critical process variables)-output (product concentration and product quality attributes) relations. The model therefore has great value in planning experiments, or in determining which critical process variables need to be monitored and controlled tightly. Mechanistic......The available knowledge on the mechanisms of a bioprocess system is central to process analytical technology. In this respect, mechanistic modeling has gained renewed attention, since a mechanistic model can provide an excellent summary of available process knowledge. Such a model therefore...... models should be combined with proper model analysis tools, such as uncertainty and sensitivity analysis. When assuming distributed inputs, the resulting uncertainty in the model outputs can be decomposed using sensitivity analysis to determine which input parameters are responsible for the major part...

19. Applying the Sport Education Model to Tennis

Science.gov (United States)

Ayvazo, Shiri

2009-01-01

The physical education field abounds with theoretically sound curricular approaches such as fitness education, skill theme approach, tactical approach, and sport education. In an era that emphasizes authentic sport experiences, the Sport Education Model includes unique features that sets it apart from other curricular models and can be a valuable…

20. Application of Survival Analysis and Multistate Modeling to Understand Animal Behavior: Examples from Guide Dogs

Science.gov (United States)

Asher, Lucy; Harvey, Naomi D.; Green, Martin; England, Gary C. W.

2017-01-01

Epidemiology is the study of patterns of health-related states or events in populations. Statistical models developed for epidemiology could be usefully applied to behavioral states or events. The aim of this study is to present the application of epidemiological statistics to understand animal behavior where discrete outcomes are of interest, using data from guide dogs to illustrate. Specifically, survival analysis and multistate modeling are applied to data on guide dogs comparing dogs that completed training and qualified as a guide dog, to those that were withdrawn from the training program. Survival analysis allows the time to (or between) a binary event(s) and the probability of the event occurring at or beyond a specified time point. Survival analysis, using a Cox proportional hazards model, was used to examine the time taken to withdraw a dog from training. Sex, breed, and other factors affected time to withdrawal. Bitches were withdrawn faster than dogs, Labradors were withdrawn faster, and Labrador × Golden Retrievers slower, than Golden Retriever × Labradors; and dogs not bred by Guide Dogs were withdrawn faster than those bred by Guide Dogs. Multistate modeling (MSM) can be used as an extension of survival analysis to incorporate more than two discrete events or states. Multistate models were used to investigate transitions between states of training to qualification as a guide dog or behavioral withdrawal, and from qualification as a guide dog to behavioral withdrawal. Sex, breed (with purebred Labradors and Golden retrievers differing from F1 crosses), and bred by Guide Dogs or not, effected movements between states. We postulate that survival analysis and MSM could be applied to a wide range of behavioral data and key examples are provided. PMID:28804710

1. Application of Survival Analysis and Multistate Modeling to Understand Animal Behavior: Examples from Guide Dogs.

Science.gov (United States)

Asher, Lucy; Harvey, Naomi D; Green, Martin; England, Gary C W

2017-01-01

Epidemiology is the study of patterns of health-related states or events in populations. Statistical models developed for epidemiology could be usefully applied to behavioral states or events. The aim of this study is to present the application of epidemiological statistics to understand animal behavior where discrete outcomes are of interest, using data from guide dogs to illustrate. Specifically, survival analysis and multistate modeling are applied to data on guide dogs comparing dogs that completed training and qualified as a guide dog, to those that were withdrawn from the training program. Survival analysis allows the time to (or between) a binary event(s) and the probability of the event occurring at or beyond a specified time point. Survival analysis, using a Cox proportional hazards model, was used to examine the time taken to withdraw a dog from training. Sex, breed, and other factors affected time to withdrawal. Bitches were withdrawn faster than dogs, Labradors were withdrawn faster, and Labrador × Golden Retrievers slower, than Golden Retriever × Labradors; and dogs not bred by Guide Dogs were withdrawn faster than those bred by Guide Dogs. Multistate modeling (MSM) can be used as an extension of survival analysis to incorporate more than two discrete events or states. Multistate models were used to investigate transitions between states of training to qualification as a guide dog or behavioral withdrawal, and from qualification as a guide dog to behavioral withdrawal. Sex, breed (with purebred Labradors and Golden retrievers differing from F1 crosses), and bred by Guide Dogs or not, effected movements between states. We postulate that survival analysis and MSM could be applied to a wide range of behavioral data and key examples are provided.

2. Application of Survival Analysis and Multistate Modeling to Understand Animal Behavior: Examples from Guide Dogs

Directory of Open Access Journals (Sweden)

Lucy Asher

2017-07-01

Full Text Available Epidemiology is the study of patterns of health-related states or events in populations. Statistical models developed for epidemiology could be usefully applied to behavioral states or events. The aim of this study is to present the application of epidemiological statistics to understand animal behavior where discrete outcomes are of interest, using data from guide dogs to illustrate. Specifically, survival analysis and multistate modeling are applied to data on guide dogs comparing dogs that completed training and qualified as a guide dog, to those that were withdrawn from the training program. Survival analysis allows the time to (or between a binary event(s and the probability of the event occurring at or beyond a specified time point. Survival analysis, using a Cox proportional hazards model, was used to examine the time taken to withdraw a dog from training. Sex, breed, and other factors affected time to withdrawal. Bitches were withdrawn faster than dogs, Labradors were withdrawn faster, and Labrador × Golden Retrievers slower, than Golden Retriever × Labradors; and dogs not bred by Guide Dogs were withdrawn faster than those bred by Guide Dogs. Multistate modeling (MSM can be used as an extension of survival analysis to incorporate more than two discrete events or states. Multistate models were used to investigate transitions between states of training to qualification as a guide dog or behavioral withdrawal, and from qualification as a guide dog to behavioral withdrawal. Sex, breed (with purebred Labradors and Golden retrievers differing from F1 crosses, and bred by Guide Dogs or not, effected movements between states. We postulate that survival analysis and MSM could be applied to a wide range of behavioral data and key examples are provided.

3. REVEAL risk scores applied to riociguat-treated patients in PATENT-2: Impact of changes in risk score on survival.

Science.gov (United States)

Benza, Raymond L; Farber, Harrison W; Frost, Adaani; Ghofrani, Hossein-Ardeschir; Gómez-Sánchez, Miguel A; Langleben, David; Rosenkranz, Stephan; Busse, Dennis; Meier, Christian; Nikkho, Sylvia; Hoeper, Marius M

2017-11-11

The Registry to Evaluate Early and Long-term PAH Disease Management (REVEAL) risk score (RRS) calculator was developed using data derived from the REVEAL registry, and predicts survival in patients with pulmonary arterial hypertension (PAH) based on multiple patient characteristics. Herein we applied the RRS to a pivotal PAH trial database, the 12-week PATENT-1 and open-label PATENT-2 extension studies of riociguat. We examined the effect of riociguat vs placebo on RRS in PATENT-1, and investigated the prognostic implications of change in RRS during PATENT-1 on long-term outcomes in PATENT-2. RRS was calculated post hoc for baseline and Week 12 of PATENT-1, and Week 12 of PATENT-2. Patients were grouped into risk strata by RRS. Kaplan-Meier estimates were made for survival and clinical worsening-free survival in PATENT-2 to evaluate the relationship between RRS in PATENT-1 and long-term outcomes in PATENT-2. A total of 396 patients completed PATENT-1 and participated in PATENT-2. In PATENT-1, riociguat significantly improved RRS (p = 0.031) and risk stratum (p = 0.018) between baseline and Week 12 compared with placebo. RRS at baseline, and at PATENT-1 Week 12, and change in RRS during PATENT-1 were significantly associated with survival (hazard ratios for a 1-point reduction in RRS: 0.675, 0.705 and 0.804, respectively) and clinical worsening-free survival (hazard ratios of 0.736, 0.716 and 0.753, respectively) over 2 years in PATENT-2. RRS at baseline and Week 12, and change in RRS, were significant predictors of both survival and clinical worsening-free survival. These data support the long-term predictive value of the RRS in a controlled study population. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

4. Applied modelling and computing in social science

CERN Document Server

Povh, Janez

2015-01-01

In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

5. Applying waste logistics modeling to regional planning

Energy Technology Data Exchange (ETDEWEB)

Holter, G.M.; Khawaja, A.; Shaver, S.R.; Peterson, K.L.

1995-05-01

Waste logistics modeling is a powerful analytical technique that can be used for effective planning of future solid waste storage, treatment, and disposal activities. Proper waste management is essential for preventing unacceptable environmental degradation from ongoing operations, and is also a critical part of any environmental remediation activity. Logistics modeling allows for analysis of alternate scenarios for future waste flowrates and routings, facility schedules, and processing or handling capacities. Such analyses provide an increased understanding of the critical needs for waste storage, treatment, transport, and disposal while there is still adequate lead time to plan accordingly. They also provide a basis for determining the sensitivity of these critical needs to the various system parameters. This paper discusses the application of waste logistics modeling concepts to regional planning. In addition to ongoing efforts to aid in planning for a large industrial complex, the Pacific Northwest Laboratory (PNL) is currently involved in implementing waste logistics modeling as part of the planning process for material recovery and recycling within a multi-city region in the western US.

6. Applying the WEAP Model to Water Resource

DEFF Research Database (Denmark)

Gao, Jingjing; Christensen, Per; Li, Wei

in assessing the effects on water resources using a case study on a Coal Industry Development Plan in an arid region in North Western China. In the case the WEAP model (Water Evaluation And Planning System) were used to simulate various scenarios using a diversity of technological instruments like irrigation...... thus providing a good basis for an SEA that can support the choice among different alternative scenarios and contribute to adjusting and optimizing the original plan....

7. Joint modelling of longitudinal CEA tumour marker progression and survival data on breast cancer

Science.gov (United States)

Borges, Ana; Sousa, Inês; Castro, Luis

2017-06-01

This work proposes the use of Biostatistics methods to study breast cancer in patients of Braga's Hospital Senology Unit, located in Portugal. The primary motivation is to contribute to the understanding of the progression of breast cancer, within the Portuguese population, using a more complex statistical model assumptions than the traditional analysis that take into account a possible existence of a serial correlation structure within a same subject observations. We aim to infer which risk factors aect the survival of Braga's Hospital patients, diagnosed with breast tumour. Whilst analysing risk factors that aect a tumour markers used on the surveillance of disease progression the Carcinoembryonic antigen (CEA). As survival and longitudinal processes may be associated, it is important to model these two processes together. Hence, a joint modelling of these two processes to infer on the association of these was conducted. A data set of 540 patients, along with 50 variables, was collected from medical records of the Hospital. A joint model approach was used to analyse these data. Two dierent joint models were applied to the same data set, with dierent parameterizations which give dierent interpretations to model parameters. These were used by convenience as the ones implemented in R software. Results from the two models were compared. Results from joint models, showed that the longitudinal CEA values were signicantly associated with the survival probability of these patients. A comparison between parameter estimates obtained in this analysis and previous independent survival[4] and longitudinal analysis[5][6], lead us to conclude that independent analysis brings up bias parameter estimates. Hence, an assumption of association between the two processes in a joint model of breast cancer data is necessary. Results indicate that the longitudinal progression of CEA is signicantly associated with the probability of survival of these patients. Hence, an assumption of

8. Private healthcare quality: applying a SERVQUAL model.

Science.gov (United States)

Butt, Mohsin Muhammad; de Run, Ernest Cyril

2010-01-01

This paper seeks to develop and test the SERVQUAL model scale for measuring Malaysian private health service quality. The study consists of 340 randomly selected participants visiting a private healthcare facility during a three-month data collection period. Data were analyzed using means, correlations, principal component and confirmatory factor analysis to establish the modified SERVQUAL scale's reliability, underlying dimensionality and convergent, discriminant validity. Results indicate a moderate negative quality gap for overall Malaysian private healthcare service quality. Results also indicate a moderate negative quality gap on each service quality scale dimension. However, scale development analysis yielded excellent results, which can be used in wider healthcare policy and practice. Respondents were skewed towards a younger population, causing concern that the results might not represent all Malaysian age groups. The study's major contribution is that it offers a way to assess private healthcare service quality. Second, it successfully develops a scale that can be used to measure health service quality in Malaysian contexts.

9. Extensions and applications of the Cox-Aalen survival model.

Science.gov (United States)

Scheike, Thomas H; Zhang, Mei-Jie

2003-12-01

Cox's regression model is the standard regression tool for survival analysis in most applications. Often, however, the model only provides a rough summary of the effect of some covariates. Therefore, if the aim is to give a detailed description of covariate effects and to consequently calculate predicted probabilities, more flexible models are needed. In another article, Scheike and Zhang (2002, Scandinavian Journal of Statistics 29, 75-88), we suggested a flexible extension of Cox's regression model, which aimed at extending the Cox model only for those covariates where additional flexibility are needed. One important advantage of the suggested approach is that even though covariates are allowed a nonparametric effect, the hassle and difficulty of finding smoothing parameters are not needed. We show how the extended model also leads to simple formulae for predicted probabilities and their standard errors, for example, in the competing risk framework.

10. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

DEFF Research Database (Denmark)

Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

2014-01-01

A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented....... The discrete time models used are multivariate variants of the discrete relative risk models. These models allow for regular parametric likelihood-based inference by exploring a coincidence of their likelihood functions and the likelihood functions of suitably defined multivariate generalized linear mixed...... models. The models include a dispersion parameter, which is essential for obtaining a decomposition of the variance of the trait of interest as a sum of parcels representing the additive genetic effects, environmental effects and unspecified sources of variability; as required in quantitative genetic...

11. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

DEFF Research Database (Denmark)

Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

2013-01-01

A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented....... The discrete time models used are multivariate variants of the discrete relative risk models. These models allow for regular parametric likelihood-based inference by exploring a coincidence of their likelihood functions and the likelihood functions of suitably defined multivariate generalized linear mixed...... models. The models include a dispersion parameter, which is essential for obtaining a decomposition of the variance of the trait of interest as a sum of parcels representing the additive genetic effects, environmental effects and unspecified sources of variability; as required in quantitative genetic...

12. Survival model construction guided by fit and predictive strength.

Science.gov (United States)

Chauvel, Cécile; O'Quigley, John

2017-06-01

Survival model construction can be guided by goodness-of-fit techniques as well as measures of predictive strength. Here, we aim to bring together these distinct techniques within the context of a single framework. The goal is how to best characterize and code the effects of the variables, in particular time dependencies, when taken either singly or in combination with other related covariates. Simple graphical techniques can provide an immediate visual indication as to the goodness-of-fit but, in cases of departure from model assumptions, will point in the direction of a more involved and richer alternative model. These techniques appear to be intuitive. This intuition is backed up by formal theorems that underlie the process of building richer models from simpler ones. Measures of predictive strength are used in conjunction with these goodness-of-fit techniques and, again, formal theorems show that these measures can be used to help identify models closest to the unknown non-proportional hazards mechanism that we can suppose generates the observations. Illustrations from studies in breast cancer show how these tools can be of help in guiding the practical problem of efficient model construction for survival data. © 2016, The International Biometric Society.

13. Hidden Markov model for dependent mark loss and survival estimation

Science.gov (United States)

Laake, Jeffrey L.; Johnson, Devin S.; Diefenbach, Duane R.; Ternent, Mark A.

2014-01-01

Mark-recapture estimators assume no loss of marks to provide unbiased estimates of population parameters. We describe a hidden Markov model (HMM) framework that integrates a mark loss model with a Cormack–Jolly–Seber model for survival estimation. Mark loss can be estimated with single-marked animals as long as a sub-sample of animals has a permanent mark. Double-marking provides an estimate of mark loss assuming independence but dependence can be modeled with a permanently marked sub-sample. We use a log-linear approach to include covariates for mark loss and dependence which is more flexible than existing published methods for integrated models. The HMM approach is demonstrated with a dataset of black bears (Ursus americanus) with two ear tags and a subset of which were permanently marked with tattoos. The data were analyzed with and without the tattoo. Dropping the tattoos resulted in estimates of survival that were reduced by 0.005–0.035 due to tag loss dependence that could not be modeled. We also analyzed the data with and without the tattoo using a single tag. By not using.

14. Survival of Escherichia coli and Salmonella Typhimurium in slurry applied to clay soil on a Danish swine farm

DEFF Research Database (Denmark)

Boes, J.; Alban, L.; Bagger, J.

2005-01-01

the survival times of E. coli and Salmonella in the soil surface following deposition of naturally contaminated pig slurry; and (3) simulate survival of Salmonella in different infection levels using E. coli data as input estimates. Slurry was deposited by four different methods: (1) hose applicator on black......A pilot study was carried out on a Danish swine farm infected with multi-resistant Salmonella Typhimurium DT104 (MRDT104). We aimed to (1) investigate to which degree the decline of Escherichia coli and Salmonella in swine slurry applied to farmland depended on the application method; (2) estimate...... and Salmonella could not be detected at all in soil following treatment 1. Following the other treatments, E. coli was not detected in soil samples after day 21 and Salmonella was no longer detected after day 7. Simulation results showed that clinical (4 log CFU g(-1)) and sub-clinical Salmonella levels (2500...

15. Modeling the survival kinetics of Salmonella in tree nuts for use in risk assessment.

Science.gov (United States)

Santillana Farakos, Sofia M; Pouillot, Régis; Anderson, Nathan; Johnson, Rhoma; Son, Insook; Van Doren, Jane

2016-06-16

Salmonella has been shown to survive in tree nuts over long periods of time. This survival capacity and its variability are key elements for risk assessment of Salmonella in tree nuts. The aim of this study was to develop a mathematical model to predict survival of Salmonella in tree nuts at ambient storage temperatures that considers variability and uncertainty separately and can easily be incorporated into a risk assessment model. Data on Salmonella survival on raw almonds, pecans, pistachios and walnuts were collected from the peer reviewed literature. The Weibull model was chosen as the baseline model and various fixed effect and mixed effect models were fit to the data. The best model identified through statistical analysis testing was then used to develop a hierarchical Bayesian model. Salmonella in tree nuts showed slow declines at temperatures ranging from 21°C to 24°C. A high degree of variability in survival was observed across tree nut studies reported in the literature. Statistical analysis results indicated that the best applicable model was a mixed effect model that included a fixed and random variation of δ per tree nut (which is the time it takes for the first log10 reduction) and a fixed variation of ρ per tree nut (parameter which defines the shape of the curve). Higher estimated survival rates (δ) were obtained for Salmonella on pistachios, followed in decreasing order by pecans, almonds and walnuts. The posterior distributions obtained from Bayesian inference were used to estimate the variability in the log10 decrease levels in survival for each tree nut, and the uncertainty of these estimates. These modeled uncertainty and variability distributions of the estimates can be used to obtain a complete exposure assessment of Salmonella in tree nuts when including time-temperature parameters for storage and consumption data. The statistical approach presented in this study may be applied to any studies that aim to develop predictive models to be

16. Machine learning models in breast cancer survival prediction.

Science.gov (United States)

2016-01-01

Breast cancer is one of the most common cancers with a high mortality rate among women. With the early diagnosis of breast cancer survival will increase from 56% to more than 86%. Therefore, an accurate and reliable system is necessary for the early diagnosis of this cancer. The proposed model is the combination of rules and different machine learning techniques. Machine learning models can help physicians to reduce the number of false decisions. They try to exploit patterns and relationships among a large number of cases and predict the outcome of a disease using historical cases stored in datasets. The objective of this study is to propose a rule-based classification method with machine learning techniques for the prediction of different types of Breast cancer survival. We use a dataset with eight attributes that include the records of 900 patients in which 876 patients (97.3%) and 24 (2.7%) patients were females and males respectively. Naive Bayes (NB), Trees Random Forest (TRF), 1-Nearest Neighbor (1NN), AdaBoost (AD), Support Vector Machine (SVM), RBF Network (RBFN), and Multilayer Perceptron (MLP) machine learning techniques with 10-cross fold technique were used with the proposed model for the prediction of breast cancer survival. The performance of machine learning techniques were evaluated with accuracy, precision, sensitivity, specificity, and area under ROC curve. Out of 900 patients, 803 patients and 97 patients were alive and dead, respectively. In this study, Trees Random Forest (TRF) technique showed better results in comparison to other techniques (NB, 1NN, AD, SVM and RBFN, MLP). The accuracy, sensitivity and the area under ROC curve of TRF are 96%, 96%, 93%, respectively. However, 1NN machine learning technique provided poor performance (accuracy 91%, sensitivity 91% and area under ROC curve 78%). This study demonstrates that Trees Random Forest model (TRF) which is a rule-based classification model was the best model with the highest level of

17. Computer based prognosis model with dimensionality reduction and validation of attributes for prolonged survival prediction

Directory of Open Access Journals (Sweden)

C.G. Raji

2017-01-01

Full Text Available Medical databases contain large volume of data about patients and their clinical information. For extracting the features and their relationships from a huge database, various data mining techniques need to be employed. As Liver transplantation is the curative surgical procedure for the patients suffering from end stage liver disease, predicting the survival rate after Liver transplantation has a big impact. Appropriate selection of attributes and methods are necessary for the survival prediction. Liver transplantation data with 256 attributes were collected from 389 attributes of the United Nations Organ Sharing registry for the survival prediction. Initially 59 attributes were filtered manually, and then Principal Component Analysis (PCA was applied for reducing the dimensionality of the data. After performing PCA, 197 attributes were obtained and they were ranked into 27 strong/relevant attributes. Using association rule mining techniques, the association between the selected attributes was identified and verified. Comparison of rules generated by various association rules mining algorithm before and after PCA was also carried out for affirming the results. The various rule mining algorithms used were Apriori, Treap mining and Tertius algorithms. Among these algorithms, Treap mining algorithm generated the rules with high accuracy. A Multilayer Perceptron model was built for predicting the long term survival of patients after Liver transplantation which produced high accuracy prediction result. The model performance was compared with Radial Basis Function model to prove the accuracy of survival of liver patients'. The top ranked attributes obtained from rule mining were fed to the models for effective training. This ensures that Treap mining generated associations of high impact attributes which in-turn made the survival prediction flawless.

18. Estimating true instead of apparent survival using spatial Cormack-Jolly-Seber models

Science.gov (United States)

Schaub, Michael; Royle, J. Andrew

2014-01-01

Survival is often estimated from capture–recapture data using Cormack–Jolly–Seber (CJS) models, where mortality and emigration cannot be distinguished, and the estimated apparent survival probability is the product of the probabilities of true survival and of study area fidelity. Consequently, apparent survival is lower than true survival unless study area fidelity equals one. Underestimation of true survival from capture–recapture data is a main limitation of the method.

19. Learning to Apply Models of Materials While Explaining Their Properties

Science.gov (United States)

Karpin, Tiia; Juuti, Kalle; Lavonen, Jari

2014-01-01

Background: Applying structural models is important to chemistry education at the upper secondary level, but it is considered one of the most difficult topics to learn. Purpose: This study analyses to what extent in designed lessons students learned to apply structural models in explaining the properties and behaviours of various materials.…

20. A flexible alternative to the Cox proportional hazards model for assessing the prognostic accuracy of hospice patient survival.

Directory of Open Access Journals (Sweden)

Full Text Available Prognostic models are often used to estimate the length of patient survival. The Cox proportional hazards model has traditionally been applied to assess the accuracy of prognostic models. However, it may be suboptimal due to the inflexibility to model the baseline survival function and when the proportional hazards assumption is violated. The aim of this study was to use internal validation to compare the predictive power of a flexible Royston-Parmar family of survival functions with the Cox proportional hazards model. We applied the Palliative Performance Scale on a dataset of 590 hospice patients at the time of hospice admission. The retrospective data were obtained from the Lifepath Hospice and Palliative Care center in Hillsborough County, Florida, USA. The criteria used to evaluate and compare the models' predictive performance were the explained variation statistic R(2, scaled Brier score, and the discrimination slope. The explained variation statistic demonstrated that overall the Royston-Parmar family of survival functions provided a better fit (R(2 =0.298; 95% CI: 0.236-0.358 than the Cox model (R(2 =0.156; 95% CI: 0.111-0.203. The scaled Brier scores and discrimination slopes were consistently higher under the Royston-Parmar model. Researchers involved in prognosticating patient survival are encouraged to consider the Royston-Parmar model as an alternative to Cox.

1. Small sample bias in the gamma frailty model for univariate survival.

Science.gov (United States)

Barker, Peter; Henderson, Robin

2005-06-01

The gamma frailty model is a natural extension of the Cox proportional hazards model in survival analysis. Because the frailties are unobserved, an E-M approach is often used for estimation. Such an approach is shown to lead to finite sample underestimation of the frailty variance, with the corresponding regression parameters also being underestimated as a result. For the univariate case, we investigate the source of the bias with simulation studies and a complete enumeration. The rank-based E-M approach, we note, only identifies frailty through the order in which failures occur; additional frailty which is evident in the survival times is ignored, and as a result the frailty variance is underestimated. An adaption of the standard E-M approach is suggested, whereby the non-parametric Breslow estimate is replaced by a local likelihood formulation for the baseline hazard which allows the survival times themselves to enter the model. Simulations demonstrate that this approach substantially reduces the bias, even at small sample sizes. The method developed is applied to survival data from the North West Regional Leukaemia Register.

2. Tobit regression for modeling mean survival time using data subject to multiple sources of censoring.

Science.gov (United States)

Gong, Qi; Schaubel, Douglas E

2018-01-22

Mean survival time is often of inherent interest in medical and epidemiologic studies. In the presence of censoring and when covariate effects are of interest, Cox regression is the strong default, but mostly due to convenience and familiarity. When survival times are uncensored, covariate effects can be estimated as differences in mean survival through linear regression. Tobit regression can validly be performed through maximum likelihood when the censoring times are fixed (ie, known for each subject, even in cases where the outcome is observed). However, Tobit regression is generally inapplicable when the response is subject to random right censoring. We propose Tobit regression methods based on weighted maximum likelihood which are applicable to survival times subject to both fixed and random censoring times. Under the proposed approach, known right censoring is handled naturally through the Tobit model, with inverse probability of censoring weighting used to overcome random censoring. Essentially, the re-weighting data are intended to represent those that would have been observed in the absence of random censoring. We develop methods for estimating the Tobit regression parameter, then the population mean survival time. A closed form large-sample variance estimator is proposed for the regression parameter estimator, with a semiparametric bootstrap standard error estimator derived for the population mean. The proposed methods are easily implementable using standard software. Finite-sample properties are assessed through simulation. The methods are applied to a large cohort of patients wait-listed for kidney transplantation. Copyright © 2018 John Wiley & Sons, Ltd.

3. Modelling human myoblasts survival upon xenotransplantation into immunodeficient mouse muscle.

Science.gov (United States)

Praud, Christophe; Vauchez, Karine; Zongo, Pascal; Vilquin, Jean-Thomas

2018-03-15

Cell transplantation has been challenged in several clinical indications of genetic or acquired muscular diseases, but therapeutic success were mitigated. To understand and improve the yields of tissue regeneration, we aimed at modelling the fate of CD56-positive human myoblasts after transplantation. Using immunodeficient severe combined immunodeficiency (SCID) mice as recipients, we assessed the survival, integration and satellite cell niche occupancy of human myoblasts by a triple immunohistochemical labelling of laminin, dystrophin and human lamin A/C. The counts were integrated into a classical mathematical decline equation. After injection, human cells were essentially located in the endomysium, then they disappeared progressively from D0 to D28. The final number of integrated human nuclei was grossly determined at D2 after injection, suggesting that no more efficient fusion between donor myoblasts and host fibers occurs after the resolution of the local damages created by needle insertion. Almost 1% of implanted human cells occupied a satellite-like cell niche. Our mathematical model validated by histological counting provided a reliable quantitative estimate of human myoblast survival and/or incorporation into SCID muscle fibers. Informations brought by histological labelling and this mathematical model are complementary. Copyright © 2018 Elsevier Inc. All rights reserved.

4. Analyzing sickness absence with statistical models for survival data

DEFF Research Database (Denmark)

Christensen, Karl Bang; Andersen, Per Kragh; Smith-Hansen, Lars

2007-01-01

OBJECTIVES: Sickness absence is the outcome in many epidemiologic studies and is often based on summary measures such as the number of sickness absences per year. In this study the use of modern statistical methods was examined by making better use of the available information. Since sickness...... absence data deal with events occurring over time, the use of statistical models for survival data has been reviewed, and the use of frailty models has been proposed for the analysis of such data. METHODS: Three methods for analyzing data on sickness absences were compared using a simulation study...... between the psychosocial work environment and sickness absence were used to illustrate the results. RESULTS: Standard methods were found to underestimate true effect sizes by approximately one-tenth [method i] and one-third [method ii] and to have lower statistical power than frailty models. CONCLUSIONS...

5. Nonlinear Eddy Viscosity Models applied to Wind Turbine Wakes

DEFF Research Database (Denmark)

Laan, van der, Paul Maarten; Sørensen, Niels N.; Réthoré, Pierre-Elouan

2013-01-01

The linear k−ε eddy viscosity model and modified versions of two existing nonlinear eddy viscosity models are applied to single wind turbine wake simulations using a Reynolds Averaged Navier-Stokes code. Results are compared with field wake measurements. The nonlinear models give better results...

6. Apply Functional Modelling to Consequence Analysis in Supervision Systems

DEFF Research Database (Denmark)

Zhang, Xinxin; Lind, Morten; Gola, Giulio

2013-01-01

This paper will first present the purpose and goals of applying functional modelling approach to consequence analysis by adopting Multilevel Flow Modelling (MFM). MFM Models describe a complex system in multiple abstraction levels in both means-end dimension and whole-part dimension. It contains...

7. Teaching students to apply multiple physical modeling methods

NARCIS (Netherlands)

Wiegers, T.; Verlinden, J.C.; Vergeest, J.S.M.

2014-01-01

Design students should be able to explore a variety of shapes before elaborating one particular shape. Current modelling courses don’t address this issue. We developed the course Rapid Modelling, which teaches students to explore multiple shape models in a short time, applying different methods and

8. SU-E-T-131: Artificial Neural Networks Applied to Overall Survival Prediction for Patients with Periampullary Carcinoma

Energy Technology Data Exchange (ETDEWEB)

Gong, Y; Yu, J; Yeung, V; Palmer, J; Yu, Y; Lu, B; Babinsky, L; Burkhart, R; Leiby, B; Siow, V; Lavu, H; Rosato, E; Winter, J; Lewis, N; Sama, A; Mitchell, E; Anne, P; Hurwitz, M; Yeo, C; Bar-Ad, V [Thomas Jefferson University Hospital, Philadelphia, PA (United States); and others

2015-06-15

Purpose: Artificial neural networks (ANN) can be used to discover complex relations within datasets to help with medical decision making. This study aimed to develop an ANN method to predict two-year overall survival of patients with peri-ampullary cancer (PAC) following resection. Methods: Data were collected from 334 patients with PAC following resection treated in our institutional pancreatic tumor registry between 2006 and 2012. The dataset contains 14 variables including age, gender, T-stage, tumor differentiation, positive-lymph-node ratio, positive resection margins, chemotherapy, radiation therapy, and tumor histology.After censoring for two-year survival analysis, 309 patients were left, of which 44 patients (∼15%) were randomly selected to form testing set. The remaining 265 cases were randomly divided into training set (211 cases, ∼80% of 265) and validation set (54 cases, ∼20% of 265) for 20 times to build 20 ANN models. Each ANN has one hidden layer with 5 units. The 20 ANN models were ranked according to their concordance index (c-index) of prediction on validation sets. To further improve prediction, the top 10% of ANN models were selected, and their outputs averaged for prediction on testing set. Results: By random division, 44 cases in testing set and the remaining 265 cases have approximately equal two-year survival rates, 36.4% and 35.5% respectively. The 20 ANN models, which were trained and validated on the 265 cases, yielded mean c-indexes as 0.59 and 0.63 on validation sets and the testing set, respectively. C-index was 0.72 when the two best ANN models (top 10%) were used in prediction on testing set. The c-index of Cox regression analysis was 0.63. Conclusion: ANN improved survival prediction for patients with PAC. More patient data and further analysis of additional factors may be needed for a more robust model, which will help guide physicians in providing optimal post-operative care. This project was supported by PA CURE Grant.

9. Comparison of two multiaxial fatigue models applied to dental implants

Directory of Open Access Journals (Sweden)

JM. Ayllon

2015-07-01

Full Text Available This paper presents two multiaxial fatigue life prediction models applied to a commercial dental implant. One model is called Variable Initiation Length Model and takes into account both the crack initiation and propagation phases. The second model combines the Theory of Critical Distance with a critical plane damage model to characterise the initiation and initial propagation of micro/meso cracks in the material. This paper discusses which material properties are necessary for the implementation of these models and how to obtain them in the laboratory from simple test specimens. It also describes the FE models developed for the stress/strain and stress intensity factor characterisation in the implant. The results of applying both life prediction models are compared with experimental results arising from the application of ISO-14801 standard to a commercial dental implant.

10. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

Energy Technology Data Exchange (ETDEWEB)

Taktak, Azzam F G [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Fisher, Anthony C [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Damato, Bertil E [Department of Ophthalmology, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom)

2004-01-07

This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate.

11. The HPT Model Applied to a Kayak Company's Registration Process

Science.gov (United States)

Martin, Florence; Hall, Herman A., IV; Blakely, Amanda; Gayford, Matthew C.; Gunter, Erin

2009-01-01

This case study describes the step-by-step application of the traditional human performance technology (HPT) model at a premier kayak company located on the coast of North Carolina. The HPT model was applied to address lost revenues related to three specific business issues: misinformed customers, dissatisfied customers, and guides not showing up…

12. Modeling the long-term kinetics of Salmonella survival on dry pet food.

Science.gov (United States)

Lambertini, Elisabetta; Mishra, Abhinav; Guo, Miao; Cao, Huilin; Buchanan, Robert L; Pradhan, Abani K

2016-09-01

Due to multiple outbreaks and large-scale product recalls, Salmonella has emerged as a priority pathogen in dry pet food and treats. However, little data are available to quantify risks posed by these classes of products to both pets and their owners. Specifically, the kinetics of Salmonella survival on complex pet food matrices are not available. This study measured the long-term kinetics of Salmonella survival on a dry pet food under storage conditions commonly encountered during production, retail, and in households (aw Salmonella enterica cocktail of 12 strains isolated from dry pet foods and treats was used to inoculate commercial dry dog food. Salmonella was enumerated on non-selective (BHI) and selective (XLD and BS) media. Results at 570 days indicated an initial relatively rapid decline (up to 54 days), followed by a much slower extended decline phase. The Weibull model provided a satisfactory fit for time series of Log-transformed Salmonella counts from all three media (δ: mean 4.65 day/Log (CFU/g); p: mean 0.364 on BHI). This study provides a survival model that can be applied in quantitative risk assessment models. Copyright © 2016 Elsevier Ltd. All rights reserved.

13. Factors relating to poor survival rates of aged cervical cancer patients: a population-based study with the relative survival model in Osaka, Japan.

Science.gov (United States)

Ioka, Akiko; Ito, Yuri; Tsukuma, Hideaki

2009-01-01

Poor survival of older cervical cancer patients has been reported; however, related factors, such as the extent of disease and the competitive risk by aging have not been well evaluated. We applied the relative survival model developed by Dickman et al to resolve this issue. Study subjects were cervical cancer patients retrieved from the Osaka Cancer Registry. They were limited to the 10,048 reported cases diagnosed from 1975 to 1999, based on the quality of data collection on vital status. Age at diagnosis was categorized into or = 65 years. The impact of prognostic factors on 5-year survival was evaluated with the relative survival model, incorporating patients' expected survival in multivariate analysis. The age-specific relative excess risk (RER) of death was significantly higher for older groups as compared with women aged 30-54 years (RER, 1.58 at 55-64 and 2.51 at > or = 65 years). The RER was decreased by 64.8% among the 55-64 year olds as an effect of cancer stage at diagnosis, and by 43.4% among those 65 years old and over. After adding adjustment for treatment modalities, the RER was no longer significantly higher among 55-64 year olds; however, it was still higher among 65 year olds and over. Advanced stage at diagnosis was the main determinant of poor survival among the aged cervical cancer patients, although other factors such as limitations on the combination of treatment were also suggested to have an influence in those aged 65 years and over.

14. LEARNING SEMANTICS-ENHANCED LANGUAGE MODELS APPLIED TO UNSUEPRVISED WSD

Energy Technology Data Exchange (ETDEWEB)

VERSPOOR, KARIN [Los Alamos National Laboratory; LIN, SHOU-DE [Los Alamos National Laboratory

2007-01-29

An N-gram language model aims at capturing statistical syntactic word order information from corpora. Although the concept of language models has been applied extensively to handle a variety of NLP problems with reasonable success, the standard model does not incorporate semantic information, and consequently limits its applicability to semantic problems such as word sense disambiguation. We propose a framework that integrates semantic information into the language model schema, allowing a system to exploit both syntactic and semantic information to address NLP problems. Furthermore, acknowledging the limited availability of semantically annotated data, we discuss how the proposed model can be learned without annotated training examples. Finally, we report on a case study showing how the semantics-enhanced language model can be applied to unsupervised word sense disambiguation with promising results.

15. Modeling in applied sciences a kinetic theory approach

CERN Document Server

Pulvirenti, Mario

2000-01-01

Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

16. Re-evaluating neonatal-age models for ungulates: does model choice affect survival estimates?

Directory of Open Access Journals (Sweden)

Troy W Grovenburg

Full Text Available New-hoof growth is regarded as the most reliable metric for predicting age of newborn ungulates, but variation in estimated age among hoof-growth equations that have been developed may affect estimates of survival in staggered-entry models. We used known-age newborns to evaluate variation in age estimates among existing hoof-growth equations and to determine the consequences of that variation on survival estimates. During 2001-2009, we captured and radiocollared 174 newborn (≤24-hrs old ungulates: 76 white-tailed deer (Odocoileus virginianus in Minnesota and South Dakota, 61 mule deer (O. hemionus in California, and 37 pronghorn (Antilocapra americana in South Dakota. Estimated age of known-age newborns differed among hoof-growth models and varied by >15 days for white-tailed deer, >20 days for mule deer, and >10 days for pronghorn. Accuracy (i.e., the proportion of neonates assigned to the correct age in aging newborns using published equations ranged from 0.0% to 39.4% in white-tailed deer, 0.0% to 3.3% in mule deer, and was 0.0% for pronghorns. Results of survival modeling indicated that variability in estimates of age-at-capture affected short-term estimates of survival (i.e., 30 days for white-tailed deer and mule deer, and survival estimates over a longer time frame (i.e., 120 days for mule deer. Conversely, survival estimates for pronghorn were not affected by estimates of age. Our analyses indicate that modeling survival in daily intervals is too fine a temporal scale when age-at-capture is unknown given the potential inaccuracies among equations used to estimate age of neonates. Instead, weekly survival intervals are more appropriate because most models accurately predicted ages within 1 week of the known age. Variation among results of neonatal-age models on short- and long-term estimates of survival for known-age young emphasizes the importance of selecting an appropriate hoof-growth equation and appropriately defining intervals (i

17. Learning to apply models of materials while explaining their properties

Science.gov (United States)

Karpin, Tiia; Juuti, Kalle; Lavonen, Jari

2014-09-01

Background:Applying structural models is important to chemistry education at the upper secondary level, but it is considered one of the most difficult topics to learn. Purpose:This study analyses to what extent in designed lessons students learned to apply structural models in explaining the properties and behaviours of various materials. Sample:An experimental group is 27 Finnish upper secondary school students and control group included 18 students from the same school. Design and methods:In quasi-experimental setting, students were guided through predict, observe, explain activities in four practical work situations. It was intended that the structural models would encourage students to learn how to identify and apply appropriate models when predicting and explaining situations. The lessons, organised over a one-week period, began with a teacher's demonstration and continued with student experiments in which they described the properties and behaviours of six household products representing three different materials. Results:Most students in the experimental group learned to apply the models correctly, as demonstrated by post-test scores that were significantly higher than pre-test scores. The control group showed no significant difference between pre- and post-test scores. Conclusions:The findings indicate that the intervention where students engage in predict, observe, explain activities while several materials and models are confronted at the same time, had a positive effect on learning outcomes.

18. Applying a realistic evaluation model to occupational safety interventions

DEFF Research Database (Denmark)

Pedersen, Louise Møller

2018-01-01

Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal...... and qualitative methods. This revised model has, however, not been applied in a real life context. Method: The model is applied in a controlled, four-component, integrated behaviour-based and safety culture-based safety intervention study (2008-2010) in a medium-sized wood manufacturing company. The interventions...... involve the company’s safety committee, safety manager, safety groups and 130 workers. Results: The model provides a framework for more valid evidence of what works within injury prevention. Affective commitment and role behaviour among key actors are identified as crucial for the implementation...

19. Forecast model applied to quality control with autocorrelational data

Directory of Open Access Journals (Sweden)

2013-11-01

Full Text Available This research approaches the prediction models applied to industrial processes, in order to check the stability of the process by means of control charts, applied to residues from linear modeling. The data used for analysis refers to the moisture content, permeability and compression resistance to the green (RCV, belonging to the casting process of green sand molding in A Company, which operates in the casting and machining, for which dynamic multivariate regression model was set. As the observations were auto-correlated, it was necessary to seek a mathematical model that produces independent and identically distribuibed residues. The models found make possible to understand the variables behavior, assisting in the achievement of the forecasts and in the monitoring of the referred process. Thus, it can be stated that the moisture content is very unstable comparing to the others variables.

20. Methods for model selection in applied science and engineering.

Energy Technology Data Exchange (ETDEWEB)

Field, Richard V., Jr.

2004-10-01

Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

1. Enhancing the Lasso Approach for Developing a Survival Prediction Model Based on Gene Expression Data

Directory of Open Access Journals (Sweden)

Shuhei Kaneko

2015-01-01

Full Text Available In the past decade, researchers in oncology have sought to develop survival prediction models using gene expression data. The least absolute shrinkage and selection operator (lasso has been widely used to select genes that truly correlated with a patient’s survival. The lasso selects genes for prediction by shrinking a large number of coefficients of the candidate genes towards zero based on a tuning parameter that is often determined by a cross-validation (CV. However, this method can pass over (or fail to identify true positive genes (i.e., it identifies false negatives in certain instances, because the lasso tends to favor the development of a simple prediction model. Here, we attempt to monitor the identification of false negatives by developing a method for estimating the number of true positive (TP genes for a series of values of a tuning parameter that assumes a mixture distribution for the lasso estimates. Using our developed method, we performed a simulation study to examine its precision in estimating the number of TP genes. Additionally, we applied our method to a real gene expression dataset and found that it was able to identify genes correlated with survival that a CV method was unable to detect.

2. Lessons Learned from Applying Safety Culture Maturity Model in Thailand

OpenAIRE

Bordin Vongvitayapirom; Punnamee Sachakamol; Hanna Kropsu-Vehkapera; Pekka Kess

2013-01-01

Purpose – The purpose of this paper it to provide practitioner and researcher lessons learned from applying a safety culture maturity model in the oil and gas industry in Thailand. It proposes a roadmap to improve safety culture maturity in an organization. Design/methodology/approach – A safety culture maturity of 5 levels was chosen (Hudson’s model) to be applied in oil and gas company, and a questionnaire survey was conducted with 2,251 employees or 74% of the target group across the compa...

3. External validation of a prognostic model for predicting survival of cirrhotic patients with refractory ascites.

Science.gov (United States)

Guardiola, Jordi; Baliellas, Carme; Xiol, Xavier; Fernandez Esparrach, Glòria; Ginès, Pere; Ventura, Pere; Vazquez, Santiago

2002-09-01

Cirrhotic patients with refractory ascites (RA) have a poor prognosis, although individual survival varies greatly. A model that could predict survival for patients with RA would be helpful in planning treatment. Moreover, in cases of potential liver transplantation, a model of these characteristics would provide the bases for establishing priorities of organ allocation and the selection of patients for a living donor graft. Recently, we developed a model to predict survival of patients with RA. The aim of this study was to establish its generalizability for predicting the survival of patients with RA. The model was validated by assessing its performance in an external cohort of patients with RA included in a multicenter, randomized, controlled trial that compared large-volume paracentesis and peritoneovenous shunt. The values for actual and model-predicted survival of three risk groups of patients, established according to the model, were compared graphically and by means of the one-sample log-rank test. The model provided a very good fit to the survival data of the three risk groups in the validation cohort. We also found good agreement between the survival predicted from the model and the observed survival when patients treated with peritoneovenous shunt and with paracentesis were considered separately. Our survival model can be used to predict the survival of patients with RA and may be a useful tool in clinical decision making, especially in deciding priority for liver transplantation.

4. Survival, transport, and sources of fecal bacteria in streams and survival in land-applied poultry litter in the upper Shoal Creek basin, southwestern Missouri, 2001-2002

Science.gov (United States)

Schumacher, John G.

2003-01-01

five sampling sites along the 5.7-mi study reach of Shoal Creek, but the trends at successive downstream sites were out of phase and could not be explained by simple advection and dispersion. At base-flow conditions, the travel time of bacteria in Shoal Creek along the 5.7-mi reach between State Highway W (site 2) and the MDNR sampling site (site 3) was about 26 hours. Substantial dispersion and dilution occurs along the upper 4.1 mi of this reach because of inflows from a number of springs and tributaries and the presence of several long pools and channel meanders. Minimal dispersion and dilution occurs along the 1.6-mi reach immediately upstream from the MDNR sampling site. Measurements of fecal bacteria decay in Shoal Creek during July 2001 indicated that about 8 percent of fecal coliform and E. coli bacteria decay each hour with an average first-order decay constant of 0.084 h-1 (per hour). Results of field test plots indicated that substantial numbers of fecal bacteria present in poul try litter can survive in fields for as much as 8 weeks after the application of the litter to the land surface. Median densities of fecal coliform and E. coli in slurry-water samples collected from fields increased from less than 60 col/100 mL before the application of turkey and broiler litter, to as large as 420,000 and 290,000 col/100 mL after the application of litter. Bacteria densities in the test plots generally decreased in a exponential manner over time with decay rates ranging from 0.085 to 0.185 d-1 (per day) for fecal coliform to between 0.100 and 0.250 d-1 for E. coli. The apparent survival of significant numbers of fecal bacteria on fields where poultry litter has been applied indicates that runoff from these fields is a potential source of fecal bacteria to vicinity streams for many weeks following litter application.

5. A generalized mixture model applied to diabetes incidence data.

Science.gov (United States)

Zuanetti, Daiane Aparecida; Milan, Luis Aparecido

2017-07-01

We present a generalization of the usual (independent) mixture model to accommodate a Markovian first-order mixing distribution. We propose the data-driven reversible jump, a Markov chain Monte Carlo (MCMC) procedure, for estimating the a posteriori probability for each model in a model selection procedure and estimating the corresponding parameters. Simulated datasets show excellent performance of the proposed method in the convergence, model selection, and precision of parameters estimates. Finally, we apply the proposed method to analyze USA diabetes incidence datasets. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

6. Applied data analysis and modeling for energy engineers and scientists

CERN Document Server

Reddy, T Agami

2011-01-01

""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

7. Survival prediction from clinico-genomic models--a comparative study.

Science.gov (United States)

Bøvelstad, Hege M; Nygård, Ståle; Borgan, Ornulf

2009-12-13

Survival prediction from high-dimensional genomic data is an active field in today's medical research. Most of the proposed prediction methods make use of genomic data alone without considering established clinical covariates that often are available and known to have predictive value. Recent studies suggest that combining clinical and genomic information may improve predictions, but there is a lack of systematic studies on the topic. Also, for the widely used Cox regression model, it is not obvious how to handle such combined models. We propose a way to combine classical clinical covariates with genomic data in a clinico-genomic prediction model based on the Cox regression model. The prediction model is obtained by a simultaneous use of both types of covariates, but applying dimension reduction only to the high-dimensional genomic variables. We describe how this can be done for seven well-known prediction methods: variable selection, unsupervised and supervised principal components regression and partial least squares regression, ridge regression, and the lasso. We further perform a systematic comparison of the performance of prediction models using clinical covariates only, genomic data only, or a combination of the two. The comparison is done using three survival data sets containing both clinical information and microarray gene expression data. Matlab code for the clinico-genomic prediction methods is available at http://www.med.uio.no/imb/stat/bmms/software/clinico-genomic/. Based on our three data sets, the comparison shows that established clinical covariates will often lead to better predictions than what can be obtained from genomic data alone. In the cases where the genomic models are better than the clinical, ridge regression is used for dimension reduction. We also find that the clinico-genomic models tend to outperform the models based on only genomic data. Further, clinico-genomic models and the use of ridge regression gives for all three data sets

8. Analyzing the Survival of Colorectal Cancer Patients of Tehran Taleghani Hospital using Non-Mixture Cure Model

Directory of Open Access Journals (Sweden)

Zahra Abdolalian

2016-12-01

Full Text Available Abstract Background: 4cure models are a model to analyze survival data that these models exist for long term survivors. Cure models are a special type of survival model where it is assumed that there are a proportion of subjects who had never event, thus, survival curve will eventually reach a plateau. Therefore, standard survival models are not appropriate because they do not account for the possibility of cure.The aim of the present research is to apply non-mixture cure model to analyze survival of patients with colorectal cancer. Materials and Methods: We studied 232 patients with colorectal cancer who were visited and treated at Taleghani Hospital Research Center for Gastroenterology and Liver Disease in Tehran. These patients were diagnosed from 1987 to 2012 and followed up until 2013. The Effect of age, gender, family history, body mass index and site of infection were studied. Kaplan-Meier and Non-Mixture cure Model were used for analzing data. Results: The ten-year survival rate after diagnosis in the studied patients was 64 % .A total of 60 (25.8 % deaths due to colorectal cancer were observed. The mean of age at the time of diagnosis was 51.6 years. Based on non-mixed cure model, the rangs of age was 45-65 years old and BMI were significant. Conclusion: When the population is divided into two groups (susceptible and non- susceptible individuals, using Cox semi-parametric model is not appropriate. Therefore, we should use cure models.

9. Modeling Organizational Design - Applying A Formalism Model From Theoretical Physics

Directory of Open Access Journals (Sweden)

Robert Fabac

2008-06-01

Full Text Available Modern organizations are exposed to diverse external environment influences. Currently accepted concepts of organizational design take into account structure, its interaction with strategy, processes, people, etc. Organization design and planning aims to align this key organizational design variables. At the higher conceptual level, however, completely satisfactory formulation for this alignment doesn’t exist. We develop an approach originating from the application of concepts of theoretical physics to social systems. Under this approach, the allocation of organizational resources is analyzed in terms of social entropy, social free energy and social temperature. This allows us to formalize the dynamic relationship between organizational design variables. In this paper we relate this model to Galbraith's Star Model and we also suggest improvements in the procedure of the complex analytical method in organizational design.

10. Remarks on orthotropic elastic models applied to wood

Directory of Open Access Journals (Sweden)

2006-09-01

Full Text Available Wood is generally considered an anisotropic material. In terms of engineering elastic models, wood is usually treated as an orthotropic material. This paper presents an analysis of two principal anisotropic elastic models that are usually applied to wood. The first one, the linear orthotropic model, where the material axes L (Longitudinal, R( radial and T(tangential are coincident with the Cartesian axes (x, y, z, is more accepted as wood elastic model. The other one, the cylindrical orthotropic model is more adequate of the growth caracteristics of wood but more mathematically complex to be adopted in practical terms. Specifically due to its importance in wood elastic parameters, this paper deals with the fiber orientation influence in these models through adequate transformation of coordinates. As a final result, some examples of the linear model, which show the variation of elastic moduli, i.e., Young´s modulus and shear modulus, with fiber orientation are presented.

11. Applying the knowledge creation model to the management of ...

African Journals Online (AJOL)

user

The contribution of evaluation to socialization and externalization of tacit knowledge: The case of the World Bank. Evaluation, 10(3), 263-283. Mclean, E. R. 2004. Measuring e-commerce success: Applying the DeLone & McLean information systems success model. International Journal of Electronic Commerce, 9(1), 31-47.

12. The limitations of applying rational decision-making models to ...

African Journals Online (AJOL)

The aim of this paper is to show the limitations of rational decision-making models as applied to child spacing and more specifically to the use of modern methods of contraception. In the light of factors known to influence low uptake of child spacing services in other African countries, suggestions are made to explain the ...

13. An applied general equilibrium model for Dutch agribusiness policy analysis

NARCIS (Netherlands)

Peerlings, J.

1993-01-01

The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of

14. Applying the Flipped Classroom Model to English Language Arts Education

Science.gov (United States)

Young, Carl A., Ed.; Moran, Clarice M., Ed.

2017-01-01

The flipped classroom method, particularly when used with digital video, has recently attracted many supporters within the education field. Now more than ever, language arts educators can benefit tremendously from incorporating flipped classroom techniques into their curriculum. "Applying the Flipped Classroom Model to English Language Arts…

15. Knowledge Growth: Applied Models of General and Individual Knowledge Evolution

Science.gov (United States)

Silkina, Galina Iu.; Bakanova, Svetlana A.

2016-01-01

The article considers the mathematical models of the growth and accumulation of scientific and applied knowledge since it is seen as the main potential and key competence of modern companies. The problem is examined on two levels--the growth and evolution of objective knowledge and knowledge evolution of a particular individual. Both processes are…

16. Modeling post-fledging survival of lark buntings in response to ecological and biological factors

Science.gov (United States)

Yackel Adams, A.A.; Skagen, S.K.; Savidge, J.A.

2006-01-01

We evaluated the influences of several ecological, biological, and methodological factors on post-fledging survival of a shortgrass prairie bird, the Lark Bunting (Calamospiza melanocorys). We estimated daily post-fledging survival (n = 206, 82 broods) using radiotelemetry and color bands to track fledglings. Daily survival probabilities were best explained by drought intensity, time in season (quadratic trend), ages ≤3 d post-fledging, and rank given drought intensity. Drought intensity had a strong negative effect on survival. Rank was an important predictor of fledgling survival only during the severe drought of 2002 when the smallest fledglings had lower survival. Recently fledged young (ages ≤3 d post-fledging) undergoing the transition from nest to surrounding habitat experienced markedly lower survival, demonstrating the vulnerable nature of this time period. Survival was greater in mid and late season than early season, corresponding to our assumptions of food availability. Neither mark type nor sex of attending parent influenced survival. The model-averaged product of the 22-d survival calculated using mean rank and median value of time in season was 0.360 ± 0.08 in 2001 and 0.276 ± 0.08 in 2002. Survival estimates that account for age, condition of young, ecological conditions, and other factors are important for parameterization of realistic population models. Biologists using population growth models to elucidate mechanisms of population declines should attempt to estimate species-specific of post-fledging survival rather than use generalized estimates.

17. A mixture model for the joint analysis of latent developmental trajectories and survival

NARCIS (Netherlands)

Klein Entink, R.H.; Fox, J.P.; Hout, A. van den

2011-01-01

A general joint modeling framework is proposed that includes a parametric stratified survival component for continuous time survival data, and a mixture multilevel item response component to model latent developmental trajectories given mixed discrete response data. The joint model is illustrated in

18. Applying Model Based Systems Engineering to NASA's Space Communications Networks

Science.gov (United States)

Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

2013-01-01

System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

19. Agrochemical fate models applied in agricultural areas from Colombia

Science.gov (United States)

Garcia-Santos, Glenda; Yang, Jing; Andreoli, Romano; Binder, Claudia

2010-05-01

The misuse application of pesticides in mainly agricultural catchments can lead to severe problems for humans and environment. Especially in developing countries where there is often found overuse of agrochemicals and incipient or lack of water quality monitoring at local and regional levels, models are needed for decision making and hot spots identification. However, the complexity of the water cycle contrasts strongly with the scarce data availability, limiting the number of analysis, techniques, and models available to researchers. Therefore there is a strong need for model simplification able to appropriate model complexity and still represent the processes. We have developed a new model so-called Westpa-Pest to improve water quality management of an agricultural catchment located in the highlands of Colombia. Westpa-Pest is based on the fully distributed hydrologic model Wetspa and a fate pesticide module. We have applied a multi-criteria analysis for model selection under the conditions and data availability found in the region and compared with the new developed Westpa-Pest model. Furthermore, both models were empirically calibrated and validated. The following questions were addressed i) what are the strengths and weaknesses of the models?, ii) which are the most sensitive parameters of each model?, iii) what happens with uncertainties in soil parameters?, and iv) how sensitive are the transfer coefficients?

20. A general diagnostic model applied to language testing data.

Science.gov (United States)

von Davier, Matthias

2008-11-01

Probabilistic models with one or more latent variables are designed to report on a corresponding number of skills or cognitive attributes. Multidimensional skill profiles offer additional information beyond what a single test score can provide, if the reported skills can be identified and distinguished reliably. Many recent approaches to skill profile models are limited to dichotomous data and have made use of computationally intensive estimation methods such as Markov chain Monte Carlo, since standard maximum likelihood (ML) estimation techniques were deemed infeasible. This paper presents a general diagnostic model (GDM) that can be estimated with standard ML techniques and applies to polytomous response variables as well as to skills with two or more proficiency levels. The paper uses one member of a larger class of diagnostic models, a compensatory diagnostic model for dichotomous and partial credit data. Many well-known models, such as univariate and multivariate versions of the Rasch model and the two-parameter logistic item response theory model, the generalized partial credit model, as well as a variety of skill profile models, are special cases of this GDM. In addition to an introduction to this model, the paper presents a parameter recovery study using simulated data and an application to real data from the field test for TOEFL Internet-based testing.

1. Applied systems ecology: models, data, and statistical methods

Energy Technology Data Exchange (ETDEWEB)

Eberhardt, L L

1976-01-01

In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

2. Evaluation of red blood cell labelling methods based on a statistical model for red blood cell survival.

Science.gov (United States)

Korell, Julia; Coulter, Carolyn V; Duffull, Stephen B

2011-12-21

The aim of this work is to compare different labelling methods that are commonly used to estimate the lifespan of red blood cells (RBCs), e.g. in anaemia of renal failure, where the effect of treatment with erythropoietin depends on the lifespan of RBCs. A previously developed model for the survival time of RBCs that accounts for plausible physiological processes of RBC destruction was used to simulate ideal random and cohort labelling methods for RBCs, as well as the flaws associated with these methods (e.g. reuse of label and loss of the label from the surviving RBCs). Random labelling with radioactive chromium and cohort labelling using heavy nitrogen were considered. Blood sampling times were determined for RBC survival studies using both labelling methods by applying the theory of optimal design. It was assessed whether the underlying parameter values of the model are estimable from these studies, and the precision of the parameter estimates were calculated. In theory, parameter estimation would be possible for both types of ideal labelling methods without flaws. However, flaws associated with random labelling are significant and not all parameters controlling RBC survival in the model can be estimated with good precision. In contrast, cohort labelling shows good precision in the parameter estimates even in the presence of reuse and prolonged incorporation of the label. A model based analysis of RBC survival studies is recommended in future to account for limitations in methodology as well as likely causes of RBC destruction. Copyright © 2011 Elsevier Ltd. All rights reserved.

3. Fractional Calculus Model of Electrical Impedance Applied to Human Skin

Science.gov (United States)

Vosika, Zoran B.; Lazovic, Goran M.; Misevic, Gradimir N.; Simic-Krstic, Jovana B.

2013-01-01

Fractional calculus is a mathematical approach dealing with derivatives and integrals of arbitrary and complex orders. Therefore, it adds a new dimension to understand and describe basic nature and behavior of complex systems in an improved way. Here we use the fractional calculus for modeling electrical properties of biological systems. We derived a new class of generalized models for electrical impedance and applied them to human skin by experimental data fitting. The primary model introduces new generalizations of: 1) Weyl fractional derivative operator, 2) Cole equation, and 3) Constant Phase Element (CPE). These generalizations were described by the novel equation which presented parameter related to remnant memory and corrected four essential parameters We further generalized single generalized element by introducing specific partial sum of Maclaurin series determined by parameters We defined individual primary model elements and their serial combination models by the appropriate equations and electrical schemes. Cole equation is a special case of our generalized class of models for Previous bioimpedance data analyses of living systems using basic Cole and serial Cole models show significant imprecisions. Our new class of models considerably improves the quality of fitting, evaluated by mean square errors, for bioimpedance data obtained from human skin. Our models with new parameters presented in specific partial sum of Maclaurin series also extend representation, understanding and description of complex systems electrical properties in terms of remnant memory effects. PMID:23577065

4. Online traffic flow model applying dynamic flow-density relation

CERN Document Server

Kim, Y

2002-01-01

This dissertation describes a new approach of the online traffic flow modelling based on the hydrodynamic traffic flow model and an online process to adapt the flow-density relation dynamically. The new modelling approach was tested based on the real traffic situations in various homogeneous motorway sections and a motorway section with ramps and gave encouraging simulation results. This work is composed of two parts: first the analysis of traffic flow characteristics and second the development of a new online traffic flow model applying these characteristics. For homogeneous motorway sections traffic flow is classified into six different traffic states with different characteristics. Delimitation criteria were developed to separate these states. The hysteresis phenomena were analysed during the transitions between these traffic states. The traffic states and the transitions are represented on a states diagram with the flow axis and the density axis. For motorway sections with ramps the complicated traffic fl...

5. ESTIMATION OF SURVIVAL FUNCTION BASED ON MODELING OF CENSORING PATTERN

OpenAIRE

Akio, Suzukawa; Nobuhiro, Taneichi; Department of Animal Production and Agricultural Economics, Obihiro University

2000-01-01

The Kaplan-Meier estimator(KM-estimator)is an important tool in the analysis of right censored data. It is a non-parametric estimator of an unknown survival function of a lifetime random variable. The purpose of this paper is to obtain a semi-parametric estimator of the survival function. In many practical data, there are several patterns of censoring, for example, censoring is apt to occur for the larger observable time. Such a pattern can be expressed by a function defined by conditional pr...

6. Modeling the effect of temperature on survival rate of Listeria monocytogenes in yogurt.

Science.gov (United States)

Szczawiński, J; Szczawińska, M E; Łobacz, A; Jackowska-Tracz, A

2016-01-01

The aim of the study was to (i) evaluate the behavior of Listeria monocytogenes in a commercially produced yogurt, (ii) determine the survival/inactivation rates of L. monocytogenes during cold storage of yogurt and (iii) to generate primary and secondary mathematical models to predict the behavior of these bacteria during storage at different temperatures. The samples of yogurt were inoculated with the mixture of three L. monocytogenes strains and stored at 3, 6, 9, 12 and 15°C for 16 days. The number of listeriae was determined after 0, 1, 2, 3, 5, 7, 9, 12, 14 and 16 days of storage. From each sample a series of decimal dilutions were prepared and plated onto ALOA agar (agar for Listeria according to Ottaviani and Agosti). It was found that applied temperature and storage time significantly influenced the survival rate of listeriae (pyogurt stored under temperature range from 3 to 15°C, however, the polynomial model gave a better fit to the experimental data.

7. In-season retail sales forecasting using survival models | Hattingh ...

African Journals Online (AJOL)

In order to identify products that should be marked down, the Retailer forecasts future sales of new products. With the aim of improving on the Retailer's current sales forecasting method, this study investigates statistical techniques, viz. classical time series analysis (Holt's smoothing method) and survival analysis. Forecasts ...

8. Modeling growth performances, survival, and feed efficiency of four ...

African Journals Online (AJOL)

Survival, feed efficiency and growth performances of four local breeds of chickens in West Cameroon (normally feathered NF, feathered tarsus FT, crested C and naked neck NN,) have been compared from hatch to 16 weeks, to determine which one could be improved by selection. Gompertz equation was used to fit growth ...

9. Applying a Dynamic Resource Supply Model in a Smart Grid

Directory of Open Access Journals (Sweden)

Kaiyu Wan

2014-09-01

Full Text Available Dynamic resource supply is a complex issue to resolve in a cyber-physical system (CPS. In our previous work, a resource model called the dynamic resource supply model (DRSM has been proposed to handle resources specification, management and allocation in CPS. In this paper, we are integrating the DRSM with service-oriented architecture and applying it to a smart grid (SG, one of the most complex CPS examples. We give the detailed design of the SG for electricity charging request and electricity allocation between plug-in hybrid electric vehicles (PHEV and DRSM through the Android system. In the design, we explain a mechanism for electricity consumption with data collection and re-allocation through ZigBee network. In this design, we verify the correctness of this resource model for expected electricity allocation.

10. Dynamic Decision Making for Graphical Models Applied to Oil Exploration

CERN Document Server

Martinelli, Gabriele; Hauge, Ragnar

2012-01-01

We present a framework for sequential decision making in problems described by graphical models. The setting is given by dependent discrete random variables with associated costs or revenues. In our examples, the dependent variables are the potential outcomes (oil, gas or dry) when drilling a petroleum well. The goal is to develop an optimal selection strategy that incorporates a chosen utility function within an approximated dynamic programming scheme. We propose and compare different approximations, from simple heuristics to more complex iterative schemes, and we discuss their computational properties. We apply our strategies to oil exploration over multiple prospects modeled by a directed acyclic graph, and to a reservoir drilling decision problem modeled by a Markov random field. The results show that the suggested strategies clearly improve the simpler intuitive constructions, and this is useful when selecting exploration policies.

11. Climate Change and Market Collapse: A Model Applied to Darfur

Directory of Open Access Journals (Sweden)

Ola Olsson

2016-03-01

Full Text Available A recurring argument in the global debate is that climate deterioration is likely to make social conflicts over diminishing natural resources more common in the future. The exact mechanism behind such a development has so far not been successfully characterized in the literature. In this paper, we present a general model of a community populated by farmers and herders who can either divide up land in a market economy or in autarky. The key insight from our model is that decreasing resources can make trade between the two groups collapse, which in turn makes each group’s welfare independent of that of the other. Predictions from the model are then applied to the conflict in Darfur. Our analysis suggests that three decades of drought in the area can at least partially explain the observed disintegration of markets and the subsequent rise of social tensions.

12. Remote sensing applied to numerical modelling. [water resources pollution

Science.gov (United States)

Sengupta, S.; Lee, S. S.; Veziroglu, T. N.; Bland, R.

1975-01-01

Progress and remaining difficulties in the construction of predictive mathematical models of large bodies of water as ecosystems are reviewed. Surface temperature is at present the only variable than can be measured accurately and reliably by remote sensing techniques, but satellite infrared data are of sufficient resolution for macro-scale modeling of oceans and large lakes, and airborne radiometers are useful in meso-scale analysis (of lakes, bays, and thermal plumes). Finite-element and finite-difference techniques applied to the solution of relevant coupled time-dependent nonlinear partial differential equations are compared, and the specific problem of the Biscayne Bay and environs ecosystem is tackled in a finite-differences treatment using the rigid-lid model and a rigid-line grid system.

13. A special case of reduced rank models for identification and modelling of time varying effects in survival analysis.

Science.gov (United States)

Perperoglou, Aris

2016-12-10

Flexible survival models are in need when modelling data from long term follow-up studies. In many cases, the assumption of proportionality imposed by a Cox model will not be valid. Instead, a model that can identify time varying effects of fixed covariates can be used. Although there are several approaches that deal with this problem, it is not always straightforward how to choose which covariates should be modelled having time varying effects and which not. At the same time, it is up to the researcher to define appropriate time functions that describe the dynamic pattern of the effects. In this work, we suggest a model that can deal with both fixed and time varying effects and uses simple hypotheses tests to distinguish which covariates do have dynamic effects. The model is an extension of the parsimonious reduced rank model of rank 1. As such, the number of parameters is kept low, and thus, a flexible set of time functions, such as b-splines, can be used. The basic theory is illustrated along with an efficient fitting algorithm. The proposed method is applied to a dataset of breast cancer patients and compared with a multivariate fractional polynomials approach for modelling time-varying effects. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

14. Applying Mechanistic Dam Breach Models to Historic Levee Breaches

Directory of Open Access Journals (Sweden)

Risher Paul

2016-01-01

Full Text Available Hurricane Katrina elevated levee risk in the US national consciousness, motivating agencies to assess and improve their levee risk assessment methodology. Accurate computation of the flood flow magnitude and timing associated with a levee breach remains one of the most difficult and uncertain components of levee risk analysis. Contemporary methods are largely empirical and approximate, introducing substantial uncertainty to the damage and life loss models. Levee breach progressions are often extrapolated to the final width and breach formation time based on limited experience with past breaches or using regression equations developed from a limited data base of dam failures. Physically based embankment erosion models could improve levee breach modeling. However, while several mechanistic embankment breach models are available, they were developed for dams. Several aspects of the levee breach problem are distinct, departing from dam breach assumptions. This study applies three embankments models developed for dam breach analysis (DL Breach, HR BREACH, and WinDAM C to historic levee breaches with observed (or inferred breach rates, assessing the limitations, and applicability of each model to the levee breach problem.

15. Survival of hendra virus in the environment: modelling the effect of temperature.

Science.gov (United States)

Scanlan, J C; Kung, N Y; Selleck, P W; Field, H E

2015-03-01

Hendra virus (HeV), a highly pathogenic zoonotic paramyxovirus recently emerged from bats, is a major concern to the horse industry in Australia. Previous research has shown that higher temperatures led to lower virus survival rates in the laboratory. We develop a model of survival of HeV in the environment as influenced by temperature. We used 20 years of daily temperature at six locations spanning the geographic range of reported HeV incidents to simulate the temporal and spatial impacts of temperature on HeV survival. At any location, simulated virus survival was greater in winter than in summer, and in any month of the year, survival was higher in higher latitudes. At any location, year-to-year variation in virus survival 24 h post-excretion was substantial and was as large as the difference between locations. Survival was higher in microhabitats with lower than ambient temperature, and when environmental exposure was shorter. The within-year pattern of virus survival mirrored the cumulative within-year occurrence of reported HeV cases, although there were no overall differences in survival in HeV case years and non-case years. The model examines the effect of temperature in isolation; actual virus survivability will reflect the effect of additional environmental factors.

16. Estimation of direct effects for survival data by using the Aalen additive hazards model

DEFF Research Database (Denmark)

Martinussen, T.; Vansteelandt, S.; Gerster, M.

2011-01-01

We extend the definition of the controlled direct effect of a point exposure on a survival outcome, other than through some given, time-fixed intermediate variable, to the additive hazard scale. We propose two-stage estimators for this effect when the exposure is dichotomous and randomly assigned...... Aalen's additive regression for the event time, given exposure, intermediate variable and confounders. The second stage involves applying Aalen's additive model, given the exposure alone, to a modified stochastic process (i.e. a modification of the observed counting process based on the first......-stage estimates). We give the large sample properties of the estimator proposed and investigate its small sample properties by Monte Carlo simulation. A real data example is provided for illustration....

17. Inverse geothermal modelling applied to Danish sedimentary basins

Science.gov (United States)

Poulsen, Søren E.; Balling, Niels; Bording, Thue S.; Mathiesen, Anders; Nielsen, Søren B.

2017-10-01

This paper presents a numerical procedure for predicting subsurface temperatures and heat-flow distribution in 3-D using inverse calibration methodology. The procedure is based on a modified version of the groundwater code MODFLOW by taking advantage of the mathematical similarity between confined groundwater flow (Darcy's law) and heat conduction (Fourier's law). Thermal conductivity, heat production and exponential porosity-depth relations are specified separately for the individual geological units of the model domain. The steady-state temperature model includes a model-based transient correction for the long-term palaeoclimatic thermal disturbance of the subsurface temperature regime. Variable model parameters are estimated by inversion of measured borehole temperatures with uncertainties reflecting their quality. The procedure facilitates uncertainty estimation for temperature predictions. The modelling procedure is applied to Danish onshore areas containing deep sedimentary basins. A 3-D voxel-based model, with 14 lithological units from surface to 5000 m depth, was built from digital geological maps derived from combined analyses of reflection seismic lines and borehole information. Matrix thermal conductivity of model lithologies was estimated by inversion of all available deep borehole temperature data and applied together with prescribed background heat flow to derive the 3-D subsurface temperature distribution. Modelled temperatures are found to agree very well with observations. The numerical model was utilized for predicting and contouring temperatures at 2000 and 3000 m depths and for two main geothermal reservoir units, the Gassum (Lower Jurassic-Upper Triassic) and Bunter/Skagerrak (Triassic) reservoirs, both currently utilized for geothermal energy production. Temperature gradients to depths of 2000-3000 m are generally around 25-30 °C km-1, locally up to about 35 °C km-1. Large regions have geothermal reservoirs with characteristic temperatures

18. Applied economic model development algorithm for electronics company

Directory of Open Access Journals (Sweden)

Mikhailov I.

2017-01-01

Full Text Available The purpose of this paper is to report about received experience in the field of creating the actual methods and algorithms that help to simplify development of applied decision support systems. It reports about an algorithm, which is a result of two years research and have more than one-year practical verification. In a case of testing electronic components, the time of the contract conclusion is crucial point to make the greatest managerial mistake. At this stage, it is difficult to achieve a realistic assessment of time-limit and of wage-fund for future work. The creation of estimating model is possible way to solve this problem. In the article is represented an algorithm for creation of those models. The algorithm is based on example of the analytical model development that serves for amount of work estimation. The paper lists the algorithm’s stages and explains their meanings with participants’ goals. The implementation of the algorithm have made possible twofold acceleration of these models development and fulfilment of management’s requirements. The resulting models have made a significant economic effect. A new set of tasks was identified to be further theoretical study.

19. Nature preservation acceptance model applied to tanker oil spill simulations

DEFF Research Database (Denmark)

Friis-Hansen, Peter; Ditlevsen, Ove Dalager

2003-01-01

is exemplified by a study of oil spills due to simulated tanker collisions in the Danish straits. It is found that the distribution of the oil spill volume per spill is well represented by an exponential distribution both in Oeresund and in Great Belt. When applied in the Poisson model, a risk profile reasonably...... close to the standard lognormal profile is obtained. Moreover, based on data pairs (volume, cost) for world wide oil spills it is inferred that the conditional distribution of the costs given the spill volume is well modeled by a lognormal distribution. By unconditioning by the exponential distribution...... of the single oil spill, a risk profile for the costs is obtained that is indistinguishable from the standard lognormal risk profile.Finally the question of formulating a public risk acceptance criterion is addressed following Ditlevsen, and it is argued that a Nature Preservation Willingness Index can...

20. Enhanced pid vs model predictive control applied to bldc motor

Science.gov (United States)

Gaya, M. S.; Muhammad, Auwal; Aliyu Abdulkadir, Rabiu; Salim, S. N. S.; Madugu, I. S.; Tijjani, Aminu; Aminu Yusuf, Lukman; Dauda Umar, Ibrahim; Khairi, M. T. M.

2018-01-01

BrushLess Direct Current (BLDC) motor is a multivariable and highly complex nonlinear system. Variation of internal parameter values with environment or reference signal increases the difficulty in controlling the BLDC effectively. Advanced control strategies (like model predictive control) often have to be integrated to satisfy the control desires. Enhancing or proper tuning of a conventional algorithm results in achieving the desired performance. This paper presents a performance comparison of Enhanced PID and Model Predictive Control (MPC) applied to brushless direct current motor. The simulation results demonstrated that the PSO-PID is slightly better than the PID and MPC in tracking the trajectory of the reference signal. The proposed scheme could be useful algorithms for the system.

1. Applying the Canonical Text Services Model to the Coptic SCRIPTORIUM

Directory of Open Access Journals (Sweden)

Bridget Almas

2016-11-01

Full Text Available Coptic SCRIPTORIUM is a platform for interdisciplinary and computational research in Coptic texts and linguistics. The purpose of this project was to research and implement a system of stable identification for the texts and linguistic data objects in Coptic SCRIPTORIUM to facilitate their citation and reuse. We began the project with a preferred solution, the Canonical Text Services URN model, which we validated for suitability for the corpus and compared it to other approaches, including HTTP URLs and Handles. The process of applying the CTS model to Coptic SCRIPTORIUM required an in-depth analysis that took into account the domain-specific scholarly research and citation practices, the structure of the textual data, and the data management workflow.

2. Estimation of Survival Probabilities for Use in Cost-effectiveness Analyses: A Comparison of a Multi-state Modeling Survival Analysis Approach with Partitioned Survival and Markov Decision-Analytic Modeling.

Science.gov (United States)

Williams, Claire; Lewsey, James D; Mackay, Daniel F; Briggs, Andrew H

2017-05-01

Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results.

3. Jointly modeling longitudinal proportional data and survival times with an application to the quality of life data in a breast cancer trial.

Science.gov (United States)

Song, Hui; Peng, Yingwei; Tu, Dongsheng

2017-04-01

Motivated by the joint analysis of longitudinal quality of life data and recurrence free survival times from a cancer clinical trial, we present in this paper two approaches to jointly model the longitudinal proportional measurements, which are confined in a finite interval, and survival data. Both approaches assume a proportional hazards model for the survival times. For the longitudinal component, the first approach applies the classical linear mixed model to logit transformed responses, while the second approach directly models the responses using a simplex distribution. A semiparametric method based on a penalized joint likelihood generated by the Laplace approximation is derived to fit the joint model defined by the second approach. The proposed procedures are evaluated in a simulation study and applied to the analysis of breast cancer data motivated this research.

4. Prognostic Model for Survival in Patients With Early Stage Cervical Cancer

NARCIS (Netherlands)

Biewenga, Petra; van der Velden, Jacobus; Mol, Ben Willem J.; Stalpers, Lukas J. A.; Schilthuis, Marten S.; van der Steeg, Jan Willem; Burger, Matthé P. M.; Buist, Marrije R.

2011-01-01

BACKGROUND: In the management of early stage cervical cancer, knowledge about the prognosis is critical. Although many factors have an impact on survival, their relative importance remains controversial. This study aims to develop a prognostic model for survival in early stage cervical cancer

5. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

Science.gov (United States)

Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

2016-01-01

One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.

6. Risk matrix model applied to the outsourcing of logistics' activities

Directory of Open Access Journals (Sweden)

2015-09-01

Full Text Available Purpose: This paper proposes the application of the risk matrix model in the field of logistics outsourcing. Such an application can serve as the basis for decision making regarding the conduct of a risk management in the logistics outsourcing process and allow its prevention. Design/methodology/approach: This study is based on the risk management of logistics outsourcing in the field of the retail sector in Morocco. The authors identify all possible risks and then classify and prioritize them using the Risk Matrix Model. Finally, we have come to four possible decisions for the identified risks. The analysis was made possible through interviews and discussions with the heads of departments and agents who are directly involved in each outsourced activity. Findings and Originality/value: It is possible to improve the risk matrix model by proposing more personalized prevention measures according to each company that operates in the mass-market retailing. Originality/value: This study is the only one made in the process of logistics outsourcing in the retail sector in Morocco through Label’vie as a case study. First, we had identified as thorough as we could all possible risks, then we applied the Risk Matrix Model to sort them out in an ascending order of importance and criticality. As a result, we could hand out to the decision-makers the mapping for an effective control of risks and a better guiding of the process of risk management.

7. Model output statistics applied to wind power prediction

Energy Technology Data Exchange (ETDEWEB)

Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

1999-03-01

Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

8. Applying the model of excellence in dental healthcare

Directory of Open Access Journals (Sweden)

Tekić Jasmina

2015-01-01

Full Text Available Introduction. Models of excellence are considered a practical tool in the field of management that should help a variety of organizations, including dental, to carry out the measurement of the quality of provided services, and so define their position in relation to excellence. The quality of healthcare implies the degree within which the system of healthcare and health services increases the likelihood of positive treatment outcome. Objective. The aim of the present study was to define a model of excellence in the field of dental healthcare (DHC in the Republic of Serbia and suggest the model of DHC whose services will have the characteristics of outstanding service in the dental practice. Methods. In this study a specially designed questionnaire was used for the assessment of the maturity level of applied management regarding quality in healthcare organizations of the Republic of Serbia. The questionnaire consists of 13 units and a total of 240 questions. Results. The results of the study were discussed involving four areas: (1 defining the main criteria and sub-criteria, (2 the elements of excellence of DHC in the Republic of Serbia, (3 the quality of DHC in the Republic of Serbia, and (4 defining the framework of the model of excellence for the DHC in the Republic of Serbia. The main criteria which defined the framework and implementation model of excellence in the field of DHC in Serbia were: leadership, management, human resources, policy and strategy, other resources, processes, patients’ satisfaction, employee’s satisfaction, impact on society and business results. The model had two main parts: the possibilities for the first five criteria and options for the other four criteria. Conclusion. Excellence in DHC business as well as the excellence of provided dental services are increasingly becoming the norm and good practice, and progressively less the exception.

9. Individual patient data meta-analysis of survival data using Poisson regression models

Directory of Open Access Journals (Sweden)

Crowther Michael J

2012-03-01

Full Text Available Abstract Background An Individual Patient Data (IPD meta-analysis is often considered the gold-standard for synthesising survival data from clinical trials. An IPD meta-analysis can be achieved by either a two-stage or a one-stage approach, depending on whether the trials are analysed separately or simultaneously. A range of one-stage hierarchical Cox models have been previously proposed, but these are known to be computationally intensive and are not currently available in all standard statistical software. We describe an alternative approach using Poisson based Generalised Linear Models (GLMs. Methods We illustrate, through application and simulation, the Poisson approach both classically and in a Bayesian framework, in two-stage and one-stage approaches. We outline the benefits of our one-stage approach through extension to modelling treatment-covariate interactions and non-proportional hazards. Ten trials of hypertension treatment, with all-cause death the outcome of interest, are used to apply and assess the approach. Results We show that the Poisson approach obtains almost identical estimates to the Cox model, is additionally computationally efficient and directly estimates the baseline hazard. Some downward bias is observed in classical estimates of the heterogeneity in the treatment effect, with improved performance from the Bayesian approach. Conclusion Our approach provides a highly flexible and computationally efficient framework, available in all standard statistical software, to the investigation of not only heterogeneity, but the presence of non-proportional hazards and treatment effect modifiers.

10. Individual patient data meta-analysis of survival data using Poisson regression models.

Science.gov (United States)

Crowther, Michael J; Riley, Richard D; Staessen, Jan A; Wang, Jiguang; Gueyffier, Francois; Lambert, Paul C

2012-03-23

An Individual Patient Data (IPD) meta-analysis is often considered the gold-standard for synthesising survival data from clinical trials. An IPD meta-analysis can be achieved by either a two-stage or a one-stage approach, depending on whether the trials are analysed separately or simultaneously. A range of one-stage hierarchical Cox models have been previously proposed, but these are known to be computationally intensive and are not currently available in all standard statistical software. We describe an alternative approach using Poisson based Generalised Linear Models (GLMs). We illustrate, through application and simulation, the Poisson approach both classically and in a Bayesian framework, in two-stage and one-stage approaches. We outline the benefits of our one-stage approach through extension to modelling treatment-covariate interactions and non-proportional hazards. Ten trials of hypertension treatment, with all-cause death the outcome of interest, are used to apply and assess the approach. We show that the Poisson approach obtains almost identical estimates to the Cox model, is additionally computationally efficient and directly estimates the baseline hazard. Some downward bias is observed in classical estimates of the heterogeneity in the treatment effect, with improved performance from the Bayesian approach. Our approach provides a highly flexible and computationally efficient framework, available in all standard statistical software, to the investigation of not only heterogeneity, but the presence of non-proportional hazards and treatment effect modifiers.

11. Applying Atmospheric Measurements to Constrain Parameters of Terrestrial Source Models

Science.gov (United States)

Hyer, E. J.; Kasischke, E. S.; Allen, D. J.

2004-12-01

Quantitative inversions of atmospheric measurements have been widely applied to constrain atmospheric budgets of a range of trace gases. Experiments of this type have revealed persistent discrepancies between 'bottom-up' and 'top-down' estimates of source magnitudes. The most common atmospheric inversion uses the absolute magnitude as the sole parameter for each source, and returns the optimal value of that parameter. In order for atmospheric measurements to be useful for improving 'bottom-up' models of terrestrial sources, information about other properties of the sources must be extracted. As the density and quality of atmospheric trace gas measurements improve, examination of higher-order properties of trace gas sources should become possible. Our model of boreal forest fire emissions is parameterized to permit flexible examination of the key uncertainties in this source. Using output from this model together with the UM CTM, we examined the sensitivity of CO concentration measurements made by the MOPITT instrument to various uncertainties in the boreal source: geographic distribution of burned area, fire type (crown fires vs. surface fires), and fuel consumption in above-ground and ground-layer fuels. Our results indicate that carefully designed inversion experiments have the potential to help constrain not only the absolute magnitudes of terrestrial sources, but also the key uncertainties associated with 'bottom-up' estimates of those sources.

12. Linear model applied to the evaluation of pharmaceutical stability data

Directory of Open Access Journals (Sweden)

Renato Cesar Souza

2013-09-01

Full Text Available The expiry date on the packaging of a product gives the consumer the confidence that the product will retain its identity, content, quality and purity throughout the period of validity of the drug. The definition of this term in the pharmaceutical industry is based on stability data obtained during the product registration. By the above, this work aims to apply the linear regression according to the guideline ICH Q1E, 2003, to evaluate some aspects of a product undergoing in a registration phase in Brazil. With this propose, the evaluation was realized with the development center of a multinational company in Brazil, with samples of three different batches composed by two active principal ingredients in two different packages. Based on the preliminary results obtained, it was possible to observe the difference of degradation tendency of the product in two different packages and the relationship between the variables studied, added knowledge so new models of linear equations can be applied and developed for other products.

13. Applying threshold models to donations to a green electricity fund

Energy Technology Data Exchange (ETDEWEB)

Ito, Nobuyuki; Takeuchi, Kenji [Graduate School of Economics, Kobe University (Japan); Tsuge, Takahiro [Faculty of Economics, Konan University (Japan); Kishimoto, Atsuo [National Institute of Advanced Industrial Science and Technology (Japan)

2010-04-15

This study applies a threshold model proposed by to analyze the diffusion process of donating behavior for renewable energy. We first use a stated preference survey to estimate the determinants of a decision to support the donation scheme under various predicted participation rates. Using the estimated coefficients, we simulate how herd behavior spreads and the participation rate reaches the equilibrium. The participation rate at the equilibrium is estimated as 37.88% when the suggested donation is 500 yen, while it is 17.76% when the suggested amount is 1000 yen. The influence of environmentalism and altruism is also examined, and we find that these motivations increase the participation rate by 31.51% on average. (author)

14. The development of simple survival prediction models for blunt trauma victims treated at Asian emergency centers.

Science.gov (United States)

Kimura, Akio; Nakahara, Shinji; Chadbunchachai, Witaya

2012-02-02

For real-time assessment of the probability of survival (Ps) of blunt trauma victims at emergency centers, this study aimed to establish regression models for estimating Ps using simplified coefficients. The data of 10,210 blunt trauma patients not missing both the binary outcome data about survival and the data necessary for Ps calculation by The Trauma and Injury Severity Score (TRISS) method were extracted from the Japan Trauma Data Bank (2004-2007) and analyzed. Half (5,113) of the data was allocated to a derivation data set, with the other half (5,097) allocated to a validation data set. The data of 6,407 blunt trauma victims from the trauma registry of Khon Kaen Regional Hospital in Thailand were analyzed for validation. The logistic regression models included age, the Injury Severity Score (ISS), the Glasgow Coma Scale score (GCS), systolic blood pressure (SBP), respiratory rate (RR), and their coded values (cAGE, 0-1; cISS, 0-4; cSBP, 0-4; cGCS, 0-4; cRR, 0-4) as predictor variables. The coefficients were simplified by rounding off after the decimal point or choosing 0.5 if the coefficients varied across 0.5. The area under the receiver-operating characteristic curve (AUROCC) was calculated for each model to measure discriminant ability. A group of formulas (log (Ps/1-Ps) = logit (Ps) = -9 + cISS - cAGE + cSBP + cGCS + cRR/2, where -9 becomes -7 if the predictor variable of cRR or cISS is missing) was developed. Using these formulas, the AUROCCs were between 0.950 and 0.964. When these models were applied to the Khon Kean data, their AUROCCs were greater than 0.91. These equations allow physicians to perform real-time assessments of survival by easy mental calculations at Asian emergency centers, which are overcrowded with blunt injury victims of traffic accidents. © 2012 Kimura et al; licensee BioMed Central Ltd.

15. The development of simple survival prediction models for blunt trauma victims treated at Asian emergency centers

Directory of Open Access Journals (Sweden)

Kimura Akio

2012-02-01

Full Text Available Abstract Background For real-time assessment of the probability of survival (Ps of blunt trauma victims at emergency centers, this study aimed to establish regression models for estimating Ps using simplified coefficients. Methods The data of 10,210 blunt trauma patients not missing both the binary outcome data about survival and the data necessary for Ps calculation by The Trauma and Injury Severity Score (TRISS method were extracted from the Japan Trauma Data Bank (2004-2007 and analyzed. Half (5,113 of the data was allocated to a derivation data set, with the other half (5,097 allocated to a validation data set. The data of 6,407 blunt trauma victims from the trauma registry of Khon Kaen Regional Hospital in Thailand were analyzed for validation. The logistic regression models included age, the Injury Severity Score (ISS, the Glasgow Coma Scale score (GCS, systolic blood pressure (SBP, respiratory rate (RR, and their coded values (cAGE, 0-1; cISS, 0-4; cSBP, 0-4; cGCS, 0-4; cRR, 0-4 as predictor variables. The coefficients were simplified by rounding off after the decimal point or choosing 0.5 if the coefficients varied across 0.5. The area under the receiver-operating characteristic curve (AUROCC was calculated for each model to measure discriminant ability. Results A group of formulas (log (Ps/1-Ps = logit (Ps = -9 + cISS - cAGE + cSBP + cGCS + cRR/2, where -9 becomes -7 if the predictor variable of cRR or cISS is missing was developed. Using these formulas, the AUROCCs were between 0.950 and 0.964. When these models were applied to the Khon Kean data, their AUROCCs were greater than 0.91. Conclusion: These equations allow physicians to perform real-time assessments of survival by easy mental calculations at Asian emergency centers, which are overcrowded with blunt injury victims of traffic accidents.

16. A prognostic model for lung adenocarcinoma patient survival with a focus on four miRNAs.

Science.gov (United States)

Li, Xianqiu; An, Zhaoling; Li, Peihui; Liu, Haihua

2017-09-01

There is currently no effective biomarker for determining the survival of patients with lung adenocarcinoma. The purpose of the present study was to construct a prognostic survival model using microRNA (miRNA) expression data from patients with lung adenocarcinoma. miRNA data were obtained from The Cancer Genome Atlas, and patients with lung adenocarcinoma were divided into either the training or validation set based on the random allocation principle. The prognostic model focusing on miRNA was constructed, and patients were divided into high-risk or low-risk groups as per the scores, to assess their survival time. The 5-year survival rate from the subgroups within the high- and low-risk groups was assessed. P-values of the prognostic model in the total population, the training set and validation set were 0.0017, 0.01986 and 0.02773, respectively, indicating that the survival time of the lung adenocarcinoma high-risk group was less than that of the low-risk group. Thus, the model had a good assessment effectiveness for the untreated group (P=0.00088) and the Caucasian patient group (P=0.00043). In addition, the model had the best prediction effect for the 5-year survival rate of the Caucasian patient group (AUC=0.629). In conclusion, the prognostic model developed in the present study can be used as an independent prognostic model for patients with lung adenocarcinoma.

17. Modeling age and nest-specific survival using a hierarchical Bayesian approach.

Science.gov (United States)

Cao, Jing; He, Chong Z; Suedkamp Wells, Kimberly M; Millspaugh, Joshua J; Ryan, Mark R

2009-12-01

Recent studies have shown that grassland birds are declining more rapidly than any other group of terrestrial birds. Current methods of estimating avian age-specific nest survival rates require knowing the ages of nests, assuming homogeneous nests in terms of nest survival rates, or treating the hazard function as a piecewise step function. In this article, we propose a Bayesian hierarchical model with nest-specific covariates to estimate age-specific daily survival probabilities without the above requirements. The model provides a smooth estimate of the nest survival curve and identifies the factors that are related to the nest survival. The model can handle irregular visiting schedules and it has the least restrictive assumptions compared to existing methods. Without assuming proportional hazards, we use a multinomial semiparametric logit model to specify a direct relation between age-specific nest failure probability and nest-specific covariates. An intrinsic autoregressive prior is employed for the nest age effect. This nonparametric prior provides a more flexible alternative to the parametric assumptions. The Bayesian computation is efficient because the full conditional posterior distributions either have closed forms or are log concave. We use the method to analyze a Missouri dickcissel dataset and find that (1) nest survival is not homogeneous during the nesting period, and it reaches its lowest at the transition from incubation to nestling; and (2) nest survival is related to grass cover and vegetation height in the study area.

18. Model of white oak flower survival and maturation

Science.gov (United States)

David R. Larsen; Robert A. Cecich

1997-01-01

A stochastic model of oak flower dynamics is presented that integrates a number of factors which appear to affect the oak pistillate flower development process. The factors are modeled such that the distribution of the predicted flower populations could have come from the same distribution as the observed flower populations. Factors included in the model are; the range...

19. Parametric and semiparametric models with applications to reliability, survival analysis, and quality of life

CERN Document Server

Nikulin, M; Mesbah, M; Limnios, N

2004-01-01

Parametric and semiparametric models are tools with a wide range of applications to reliability, survival analysis, and quality of life. This self-contained volume examines these tools in survey articles written by experts currently working on the development and evaluation of models and methods. While a number of chapters deal with general theory, several explore more specific connections and recent results in "real-world" reliability theory, survival analysis, and related fields.

20. The log-Burr XII regression model for grouped survival data.

Science.gov (United States)

Hashimoto, Elizabeth M; Ortega, Edwin M M; Cordeiro, Gauss M; Barreto, Mauricio L

2012-01-01

The log-Burr XII regression model for grouped survival data is evaluated in the presence of many ties. The methodology for grouped survival data is based on life tables, where the times are grouped in k intervals, and we fit discrete lifetime regression models to the data. The model parameters are estimated by maximum likelihood and jackknife methods. To detect influential observations in the proposed model, diagnostic measures based on case deletion, so-called global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to these measures, the total local influence and influential estimates are also used. We conduct Monte Carlo simulation studies to assess the finite sample behavior of the maximum likelihood estimators of the proposed model for grouped survival. A real data set is analyzed using a regression model for grouped data.

1. Hierarchic stochastic modelling applied to intracellular Ca(2+ signals.

Directory of Open Access Journals (Sweden)

Gregor Moenke

Full Text Available Important biological processes like cell signalling and gene expression have noisy components and are very complex at the same time. Mathematical analysis of such systems has often been limited to the study of isolated subsystems, or approximations are used that are difficult to justify. Here we extend a recently published method (Thurley and Falcke, PNAS 2011 which is formulated in observable system configurations instead of molecular transitions. This reduces the number of system states by several orders of magnitude and avoids fitting of kinetic parameters. The method is applied to Ca(2+ signalling. Ca(2+ is a ubiquitous second messenger transmitting information by stochastic sequences of concentration spikes, which arise by coupling of subcellular Ca(2+ release events (puffs. We derive analytical expressions for a mechanistic Ca(2+ model, based on recent data from live cell imaging, and calculate Ca(2+ spike statistics in dependence on cellular parameters like stimulus strength or number of Ca(2+ channels. The new approach substantiates a generic Ca(2+ model, which is a very convenient way to simulate Ca(2+ spike sequences with correct spiking statistics.

2. Growth and survival of larval and early juvenile lesser sandeel in patchy prey field in the North Sea: An examination using individual-based modelling

DEFF Research Database (Denmark)

Gürkan, Zeren; Christensen, Asbjørn; Deurs, Mikael van

2012-01-01

concentrations is regarded important for survival. Intense aggregations of zooplankton in near-surface waters provide these conditions for larval fish. Simulation studies by individual-based modeling can help understanding of the mechanisms for survival during early life-stages. In this study, we examined how...... growth and survival of larvae and early juveniles of Lesser Sandeel (Ammodytes marinus) in the North Sea are influenced by availability and patchiness of the planktonic prey by adapting and applying a generic bioenergetic individual-based model for larval fish. Input food conditions were generated...... by modeling copepod size spectra dynamics and patchiness based on particle count transects and Continuous Plankton Recorder time series data. The study analyzes the effects of larval hatching time, presence of zooplankton patchiness and within patch abundance on growth and survival of sandeel early life...

3. Up-to-date and precise estimates of cancer patient survival: model-based period analysis.

Science.gov (United States)

Brenner, Hermann; Hakulinen, Timo

2006-10-01

Monitoring of progress in cancer patient survival by cancer registries should be as up-to-date as possible. Period analysis has been shown to provide more up-to-date survival estimates than do traditional methods of survival analysis. However, there is a trade-off between up-to-dateness and the precision of period estimates, in that increasing the up-to-dateness of survival estimates by restricting the analysis to a relatively short, recent time period, such as the most recent calendar year for which cancer registry data are available, goes along with a loss of precision. The authors propose a model-based approach to maximize the up-to-dateness of period estimates at minimal loss of precision. The approach is illustrated for monitoring of 5-year relative survival of patients diagnosed with one of 20 common forms of cancer in Finland between 1953 and 2002 by use of data from the nationwide Finnish Cancer Registry. It is shown that the model-based approach provides survival estimates that are as up-to-date as the most up-to-date conventional period estimates and at the same time much more precise than the latter. The modeling approach may further enhance the use of period analysis for deriving up-to-date cancer survival rates.

4. Modeling the airborne survival of influenza virus in a residential setting: the impacts of home humidification

Directory of Open Access Journals (Sweden)

Myatt Theodore A

2010-09-01

Full Text Available Abstract Background Laboratory research studies indicate that aerosolized influenza viruses survive for longer periods at low relative humidity (RH conditions. Further analysis has shown that absolute humidity (AH may be an improved predictor of virus survival in the environment. Maintaining airborne moisture levels that reduce survival of the virus in the air and on surfaces could be another tool for managing public health risks of influenza. Methods A multi-zone indoor air quality model was used to evaluate the ability of portable humidifiers to control moisture content of the air and the potential related benefit of decreasing survival of influenza viruses in single-family residences. We modeled indoor AH and influenza virus concentrations during winter months (Northeast US using the CONTAM multi-zone indoor air quality model. A two-story residential template was used under two different ventilation conditions - forced hot air and radiant heating. Humidity was evaluated on a room-specific and whole house basis. Estimates of emission rates for influenza virus were particle-size specific and derived from published studies and included emissions during both tidal breathing and coughing events. The survival of the influenza virus was determined based on the established relationship between AH and virus survival. Results The presence of a portable humidifier with an output of 0.16 kg water per hour in the bedroom resulted in an increase in median sleeping hours AH/RH levels of 11 to 19% compared to periods without a humidifier present. The associated percent decrease in influenza virus survival was 17.5 - 31.6%. Distribution of water vapor through a residence was estimated to yield 3 to 12% increases in AH/RH and 7.8-13.9% reductions in influenza virus survival. Conclusion This modeling analysis demonstrates the potential benefit of portable residential humidifiers in reducing the survival of aerosolized influenza virus by controlling humidity

5. Connecting single-stock assessment models through correlated survival

DEFF Research Database (Denmark)

Albertsen, Christoffer Moesgaard; Nielsen, Anders; Thygesen, Uffe Høgsbro

2017-01-01

Fisheries management is mainly conducted via single-stock assessment models assuming that fish stocks do not interact, except through assumed natural mortalities. Currently, the main alternative is complex ecosystem models which require extensive data, are difficult to calibrate, and have long ru...

6. Predicting clinical outcomes from large scale cancer genomic profiles with deep survival models.

Science.gov (United States)

Yousefi, Safoora; Amrollahi, Fatemeh; Amgad, Mohamed; Dong, Chengliang; Lewis, Joshua E; Song, Congzheng; Gutman, David A; Halani, Sameer H; Velazquez Vega, Jose Enrique; Brat, Daniel J; Cooper, Lee A D

2017-09-15

Translating the vast data generated by genomic platforms into accurate predictions of clinical outcomes is a fundamental challenge in genomic medicine. Many prediction methods face limitations in learning from the high-dimensional profiles generated by these platforms, and rely on experts to hand-select a small number of features for training prediction models. In this paper, we demonstrate how deep learning and Bayesian optimization methods that have been remarkably successful in general high-dimensional prediction tasks can be adapted to the problem of predicting cancer outcomes. We perform an extensive comparison of Bayesian optimized deep survival models and other state of the art machine learning methods for survival analysis, and describe a framework for interpreting deep survival models using a risk backpropagation technique. Finally, we illustrate that deep survival models can successfully transfer information across diseases to improve prognostic accuracy. We provide an open-source software implementation of this framework called SurvivalNet that enables automatic training, evaluation and interpretation of deep survival models.

7. Modeling of pathogen survival during simulated gastric digestion.

Science.gov (United States)

Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

2011-02-01

The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens.

8. Modeling of Pathogen Survival during Simulated Gastric Digestion ▿

Science.gov (United States)

Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

2011-01-01

The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens. PMID:21131530

9. Modelling Tradescantia fluminensis to assess long term survival

Directory of Open Access Journals (Sweden)

Alex James

2015-06-01

Full Text Available We present a simple Poisson process model for the growth of Tradescantia fluminensis, an invasive plant species that inhibits the regeneration of native forest remnants in New Zealand. The model was parameterised with data derived from field experiments in New Zealand and then verified with independent data. The model gave good predictions which showed that its underlying assumptions are sound. However, this simple model had less predictive power for outputs based on variance suggesting that some assumptions were lacking. Therefore, we extended the model to include higher variability between plants thereby improving its predictions. This high variance model suggests that control measures that promote node death at the base of the plant or restrict the main stem growth rate will be more effective than those that reduce the number of branching events. The extended model forms a good basis for assessing the efficacy of various forms of control of this weed, including the recently-released leaf-feeding tradescantia leaf beetle (Neolema ogloblini.

10. Survival Old Model Tamping on Bugis House in Kampong of Bunne Regency of Soppeng South Sulawesi Indonesia

Science.gov (United States)

Abidah, Andi

2017-10-01

Tamping is space circulation from terrace to inside home and also as space for sitting space for low rank social community. Position tamping is one of side of main house. The floor of tamping slightly low than main house floor, this model has seldom found today which community more refer on new tamping model. The new model of tamping today, the same level on main house floor. Even new Bugis house model without tamping. Old model house use tamping but the tamping and watangpola ha the same floor level. This model consists of four modules which three modules on main house and one module tamping. In the past, old model of tamping is different level floor between watangpola and tamping floor now this tamping floor of old Bugis house model gone the same level of watangpola. While new model called eppa-eppa house, did not use tamping. Community in Kampung Bunne is till survive on old model of tamping on their house although several house has change its tamping like community applied now. This model is still found around 45 house of total number of house in the kampung. This study will explore applying old model of tamping of Bugis house in kampong Bunne Regency of Soppeng South Sulawesi. Qualitative research is used on this study. The study was developed base in sketch, photograph and interview.

11. Defining Requirements and Applying Information Modeling for Protecting Enterprise Assets

Science.gov (United States)

Fortier, Stephen C.; Volk, Jennifer H.

The advent of terrorist threats has heightened local, regional, and national governments' interest in emergency response and disaster preparedness. The threat of natural disasters also challenges emergency responders to act swiftly and in a coordinated fashion. When a disaster occurs, an ad hoc coalition of pre-planned groups usually forms to respond to the incident. History has shown that these “system of systems” do not interoperate very well. Communications between fire, police and rescue components either do not work or are inefficient. Government agencies, non-governmental organizations (NGOs), and private industry use a wide array of software platforms for managing data about emergency conditions, resources and response activities. Most of these are stand-alone systems with very limited capability for data sharing with other agencies or other levels of government. Information technology advances have facilitated the movement towards an integrated and coordinated approach to emergency management. Other communication mechanisms, such as video teleconferencing, digital television and radio broadcasting, are being utilized to combat the challenges of emergency information exchange. Recent disasters, such as Hurricane Katrina and the tsunami in Indonesia, have illuminated the weaknesses in emergency response. This paper will discuss the need for defining requirements for components of ad hoc coalitions which are formed to respond to disasters. A goal of our effort was to develop a proof of concept that applying information modeling to the business processes used to protect and mitigate potential loss of an enterprise was feasible. These activities would be modeled both pre- and post-incident.

12. Electrodynamic modeling applied to micro-strip gas chambers

Energy Technology Data Exchange (ETDEWEB)

Fang, R

1998-12-31

Gas gain variations as functions of time, counting rate and substrate resistivity have been observed with Micro-Strip Gas Chambers (MSGC). Such a chamber is here treated as a system of 2 dielectrics, gas and substrate, with finite resistivities. Electric charging between their interface results in variations of the electric field and the gas gain. The electrodynamic equations (including time dependence) for such a system are proposed. A Rule of Charge Accumulation (RCA) is then derived which allows to determine the quantity and sign of charges accumulated on the surface at equilibrium. In order to apply the equations and the rule to MSGCs, a model of gas conductance induced by ionizing radiation is proposed, and a differential equation and some formulae are derived to calculate the rms dispersion and the spatial distribution of electrons (ions) in inhomogeneous electric fields. RCA coupled with a precise simulation of the electric fields gives the first quantitative explanation of gas gain variations of MSGCs. Finally an electrodynamic simulation program is made to reproduce the dynamic process of gain variation due to surface charging with an uncertainty of at most 15% relative to experimental data. As a consequence, the methods for stabilizing operation of MSGCs are proposed. (author) 18 refs.

13. Applying Quality Function Deployment Model in Burn Unit Service Improvement.

Science.gov (United States)

Keshtkaran, Ali; Hashemi, Neda; Kharazmi, Erfan; Abbasi, Mehdi

2016-01-01

Quality function deployment (QFD) is one of the most effective quality design tools. This study applies QFD technique to improve the quality of the burn unit services in Ghotbedin Hospital in Shiraz, Iran. First, the patients' expectations of burn unit services and their priorities were determined through Delphi method. Thereafter, burn unit service specifications were determined through Delphi method. Further, the relationships between the patients' expectations and service specifications and also the relationships between service specifications were determined through an expert group's opinion. Last, the final importance scores of service specifications were calculated through simple additive weighting method. The findings show that burn unit patients have 40 expectations in six different areas. These expectations are in 16 priority levels. Burn units also have 45 service specifications in six different areas. There are four-level relationships between the patients' expectations and service specifications and four-level relationships between service specifications. The most important burn unit service specifications have been identified in this study. The QFD model developed in the study can be a general guideline for QFD planners and executives.

14. Applied genre analysis: a multi-perspective model

Directory of Open Access Journals (Sweden)

Vijay K Bhatia

2002-04-01

Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

15. A simple prognostic model for overall survival in metastatic renal cell carcinoma

Science.gov (United States)

Assi, Hazem I.; Patenaude, Francois; Toumishey, Ethan; Ross, Laura; Abdelsalam, Mahmoud; Reiman, Tony

2016-01-01

Introduction: The primary purpose of this study was to develop a simpler prognostic model to predict overall survival for patients treated for metastatic renal cell carcinoma (mRCC) by examining variables shown in the literature to be associated with survival. Methods: We conducted a retrospective analysis of patients treated for mRCC at two Canadian centres. All patients who started first-line treatment were included in the analysis. A multivariate Cox proportional hazards regression model was constructed using a stepwise procedure. Patients were assigned to risk groups depending on how many of the three risk factors from the final multivariate model they had. Results: There were three risk factors in the final multivariate model: hemoglobin, prior nephrectomy, and time from diagnosis to treatment. Patients in the high-risk group (two or three risk factors) had a median survival of 5.9 months, while those in the intermediate-risk group (one risk factor) had a median survival of 16.2 months, and those in the low-risk group (no risk factors) had a median survival of 50.6 months. Conclusions: In multivariate analysis, shorter survival times were associated with hemoglobin below the lower limit of normal, absence of prior nephrectomy, and initiation of treatment within one year of diagnosis. PMID:27217858

16. Two Artificial Neural Networks for Modeling Discrete Survival Time of Censored Data

Directory of Open Access Journals (Sweden)

Taysseer Sharaf

2015-01-01

Full Text Available Artificial neural network (ANN theory is emerging as an alternative to conventional statistical methods in modeling nonlinear functions. The popular Cox proportional hazard model falls short in modeling survival data with nonlinear behaviors. ANN is a good alternative to the Cox PH as the proportionality of the hazard assumption and model relaxations are not required. In addition, ANN possesses a powerful capability of handling complex nonlinear relations within the risk factors associated with survival time. In this study, we present a comprehensive comparison of two different approaches of utilizing ANN in modeling smooth conditional hazard probability function. We use real melanoma cancer data to illustrate the usefulness of the proposed ANN methods. We report some significant results in comparing the survival time of male and female melanoma patients.

17. International Conference on Applied Mathematics, Modeling and Computational Science & Annual meeting of the Canadian Applied and Industrial Mathematics

CERN Document Server

Bélair, Jacques; Kunze, Herb; Makarov, Roman; Melnik, Roderick; Spiteri, Raymond J

2016-01-01

Focusing on five main groups of interdisciplinary problems, this book covers a wide range of topics in mathematical modeling, computational science and applied mathematics. It presents a wealth of new results in the development of modeling theories and methods, advancing diverse areas of applications and promoting interdisciplinary interactions between mathematicians, scientists, engineers and representatives from other disciplines. The book offers a valuable source of methods, ideas, and tools developed for a variety of disciplines, including the natural and social sciences, medicine, engineering, and technology. Original results are presented on both the fundamental and applied level, accompanied by an ample number of real-world problems and examples emphasizing the interdisciplinary nature and universality of mathematical modeling, and providing an excellent outline of today’s challenges. Mathematical modeling, with applied and computational methods and tools, plays a fundamental role in modern science a...

18. Gacyclidine improves the survival and reduces motor deficits in a mouse model of amyotrophic lateral sclerosis

Directory of Open Access Journals (Sweden)

Yannick Nicolas Gerber

2013-12-01

Full Text Available Amyotrophic lateral sclerosis (ALS is a fatal neurodegenerative disorder typified by a massive loss of motor neurons with few therapeutic options. The exact cause of neuronal degeneration is unknown but it is now admitted that ALS is a multifactorial disease with several mechanisms involved including glutamate excitotoxicity. More specifically, N-methyl-D-aspartate (NMDA-mediated cell death and impairment of the glutamate-transport has been suggested to play a key role in ALS pathophysiology. Thus, evaluating NMDAR antagonists is of high therapeutic interest. Gacyclidine, also named GK11, is a high affinity non-competitive NMDAR antagonist that may protect against motor neuron death in an ALS context. Moreover, GK11 presents a low intrinsic neurotoxicity and has already been used in two clinical trials for CNS lesions. In the present study, we investigated the influence of chronic administration of two doses of GK11 (0.1 and 1 mg/kg on the survival and the functional motor activity of hSOD1G93A mice, an animal model of ALS. Treatment started at early symptomatic age (60 days and was applied bi-weekly until the end stage of the disease. We first confirmed that functional alteration of locomotor activity was evident in the hSOD1G93A transgenic female mice by 60 days of age. A low dose of GK11 improved the survival of the mice by 4.3% and partially preserved body weight. Improved life span was associated with a delay in locomotor function impairment. Conversely, the high dose treatment worsened motor functions. These findings suggest that chronic administration of GK11beginning at early symptomatic stage may be beneficial for patients with ALS.

19. ANP AFFECTS CARDIAC REMODELING, FUNCTION, HEART FAILURE AND SURVIVAL IN A MOUSE MODEL OF DILATED CARDIOMYOPATHY

OpenAIRE

Wang, Dong; Gladysheva, Inna P.; Fan, Tai-Hwang M.; Sullivan, Ryan; Houng, Aiilyan K.; Reed, Guy L.

2013-01-01

Dilated cardiomyopathy is a frequent cause of heart failure and death. Atrial natriuretic peptide (ANP) is a biomarker of dilated cardiomyopathy, but there is controversy whether ANP modulates the development of heart failure. Therefore we examined whether ANP affects heart failure, cardiac remodeling, function and survival in a well-characterized, transgenic model of dilated cardiomyopathy. Mice with dilated cardiomyopathy with normal ANP levels survived longer than mice with partial ANP (p

20. Modeling survival of Listeria monocytogenes in the traditional Greek soft cheese Katiki.

Science.gov (United States)

Mataragas, Marios; Stergiou, Virginia; Nychas, George-John E

2008-09-01

In the present work, survival of Listeria monocytogenes in the traditional Greek soft, spreadable cheese Katiki was studied throughout the shelf life of the product. Samples of finished cheese were inoculated with a cocktail of five L. monocytogenes strains (ca. 6 log CFU g(-1)) and stored at 5, 10, 15, and 20 degrees C. Acid-stress adaptation or cross-protection to the same stress was also investigated by inoculation of acid-adapted cells in the product. The results showed that pathogen survival was biphasic. Various mathematical equations (Geeraerd, Cerf, Albert-Mafart, Whiting, Zwietering, and Baranyi models) were fitted to the experimental data. A thorough statistical analysis was performed to choose the best model. The Geeraerd model was finally selected, and the results revealed no acid tolerance acquisition (no significant differences, P > 0.05, in the survival rates of the non-acid-adapted and acid-adapted cells). Secondary modeling (second-order polynomial with a(0) = 0.8453, a(1) = -0.0743, and a(2) = 0.0059) of the survival rate (of sensitive population), and other parameters that were similar at all temperatures (fraction of initial population in the major population = 99.98%, survival rate of resistant population = 0.10 day(-1), and initial population = 6.29 log CFU g(-1)), showed that survival of the pathogen was temperature dependent with bacterial cells surviving for a longer period of time at lower temperatures. Finally, the developed predictive model was successfully validated at two independent temperatures (12 and 17 degrees C). This study underlines the usefulness of predictive modeling as a tool for realistic estimation and control of L. monocytogenes risk in food products. Such data are also useful when conducting risk assessment studies.

1. Surviving the present: Modeling tools for organizational change

Energy Technology Data Exchange (ETDEWEB)

Pangaro, P. (Pangaro Inc., Washington, DC (United States))

1992-01-01

The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them.

2. A unified framework for benchmark dose estimation applied to mixed models and model averaging

DEFF Research Database (Denmark)

Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

2013-01-01

This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...

3. Survival at refrigeration and freezing temperatures of Campylobacter coli and Campylobacter jejuni on chicken skin applied as axenic and mixed inoculums.

Science.gov (United States)

El-Shibiny, Ayman; Connerton, Phillippa; Connerton, Ian

2009-05-31

Campylobacter is considered to be the most common cause of bacterial diarrhoeal illness in the developed world. Many cases are thought to be acquired from consumption of undercooked poultry. The aim of this study was to compare the effect of the rate of cooling on the survival, at 4 degrees C and -20 degrees C, of Campylobacter coli and Campylobacter jejuni strains, inoculated on chicken skin from axenic culture or as mixed inoculums. Strains chilled in a domestic refrigerator varied in their tolerance to storage at 4 degrees C. Statistically significant differences between strains applied as axenic or mixed inoculums were observed for specific strain combinations using two-way ANOVA, including the enhanced survival of antibiotic resistant C. coli 99/367 at 4 degrees C. The use of rapid cooling (at -20 degrees C/min) enhanced the survival of all the Campylobacter strains chilled to 4 degrees C compared to standard refrigeration. Freezing to -20 degrees C reduced viable counts by 2.2-2.6 log10 CFU/cm(2) in 24 h. Rapid cooling to -20 degrees C (at -30 degrees C/min) enhanced the survival of C. coli 99/367 compared to freezing in a domestic freezer. Statistically significant interaction terms between specific strains were observed in mixed inoculums chilled to -20 degrees C by freezing in a domestic freezer and by rapid chilling to -20 degrees C. Rapid chilling of poultry, particularly for 4 degrees C storage may enhance survival of Campylobacter and although this is an issue that affects meat quality, it should be considered by poultry processors.

4. Regression models for interval censored survival data: Application to HIV infection in Danish homosexual men

DEFF Research Database (Denmark)

Carstensen, Bendix

1996-01-01

This paper shows how to fit excess and relative risk regression models to interval censored survival data, and how to implement the models in standard statistical software. The methods developed are used for the analysis of HIV infection rates in a cohort of Danish homosexual men....

5. Iterative Bayesian Model Averaging: a method for the application of survival analysis to high-dimensional microarray data

Directory of Open Access Journals (Sweden)

2009-02-01

Full Text Available Abstract Background Microarray technology is increasingly used to identify potential biomarkers for cancer prognostics and diagnostics. Previously, we have developed the iterative Bayesian Model Averaging (BMA algorithm for use in classification. Here, we extend the iterative BMA algorithm for application to survival analysis on high-dimensional microarray data. The main goal in applying survival analysis to microarray data is to determine a highly predictive model of patients' time to event (such as death, relapse, or metastasis using a small number of selected genes. Our multivariate procedure combines the effectiveness of multiple contending models by calculating the weighted average of their posterior probability distributions. Our results demonstrate that our iterative BMA algorithm for survival analysis achieves high prediction accuracy while consistently selecting a small and cost-effective number of predictor genes. Results We applied the iterative BMA algorithm to two cancer datasets: breast cancer and diffuse large B-cell lymphoma (DLBCL data. On the breast cancer data, the algorithm selected a total of 15 predictor genes across 84 contending models from the training data. The maximum likelihood estimates of the selected genes and the posterior probabilities of the selected models from the training data were used to divide patients in the test (or validation dataset into high- and low-risk categories. Using the genes and models determined from the training data, we assigned patients from the test data into highly distinct risk groups (as indicated by a p-value of 7.26e-05 from the log-rank test. Moreover, we achieved comparable results using only the 5 top selected genes with 100% posterior probabilities. On the DLBCL data, our iterative BMA procedure selected a total of 25 genes across 3 contending models from the training data. Once again, we assigned the patients in the validation set to significantly distinct risk groups (p

6. Predictability of survival models for waiting list and transplant patients: calculating LYFT.

Science.gov (United States)

Wolfe, R A; McCullough, K P; Leichtman, A B

2009-07-01

'Life years from transplant' (LYFT) is the extra years of life that a candidate can expect to achieve with a kidney transplant as compared to never receiving a kidney transplant at all. The LYFT component survival models (patient lifetimes with and without transplant, and graft lifetime) are comparable to or better predictors of long-term survival than are other predictive equations currently in use for organ allocation. Furthermore, these models are progressively more successful at predicting which of two patients will live longer as their medical characteristics (and thus predicted lifetimes) diverge. The C-statistics and the correlations for the three LYFT component equations have been validated using independent, nonoverlapping split-half random samples. Allocation policies based on these survival models could lead to substantial increases in the number of life years gained from the current donor pool.

7. Model predictive control based on reduced order models applied to belt conveyor system.

Science.gov (United States)

Chen, Wei; Li, Xin

2016-11-01

In the paper, a model predictive controller based on reduced order model is proposed to control belt conveyor system, which is an electro-mechanics complex system with long visco-elastic body. Firstly, in order to design low-degree controller, the balanced truncation method is used for belt conveyor model reduction. Secondly, MPC algorithm based on reduced order model for belt conveyor system is presented. Because of the error bound between the full-order model and reduced order model, two Kalman state estimators are applied in the control scheme to achieve better system performance. Finally, the simulation experiments are shown that balanced truncation method can significantly reduce the model order with high-accuracy and model predictive control based on reduced-model performs well in controlling the belt conveyor system. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

8. Multilevel modelling of clustered grouped survival data using Cox regression model: an application to ART dental restorations.

Science.gov (United States)

Wong, May C M; Lam, K F; Lo, Edward C M

2006-02-15

In some controlled clinical trials in dental research, multiple failure time data from the same patient are frequently observed that result in clustered multiple failure time. Moreover, the treatments are often delivered by more than one operator and thus the multiple failure times are clustered according to a multilevel structure when the operator effects are assumed to be random. In practice, it is often too expensive or even impossible to monitor the study subjects continuously, but they are examined periodically at some regular pre-scheduled visits. Hence, discrete or grouped clustered failure time data are collected. The aim of this paper is to illustrate the use of the Monte Carlo Markov chain (MCMC) approach and non-informative prior in a Bayesian framework to mimic the maximum likelihood (ML) estimation in a frequentist approach in multilevel modelling of clustered grouped survival data. A three-level model with additive variance components model for the random effects is considered in this paper. Both the grouped proportional hazards model and the dynamic logistic regression model are used. The approximate intra-cluster correlation of the log failure times can be estimated when the grouped proportional hazards model is used. The statistical package WinBUGS is adopted to estimate the parameter of interest based on the MCMC method. The models and method are applied to a data set obtained from a prospective clinical study on a cohort of Chinese school children that atraumatic restorative treatment (ART) restorations were placed on permanent teeth with carious lesions. Altogether 284 ART restorations were placed by five dentists and clinical status of the ART restorations was evaluated annually for 6 years after placement, thus clustered grouped failure times of the restorations were recorded. Results based on the grouped proportional hazards model revealed that clustering effect among the log failure times of the different restorations from the same child was

9. Robust Decision-making Applied to Model Selection

Energy Technology Data Exchange (ETDEWEB)

Hemez, Francois M. [Los Alamos National Laboratory

2012-08-06

The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define each of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.

10. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

Directory of Open Access Journals (Sweden)

Matthew B Biggs

Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

11. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

Science.gov (United States)

Biggs, Matthew B; Papin, Jason A

2013-01-01

Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

12. Empirical modeling and data analysis for engineers and applied scientists

CERN Document Server

Pardo, Scott A

2016-01-01

This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

13. Interleukin-7 ameliorates immune dysfunction and improves survival in a 2-hit model of fungal sepsis.

Science.gov (United States)

Unsinger, Jacqueline; Burnham, Carey-Ann D; McDonough, Jacquelyn; Morre, Michel; Prakash, Priya S; Caldwell, Charles C; Dunne, W Michael; Hotchkiss, Richard S

2012-08-15

Secondary hospital-acquired fungal infections are common in critically-ill patients and mortality remains high despite antimicrobial therapy. Interleukin-7 (IL-7) is a potent immunotherapeutic agent that improves host immunity and has shown efficacy in bacterial and viral models of infection. This study examined the ability of IL-7, which is currently in multiple clinical trials (including hepatitis and human immunodeficiency virus), to improve survival in a clinically relevant 2-hit model of fungal sepsis. Mice underwent cecal ligation and puncture to induce peritonitis. Four days later, surviving mice had intravenous injection with Candida albicans. Following Candida infection, mice were treated with IL-7 or saline control. The effect of IL-7 on host immunity and survival was recorded. IL-7 ameliorated the loss of immune effector cells and increased lymphocyte functions, including activation, proliferation, expression of adhesion molecules, and interferon-γ production. These beneficial effects of IL-7 were associated with an increase in global immunity as reflected by an enhanced delayed type hypersensitivity response and a 1.7-fold improvement in survival. The present findings showing that IL-7 improves survival in fungal sepsis, together with its previously reported efficacy in bacterial and viral infectious models, further supports its use as a novel immunotherapeutic in sepsis.

14. Adequateness of applying the Zmijewski model on Serbian companies

Directory of Open Access Journals (Sweden)

2012-12-01

Full Text Available The aim of the paper is to determine the accuracy of the prediction of Zmijewski model in Serbia on the eligible sample. At the same time, the paper identifies model's strengths, weaknesses and limitations of its possible application. Bearing in mind that the economic environment in Serbia is not similar to the United States at the time the model was developed, Zmijewski model is surprisingly accurate in the case of Serbian companies. The accuracy was slightly weaker than the model results in the U.S. in its original form, but much better than the results model gave in the U.S. in the period 1988-1991, and 1992-1999. Model gave also better results in Serbia comparing those in Croatia, even in Croatia model was adjusted.

15. Short Term Survival after Admission for Heart Failure in Sweden: Applying Multilevel Analyses of Discriminatory Accuracy to Evaluate Institutional Performance.

Directory of Open Access Journals (Sweden)

Nermin Ghith

Full Text Available Hospital performance is frequently evaluated by analyzing differences between hospital averages in some quality indicators. The results are often expressed as quality charts of hospital variance (e.g., league tables, funnel plots. However, those analyses seldom consider patients heterogeneity around averages, which is of fundamental relevance for a correct evaluation. Therefore, we apply an innovative methodology based on measures of components of variance and discriminatory accuracy to analyze 30-day mortality after hospital discharge with a diagnosis of Heart Failure (HF in Sweden.We analyzed 36,943 patients aged 45-80 treated in 565 wards at 71 hospitals during 2007-2009. We applied single and multilevel logistic regression analyses to calculate the odds ratios and the area under the receiver-operating characteristic (AUC. We evaluated general hospital and ward effects by quantifying the intra-class correlation coefficient (ICC and the increment in the AUC obtained by adding random effects in a multilevel regression analysis (MLRA. Finally, the Odds Ratios (ORs for specific ward and hospital characteristics were interpreted jointly with the proportional change in variance (PCV and the proportion of ORs in the opposite direction (POOR.Overall, the average 30-day mortality was 9%. Using only patient information on age and previous hospitalizations for different diseases we obtained an AUC = 0.727. This value was almost unchanged when adding sex, country of birth as well as hospitals and wards levels. Average mortality was higher in small wards and municipal hospitals but the POOR values were 15% and 16% respectively.Swedish wards and hospitals in general performed homogeneously well, resulting in a low 30-day mortality rate after HF. In our study, knowledge on a patient's previous hospitalizations was the best predictor of 30-day mortality, and this information did not improve by knowing the sex and country of birth of the patient or where the

16. The NEAT Predictive Model for Survival in Patients with Advanced Cancer.

Science.gov (United States)

Zucker, Amanda; Tsai, Chiaojung Jillian; Loscalzo, John; Calves, Pedro; Kao, Johnny

2018-01-24

We previously developed a model to more accurately predict life expectancy for stage IV cancer patients referred to radiation oncology. The goals of this study are to validate this model and to compare competing published models. From May 2012 to March 2015, 280 consecutive patients with stage IV cancer were prospectively evaluated by a single radiation oncologist. Patients were separated into training, validation and combined sets. The NEAT model evaluated number of active tumors ("N"), Eastern Cooperative Oncology Group (ECOG) performance status ("E"), albumin ("A") and primary tumor site ("T"). The Odette Cancer Center model validated performance status, bone only metastases and primary tumor site. The Harvard TEACHH model investigated primary tumor type, performance status, age, prior chemotherapy courses, liver metastases, and hospitalization within 3 months. Cox multivariable analyses and logistical regression were utilized to compare model performance. Number of active tumors, performance status, albumin, primary tumor site, prior hospitalization within the last 3 months and liver metastases predicted overall survival on uinvariate and multivariable analysis (pNEAT model separated patients into 4 prognostic groups with median survivals of 24.9, 14.8, 4.0, and 1.2 months, respectively (pNEAT model had a C-index of 0.76 with a Nagelkerke's R2 of 0.54 suggesting good discrimination, calibration and total performance. The NEAT model warrants further investigation as a clinically useful approach to predict survival in patients with stage IV cancer.

17. Apply Model Checking to Security Analysis in Trust Management

Science.gov (United States)

2007-02-01

checking tool called SMY Model checking is an automated technique that checks if desired purposes hold in the model. Our experience, reported here...efforts to build practical tools that answer such questions in many cases nevertheless by using a lightweight approach that leverages a mature model

18. Applying a mesoscale atmospheric model to Svalbard glaciers

NARCIS (Netherlands)

Claremar, B.; Obleitner, F.; Reijmer, C.H.|info:eu-repo/dai/nl/229345956; Pohjola, V.; Waxegard, A.; Karner, F.; Rutgersson, A.

2012-01-01

The mesoscale atmospheric model WRF is used over three Svalbard glaciers. The simulations are done with a setup of the model corresponding to the state-of-the-art model for polar conditions, Polar WRF, and it was validated using surface observations. The ERA-Interim reanalysis was used for boundary

19. A biophysical model applied to the Benguela upwelling system ...

African Journals Online (AJOL)

A three-dimensional biophysical model for the Benguela upwelling system is described. The model (NORWECOM) has been used in previous works to study model circulation, primary production and dispersion of particles (fish larvae and pollution) in the North Sea. The primary task of this work has been to validate its ...

20. Applying the Job Characteristics Model to the College Education Experience

Science.gov (United States)

Kass, Steven J.; Vodanovich, Stephen J.; Khosravi, Jasmine Y.

2011-01-01

Boredom is one of the most common complaints among university students, with studies suggesting its link to poor grades, drop out, and behavioral problems. Principles borrowed from industrial-organizational psychology may help prevent boredom and enrich the classroom experience. In the current study, we applied the core dimensions of the job…

1. Evaluation of parametric models by the prediction error in colorectal cancer survival analysis.

Science.gov (United States)

2015-01-01

The aim of this study is to determine the factors influencing predicted survival time for patients with colorectal cancer (CRC) using parametric models and select the best model by predicting error's technique. Survival models are statistical techniques to estimate or predict the overall time up to specific events. Prediction is important in medical science and the accuracy of prediction is determined by a measurement, generally based on loss functions, called prediction error. A total of 600 colorectal cancer patients who admitted to the Cancer Registry Center of Gastroenterology and Liver Disease Research Center, Taleghani Hospital, Tehran, were followed at least for 5 years and have completed selected information for this study. Body Mass Index (BMI), Sex, family history of CRC, tumor site, stage of disease and histology of tumor included in the analysis. The survival time was compared by the Log-rank test and multivariate analysis was carried out using parametric models including Log normal, Weibull and Log logistic regression. For selecting the best model, the prediction error by apparent loss was used. Log rank test showed a better survival for females, BMI more than 25, patients with early stage at diagnosis and patients with colon tumor site. Prediction error by apparent loss was estimated and indicated that Weibull model was the best one for multivariate analysis. BMI and Stage were independent prognostic factors, according to Weibull model. In this study, according to prediction error Weibull regression showed a better fit. Prediction error would be a criterion to select the best model with the ability to make predictions of prognostic factors in survival analysis.

2. Semiparametric maximum likelihood estimation in normal transformation models for bivariate survival data

OpenAIRE

Yi Li; Ross L. Prentice; Xihong Lin

2008-01-01

We consider a class of semiparametric normal transformation models for right-censored bivariate failure times. Nonparametric hazard rate models are transformed to a standard normal model and a joint normal distribution is assumed for the bivariate vector of transformed variates. A semiparametric maximum likelihood estimation procedure is developed for estimating the marginal survival distribution and the pairwise correlation parameters. This produces an efficient estimator of the correlation ...

3. Integrated population modeling reveals the impact of climate on the survival of juvenile emperor penguins.

Science.gov (United States)

Abadi, Fitsum; Barbraud, Christophe; Gimenez, Olivier

2017-03-01

Early-life demographic traits are poorly known, impeding our understanding of population processes and sensitivity to climate change. Survival of immature individuals is a critical component of population dynamics and recruitment in particular. However, obtaining reliable estimates of juvenile survival (i.e., from independence to first year) remains challenging, as immatures are often difficult to observe and to monitor individually in the field. This is particularly acute for seabirds, in which juveniles stay at sea and remain undetectable for several years. In this work, we developed a Bayesian integrated population model to estimate the juvenile survival of emperor penguins (Aptenodytes forsteri), and other demographic parameters including adult survival and fecundity of the species. Using this statistical method, we simultaneously analyzed capture-recapture data of adults, the annual number of breeding females, and the number of fledglings of emperor penguins collected at Dumont d'Urville, Antarctica, for the period 1971-1998. We also assessed how climate covariates known to affect the species foraging habitats and prey [southern annular mode (SAM), sea ice concentration (SIC)] affect juvenile survival. Our analyses revealed that there was a strong evidence for the positive effect of SAM during the rearing period (SAMR) on juvenile survival. Our findings suggest that this large-scale climate index affects juvenile emperor penguins body condition and survival through its influence on wind patterns, fast ice extent, and distance to open water. Estimating the influence of environmental covariates on juvenile survival is of major importance to understand the impacts of climate variability and change on the population dynamics of emperor penguins and seabirds in general and to make robust predictions on the impact of climate change on marine predators. © 2016 John Wiley & Sons Ltd.

4. External validation of a 5-year survival prediction model after elective abdominal aortic aneurysm repair.

Science.gov (United States)

DeMartino, Randall R; Huang, Ying; Mandrekar, Jay; Goodney, Philip P; Oderich, Gustavo S; Kalra, Manju; Bower, Thomas C; Cronenwett, Jack L; Gloviczki, Peter

2017-08-11

The benefit of prophylactic repair of abdominal aortic aneurysms (AAAs) is based on the risk of rupture exceeding the risk of death from other comorbidities. The purpose of this study was to validate a 5-year survival prediction model for patients undergoing elective repair of asymptomatic AAA .05 indicating goodness of fit). Across different populations of patients, assessment of age and level of cardiac, pulmonary, and renal disease can accurately predict 5-year survival in patients with AAA <6.5 cm undergoing repair. This risk prediction model is a valid method to assess mortality risk in determining potential overall survival benefit from elective AAA repair. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

5. Modeling longitudinal data with nonparametric multiplicative random effects jointly with survival data.

Science.gov (United States)

Ding, Jimin; Wang, Jane-Ling

2008-06-01

In clinical studies, longitudinal biomarkers are often used to monitor disease progression and failure time. Joint modeling of longitudinal and survival data has certain advantages and has emerged as an effective way to mutually enhance information. Typically, a parametric longitudinal model is assumed to facilitate the likelihood approach. However, the choice of a proper parametric model turns out to be more elusive than models for standard longitudinal studies in which no survival endpoint occurs. In this article, we propose a nonparametric multiplicative random effects model for the longitudinal process, which has many applications and leads to a flexible yet parsimonious nonparametric random effects model. A proportional hazards model is then used to link the biomarkers and event time. We use B-splines to represent the nonparametric longitudinal process, and select the number of knots and degrees based on a version of the Akaike information criterion (AIC). Unknown model parameters are estimated through maximizing the observed joint likelihood, which is iteratively maximized by the Monte Carlo Expectation Maximization (MCEM) algorithm. Due to the simplicity of the model structure, the proposed approach has good numerical stability and compares well with the competing parametric longitudinal approaches. The new approach is illustrated with primary biliary cirrhosis (PBC) data, aiming to capture nonlinear patterns of serum bilirubin time courses and their relationship with survival time of PBC patients.

6. Polarimetric SAR interferometry applied to land ice: modeling

DEFF Research Database (Denmark)

Dall, Jørgen; Papathanassiou, Konstantinos; Skriver, Henning

2004-01-01

depths. The validity of the scattering models is examined using L-band polarimetric interferometric SAR data acquired with the EMISAR system over an ice cap located in the percolation zone of the Greenland ice sheet. Radar reflectors were deployed on the ice surface prior to the data acquisition in order......This paper introduces a few simple scattering models intended for the application of polarimetric SAR interfer-ometry to land ice. The principal aim is to eliminate the penetration bias hampering ice sheet elevation maps generated with single-channel SAR interferometry. The polarimetric coherent...... scattering models are similar to the oriented-volume model and the random-volume-over-ground model used in vegetation studies, but the ice models are adapted to the different geometry of land ice. Also, due to compaction, land ice is not uniform; a fact that must be taken into account for large penetration...

7. Applying Model Checking to Industrial-Sized PLC Programs

CERN Document Server

AUTHOR|(CDS)2079190; Darvas, Daniel; Blanco Vinuela, Enrique; Tournier, Jean-Charles; Bliudze, Simon; Blech, Jan Olaf; Gonzalez Suarez, Victor M

2015-01-01

Programmable logic controllers (PLCs) are embedded computers widely used in industrial control systems. Ensuring that a PLC software complies with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of safety-critical software but is still underused in industry due to the complexity of building and managing formal models of real applications. In this paper, we propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (\\eg CTL, LTL) on PLC programs. This methodology is based on an intermediate model (IM), meant to transform PLC programs written in various standard languages (ST, SFC, etc.) to different modeling languages of verification tools. We present the syntax and semantics of the IM and the transformation rules of the ST and SFC languages to the nuXmv model checker passing through the intermediate model. Finally, two real cases studies of \\CERN PLC programs, written mainly in th...

8. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

Science.gov (United States)

Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

9. A study of V79 cell survival after for proton and carbon ion beams as represented by the parameters of Katz' track structure model

DEFF Research Database (Denmark)

Grzanka, Leszek; Waligórski, M. P. R.; Bassler, Niels

carbon irradiation. 1. Katz, R., Track structure in radiobiology and in radiation detection. Nuclear Track Detection 2: 1-28 (1978). 2. Furusawa Y. et al. Inactivation of aerobic and hypoxic cells from three different cell lines by accelerated 3He-, 12C- and 20Ne beams. Radiat Res. 2012 Jan; 177......Katz’s theory of cellular track structure (1) is an amorphous analytical model which applies a set of four cellular parameters representing survival of a given cell line after ion irradiation. Usually the values of these parameters are best fitted to a full set of experimentally measured survival...... curves available for a variety of ions. Once fitted, using these parameter values and the analytical formulae of the model calculations, cellular survival curves and RBE may be predicted for that cell line after irradiation by any ion, including mixed ion fields. While it is known that the Katz model...

10. Modeling Educational Choices. A Binomial Logit Model Applied to the Demand for Higher Education.

Science.gov (United States)

Jimenez, Juan de Dios; Salas-Velasco, Manual

2000-01-01

Presents a microeconomic analysis of the choice of university degree course (3 year or 4 year course) Spanish students make on finishing their secondary studies and applies the developed binomial logit model to survey data from 388 high school graduates. Findings show the importance of various factors in determining the likelihood of choosing the…

11. Applying Transtheoretical Model to Promote Physical Activities Among Women

National Research Council Canada - National Science Library

Pirzadeh, Asiyeh; Mostafavi, Firoozeh; Ghofranipour, Fazllolah; Feizi, Awat

2015-01-01

.... Inadequate regular physical activities among women, the importance of education in promoting the physical activities, and lack of studies on the women using transtheoretical model, persuaded us...

12. Surface-bounded growth modeling applied to human mandibles

DEFF Research Database (Denmark)

Andresen, Per Rønsholt

1999-01-01

to yield a spatially dense field. Different methods for constructing the sparse field are compared. Adaptive Gaussian smoothing is the preferred method since it is parameter free and yields good results in practice. A new method, geometry-constrained diffusion, is used to simplify The most successful...... growth model is linear and based on results from shape analysis and principal component analysis. The growth model is tested in a cross validation study with good results. The worst case mean modeling error in the cross validation study is 3.7 mm. It occurs when modeling the shape and size of a 12 years...

13. Community Mobilization Model Applied to Support Grandparents Raising Grandchildren

Science.gov (United States)

Miller, Jacque; Bruce, Ann; Bundy-Fazioli, Kimberly; Fruhauf, Christine A.

2010-01-01

This article discusses the application of a community mobilization model through a case study of one community's response to address the needs of grandparents raising grandchildren. The community mobilization model presented is one that is replicable in addressing diverse community identified issues. Discussed is the building of the partnerships,…

14. Applying a Model-Based Approach for Embedded System Development

NARCIS (Netherlands)

Bunse, C.; Gross, H.G.; Peper, C.

2007-01-01

Model-based and component-oriented software development approaches are slowly superseding traditional ways of developing embedded systems. For investigating to which extent model-based development is feasible for embedded system development, we conducted a case study in which a small embedded system

15. Surface-bounded growth modeling applied to human mandibles

DEFF Research Database (Denmark)

Andresen, Per Rønsholt; Brookstein, F. L.; Conradsen, Knut

2000-01-01

automatically using shape features and a new algorithm called geometry-constrained diffusion. The semilandmarks are mapped into Procrustes space. Principal component analysis extracts a one-dimensional subspace, which is used to construct a linear growth model. The worst case mean modeling error in a cross...

16. Applying reliability models to the maintenance of Space Shuttle software

Science.gov (United States)

Schneidewind, Norman F.

1992-01-01

Software reliability models provide the software manager with a powerful tool for predicting, controlling, and assessing the reliability of software during maintenance. We show how a reliability model can be effectively employed for reliability prediction and the development of maintenance strategies using the Space Shuttle Primary Avionics Software Subsystem as an example.

17. Hydrologic and water quality terminology as applied to modeling

Science.gov (United States)

A survey of literature and examination in particular of terminology use in a previous special collection of modeling calibration and validation papers has been conducted to arrive at a list of consistent terminology recommended for writing about hydrologic and water quality model calibration and val...

18. Studying, Teaching and Applying Sustainability Visions Using Systems Modeling

Directory of Open Access Journals (Sweden)

David M. Iwaniec

2014-07-01

Full Text Available The objective of articulating sustainability visions through modeling is to enhance the outcomes and process of visioning in order to successfully move the system toward a desired state. Models emphasize approaches to develop visions that are viable and resilient and are crafted to adhere to sustainability principles. This approach is largely assembled from visioning processes (resulting in descriptions of desirable future states generated from stakeholder values and preferences and participatory modeling processes (resulting in systems-based representations of future states co-produced by experts and stakeholders. Vision modeling is distinct from normative scenarios and backcasting processes in that the structure and function of the future desirable state is explicitly articulated as a systems model. Crafting, representing and evaluating the future desirable state as a systems model in participatory settings is intended to support compliance with sustainability visioning quality criteria (visionary, sustainable, systemic, coherent, plausible, tangible, relevant, nuanced, motivational and shared in order to develop rigorous and operationalizable visions. We provide two empirical examples to demonstrate the incorporation of vision modeling in research practice and education settings. In both settings, vision modeling was used to develop, represent, simulate and evaluate future desirable states. This allowed participants to better identify, explore and scrutinize sustainability solutions.

19. An electricity billing model | Adetona | Journal of Applied Science ...

African Journals Online (AJOL)

Linear regression analysis has been employed to develop a model for predicting accurately the electricity billing for commercial consumers in Ogun State (Nigeria) at faster rate. The electricity billing model was implement-ed, executed and tested using embedded MATLAB function blocks. The correlations between the ...

20. Applied model for the growth of the daytime mixed layer

DEFF Research Database (Denmark)

Batchvarova, E.; Gryning, Sven-Erik

1991-01-01

A slab model is proposed for developing the height of the mixed layer capped by stable air aloft. The model equations are closed by relating the consumption of energy (potential and kinetic) at the top of the mixed layer to the production of convective and mechanical turbulent kinetic energy with...

1. Geographically Weighted Logistic Regression Applied to Credit Scoring Models

Directory of Open Access Journals (Sweden)

Pedro Henrique Melo Albuquerque

Full Text Available Abstract This study used real data from a Brazilian financial institution on transactions involving Consumer Direct Credit (CDC, granted to clients residing in the Distrito Federal (DF, to construct credit scoring models via Logistic Regression and Geographically Weighted Logistic Regression (GWLR techniques. The aims were: to verify whether the factors that influence credit risk differ according to the borrower’s geographic location; to compare the set of models estimated via GWLR with the global model estimated via Logistic Regression, in terms of predictive power and financial losses for the institution; and to verify the viability of using the GWLR technique to develop credit scoring models. The metrics used to compare the models developed via the two techniques were the AICc informational criterion, the accuracy of the models, the percentage of false positives, the sum of the value of false positive debt, and the expected monetary value of portfolio default compared with the monetary value of defaults observed. The models estimated for each region in the DF were distinct in their variables and coefficients (parameters, with it being concluded that credit risk was influenced differently in each region in the study. The Logistic Regression and GWLR methodologies presented very close results, in terms of predictive power and financial losses for the institution, and the study demonstrated viability in using the GWLR technique to develop credit scoring models for the target population in the study.

2. [A new perspective of survival data on clinical epidemiology: introduction of competitive risk model].

Science.gov (United States)

Nie, Z Q; Ou, Y Q; Qu, Y J; Yuan, H Y; Liu, X Q

2017-08-10

Competing risks occur frequently in the analysis of survival data that should be dealt with competing risk models. Competing risk is an event whose occurrence precludes the occurrence of the primary event of interest. Previous commonly used Kaplan-Meier method tends to overestimate the cumulative survival functions, while the traditional Cox proportional hazards model falsely evaluates the effects of covariates on the hazard related to the occurrence of the event. There are few domestic reports mentioning the concept, application and methodology of competing risk model as well as the implementation procedures or resolution of model conditions and parameters. The current work aims to explain the core concept and methodology of the competing risk model and to illustrate the process of analysis on cumulative incidence rate, using both the cause-specific hazard function model and the sub-distribution hazard function model. Software macro code in SAS 9.4 is also provided to assist clinical researchers to further understand the application of the model so to properly analyze the survival data.

3. Nonstandard Finite Difference Method Applied to a Linear Pharmacokinetics Model

Directory of Open Access Journals (Sweden)

Oluwaseun Egbelowo

2017-05-01

Full Text Available We extend the nonstandard finite difference method of solution to the study of pharmacokinetic–pharmacodynamic models. Pharmacokinetic (PK models are commonly used to predict drug concentrations that drive controlled intravenous (I.V. transfers (or infusion and oral transfers while pharmacokinetic and pharmacodynamic (PD interaction models are used to provide predictions of drug concentrations affecting the response of these clinical drugs. We structure a nonstandard finite difference (NSFD scheme for the relevant system of equations which models this pharamcokinetic process. We compare the results obtained to standard methods. The scheme is dynamically consistent and reliable in replicating complex dynamic properties of the relevant continuous models for varying step sizes. This study provides assistance in understanding the long-term behavior of the drug in the system, and validation of the efficiency of the nonstandard finite difference scheme as the method of choice.

4. Nonstandard Finite Difference Method Applied to a Linear Pharmacokinetics Model.

Science.gov (United States)

Egbelowo, Oluwaseun; Harley, Charis; Jacobs, Byron

2017-05-04

We extend the nonstandard finite difference method of solution to the study of pharmacokinetic-pharmacodynamic models. Pharmacokinetic (PK) models are commonly used to predict drug concentrations that drive controlled intravenous (I.V.) transfers (or infusion and oral transfers) while pharmacokinetic and pharmacodynamic (PD) interaction models are used to provide predictions of drug concentrations affecting the response of these clinical drugs. We structure a nonstandard finite difference (NSFD) scheme for the relevant system of equations which models this pharamcokinetic process. We compare the results obtained to standard methods. The scheme is dynamically consistent and reliable in replicating complex dynamic properties of the relevant continuous models for varying step sizes. This study provides assistance in understanding the long-term behavior of the drug in the system, and validation of the efficiency of the nonstandard finite difference scheme as the method of choice.

5. Effects of temperature on development, survival and reproduction of insects: Experimental design, data analysis and modeling

Science.gov (United States)

Jacques Regniere; James Powell; Barbara Bentz; Vincent Nealis

2012-01-01

The developmental response of insects to temperature is important in understanding the ecology of insect life histories. Temperature-dependent phenology models permit examination of the impacts of temperature on the geographical distributions, population dynamics and management of insects. The measurement of insect developmental, survival and reproductive responses to...

6. Modeling the Impact of Breast-Feeding by HIV-Infected Women on Child Survival.

Science.gov (United States)

Heymann, Sally Jody

1990-01-01

Models the survival outcomes of children in developing countries born to women infected with human immunodeficiency virus (HIV) who are breast-fed, bottle-fed, and wet-nursed. Uses decision analysis to assess the relative risk of child mortality from HIV transmission and non-HIV causes associated with different methods of feeding. (FMW)

7. Modelling circulating tumour cells for personalised survival prediction in metastatic breast cancer.

Directory of Open Access Journals (Sweden)

Gianluca Ascolani

2015-05-01

Full Text Available Ductal carcinoma is one of the most common cancers among women, and the main cause of death is the formation of metastases. The development of metastases is caused by cancer cells that migrate from the primary tumour site (the mammary duct through the blood vessels and extravasating they initiate metastasis. Here, we propose a multi-compartment model which mimics the dynamics of tumoural cells in the mammary duct, in the circulatory system and in the bone. Through a branching process model, we describe the relation between the survival times and the four markers mainly involved in metastatic breast cancer (EPCAM, CD47, CD44 and MET. In particular, the model takes into account the gene expression profile of circulating tumour cells to predict personalised survival probability. We also include the administration of drugs as bisphosphonates, which reduce the formation of circulating tumour cells and their survival in the blood vessels, in order to analyse the dynamic changes induced by the therapy. We analyse the effects of circulating tumour cells on the progression of the disease providing a quantitative measure of the cell driver mutations needed for invading the bone tissue. Our model allows to design intervention scenarios that alter the patient-specific survival probability by modifying the populations of circulating tumour cells and it could be extended to other cancer metastasis dynamics.

8. Modeling receptor kinetics in the analysis of survival data for organophosphorus pesticides.

NARCIS (Netherlands)

Jager, D.T.; Kooijman, S.A.L.M.

2005-01-01

Acute ecotoxicological tests usually focus on survival at a standardized exposure time. However, LC50's decrease in time in a manner that depends both on the chemical and on the organism. DEBtox is an existing approach to analyze toxicity data in time, based on hazard modeling (the internal

9. A comparative study of machine learning methods for time-to-event survival data for radiomics risk modelling.

Science.gov (United States)

Leger, Stefan; Zwanenburg, Alex; Pilz, Karoline; Lohaus, Fabian; Linge, Annett; Zöphel, Klaus; Kotzerke, Jörg; Schreiber, Andreas; Tinhofer, Inge; Budach, Volker; Sak, Ali; Stuschke, Martin; Balermpas, Panagiotis; Rödel, Claus; Ganswindt, Ute; Belka, Claus; Pigorsch, Steffi; Combs, Stephanie E; Mönnich, David; Zips, Daniel; Krause, Mechthild; Baumann, Michael; Troost, Esther G C; Löck, Steffen; Richter, Christian

2017-10-16

Radiomics applies machine learning algorithms to quantitative imaging data to characterise the tumour phenotype and predict clinical outcome. For the development of radiomics risk models, a variety of different algorithms is available and it is not clear which one gives optimal results. Therefore, we assessed the performance of 11 machine learning algorithms combined with 12 feature selection methods by the concordance index (C-Index), to predict loco-regional tumour control (LRC) and overall survival for patients with head and neck squamous cell carcinoma. The considered algorithms are able to deal with continuous time-to-event survival data. Feature selection and model building were performed on a multicentre cohort (213 patients) and validated using an independent cohort (80 patients). We found several combinations of machine learning algorithms and feature selection methods which achieve similar results, e.g. C-Index = 0.71 and BT-COX: C-Index = 0.70 in combination with Spearman feature selection. Using the best performing models, patients were stratified into groups of low and high risk of recurrence. Significant differences in LRC were obtained between both groups on the validation cohort. Based on the presented analysis, we identified a subset of algorithms which should be considered in future radiomics studies to develop stable and clinically relevant predictive models for time-to-event endpoints.

10. Alexandrium minutum growth controlled by phosphorus . An applied model

Science.gov (United States)

Chapelle, A.; Labry, C.; Sourisseau, M.; Lebreton, C.; Youenou, A.; Crassous, M. P.

2010-11-01

Toxic algae are a worldwide problem threatening aquaculture, public health and tourism. Alexandrium, a toxic dinoflagellate proliferates in Northwest France estuaries (i.e. the Penzé estuary) causing Paralytic Shellfish Poisoning events. Vegetative growth, and in particular the role of nutrient uptake and growth rate, are crucial parameters to understand toxic blooms. With the goal of modelling in situ Alexandrium blooms related to environmental parameters, we first try to calibrate a zero-dimensional box model of Alexandrium growth. This work focuses on phosphorus nutrition. Our objective is to calibrate Alexandrium minutum as well as Heterocapsa triquetra (a non-toxic dinoflagellate) growth under different rates of phosphorus supply, other factors being optimal and constant. Laboratory experiments are used to calibrate two growth models and three uptake models for each species. Models are then used to simulate monospecific batch and semi-continuous experiments as well as competition between the two algae (mixed cultures). Results show that the Droop growth model together with linear uptake versus quota can represent most of our observations, although a power law uptake function can more accurately simulate our phosphorus uptake data. We note that such models have limitations in non steady-state situations and cell quotas can depend on a variety of factors, so care must be taken in extrapolating these results beyond the specific conditions studied.

11. Robust model identification applied to type 1diabetes

DEFF Research Database (Denmark)

Finan, Daniel Aaron; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

2010-01-01

In many realistic applications, process noise is known to be neither white nor normally distributed. When identifying models in these cases, it may be more effective to minimize a different penalty function than the standard sum of squared errors (as in a least-squares identification method). Thi...... accurate estimates of parameters than the standard least-squares solution, and more accurate model predictions for test data. The identification techniques are demonstrated on a simple toy problem as well as a physiological model of type 1 diabetes....

12. Modular Modelling and Simulation Approach - Applied to Refrigeration Systems

DEFF Research Database (Denmark)

Sørensen, Kresten Kjær; Stoustrup, Jakob

2008-01-01

This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...... is divided into components where the inputs and outputs are described by a set of XML files that can be combined into a composite system model that may be loaded into MATLABtrade. A set of tools that allows the user to easily load the model and run a simulation are provided. The results show a simulation...

13. Trailing edge noise model applied to wind turbine airfoils

Energy Technology Data Exchange (ETDEWEB)

Bertagnolio, F.

2008-01-15

The aim of this work is firstly to provide a quick introduction to the theory of noise generation that are relevant to wind turbine technology with focus on trailing edge noise. Secondly, the socalled TNO trailing edge noise model developed by Parchen [1] is described in more details. The model is tested and validated by comparing with other results from the literature. Finally, this model is used in the optimization process of two reference airfoils in order to reduce their noise signature: the RISOE-B1-18 and the S809 airfoils. (au)

14. Mitochondrial-Based Treatments that Prevent Post-Traumatic Osteoarthritis in a Translational Large Animal Intraarticular Fracture Survival Model

Science.gov (United States)

2016-09-01

Animal Intraarticular Fracture Survival Model PRINCIPAL INVESTIGATOR: James A. Martin, PhD CONTRACTING ORGANIZATION: University of Iowa Iowa City, IA...Post-Traumatic Osteoarthritis in a Translational Large Animal Intraarticular Fracture Survival Model 5b. GRANT NUMBER W81XWH-11-1-0583 5c...traumatic osteoarthritis, large animal model, oxidative stress, mitochondria, mechanotransduction, amobarbital, n-acetyl cysteine 16. SECURITY

15. Agent-Based Modelling applied to 5D model of the HIV infection

Directory of Open Access Journals (Sweden)

Toufik Laroum

2016-12-01

The simplest model was the 3D mathematical model. But the complexity of this phenomenon and the diversity of cells and actors which affect its evolution requires the use of new approaches such as multi-agents approach that we have applied in this paper. The results of our simulator on the 5D model are promising because they are consistent with biological knowledge’s. Therefore, the proposed approach is well appropriate to the study of population dynamics in general and could help to understand and predict the dynamics of HIV infection.

16. Hydrodynamics and water quality models applied to Sepetiba Bay

Science.gov (United States)

Cunha, Cynara de L. da N.; Rosman, Paulo C. C.; Ferreira, Aldo Pacheco; Carlos do Nascimento Monteiro, Teófilo

2006-10-01

A coupled hydrodynamic and water quality model is used to simulate the pollution in Sepetiba Bay due to sewage effluent. Sepetiba Bay has a complicated geometry and bottom topography, and is located on the Brazilian coast near Rio de Janeiro. In the simulation, the dissolved oxygen (DO) concentration and biochemical oxygen demand (BOD) are used as indicators for the presence of organic matter in the body of water, and as parameters for evaluating the environmental pollution of the eastern part of Sepetiba Bay. Effluent sources in the model are taken from DO and BOD field measurements. The simulation results are consistent with field observations and demonstrate that the model has been correctly calibrated. The model is suitable for evaluating the environmental impact of sewage effluent on Sepetiba Bay from river inflows, assessing the feasibility of different treatment schemes, and developing specific monitoring activities. This approach has general applicability for environmental assessment of complicated coastal bays.

17. A Model-based Prognostics Approach Applied to Pneumatic Valves

Data.gov (United States)

National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

18. Lithospheric structure models applied for locating the Romanian seismic events

Directory of Open Access Journals (Sweden)

V. Oancea

1994-06-01

Full Text Available The paper presents our attempts made for improving the locations obtained for local seismic events, using refined lithospheric structure models. The location program (based on Geiger method supposes a known model. The program is run for some seismic sequences which occurred in different regions, on the Romanian territory, using for each of the sequences three velocity models: 1 7 layers of constant velocity of seismic waves, as an average structure of the lithosphere for the whole territory; 2 site dependent structure (below each station, based on geophysical and geological information on the crust; 3 curves deseribing the dependence of propagation velocities with depth in the lithosphere, characterizing the 7 structural units delineated on the Romanian territory. The results obtained using the different velocity models are compared. Station corrections are computed for each data set. Finally, the locations determined for some quarry blasts are compared with the real ones.

19. A Model-Based Prognostics Approach Applied to Pneumatic Valves

Data.gov (United States)

National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

20. Applying Functional Modeling for Accident Management of Nucler Power Plant

DEFF Research Database (Denmark)

Lind, Morten; Zhang, Xinxin

2014-01-01

The paper investigates applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow...

1. Applying Functional Modeling for Accident Management of Nuclear Power Plant

DEFF Research Database (Denmark)

Lind, Morten; Zhang, Xinxin

2014-01-01

The paper investigate applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow...

2. Joint regression analysis and AMMI model applied to oat improvement

Science.gov (United States)

Oliveira, A.; Oliveira, T. A.; Mejza, S.

2012-09-01

In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

3. Tensegrity applied to modelling the motion of viruses

Science.gov (United States)

Simona-Mariana, Cretu; Gabriela-Catalina, Brinzan

2011-02-01

A considerable number of viruses' structures have been discovered and more are expected to be identified. Different viruses' symmetries can be observed at the nanoscale level. The mechanical models of some viruses realised by scientists are described in this paper, none of which has taken into consideration the internal deformation of subsystems. The authors' models for some viruses' elements are introduced, with rigid and flexible links, which reproduce the movements of viruses including internal deformations of the subunits.

4. Validation of models with constant bias: an applied approach

Directory of Open Access Journals (Sweden)

2014-06-01

Full Text Available Objective. This paper presents extensions to the statistical validation method based on the procedure of Freese when a model shows constant bias (CB in its predictions and illustrate the method with data from a new mechanistic model that predict weight gain in cattle. Materials and methods. The extensions were the hypothesis tests and maximum anticipated error for the alternative approach, and the confidence interval for a quantile of the distribution of errors. Results. The model evaluated showed CB, once the CB is removed and with a confidence level of 95%, the magnitude of the error does not exceed 0.575 kg. Therefore, the validated model can be used to predict the daily weight gain of cattle, although it will require an adjustment in its structure based on the presence of CB to increase the accuracy of its forecasts. Conclusions. The confidence interval for the 1-α quantile of the distribution of errors after correcting the constant bias, allows determining the top limit for the magnitude of the error of prediction and use it to evaluate the evolution of the model in the forecasting of the system. The confidence interval approach to validate a model is more informative than the hypothesis tests for the same purpose.

5. Role of adipose-derived stromal cells in pedicle skin flap survival in experimental animal models

Science.gov (United States)

Foroglou, Pericles; Karathanasis, Vasileios; Demiri, Efterpi; Koliakos, George; Papadakis, Marios

2016-01-01

The use of skin flaps in reconstructive surgery is the first-line surgical treatment for the reconstruction of skin defects and is essentially considered the starting point of plastic surgery. Despite their excellent usability, their application includes general surgical risks or possible complications, the primary and most common is necrosis of the flap. To improve flap survival, researchers have used different methods, including the use of adipose-derived stem cells, with significant positive results. In our research we will report the use of adipose-derived stem cells in pedicle skin flap survival based on current literature on various experimental models in animals. PMID:27022440

6. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

Science.gov (United States)

Yan, Ying; Yi, Grace Y

2016-07-01

Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

7. Semiparametric Maximum Likelihood Estimation in Normal Transformation Models for Bivariate Survival Data

Science.gov (United States)

Li, Yi; Prentice, Ross L.; Lin, Xihong

2008-01-01

SUMMARY We consider a class of semiparametric normal transformation models for right censored bivariate failure times. Nonparametric hazard rate models are transformed to a standard normal model and a joint normal distribution is assumed for the bivariate vector of transformed variates. A semiparametric maximum likelihood estimation procedure is developed for estimating the marginal survival distribution and the pairwise correlation parameters. This produces an efficient estimator of the correlation parameter of the semiparametric normal transformation model, which characterizes the bivariate dependence of bivariate survival outcomes. In addition, a simple positive-mass-redistribution algorithm can be used to implement the estimation procedures. Since the likelihood function involves infinite-dimensional parameters, the empirical process theory is utilized to study the asymptotic properties of the proposed estimators, which are shown to be consistent, asymptotically normal and semiparametric efficient. A simple estimator for the variance of the estimates is also derived. The finite sample performance is evaluated via extensive simulations. PMID:19079778

8. Semiparametric Maximum Likelihood Estimation in Normal Transformation Models for Bivariate Survival Data.

Science.gov (United States)

Li, Yi; Prentice, Ross L; Lin, Xihong

2008-12-01

We consider a class of semiparametric normal transformation models for right censored bivariate failure times. Nonparametric hazard rate models are transformed to a standard normal model and a joint normal distribution is assumed for the bivariate vector of transformed variates. A semiparametric maximum likelihood estimation procedure is developed for estimating the marginal survival distribution and the pairwise correlation parameters. This produces an efficient estimator of the correlation parameter of the semiparametric normal transformation model, which characterizes the bivariate dependence of bivariate survival outcomes. In addition, a simple positive-mass-redistribution algorithm can be used to implement the estimation procedures. Since the likelihood function involves infinite-dimensional parameters, the empirical process theory is utilized to study the asymptotic properties of the proposed estimators, which are shown to be consistent, asymptotically normal and semiparametric efficient. A simple estimator for the variance of the estimates is also derived. The finite sample performance is evaluated via extensive simulations.

9. Applying artificial vision models to human scene understanding.

Science.gov (United States)

Aminoff, Elissa M; Toneva, Mariya; Shrivastava, Abhinav; Chen, Xinlei; Misra, Ishan; Gupta, Abhinav; Tarr, Michael J

2015-01-01

How do we understand the complex patterns of neural responses that underlie scene understanding? Studies of the network of brain regions held to be scene-selective-the parahippocampal/lingual region (PPA), the retrosplenial complex (RSC), and the occipital place area (TOS)-have typically focused on single visual dimensions (e.g., size), rather than the high-dimensional feature space in which scenes are likely to be neurally represented. Here we leverage well-specified artificial vision systems to explicate a more complex understanding of how scenes are encoded in this functional network. We correlated similarity matrices within three different scene-spaces arising from: (1) BOLD activity in scene-selective brain regions; (2) behavioral measured judgments of visually-perceived scene similarity; and (3) several different computer vision models. These correlations revealed: (1) models that relied on mid- and high-level scene attributes showed the highest correlations with the patterns of neural activity within the scene-selective network; (2) NEIL and SUN-the models that best accounted for the patterns obtained from PPA and TOS-were different from the GIST model that best accounted for the pattern obtained from RSC; (3) The best performing models outperformed behaviorally-measured judgments of scene similarity in accounting for neural data. One computer vision method-NEIL ("Never-Ending-Image-Learner"), which incorporates visual features learned as statistical regularities across web-scale numbers of scenes-showed significant correlations with neural activity in all three scene-selective regions and was one of the two models best able to account for variance in the PPA and TOS. We suggest that these results are a promising first step in explicating more fine-grained models of neural scene understanding, including developing a clearer picture of the division of labor among the components of the functional scene-selective brain network.

10. Applying artificial vision models to human scene understanding

Directory of Open Access Journals (Sweden)

Elissa Michele Aminoff

2015-02-01

Full Text Available How do we understand the complex patterns of neural responses that underlie scene understanding? Studies of the network of brain regions held to be scene-selective – the parahippocampal/lingual region (PPA, the retrosplenial complex (RSC, and the occipital place area (TOS – have typically focused on single visual dimensions (e.g., size, rather than the high-dimensional feature space in which scenes are likely to be neurally represented. Here we leverage well-specified artificial vision systems to explicate a more complex understanding of how scenes are encoded in this functional network. We correlated similarity matrices within three different scene-spaces arising from: 1 BOLD activity in scene-selective brain regions; 2 behavioral measured judgments of visually-perceived scene similarity; and 3 several different computer vision models. These correlations revealed: 1 models that relied on mid- and high-level scene attributes showed the highest correlations with the patterns of neural activity within the scene-selective network; 2 NEIL and SUN – the models that best accounted for the patterns obtained from PPA and TOS – were different from the GIST model that best accounted for the pattern obtained from RSC; 3 The best performing models outperformed behaviorally-measured judgments of scene similarity in accounting for neural data. One computer vision method – NEIL (Never-Ending-Image-Learner, which incorporates visual features learned as statistical regularities across web-scale numbers of scenes – showed significant correlations with neural activity in all three scene-selective regions and was one of the two models best able to account for variance in the PPA and TOS. We suggest that these results are a promising first step in explicating more fine-grained models of neural scene understanding, including developing a clearer picture of the division of labor among the components of the functional scene-selective brain network.

11. A Model-based Prognostics Approach Applied to Pneumatic Valves

Directory of Open Access Journals (Sweden)

Matthew J. Daigle

2011-01-01

Full Text Available Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

12. Effects of temperature on development, survival and reproduction of insects: experimental design, data analysis and modeling.

Science.gov (United States)

Régnière, Jacques; Powell, James; Bentz, Barbara; Nealis, Vincent

2012-05-01

The developmental response of insects to temperature is important in understanding the ecology of insect life histories. Temperature-dependent phenology models permit examination of the impacts of temperature on the geographical distributions, population dynamics and management of insects. The measurement of insect developmental, survival and reproductive responses to temperature poses practical challenges because of their modality, variability among individuals and high mortality near the lower and upper threshold temperatures. We address this challenge with an integrated approach to the design of experiments and analysis of data based on maximum likelihood. This approach expands, simplifies and unifies the analysis of laboratory data parameterizing the thermal responses of insects in particular and poikilotherms in general. This approach allows the use of censored observations (records of surviving individuals that have not completed development after a certain time) and accommodates observations from temperature transfer treatments in which individuals pass only a portion of their development at an extreme (near-threshold) temperature and are then placed in optimal conditions to complete their development with a higher rate of survival. Results obtained from this approach are directly applicable to individual-based modeling of insect development, survival and reproduction with respect to temperature. This approach makes possible the development of process-based phenology models that are based on optimal use of available information, and will aid in the development of powerful tools for analyzing eruptive insect population behavior and response to changing climatic conditions. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.

13. An approach to the drone fleet survivability assessment based on a stochastic continues-time model

Science.gov (United States)

Kharchenko, Vyacheslav; Fesenko, Herman; Doukas, Nikos

2017-09-01

An approach and the algorithm to the drone fleet survivability assessment based on a stochastic continues-time model are proposed. The input data are the number of the drones, the drone fleet redundancy coefficient, the drone stability and restoration rate, the limit deviation from the norms of the drone fleet recovery, the drone fleet operational availability coefficient, the probability of the drone failure-free operation, time needed for performing the required tasks by the drone fleet. The ways for improving the recoverable drone fleet survivability taking into account amazing factors of system accident are suggested. Dependencies of the drone fleet survivability rate both on the drone stability and the number of the drones are analysed.

14. Statistical modelling of survival data with random effects h-likelihood approach

CERN Document Server

Ha, Il Do; Lee, Youngjo

2017-01-01

This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

15. Radiative transfer theory applied to ocean bottom modeling.

Science.gov (United States)

Quijano, Jorge E; Zurk, Lisa M

2009-10-01

Research on the propagation of acoustic waves in the ocean bottom sediment is of interest for active sonar applications such as target detection and remote sensing. The interaction of acoustic energy with the sea floor sublayers is usually modeled with techniques based on the full solution of the wave equation, which sometimes leads to mathematically intractable problems. An alternative way to model wave propagation in layered media containing random scatterers is the radiative transfer (RT) formulation, which is a well established technique in the electromagnetics community and is based on the principle of conservation of energy. In this paper, the RT equation is used to model the backscattering of acoustic energy from a layered elastic bottom sediment containing distributions of independent scatterers due to a constant single frequency excitation in the water column. It is shown that the RT formulation provides insight into the physical phenomena of scattering and conversion of energy between waves of different polarizations.

16. Applied Bounded Model Checking for Interlocking System Designs

DEFF Research Database (Denmark)

Haxthausen, Anne Elisabeth; Peleska, Jan; Pinger, Ralf

2013-01-01

In this article the verification and validation of interlocking systems is investigated. Reviewing both geographical and route-related interlocking, the verification objectives can be structured from a perspective of computer science into (1) verification of static semantics, and (2) verification...... of behavioural (operational) semantics. The former checks that the plant model – that is, the software components reflecting the physical components of the interlocking system – has been set up in an adequate way. The latter investigates trains moving through the network, with the objective to uncover potential...... safety violations. From a formal methods perspective, these verification objectives can be approached by theorem proving, global, or bounded model checking. This article explains the techniques for application of bounded model checking techniques, and discusses their advantages in comparison...

17. The OTP-model applied to the Aklim site database

Science.gov (United States)

Mraini, Kamilia; Jabiri, Abdelhadi; Benkhaldoun, Zouhair; Bounhir, Aziza; Hach, Youssef; Sabil, Mohammed; Habib, Abdelfettah

2014-08-01

Within the framework of the site prospection for the future European Extremely Large Telescope (E-ELT), a wide site characterization was achieved. Aklim site located at an altitude of 2350 m at the geographical coordinates: lat.= 30°07'38" N, long.= 8°18'31" W , in the Moroccan Middle Atlas Mountains, was one of the candidate sites chosen by the Framework Programme VI (FP6) of the European Union. To complete the fulfilled study ([19]; [21]), we have used the ModelOTP (model of optical turbulence profiles) established by [15] and improved by [6]. This model allows getting the built-in profiles of the optical turbulence under various conditions. In this paper, we present an overview of the Aklim database results, in the boundary layers and in the free atmosphere separately and we make a comparison with Cerro Pachon result [15].

18. Modeling the microstructure of surface by applying BRDF function

Science.gov (United States)

Plachta, Kamil

2017-06-01

The paper presents the modeling of surface microstructure using a bidirectional reflectance distribution function. This function contains full information about the reflectance properties of the flat surfaces - it is possible to determine the share of the specular, directional and diffuse components in the reflected luminous stream. The software is based on the authorial algorithm that uses selected elements of this function models, which allows to determine the share of each component. Basing on obtained data, the surface microstructure of each material can be modeled, which allows to determine the properties of this materials. The concentrator directs the reflected solar radiation onto the photovoltaic surface, increasing, at the same time, the value of the incident luminous stream. The paper presents an analysis of selected materials that can be used to construct the solar concentrator system. The use of concentrator increases the power output of the photovoltaic system by up to 17% as compared to the standard solution.

19. Continuous Molecular Fields Approach Applied to Structure-Activity Modeling

CERN Document Server

2013-01-01

The Method of Continuous Molecular Fields is a universal approach to predict various properties of chemical compounds, in which molecules are represented by means of continuous fields (such as electrostatic, steric, electron density functions, etc). The essence of the proposed approach consists in performing statistical analysis of functional molecular data by means of joint application of kernel machine learning methods and special kernels which compare molecules by computing overlap integrals of their molecular fields. This approach is an alternative to traditional methods of building 3D structure-activity and structure-property models based on the use of fixed sets of molecular descriptors. The methodology of the approach is described in this chapter, followed by its application to building regression 3D-QSAR models and conducting virtual screening based on one-class classification models. The main directions of the further development of this approach are outlined at the end of the chapter.

20. Hidden multidimensional social structure modeling applied to biased social perception

Science.gov (United States)

Maletić, Slobodan; Zhao, Yi

2018-02-01

Intricacies of the structure of social relations are realized by representing a collection of overlapping opinions as a simplicial complex, thus building latent multidimensional structures, through which agents are, virtually, moving as they exchange opinions. The influence of opinion space structure on the distribution of opinions is demonstrated by modeling consensus phenomena when the opinion exchange between individuals may be affected by the false consensus effect. The results indicate that in the cases with and without bias, the road toward consensus is influenced by the structure of multidimensional space of opinions, and in the biased case, complete consensus is achieved. The applications of proposed modeling framework can easily be generalized, as they transcend opinion formation modeling.

1. A new inverse regression model applied to radiation biodosimetry

Science.gov (United States)

Higueras, Manuel; Puig, Pedro; Ainsbury, Elizabeth A.; Rothkamm, Kai

2015-01-01

Biological dosimetry based on chromosome aberration scoring in peripheral blood lymphocytes enables timely assessment of the ionizing radiation dose absorbed by an individual. Here, new Bayesian-type count data inverse regression methods are introduced for situations where responses are Poisson or two-parameter compound Poisson distributed. Our Poisson models are calculated in a closed form, by means of Hermite and negative binomial (NB) distributions. For compound Poisson responses, complete and simplified models are provided. The simplified models are also expressible in a closed form and involve the use of compound Hermite and compound NB distributions. Three examples of applications are given that demonstrate the usefulness of these methodologies in cytogenetic radiation biodosimetry and in radiotherapy. We provide R and SAS codes which reproduce these examples. PMID:25663804

2. Gordon's model applied to nursing care of people with depression.

Science.gov (United States)

Temel, M; Kutlu, F Y

2015-12-01

Psychiatric nurses should consider the patient's biological, psychological and social aspects. Marjory Gordon's Functional Health Pattern Model ensures a holistic approach for the patient. To examine the effectiveness of Gordon's Functional Health Pattern Model in reducing depressive symptoms, increasing self-efficacy, coping with depression and increasing hope in people with depression. A quasi-experimental two-group pre-test and post-test design was adopted. Data were collected from April 2013 to May 2014 from people with depression at the psychiatry clinic of a state hospital in Turkey; they were assigned to the intervention (n = 34) or control group (n = 34). The intervention group received nursing care according to Gordon's Functional Health Pattern Model and routine care, while the control group received routine care only. The Beck Depression Inventory, Beck Hopelessness Scale and Depression Coping Self-Efficacy Scale were used. The intervention group had significantly lower scores on the Beck Depression Inventory and Beck Hopelessness Scale at the post-test and 3-month follow-up; they had higher scores on the Depression Coping Self-Efficacy Scale at the 3-month follow-up when compared with the control group. The study was conducted at only one psychiatry clinic. The intervention and control group patients were at the clinic at the same time and influenced each other. Moreover, because clinical routines were in progress during the study, the results cannot only be attributed to nursing interventions. Nursing models offer guidance for the care provided. Practices based on the models return more efficient and systematic caregiving results with fewer health problems. Gordon's Functional Health Pattern Model was effective in improving the health of people with depression and could be introduced as routine care with ongoing evaluation in psychiatric clinics. More research is needed to evaluate Gordon's Nursing Model effect on people with depression. Future

3. A Twin Protection Effect? Explaining Twin Survival Advantages with a Two-Process Mortality Model.

Directory of Open Access Journals (Sweden)

David J Sharrow

Full Text Available Twin studies that focus on the correlation in age-at-death between twin pairs have yielded important insights into the heritability and role of genetic factors in determining lifespan, but less attention is paid to the biological and social role of zygosity itself in determining survival across the entire life course. Using data from the Danish Twin Registry and the Human Mortality Database, we show that monozygotic twins have greater cumulative survival proportions at nearly every age compared to dizygotic twins and the Danish general population. We examine this survival advantage by fitting these data with a two-process mortality model that partitions survivorship patterns into extrinsic and intrinsic mortality processes roughly corresponding to acute, environmental and chronic, biological origins. We find intrinsic processes confer a survival advantage at older ages for males, while at younger ages, all monozygotic twins show a health protection effect against extrinsic death akin to a marriage protection effect. While existing research suggests an increasingly important role for genetic factors at very advanced ages, we conclude that the social closeness of monozygotic twins is a plausible driver of the survival advantage at ages <65.

4. A Twin Protection Effect? Explaining Twin Survival Advantages with a Two-Process Mortality Model.

Science.gov (United States)

Sharrow, David J; Anderson, James J

2016-01-01

Twin studies that focus on the correlation in age-at-death between twin pairs have yielded important insights into the heritability and role of genetic factors in determining lifespan, but less attention is paid to the biological and social role of zygosity itself in determining survival across the entire life course. Using data from the Danish Twin Registry and the Human Mortality Database, we show that monozygotic twins have greater cumulative survival proportions at nearly every age compared to dizygotic twins and the Danish general population. We examine this survival advantage by fitting these data with a two-process mortality model that partitions survivorship patterns into extrinsic and intrinsic mortality processes roughly corresponding to acute, environmental and chronic, biological origins. We find intrinsic processes confer a survival advantage at older ages for males, while at younger ages, all monozygotic twins show a health protection effect against extrinsic death akin to a marriage protection effect. While existing research suggests an increasingly important role for genetic factors at very advanced ages, we conclude that the social closeness of monozygotic twins is a plausible driver of the survival advantage at ages <65.

5. Decompression Sickness After Air Break in Prebreathe Described with a Survival Model

Science.gov (United States)

Conkin, J.; Pilmanis, A. A.

2010-01-01

Data from Brooks City-Base show the decompression sickness (DCS) and venous gas emboli (VGE) consequences of air breaks in a resting 100% O2 prebreathe (PB) prior to a hypobaric exposure. METHODS: DCS and VGE survival times from 95 controls for a 60 min PB prior to 2-hr or 4-hr exposures to 4.37 psia are statistically compared to 3 break in PB conditions: a 10 min (n=40), 20 min (n=40), or 60 min break (n=32) 30 min into the PB followed by 30 min of PB. Ascent rate was 1,524 meters / min and all exposures included light exercise and 4 min of VGE monitoring of heart chambers at 16 min intervals. DCS survival time for combined control and air breaks were described with an accelerated log logistic model where exponential N2 washin during air break was described with a 10 min half-time and washout during PB with a 60 min half-time. RESULTS: There was no difference in VGE or DCS survival times among 3 different air breaks, or when air breaks were compared to control VGE times. However, 10, 20, and 60 min air breaks had significantly earlier survival times compared to control DCS times, certainly early in the exposures. CONCLUSION: Air breaks of 10, 20, and 60 min after 30 min of a 60 min PB reduced DCS survival time. The survival model combined discrete comparisons into a global description mechanistically linked to asymmetrical N2 washin and washout kinetics based on inspired pN2. Our unvalidated regression is used to compute additional PB time needed to compensate for an air break in PB within the range of tested conditions.

6. Prognostic factors for survival in adult patients with recurrent glioblastoma: a decision-tree-based model.

Science.gov (United States)

Audureau, Etienne; Chivet, Anaïs; Ursu, Renata; Corns, Robert; Metellus, Philippe; Noel, Georges; Zouaoui, Sonia; Guyotat, Jacques; Le Reste, Pierre-Jean; Faillot, Thierry; Litre, Fabien; Desse, Nicolas; Petit, Antoine; Emery, Evelyne; Lechapt-Zalcman, Emmanuelle; Peltier, Johann; Duntze, Julien; Dezamis, Edouard; Voirin, Jimmy; Menei, Philippe; Caire, François; Dam Hieu, Phong; Barat, Jean-Luc; Langlois, Olivier; Vignes, Jean-Rodolphe; Fabbro-Peray, Pascale; Riondel, Adeline; Sorbets, Elodie; Zanello, Marc; Roux, Alexandre; Carpentier, Antoine; Bauchet, Luc; Pallud, Johan

2017-11-20

We assessed prognostic factors in relation to OS from progression in recurrent glioblastomas. Retrospective multicentric study enrolling 407 (training set) and 370 (external validation set) adult patients with a recurrent supratentorial glioblastoma treated by surgical resection and standard combined chemoradiotherapy as first-line treatment. Four complementary multivariate prognostic models were evaluated: Cox proportional hazards regression modeling, single-tree recursive partitioning, random survival forest, conditional random forest. Median overall survival from progression was 7.6 months (mean, 10.1; range, 0-86) and 8.0 months (mean, 8.5; range, 0-56) in the training and validation sets, respectively (p = 0.900). Using the Cox model in the training set, independent predictors of poorer overall survival from progression included increasing age at histopathological diagnosis (aHR, 1.47; 95% CI [1.03-2.08]; p = 0.032), RTOG-RPA V-VI classes (aHR, 1.38; 95% CI [1.11-1.73]; p = 0.004), decreasing KPS at progression (aHR, 3.46; 95% CI [2.10-5.72]; p < 0.001), while independent predictors of longer overall survival from progression included surgical resection (aHR, 0.57; 95% CI [0.44-0.73]; p < 0.001) and chemotherapy (aHR, 0.41; 95% CI [0.31-0.55]; p < 0.001). Single-tree recursive partitioning identified KPS at progression, surgical resection at progression, chemotherapy at progression, and RTOG-RPA class at histopathological diagnosis, as main survival predictors in the training set, yielding four risk categories highly predictive of overall survival from progression both in training (p < 0.0001) and validation (p < 0.0001) sets. Both random forest approaches identified KPS at progression as the most important survival predictor. Age, KPS at progression, RTOG-RPA classes, surgical resection at progression and chemotherapy at progression are prognostic for survival in recurrent glioblastomas and should inform the treatment decisions.

7. Applied Bounded Model Checking for Interlocking System Designs

DEFF Research Database (Denmark)

Haxthausen, Anne Elisabeth; Peleska, Jan; Pinger, Ralf

2014-01-01

of behavioural (operational) semantics. The former checks that the plant model – that is, the software components reflecting the physical components of the interlocking system – has been set up in an adequate way. The latter investigates trains moving through the network, with the objective to uncover potential...

8. Applying the knowledge creation model to the management of ...

African Journals Online (AJOL)

In present-day society, the need to manage indigenous knowledge is widely recognised. However, there is a debate in progress on whether or not indigenous knowledge can be easily managed. The purpose of this paper is to examine the possibility of using knowledge management models like knowledge creation theory ...

Science.gov (United States)

Warren, Jane; Klepper, Konja K.; Lambert, Serena; Nunez, Johnna; Williams, Susan

2011-01-01

Creating and retaining empathic connections with the most disenfranchised among us can take a toll on the wellness of counselor advocates. The Advocacy-Serving Model is introduced as a creative approach to strengthening the ability of advocates to serve through enhancing awareness, focusing actions, and connecting to community. The model…

10. Applying an Employee-Motivation Model to Prevent Student Plagiarism.

Science.gov (United States)

Malouff, John M.; Sims, Randi L.

1996-01-01

A model based on Vroom's expectancy theory of employee motivation posits that instructors can prevent plagiarism by ensuring that students understand the rules of ethical writing, expect assignments to be manageable and have personal benefits, and expect plagiarism to be difficult and have important personal costs. (SK)

11. Dynamics Model Applied to Pricing Options with Uncertain Volatility

Directory of Open Access Journals (Sweden)

Lorella Fatone

2012-01-01

model is proposed. The data used to test the calibration problem included observations of asset prices over a finite set of (known equispaced discrete time values. Statistical tests were used to estimate the statistical significance of the two parameters of the Black-Scholes model: the volatility and the drift. The effects of these estimates on the option pricing problem were investigated. In particular, the pricing of an option with uncertain volatility in the Black-Scholes framework was revisited, and a statistical significance was associated with the price intervals determined using the Black-Scholes-Barenblatt equations. Numerical experiments involving synthetic and real data were presented. The real data considered were the daily closing values of the S&P500 index and the associated European call and put option prices in the year 2005. The method proposed here for calibrating the Black-Scholes dynamics model could be extended to other science and engineering models that may be expressed in terms of stochastic dynamical systems.

12. A comparison of various modelling approaches applied to Cholera ...

African Journals Online (AJOL)

The application of a methodology that proposes the use of spectral methods to inform the development of statistical forecasting models for cholera case data is explored in this paper. The seasonal behaviour of the target variable (cholera cases) is analysed using singular spectrum analysis followed by spectrum estimation ...

13. A comparison of various modelling approaches applied to Cholera ...

African Journals Online (AJOL)

Abstract. The application of a methodology that proposes the use of spectral methods to inform the development of statistical forecasting models for cholera case data is explored in this pa- per. The seasonal behaviour of the target variable (cholera cases) is analysed using singular spectrum analysis followed by spectrum ...

14. An Analytical Model for Learning: An Applied Approach.

Science.gov (United States)

Kassebaum, Peter Arthur

A mediated-learning package, geared toward non-traditional students, was developed for use in the College of Marin's cultural anthropology courses. An analytical model for learning was used in the development of the package, utilizing concepts related to learning objectives, programmed instruction, Gestalt psychology, cognitive psychology, and…

15. Leadership Identity Development: Challenges in Applying a Developmental Model

Science.gov (United States)

Komives, Susan R.; Longerbeam, Susan D.; Mainella, Felicia; Osteen, Laura; Owen, Julie E.; Wagner, Wendy

2009-01-01

The leadership identity development (LID) grounded theory (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005) and related LID model (Komives, Longerbeam, Owen, Mainella, & Osteen, 2006) present a framework for understanding how individual college students develop the social identity of being collaborative, relational leaders…

16. The Limitations of Applying Rational Decision-Making Models

African Journals Online (AJOL)

also assumes that the individual operates as a rational decision- making organism in determining the benefits and costs of taking action to control pregnancies. The model .... dren, Male, Maldives,january 1990. Ms. S. Collins. Lecturer in Psychology. Chancellor College. University of Malawi. P.O. Box 280. Zomba. Malawi.

17. A Spatial Lattice Model Applied for Meteorological Visualization and Analysis

Directory of Open Access Journals (Sweden)

Mingyue Lu

2017-03-01

Full Text Available Meteorological information has obvious spatial-temporal characteristics. Although it is meaningful to employ a geographic information system (GIS to visualize and analyze the meteorological information for better identification and forecasting of meteorological weather so as to reduce the meteorological disaster loss, modeling meteorological information based on a GIS is still difficult because meteorological elements generally have no stable shape or clear boundary. To date, there are still few GIS models that can satisfy the requirements of both meteorological visualization and analysis. In this article, a spatial lattice model based on sampling particles is proposed to support both the representation and analysis of meteorological information. In this model, a spatial sampling particle is regarded as the basic element that contains the meteorological information, and the location where the particle is placed with the time mark. The location information is generally represented using a point. As these points can be extended to a surface in two dimensions and a voxel in three dimensions, if these surfaces and voxels can occupy a certain space, then this space can be represented using these spatial sampling particles with their point locations and meteorological information. In this case, the full meteorological space can then be represented by arranging numerous particles with their point locations in a certain structure and resolution, i.e., the spatial lattice model, and extended at a higher resolution when necessary. For practical use, the meteorological space is logically classified into three types of spaces, namely the projection surface space, curved surface space, and stereoscopic space, and application-oriented spatial lattice models with different organization forms of spatial sampling particles are designed to support the representation, inquiry, and analysis of meteorological information within the three types of surfaces. Cases

18. Evaluation of different approaches for modeling Escherichia coli O157:H7 survival on field lettuce.

Science.gov (United States)

McKellar, Robin C; Peréz-Rodríguez, Fernando; Harris, Linda J; Moyne, Anne-Laure; Blais, Burton; Topp, Ed; Bezanson, Greg; Bach, Susan; Delaquis, Pascal

2014-08-01

The ability to predict the behavior of Escherichia coli O157:H7 on contaminated field lettuce is essential for the development of accurate quantitative microbial risk assessments. The survival pattern of the species was assessed from several data sets derived from field-based experiments, which were analyzed by regression analysis fitting one monophasic model (log-linear) and two biphasic (Weibull and Cerf's model) models. Probabilistic models were also simulated with @RISK™, integrating the fitted monophasic and biphasic models in order to analyze their impact on the estimate of the extent of die-off subsequent to a contamination event in the field. Regression analysis indicated that E. coli O157:H7 followed a biphasic decay pattern in most cases, with the Weibull and Cerf's model showing similar good fit to individual and pooled survival data. Furthermore, results from the stochastic analysis demonstrated that using the log-linear model could lead to different risk estimates from those obtained with biphasic models, with a lower prevalence in the former scenario as no tailing is assumed in this model. The models and results derived from this work provide the first suitable mathematical base upon which to build probabilistic models to predict the fate of E. coli O157:H7 on field-grown leafy green vegetable. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

19. Applying Transtheoretical Model to Promote Physical Activities Among Women

Science.gov (United States)

Pirzadeh, Asiyeh; Mostafavi, Firoozeh; Ghofranipour, Fazllolah; Feizi, Awat

2015-01-01

Background: Physical activity is one of the most important indicators of health in communities but different studies conducted in the provinces of Iran showed that inactivity is prevalent, especially among women. Objectives: Inadequate regular physical activities among women, the importance of education in promoting the physical activities, and lack of studies on the women using transtheoretical model, persuaded us to conduct this study with the aim of determining the application of transtheoretical model in promoting the physical activities among women of Isfahan. Materials and Methods: This research was a quasi-experimental study which was conducted on 141 women residing in Isfahan, Iran. They were randomly divided into case and control groups. In addition to the demographic information, their physical activities and the constructs of the transtheoretical model (stages of change, processes of change, decisional balance, and self-efficacy) were measured at 3 time points; preintervention, 3 months, and 6 months after intervention. Finally, the obtained data were analyzed through t test and repeated measures ANOVA test using SPSS version 16. Results: The results showed that education based on the transtheoretical model significantly increased physical activities in 2 aspects of intensive physical activities and walking, in the case group over the time. Also, a high percentage of people have shown progress during the stages of change, the mean of the constructs of processes of change, as well as pros and cons. On the whole, a significant difference was observed over the time in the case group (P < 0.01). Conclusions: This study showed that interventions based on the transtheoretical model can promote the physical activity behavior among women. PMID:26834796

20. Consideration of an applied model of public health program infrastructure.

Science.gov (United States)

Lavinghouze, René; Snyder, Kimberly; Rieker, Patricia; Ottoson, Judith

2013-01-01

Systemic infrastructure is key to public health achievements. Individual public health program infrastructure feeds into this larger system. Although program infrastructure is rarely defined, it needs to be operationalized for effective implementation and evaluation. The Ecological Model of Infrastructure (EMI) is one approach to defining program infrastructure. The EMI consists of 5 core (Leadership, Partnerships, State Plans, Engaged Data, and Managed Resources) and 2 supporting (Strategic Understanding and Tactical Action) elements that are enveloped in a program's context. We conducted a literature search across public health programs to determine support for the EMI. Four of the core elements were consistently addressed, and the other EMI elements were intermittently addressed. The EMI provides an initial and partial model for understanding program infrastructure, but additional work is needed to identify evidence-based indicators of infrastructure elements that can be used to measure success and link infrastructure to public health outcomes, capacity, and sustainability.

1. Applying learning theories and instructional design models for effective instruction.

Science.gov (United States)

Khalil, Mohammed K; Elkhider, Ihsan A

2016-06-01

Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory. Copyright © 2016 The American Physiological Society.

2. Synthetic neural modeling applied to a real-world artifact.

OpenAIRE

Edelman, G M; Reeke, G N; Gall, W E; Tononi, G; Williams, D.; Sporns, O.

1992-01-01

We describe the general design, operating principles, and performance of a neurally organized, multiply adaptive device (NOMAD) under control of a nervous system simulated in a computer. The complete system, Darwin IV, is the latest in a series of models based on the theory of neuronal group selection, which postulates that adaptive behavior is the result of selection in somatic time among synaptic populations. The simulated brain of Darwin IV includes visual and motor areas that are connecte...

3. Applying CIPP Model for Learning-Object Management

Science.gov (United States)

Morgado, Erla M. Morales; Peñalvo, Francisco J. García; Martín, Carlos Muñoz; Gonzalez, Miguel Ángel Conde

Although knowledge management process needs to receive some evaluation in order to determine their suitable functionality. There is not a clear definition about the stages where LOs need to be evaluated and the specific metrics to continuously promote their quality. This paper presents a proposal for LOs evaluation during their management for e-learning systems. To achieve this, we suggest specific steps for LOs design, implementation and evaluation into the four stages proposed by CIPP model (Context, Input, Process, Product).

4. Simulation of Road Traffic Applying Model-Driven Engineering

Directory of Open Access Journals (Sweden)

Alberto FERNÁNDEZ-ISABEL

2016-05-01

Full Text Available Road traffic is an important phenomenon in modern societies. The study of its different aspects in the multiple scenarios where it happens is relevant for a huge number of problems. At the same time, its scale and complexity make it hard to study. Traffic simulations can alleviate these difficulties, simplifying the scenarios to consider and controlling their variables. However, their development also presents difficulties. The main ones come from the need to integrate the way of working of researchers and developers from multiple fields. Model-Driven Engineering (MDE addresses these problems using Modelling Languages (MLs and semi-automatic transformations to organise and describe the development, from requirements to code. This paper presents a domain-specific MDE framework for simulations of road traffic. It comprises an extensible ML, support tools, and development guidelines. The ML adopts an agent-based approach, which is focused on the roles of individuals in road traffic and their decision-making. A case study shows the process to model a traffic theory with the ML, and how to specialise that specification for an existing target platform and its simulations. The results are the basis for comparison with related work.

5. Applying fuzzy analytic network process in quality function deployment model

Directory of Open Access Journals (Sweden)

2012-08-01

Full Text Available In this paper, we propose an empirical study of QFD implementation when fuzzy numbers are used to handle the uncertainty associated with different components of the proposed model. We implement fuzzy analytical network to find the relative importance of various criteria and using fuzzy numbers we calculate the relative importance of these factors. The proposed model of this paper uses fuzzy matrix and house of quality to study the products development in QFD and also the second phase i.e. part deployment. In most researches, the primary objective is only on CRs to implement the quality function deployment and some other criteria such as production costs, manufacturing costs etc were disregarded. The results of using fuzzy analysis network process based on the QFD model in Daroupat packaging company to develop PVDC show that the most important indexes are being waterproof, resistant pill packages, and production cost. In addition, the PVDC coating is the most important index in terms of company experts’ point of view.

6. Model evaluation based on the negative predictive value for interval-censored survival outcomes.

Science.gov (United States)

Han, Seungbong; Tsui, Kam-Wah; Andrei, Adin-Cristian

2017-04-01

In many cohort studies, time to events such as disease recurrence is recorded in an interval-censored format. An important objective is to predict patient outcomes. Clinicians are interested in predictive covariates. Prediction rules based on the receiver operating characteristic curve alone are not related to the survival endpoint. We propose a model evaluation strategy to leverage the predictive accuracy based on negative predictive functions. Our proposed method makes very few assumptions and only requires a working model to obtain the regression coefficients. A nonparametric estimate of the predictive accuracy provides a simple and flexible approach for model evaluation to interval-censored survival outcomes. The implementation effort is minimal, therefore this method has an increased potential for immediate use in biomedical data analyses. Simulation studies and a breast cancer trial example further illustrate the practical advantages of this approach.

7. BAYESIAN INFERENCE OF HIDDEN GAMMA WEAR PROCESS MODEL FOR SURVIVAL DATA WITH TIES.

Science.gov (United States)

Sinha, Arijit; Chi, Zhiyi; Chen, Ming-Hui

2015-10-01

Survival data often contain tied event times. Inference without careful treatment of the ties can lead to biased estimates. This paper develops the Bayesian analysis of a stochastic wear process model to fit survival data that might have a large number of ties. Under a general wear process model, we derive the likelihood of parameters. When the wear process is a Gamma process, the likelihood has a semi-closed form that allows posterior sampling to be carried out for the parameters, hence achieving model selection using Bayesian deviance information criterion. An innovative simulation algorithm via direct forward sampling and Gibbs sampling is developed to sample event times that may have ties in the presence of arbitrary covariates; this provides a tool to assess the precision of inference. An extensive simulation study is reported and a data set is used to further illustrate the proposed methodology.

8. Semi-parametric regression model for survival data: graphical visualization with R.

Science.gov (United States)

Zhang, Zhongheng

2016-12-01

Cox proportional hazards model is a semi-parametric model that leaves its baseline hazard function unspecified. The rationale to use Cox proportional hazards model is that (I) the underlying form of hazard function is stringent and unrealistic, and (II) researchers are only interested in estimation of how the hazard changes with covariate (relative hazard). Cox regression model can be easily fit with coxph() function in survival package. Stratified Cox model may be used for covariate that violates the proportional hazards assumption. The relative importance of covariates in population can be examined with the rankhazard package in R. Hazard ratio curves for continuous covariates can be visualized using smoothHR package. This curve helps to better understand the effects that each continuous covariate has on the outcome. Population attributable fraction is a classic quantity in epidemiology to evaluate the impact of risk factor on the occurrence of event in the population. In survival analysis, the adjusted/unadjusted attributable fraction can be plotted against survival time to obtain attributable fraction function.

9. Mediation Analysis with Survival Outcomes: Accelerated Failure Time Versus Proportional Hazards Models

Directory of Open Access Journals (Sweden)

Lois A Gelfand

2016-03-01

Full Text Available Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH and fully parametric accelerated failure time (AFT approaches for illustration.Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively under varied data conditions, some including censoring. A simulated data set illustrates the findings.Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome – underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG.Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

10. Nonlinear group survival in Kimura's model for the evolution of altruism.

Science.gov (United States)

Fontanari, José F; Serva, Maurizio

2014-03-01

Establishing the conditions that guarantee the spreading or the sustenance of altruistic traits in a population is the main goal of intergroup selection models. Of particular interest is the balance of the parameters associated to group size, migration and group survival against the selective advantage of the non-altruistic individuals. Here we use Kimura's diffusion model of intergroup selection to determine those conditions in the case the group survival rate is a nonlinear non-decreasing function of the proportion of altruists in a group. In the case this function is linear, there are two possible steady states which correspond to the non-altruistic and the altruistic phases. At the discontinuous transition line separating these phases there is a non-ergodic coexistence phase. For a continuous concave survival function, we find an ergodic coexistence phase that occupies a finite region of the parameter space in between the altruistic and the non-altruistic phases, and is separated from these phases by continuous transition lines. For a convex survival function, the coexistence phase disappears altogether but a bistable phase appears for which the choice of the initial condition determines whether the evolutionary dynamics leads to the altruistic or the non-altruistic steady state. Copyright © 2014 Elsevier Inc. All rights reserved.

11. Inverse geothermal modelling applied to Danish sedimentary basins

DEFF Research Database (Denmark)

Poulsen, Soren E.; Balling, Niels; Bording, Thue S.

2017-01-01

. The numerical model was utilized for predicting and contouring temperatures at 2000 and 3000 m depths and for two main geothermal reservoir units, the Gassum (Lower Jurassic-Upper Triassic) and Bunter/Skagerrak (Triassic) reservoirs, both currently utilized for geothermal energy production. Temperature...... gradients to depths of 2000-3000 m are generally around 25-30. degrees C km(-1), locally up to about 35. degrees C km(-1). Large regions have geothermal reservoirs with characteristic temperatures ranging from ca. 40-50. degrees C, at 1000-1500 m depth, to ca. 80-110. degrees C, at 2500-3500 m, however...

12. Comparison of various modelling approaches applied to cholera case data

CSIR Research Space (South Africa)

Van Den Bergh, F

2008-06-01

Full Text Available and outbreaks of cholera in Bangladesh have been demonstrated by Huq et al. [15]. A study by Gil et al. [12] indicated a relationship between cholera incidence and elevated sea surface temperatures in Peru, including e ects from the 1997{1998 El Ni~no, while... regression has previously been used to model cholera case data in the Bangladesh study of Huq et al. [15]. Further details on Poisson regression may be found in McCullagh & Nelder [22], Agresti [2] or Montgomery et al. [23]. Negative binomial regression...

13. Habitat-specific breeder survival of Florida Scrub-Jays: Inferences from multistate models

Science.gov (United States)

Breininger, D.R.; Nichols, J.D.; Carter, G.M.; Oddy, D.M.

2009-01-01

Quantifying habitat-specific survival and changes in habitat quality within disturbance-prone habitats is critical for understanding population dynamics and variation in fitness, and for managing degraded ecosystems. We used 18 years of color-banding data and multistate capture-recapture models to test whether habitat quality within territories influences survival and detection probability of breeding Florida Scrub-Jays (Aphelocoma coerulescens) and to estimate bird transition probabilities from one territory quality state to another. Our study sites were along central Florida's Atlantic coast and included two of the four largest metapopulations within the species range. We developed Markov models for habitat transitions and compared these to bird transition probabilities. Florida Scrub-Jay detection probabilities ranged from 0.88 in the tall territory state to 0.99 in the optimal state; detection probabilities were intermediate in the short state. Transition probabilities were similar for birds and habitat in grid cells mapped independently of birds. Thus, bird transitions resulted primarily from habitat transitions between states over time and not from bird movement. Survival ranged from 0.71 in the short state to 0.82 in the optimal state, with tall states being intermediate. We conclude that average Florida Scrub-Jay survival will remain at levels that lead to continued population declines because most current habitat quality is only marginally suitable across most of the species range. Improvements in habitat are likely to be slow and difficult because tall states are resistant to change and the optimal state represents an intermediate transitional stage. The multistate modeling approach to quantifying survival and habitat transition probabilities is useful for quantifying habitat transition probabilities and comparing them to bird transition probabilities to test for habitat selection in dynamic environments. ?? 2009 by the Ecological society ot America.

14. Synthetic neural modeling applied to a real-world artifact.

Science.gov (United States)

Edelman, G M; Reeke, G N; Gall, W E; Tononi, G; Williams, D; Sporns, O

1992-08-01

We describe the general design, operating principles, and performance of a neurally organized, multiply adaptive device (NOMAD) under control of a nervous system simulated in a computer. The complete system, Darwin IV, is the latest in a series of models based on the theory of neuronal group selection, which postulates that adaptive behavior is the result of selection in somatic time among synaptic populations. The simulated brain of Darwin IV includes visual and motor areas that are connected with NOMAD by telemetry. Under suitable conditions, Darwin IV can be trained to track a light moving in a random path. After such training, it can approach colored blocks and collect them to a home position. Following a series of contacts with such blocks, value signals received through a "snout" that senses conductivity allow it to sort these blocks on the basis of differences in color associated with differences in their conductivity. Darwin IV represents a new approach to synthetic neural modeling (SNM), a technique in which large-scale computer simulations are employed to analyze the interactions among the nervous system, the phenotype, and the environment of a designed organism as behavior develops. Darwin IV retains the advantages of SNM while avoiding the difficulties and pitfalls of attempting to simulate a rich environment in addition to a brain.

15. Neighborhood Design, Physical Activity, and Wellbeing: Applying the Walkability Model

Directory of Open Access Journals (Sweden)

2017-01-01

Full Text Available Neighborhood design affects lifestyle physical activity, and ultimately human wellbeing. There are, however, a limited number of studies that examine neighborhood design types. In this research, we examine four types of neighborhood designs: traditional development, suburban development, enclosed community, and cluster housing development, and assess their level of walkability and their effects on physical activity and wellbeing. We examine significant associations through a questionnaire (n = 486 distributed in Tucson, Arizona using the Walkability Model. Among the tested neighborhood design types, traditional development showed significant associations and the highest value for walkability, as well as for each of the two types of walking (recreation and transportation representing physical activity. Suburban development showed significant associations and the highest mean values for mental health and wellbeing. Cluster housing showed significant associations and the highest mean value for social interactions with neighbors and for perceived safety from crime. Enclosed community did not obtain the highest means for any wellbeing benefit. The Walkability Model proved useful in identifying the walkability categories associated with physical activity and perceived crime. For example, the experience category was strongly and inversely associated with perceived crime. This study provides empirical evidence of the importance of including vegetation, particularly trees, throughout neighborhoods in order to increase physical activity and wellbeing. Likewise, the results suggest that regular maintenance is an important strategy to improve mental health and overall wellbeing in cities.

16. Neighborhood Design, Physical Activity, and Wellbeing: Applying the Walkability Model.

Science.gov (United States)

Zuniga-Teran, Adriana A; Orr, Barron J; Gimblett, Randy H; Chalfoun, Nader V; Guertin, David P; Marsh, Stuart E

2017-01-13

Neighborhood design affects lifestyle physical activity, and ultimately human wellbeing. There are, however, a limited number of studies that examine neighborhood design types. In this research, we examine four types of neighborhood designs: traditional development, suburban development, enclosed community, and cluster housing development, and assess their level of walkability and their effects on physical activity and wellbeing. We examine significant associations through a questionnaire ( n = 486) distributed in Tucson, Arizona using the Walkability Model. Among the tested neighborhood design types, traditional development showed significant associations and the highest value for walkability, as well as for each of the two types of walking (recreation and transportation) representing physical activity. Suburban development showed significant associations and the highest mean values for mental health and wellbeing. Cluster housing showed significant associations and the highest mean value for social interactions with neighbors and for perceived safety from crime. Enclosed community did not obtain the highest means for any wellbeing benefit. The Walkability Model proved useful in identifying the walkability categories associated with physical activity and perceived crime. For example, the experience category was strongly and inversely associated with perceived crime. This study provides empirical evidence of the importance of including vegetation, particularly trees, throughout neighborhoods in order to increase physical activity and wellbeing. Likewise, the results suggest that regular maintenance is an important strategy to improve mental health and overall wellbeing in cities.

17. Analysis of survival in breast cancer patients by using different parametric models

Science.gov (United States)

Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti

2017-09-01

In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.

18. Applying a Hybrid MCDM Model for Six Sigma Project Selection

Directory of Open Access Journals (Sweden)

Fu-Kwun Wang

2014-01-01

Full Text Available Six Sigma is a project-driven methodology; the projects that provide the maximum financial benefits and other impacts to the organization must be prioritized. Project selection (PS is a type of multiple criteria decision making (MCDM problem. In this study, we present a hybrid MCDM model combining the decision-making trial and evaluation laboratory (DEMATEL technique, analytic network process (ANP, and the VIKOR method to evaluate and improve Six Sigma projects for reducing performance gaps in each criterion and dimension. We consider the film printing industry of Taiwan as an empirical case. The results show that our study not only can use the best project selection, but can also be used to analyze the gaps between existing performance values and aspiration levels for improving the gaps in each dimension and criterion based on the influential network relation map.

19. Two Dimensional Projection Pursuit Applied to Gaussian Mixture Model Fitting

Directory of Open Access Journals (Sweden)

Natella Likhterov

2003-08-01

Full Text Available In this paper we seek a Gaussian mixture model (GMM of an n-variate probability density function. Usually the parameters of GMMs are determined by a maximum likelihood (ML criterion. A practical deficiency of ML fitting of GMMs is poor performance when dealing with high-dimensional data since a large sample size is needed to match the accuracy that is possible in low dimensions. We propose a method to fit the GMM to multivariate data which is based on the two-dimensional projection pursuit (PP method. By means of simulations we compare the proposed method with a one-dimensional PP method for GMM. We conclude that a combination of one- and twodimensional PP methods could be useful in some applications.

20. Virtual building environments (VBE) - Applying information modeling to buildings

Energy Technology Data Exchange (ETDEWEB)

2004-06-21

A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

1. Sensorless position estimator applied to nonlinear IPMC model

Science.gov (United States)

Bernat, Jakub; Kolota, Jakub

2016-11-01

This paper addresses the issue of estimating position for an ionic polymer metal composite (IPMC) known as electro active polymer (EAP). The key step is the construction of a sensorless mode considering only current feedback. This work takes into account nonlinearities caused by electrochemical effects in the material. Owing to the recent observer design technique, the authors obtained both Lyapunov function based estimation law as well as sliding mode observer. To accomplish the observer design, the IPMC model was identified through a series of experiments. The research comprises time domain measurements. The identification process was completed by means of geometric scaling of three test samples. In the proposed design, the estimated position accurately tracks the polymer position, which is illustrated by the experiments.

2. Applying the welfare model to at-own-risk discharges.

Science.gov (United States)

Krishna, Lalit Kumar Radha; Menon, Sumytra; Kanesvaran, Ravindran

2017-08-01

"At-own-risk discharges" or "self-discharges" evidences an irretrievable breakdown in the patient-clinician relationship when patients leave care facilities before completion of medical treatment and against medical advice. Dissolution of the therapeutic relationship terminates the physician's duty of care and professional liability with respect to care of the patient. Acquiescence of an at-own-risk discharge by the clinician is seen as respecting patient autonomy. The validity of such requests pivot on the assumptions that the patient is fully informed and competent to invoke an at-own-risk discharge and that care up to the point of the at-own-risk discharge meets prevailing clinical standards. Palliative care's use of a multidisciplinary team approach challenges both these assumptions. First by establishing multiple independent therapeutic relations between professionals in the multidisciplinary team and the patient who persists despite an at-own-risk discharge. These enduring therapeutic relationships negate the suggestion that no duty of care is owed the patient. Second, the continued employ of collusion, familial determinations, and the circumnavigation of direct patient involvement in family-centric societies compromises the patient's decision-making capacity and raises questions as to the patient's decision-making capacity and their ability to assume responsibility for the repercussions of invoking an at-own-risk discharge. With the validity of at-own-risk discharge request in question and the welfare and patient interest at stake, an alternative approach to assessing at-own-risk discharge requests are called for. The welfare model circumnavigates these concerns and preserves the patient's welfare through the employ of a multidisciplinary team guided holistic appraisal of the patient's specific situation that is informed by clinical and institutional standards and evidenced-based practice. The welfare model provides a robust decision-making framework for

3. Nonspherical Radiation Driven Wind Models Applied to Be Stars

Science.gov (United States)

Arauxo, F. X.

1990-11-01

4. Developing and Validating a Survival Prediction Model for NSCLC Patients Through Distributed Learning Across 3 Countries.

Science.gov (United States)

Jochems, Arthur; Deist, Timo M; El Naqa, Issam; Kessler, Marc; Mayo, Chuck; Reeves, Jackson; Jolly, Shruti; Matuszak, Martha; Ten Haken, Randall; van Soest, Johan; Oberije, Cary; Faivre-Finn, Corinne; Price, Gareth; de Ruysscher, Dirk; Lambin, Philippe; Dekker, Andre

2017-10-01

Tools for survival prediction for non-small cell lung cancer (NSCLC) patients treated with chemoradiation or radiation therapy are of limited quality. In this work, we developed a predictive model of survival at 2 years. The model is based on a large volume of historical patient data and serves as a proof of concept to demonstrate the distributed learning approach. Clinical data from 698 lung cancer patients, treated with curative intent with chemoradiation or radiation therapy alone, were collected and stored at 2 different cancer institutes (559 patients at Maastro clinic (Netherlands) and 139 at Michigan university [United States]). The model was further validated on 196 patients originating from The Christie (United Kingdon). A Bayesian network model was adapted for distributed learning (the animation can be viewed at https://www.youtube.com/watch?v=ZDJFOxpwqEA). Two-year posttreatment survival was chosen as the endpoint. The Maastro clinic cohort data are publicly available at https://www.cancerdata.org/publication/developing-and-validating-survival-prediction-model-nsclc-patients-through-distributed, and the developed models can be found at www.predictcancer.org. Variables included in the final model were T and N category, age, performance status, and total tumor dose. The model has an area under the curve (AUC) of 0.66 on the external validation set and an AUC of 0.62 on a 5-fold cross validation. A model based on the T and N category performed with an AUC of 0.47 on the validation set, significantly worse than our model (P<.001). Learning the model in a centralized or distributed fashion yields a minor difference on the probabilities of the conditional probability tables (0.6%); the discriminative performance of the models on the validation set is similar (P=.26). Distributed learning from federated databases allows learning of predictive models on data originating from multiple institutions while avoiding many of the data-sharing barriers. We believe

5. Modelling p-value distributions to improve theme-driven survival analysis of cancer transcriptome datasets

Directory of Open Access Journals (Sweden)

Brors Benedikt

2010-01-01

Full Text Available Abstract Background Theme-driven cancer survival studies address whether the expression signature of genes related to a biological process can predict patient survival time. Although this should ideally be achieved by testing two separate null hypotheses, current methods treat both hypotheses as one. The first test should assess whether a geneset, independent of its composition, is associated with prognosis (frequently done with a survival test. The second test then verifies whether the theme of the geneset is relevant (usually done with an empirical test that compares the geneset of interest with random genesets. Current methods do not test this second null hypothesis because it has been assumed that the distribution of p-values for random genesets (when tested against the first null hypothesis is uniform. Here we demonstrate that such an assumption is generally incorrect and consequently, such methods may erroneously associate the biology of a particular geneset with cancer prognosis. Results To assess the impact of non-uniform distributions for random genesets in such studies, an automated theme-driven method was developed. This method empirically approximates the p-value distribution of sets of unrelated genes based on a permutation approach, and tests whether predefined sets of biologically-related genes are associated with survival. The results from a comparison with a published theme-driven approach revealed non-uniform distributions, suggesting a significant problem exists with false positive rates in the original study. When applied to two public cancer datasets our technique revealed novel ontological categories with prognostic power, including significant correlations between "fatty acid metabolism" with overall survival in breast cancer, as well as "receptor mediated endocytosis", "brain development", "apical plasma membrane" and "MAPK signaling pathway" with overall survival in lung cancer. Conclusions Current methods of theme

6. Atomistic Method Applied to Computational Modeling of Surface Alloys

Science.gov (United States)

Bozzolo, Guillermo H.; Abel, Phillip B.

2000-01-01

The formation of surface alloys is a growing research field that, in terms of the surface structure of multicomponent systems, defines the frontier both for experimental and theoretical techniques. Because of the impact that the formation of surface alloys has on surface properties, researchers need reliable methods to predict new surface alloys and to help interpret unknown structures. The structure of surface alloys and when, and even if, they form are largely unpredictable from the known properties of the participating elements. No unified theory or model to date can infer surface alloy structures from the constituents properties or their bulk alloy characteristics. In spite of these severe limitations, a growing catalogue of such systems has been developed during the last decade, and only recently are global theories being advanced to fully understand the phenomenon. None of the methods used in other areas of surface science can properly model even the already known cases. Aware of these limitations, the Computational Materials Group at the NASA Glenn Research Center at Lewis Field has developed a useful, computationally economical, and physically sound methodology to enable the systematic study of surface alloy formation in metals. This tool has been tested successfully on several known systems for which hard experimental evidence exists and has been used to predict ternary surface alloy formation (results to be published: Garces, J.E.; Bozzolo, G.; and Mosca, H.: Atomistic Modeling of Pd/Cu(100) Surface Alloy Formation. Surf. Sci., 2000 (in press); Mosca, H.; Garces J.E.; and Bozzolo, G.: Surface Ternary Alloys of (Cu,Au)/Ni(110). (Accepted for publication in Surf. Sci., 2000.); and Garces, J.E.; Bozzolo, G.; Mosca, H.; and Abel, P.: A New Approach for Atomistic Modeling of Pd/Cu(110) Surface Alloy Formation. (Submitted to Appl. Surf. Sci.)). Ternary alloy formation is a field yet to be fully explored experimentally. The computational tool, which is based on

7. Applying revised gap analysis model in measuring hotel service quality.

Science.gov (United States)

Lee, Yu-Cheng; Wang, Yu-Che; Chien, Chih-Hung; Wu, Chia-Huei; Lu, Shu-Chiung; Tsai, Sang-Bing; Dong, Weiwei

2016-01-01

With the number of tourists coming to Taiwan growing by 10-20 % since 2010, the number has increased due to an increasing number of foreign tourists, particularly after deregulation allowed admitting tourist groups, followed later on by foreign individual tourists, from mainland China. The purpose of this study is to propose a revised gap model to evaluate and improve service quality in Taiwanese hotel industry. Thus, service quality could be clearly measured through gap analysis, which was more effective for offering direction in developing and improving service quality. The HOLSERV instrument was used to identify and analyze service gaps from the perceptions of internal and external customers. The sample for this study included three main categories of respondents: tourists, employees, and managers. The results show that five gaps influenced tourists' evaluations of service quality. In particular, the study revealed that Gap 1 (management perceptions vs. customer expectations) and Gap 9 (service provider perceptions of management perceptions vs. service delivery) were more critical than the others in affecting perceived service quality, making service delivery the main area of improvement. This study contributes toward an evaluation of the service quality of the Taiwanese hotel industry from the perspectives of customers, service providers, and managers, which is considerably valuable for hotel managers. It was the aim of this study to explore all of these together in order to better understand the possible gaps in the hotel industry in Taiwan.

8. Optical Neural Network Models Applied To Logic Program Execution

Science.gov (United States)

Stormon, Charles D.

1988-05-01

Logic programming is being used extensively by Artificial Intelligence researchers to solve problems including natural language processing and expert systems. These languages, of which Prolog is the most widely used, promise to revolutionize software engineering, but much greater performance is needed. Researchers have demonstrated the applicability of neural network models to the solution of certain NP-complete problems, but these methods are not obviously applicable to the execution of logic programs. This paper outlines the use of neural networks in four aspects of the logic program execution cycle, and discusses results of a simulation of three of these. Four neural network functional units are described, called the substitution agent, the clause filter, the structure processor, and the heuristics generator, respectively. Simulation results suggest that the system described may provide several orders of magnitude improvement in execution speed for large logic programs. However, practical implementation of the proposed architecture will require the application of optical computing techniques due to the large number of neurons required, and the need for massive, adaptive connectivity.

9. A model of smoldering combustion applied to flexible polyurethane foams

Science.gov (United States)

Ohlemiller, T. J.; Rogers, F.; Bellan, J.

1979-01-01

Smoldering combustion, particularly in upholstery and bedding materials, has been proven a serious life hazard. The simplest representation of this hazard situation is one-dimensional downward propagation of a smolder wave against a buoyant upflow (cocurrent smolder); the configuration treated here is identical in all respects to this except for the presence of a forced flow replacing the buoyant one. The complex degradation chemistry of the polyurethanes is here reduced to the two major overall reactions of char formation and char oxidation. The model solutions, which are in reasonable agreement with experimental results, show the smolder process to be oxygen-limited, which leads to some very simple trends. More subtle behavior aspects determine actual propagation velocity, fraction of fuel consumed, and apparent equivalence ratio (all of which are variable). The self-insulating character of the smolder wave makes it viable in a wide-ranging set of conditions if the igniting stimulus is sufficiently long. These results have significant implications regarding the problem of smolder prevention or hindrance.

10. A variable polytrope index applied to planet and material models

Science.gov (United States)

Thielen, Kevin; Weppner, Stephen; Zielinski, Alexander

2016-01-01

We introduce a new approach to a century-old assumption which enhances not only planetary interior calculations but also high-pressure material physics. We show that the polytropic index is the derivative of the bulk modulus with respect to pressure. We then augment the traditional polytrope theory by including a variable polytrope index within the confines of the Lane-Emden differential equation. To investigate the possibilities of this method, we create a high-quality universal equation of state, transforming the traditional polytrope method to a tool with the potential for excellent predictive power. The theoretical foundation of our equation of state is the same elastic observable which we found equivalent to the polytrope index, the derivative of the bulk modulus with respect to pressure. We calculate the density-pressure of six common materials up to 1018 Pa, mass-radius relationships for the same materials, and produce plausible density-radius models for the rocky planets of our Solar system. We argue that the bulk modulus and its derivatives have been underutilized in previous planet formation methods. We constrain the material surface observables for the inner core, outer core, and mantle of planet Earth in a systematic way including pressure, bulk modulus, and the polytrope index in the analysis. We believe that this variable polytrope method has the necessary apparatus to be extended further to gas giants and stars. As supplemental material we provide computer code to calculate multi-layered planets.

11. Interdependent multi-layer networks: modeling and survivability analysis with applications to space-based networks.

Science.gov (United States)

Castet, Jean-Francois; Saleh, Joseph H

2013-01-01

This article develops a novel approach and algorithmic tools for the modeling and survivability analysis of networks with heterogeneous nodes, and examines their application to space-based networks. Space-based networks (SBNs) allow the sharing of spacecraft on-orbit resources, such as data storage, processing, and downlink. Each spacecraft in the network can have different subsystem composition and functionality, thus resulting in node heterogeneity. Most traditional survivability analyses of networks assume node homogeneity and as a result, are not suited for the analysis of SBNs. This work proposes that heterogeneous networks can be modeled as interdependent multi-layer networks, which enables their survivability analysis. The multi-layer aspect captures the breakdown of the network according to common functionalities across the different nodes, and it allows the emergence of homogeneous sub-networks, while the interdependency aspect constrains the network to capture the physical characteristics of each node. Definitions of primitives of failure propagation are devised. Formal characterization of interdependent multi-layer networks, as well as algorithmic tools for the analysis of failure propagation across the network are developed and illustrated with space applications. The SBN applications considered consist of several networked spacecraft that can tap into each other's Command and Data Handling subsystem, in case of failure of its own, including the Telemetry, Tracking and Command, the Control Processor, and the Data Handling sub-subsystems. Various design insights are derived and discussed, and the capability to perform trade-space analysis with the proposed approach for various network characteristics is indicated. The select results here shown quantify the incremental survivability gains (with respect to a particular class of threats) of the SBN over the traditional monolith spacecraft. Failure of the connectivity between nodes is also examined, and the

12. Interdependent multi-layer networks: modeling and survivability analysis with applications to space-based networks.

Directory of Open Access Journals (Sweden)

Jean-Francois Castet

Full Text Available This article develops a novel approach and algorithmic tools for the modeling and survivability analysis of networks with heterogeneous nodes, and examines their application to space-based networks. Space-based networks (SBNs allow the sharing of spacecraft on-orbit resources, such as data storage, processing, and downlink. Each spacecraft in the network can have different subsystem composition and functionality, thus resulting in node heterogeneity. Most traditional survivability analyses of networks assume node homogeneity and as a result, are not suited for the analysis of SBNs. This work proposes that heterogeneous networks can be modeled as interdependent multi-layer networks, which enables their survivability analysis. The multi-layer aspect captures the breakdown of the network according to common functionalities across the different nodes, and it allows the emergence of homogeneous sub-networks, while the interdependency aspect constrains the network to capture the physical characteristics of each node. Definitions of primitives of failure propagation are devised. Formal characterization of interdependent multi-layer networks, as well as algorithmic tools for the analysis of failure propagation across the network are developed and illustrated with space applications. The SBN applications considered consist of several networked spacecraft that can tap into each other's Command and Data Handling subsystem, in case of failure of its own, including the Telemetry, Tracking and Command, the Control Processor, and the Data Handling sub-subsystems. Various design insights are derived and discussed, and the capability to perform trade-space analysis with the proposed approach for various network characteristics is indicated. The select results here shown quantify the incremental survivability gains (with respect to a particular class of threats of the SBN over the traditional monolith spacecraft. Failure of the connectivity between nodes is also

13. Survival analysis of stochastic competitive models in a polluted environment and stochastic competitive exclusion principle.

Science.gov (United States)

Liu, Meng; Wang, Ke; Wu, Qiong

2011-09-01

Stochastic competitive models with pollution and without pollution are proposed and studied. For the first system with pollution, sufficient criteria for extinction, nonpersistence in the mean, weak persistence in the mean, strong persistence in the mean, and stochastic permanence are established. The threshold between weak persistence in the mean and extinction for each population is obtained. It is found that stochastic disturbance is favorable for the survival of one species and is unfavorable for the survival of the other species. For the second system with pollution, sufficient conditions for extinction and weak persistence are obtained. For the model without pollution, a partial stochastic competitive exclusion principle is derived. © Society for Mathematical Biology 2010

14. Interleukin-7 Ameliorates Immune Dysfunction and Improves Survival in a 2-Hit Model of Fungal Sepsis

OpenAIRE

Unsinger, Jacqueline; Burnham, Carey-Ann D.; McDonough, Jacquelyn; Morre, Michel; Prakash, Priya S.; Caldwell, Charles C.; Dunne, W. Michael; Hotchkiss, Richard S.

2012-01-01

Background. Secondary hospital-acquired fungal infections are common in critically-ill patients and mortality remains high despite antimicrobial therapy. Interleukin-7 (IL-7) is a potent immunotherapeutic agent that improves host immunity and has shown efficacy in bacterial and viral models of infection. This study examined the ability of IL-7, which is currently in multiple clinical trials (including hepatitis and human immunodeficiency virus), to improve survival in a clinically relevant 2-...

15. Metabolomics with Nuclear Magnetic Resonance Spectroscopy in a Drosophila melanogaster Model of Surviving Sepsis

Science.gov (United States)

Bakalov, Veli; Amathieu, Roland; Triba, Mohamed N.; Clément, Marie-Jeanne; Reyes Uribe, Laura; Le Moyec, Laurence; Kaynar, Ata Murat

2016-01-01

Patients surviving sepsis demonstrate sustained inflammation, which has been associated with long-term complications. One of the main mechanisms behind sustained inflammation is a metabolic switch in parenchymal and immune cells, thus understanding metabolic alterations after sepsis may provide important insights to the pathophysiology of sepsis recovery. In this study, we explored metabolomics in a novel Drosophila melanogaster model of surviving sepsis using Nuclear Magnetic Resonance (NMR), to determine metabolite profiles. We used a model of percutaneous infection in Drosophila melanogaster to mimic sepsis. We had three experimental groups: sepsis survivors (infected with Staphylococcus aureus and treated with oral linezolid), sham (pricked with an aseptic needle), and unmanipulated (positive control). We performed metabolic measurements seven days after sepsis. We then implemented metabolites detected in NMR spectra into the MetExplore web server in order to identify the metabolic pathway alterations in sepsis surviving Drosophila. Our NMR metabolomic approach in a Drosophila model of recovery from sepsis clearly distinguished between all three groups and showed two different metabolomic signatures of inflammation. Sham flies had decreased levels of maltose, alanine, and glutamine, while their level of choline was increased. Sepsis survivors had a metabolic signature characterized by decreased glucose, maltose, tyrosine, beta-alanine, acetate, glutamine, and succinate. PMID:28009836

16. Non-viral VEGF165 gene therapy – magnetofection of acoustically active magnetic lipospheres (‘magnetobubbles’) increases tissue survival in an oversized skin flap model

Science.gov (United States)

Holzbach, Thomas; Vlaskou, Dialekti; Neshkova, Iva; Konerding, Moritz A; Wörtler, Klaus; Mykhaylyk, Olga; Gänsbacher, Bernd; Machens, H-G; Plank, Christian; Giunta, Riccardo E

2010-01-01

Abstract Adenoviral transduction of the VEGF gene in an oversized skin flap increases flap survival and perfusion. In this study, we investigated the potential of magnetofection of magnetic lipospheres containing VEGF165-cDNA on survival and perfusion of ischemic skin flaps and evaluated the method with respect to the significance of applied magnetic field and ultrasound. We prepared perfluoropropane-filled magnetic lipospheres (‘magnetobubbles’) from Tween60-coated magnetic nanoparticles, Metafectene, soybean-oil and cDNA and studied the effect in an oversized random-pattern-flap model in the rats (n= 46). VEGF-cDNA-magnetobubbles were administered under a magnetic field with simultaneously applied ultrasound, under magnetic field alone and with applied ultrasound alone. Therapy was conducted 7 days pre-operative. Flap survival and necrosis were measured 7 days post-operatively. Flap perfusion, VEGF-protein concentration in target and surrounding tissue, formation and appearance of new vessels were analysed additionally. Magnetofection with VEGF-cDNA-magnetobubbles presented an increased flap survival of 50% and increased flap perfusion (P < 0.05). Without ultrasound and without magnetic field, the effect is weakened. VEGF concentration in target tissue was elevated (P < 0.05), while underlying muscle was not affected. Our results demonstrate the successful VEGF gene therapy by means of magnetobubble magnetofection. Here, the method of magnetofection of magnetic lipospheres is equally efficient as adenoviral transduction, but has a presumable superior safety profile. PMID:19040418

17. Non-viral VEGF(165) gene therapy--magnetofection of acoustically active magnetic lipospheres ('magnetobubbles') increases tissue survival in an oversized skin flap model.

Science.gov (United States)

Holzbach, Thomas; Vlaskou, Dialekti; Neshkova, Iva; Konerding, Moritz A; Wörtler, Klaus; Mykhaylyk, Olga; Gänsbacher, Bernd; Machens, H-G; Plank, Christian; Giunta, Riccardo E

2010-03-01

Adenoviral transduction of the VEGF gene in an oversized skin flap increases flap survival and perfusion. In this study, we investigated the potential of magnetofection of magnetic lipospheres containing VEGF(165)-cDNA on survival and perfusion of ischemic skin flaps and evaluated the method with respect to the significance of applied magnetic field and ultrasound. We prepared perfluoropropane-filled magnetic lipospheres ('magnetobubbles') from Tween60-coated magnetic nanoparticles, Metafectene, soybean-oil and cDNA and studied the effect in an oversized random-pattern-flap model in the rats (n= 46). VEGF-cDNA-magnetobubbles were administered under a magnetic field with simultaneously applied ultrasound, under magnetic field alone and with applied ultrasound alone. Therapy was conducted 7 days pre-operative. Flap survival and necrosis were measured 7 days post-operatively. Flap perfusion, VEGF-protein concentration in target and surrounding tissue, formation and appearance of new vessels were analysed additionally. Magnetofection with VEGF-cDNA-magnetobubbles presented an increased flap survival of 50% and increased flap perfusion (P < 0.05). Without ultrasound and without magnetic field, the effect is weakened. VEGF concentration in target tissue was elevated (P < 0.05), while underlying muscle was not affected. Our results demonstrate the successful VEGF gene therapy by means of magnetobubble magnetofection. Here, the method of magnetofection of magnetic lipospheres is equally efficient as adenoviral transduction, but has a presumable superior safety profile.

18. Applying horizontal diffusion on pressure surface to mesoscale models on terrain-following coordinates

Science.gov (United States)

Hann-Ming Henry Juang; Ching-Teng Lee; Yongxin Zhang; Yucheng Song; Ming-Chin Wu; Yi-Leng Chen; Kevin Kodama; Shyh-Chin Chen

2005-01-01

The National Centers for Environmental Prediction regional spectral model and mesoscale spectral model (NCEP RSM/MSM) use a spectral computation on perturbation. The perturbation is defined as a deviation between RSM/MSM forecast value and their outer model or analysis value on model sigma-coordinate surfaces. The horizontal diffusion used in the models applies...

19. Leptin-deficient obesity prolongs survival in a murine model of myelodysplastic syndrome.

Science.gov (United States)

Kraakman, Michael J; Kammoun, Helene L; Dragoljevic, Dragana; Al-Sharea, Annas; Lee, Man K S; Flynn, Michelle C; Stolz, Christian J; Guirguis, Andrew A; Lancaster, Graeme I; Chin-Dusting, Jaye; Curtis, David J; Murphy, Andrew J

2018-01-25

Obesity enhances the risk of developing myelodysplastic syndromes. However, the effect of obesity on survival is unclear. Obese people present with monocytosis due to inflammatory signals emanating from obese adipose tissue. We hypothesized that obesity-induced myelopoiesis would promote the transition of myelodysplastic syndrome to acute myeloid leukemia and accelerate mortality in obesity. Obese Ob/Ob mice or their lean littermate controls received a bone marrow transplant from NUP98-HOXD13 transgenic mice, a model of myelodysplastic syndrome. The metabolic parameters of the mice were examined throughout the course of the study, as were blood leukocytes. Myeloid cells were analyzed in the bone, spleen, liver and adipose tissue by flow cytometry halfway through the disease progression and at the endpoint. Survival curves were also calculated. Contrary to our hypothesis, transplantation of NUP98-HOXD13 bone marrow into obese recipient mice significantly increased survival time compared with lean recipient controls. While monocyte skewing was exacerbated in obese mice receiving NUP98-HOXD13 bone marrow, transformation to acute myeloid leukemia was not enhanced. Increased survival of obese mice was associated with a preservation of fat mass as well as increased myeloid cell deposition within the adipose tissue and a concomitant reduction in detrimental myeloid cell accumulation within other organs. This study revealed that obesity increases survival in animals with myelodysplastic syndrome. This may be due to the greater fat mass of Ob/Ob mice, which acts as a sink for myeloid cells, preventing their accumulation in other key organs such as the liver. Copyright © 2018, Ferrata Storti Foundation.

20. A Validated Prediction Model for Overall Survival From Stage III Non-Small Cell Lung Cancer: Toward Survival Prediction for Individual Patients

Energy Technology Data Exchange (ETDEWEB)

Oberije, Cary, E-mail: cary.oberije@maastro.nl [Radiation Oncology, Research Institute GROW of Oncology, Maastricht University Medical Center, Maastricht (Netherlands); De Ruysscher, Dirk [Radiation Oncology, Research Institute GROW of Oncology, Maastricht University Medical Center, Maastricht (Netherlands); Universitaire Ziekenhuizen Leuven, KU Leuven (Belgium); Houben, Ruud [Radiation Oncology, Research Institute GROW of Oncology, Maastricht University Medical Center, Maastricht (Netherlands); Heuvel, Michel van de; Uyterlinde, Wilma [Department of Thoracic Oncology, Netherlands Cancer Institute, Amsterdam (Netherlands); Deasy, Joseph O. [Memorial Sloan Kettering Cancer Center, New York (United States); Belderbos, Jose [Department of Radiation Oncology, Netherlands Cancer Institute, Amsterdam (Netherlands); Dingemans, Anne-Marie C. [Department of Pulmonology, University Hospital Maastricht, Research Institute GROW of Oncology, Maastricht (Netherlands); Rimner, Andreas; Din, Shaun [Memorial Sloan Kettering Cancer Center, New York (United States); Lambin, Philippe [Radiation Oncology, Research Institute GROW of Oncology, Maastricht University Medical Center, Maastricht (Netherlands)

2015-07-15

Purpose: Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing and validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Methods and Materials: Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). Results: The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability ( (www.predictcancer.org)). The data set can be downloaded at (https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048). Conclusions: The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system.

1. Prognostic model for survival in patients with early stage cervical cancer.

Science.gov (United States)

Biewenga, Petra; van der Velden, Jacobus; Mol, Ben Willem J; Stalpers, Lukas J A; Schilthuis, Marten S; van der Steeg, Jan Willem; Burger, Matthé P M; Buist, Marrije R

2011-02-15

In the management of early stage cervical cancer, knowledge about the prognosis is critical. Although many factors have an impact on survival, their relative importance remains controversial. This study aims to develop a prognostic model for survival in early stage cervical cancer patients and to reconsider grounds for adjuvant treatment. A multivariate Cox regression model was used to identify the prognostic weight of clinical and histological factors for disease-specific survival (DSS) in 710 consecutive patients who had surgery for early stage cervical cancer (FIGO [International Federation of Gynecology and Obstetrics] stage IA2-IIA). Prognostic scores were derived by converting the regression coefficients for each prognostic marker and used in a score chart. The discriminative capacity was expressed as the area under the curve (AUC) of the receiver operating characteristic. The 5-year DSS was 92%. Tumor diameter, histological type, lymph node metastasis, depth of stromal invasion, lymph vascular space invasion, and parametrial extension were independently associated with DSS and were included in a Cox regression model. This prognostic model, corrected for the 9% overfit shown by internal validation, showed a fair discriminative capacity (AUC, 0.73). The derived score chart predicting 5-year DSS showed a good discriminative capacity (AUC, 0.85). In patients with early stage cervical cancer, DSS can be predicted with a statistical model. Models, such as that presented here, should be used in clinical trials on the effects of adjuvant treatments in high-risk early cervical cancer patients, both to stratify and to include patients. Copyright © 2010 American Cancer Society.

2. Model Proposition for the Fiscal Policies Analysis Applied in Economic Field

Directory of Open Access Journals (Sweden)

Larisa Preda

2007-05-01

Full Text Available This paper presents a study about fiscal policy applied in economic development. Correlations between macroeconomics and fiscal indicators signify the first steep in our analysis. Next step is a new model proposal for the fiscal and budgetary choices. This model is applied on the date of the Romanian case.

3. Development of a model to predict breast cancer survival using data from the National Cancer Data Base.

Science.gov (United States)

Asare, Elliot A; Liu, Lei; Hess, Kenneth R; Gordon, Elisa J; Paruch, Jennifer L; Palis, Bryan; Dahlke, Allison R; McCabe, Ryan; Cohen, Mark E; Winchester, David P; Bilimoria, Karl Y

2016-02-01

With the large amounts of data on patient, tumor, and treatment factors available to clinicians, it has become critically important to harness this information to guide clinicians in discussing a patient's prognosis. However, no widely accepted survival calculator is available that uses national data and includes multiple prognostic factors. Our objective was to develop a model for predicting survival among patients diagnosed with breast cancer using the National Cancer Data Base (NCDB) to serve as a prototype for the Commission on Cancer's "Cancer Survival Prognostic Calculator." A retrospective cohort of patients diagnosed with breast cancer (2003-2006) in the NCDB was included. A multivariable Cox proportional hazards regression model to predict overall survival was developed. Model discrimination by 10-fold internal cross-validation and calibration was assessed. There were 296,284 patients for model development and internal validation. The c-index for the 10-fold cross-validation ranged from 0.779 to 0.788 after inclusion of all available pertinent prognostic factors. A plot of the observed versus predicted 5 year overall survival showed minimal deviation from the reference line. This breast cancer survival prognostic model to be used as a prototype for building the Commission on Cancer's "Cancer Survival Prognostic Calculator" will offer patients and clinicians an objective opportunity to estimate personalized long-term survival based on patient demographic characteristics, tumor factors, and treatment delivered. Copyright © 2016 Elsevier Inc. All rights reserved.

4. Structural Modeling and Analysis of a Wave Energy Converter Applying Dynamical Substructuring Method

DEFF Research Database (Denmark)

Zurkinden, Andrew Stephen; Damkilde, Lars; Gao, Zhen

2013-01-01

This paper deals with structural modeling and analysis of a wave energy converter. The device, called Wavestar, is a bottom fixed structure, located in a shallow water environment at the Danish Northwest coast. The analysis is concentrated on a single float and its structural arm which connects...... the WEC to a jackup structure. The wave energy converter is characterized by having an operational and survival mode. The survival mode drastically reduces the exposure to waves and therfore to the wave loads. Structural response analysis of the Wavestar arm is carried out in this study. Due...

5. Modeling the survivability of brucella to exposure of Ultraviolet radiation and temperature

Science.gov (United States)

Howe, R.

Accumulated summation of daily Ultra Violet-B (UV-B = 290? to 320 ? ) data? from The USDA Ultraviolet Radiation Monitoring Program show good correlation (R^2 = 77%) with daily temperature data during the five month period from February through June, 1998. Exposure of disease organisms, such as brucella to the effects of accumulated UV-B radiation, can be modeled for a 5 month period from February through June, 1998. Estimates of a lethal dosage for brucell of UV-B in the environment is dependent on minimum/maximum temperature and Solar Zenith Angle for the time period. The accumulated increase in temperature over this period also effects the decomposition of an aborted fetus containing brucella. Decomposition begins at some minimum daily temperature at 27 to 30 degrees C and peaks at 39 to 40C. It is useful to view the summation of temperature as a threshold for other bacteria growth, so that accumulated temperature greater than some value causes decomposition through competition with other bacteria and brucella die from the accumulated effects of UV-B, temperature and organism competition. Results of a study (Cook 1998) to determine survivability of brucellosis in the environment through exposure of aborted bovine fetuses show no one cause can be attributed to death of the disease agent. The combination of daily increase in temperature and accumulated UV-B radiation reveal an inverse correlation to survivability data and can be modeled as an indicator of brucella survivability in the environment in arid regions.

6. Effect of natural hirudin on random pattern skin flap survival in a porcine model.

Science.gov (United States)

Zhao, H; Shi, Q; Sun, Z Y; Yin, G Q; Yang, H L

2012-01-01

The effect of local administration of hirudin on random pattern skin flap survival was investigated in a porcine model. Three random pattern skin flaps (4 × 14 cm) were created on each flank of five Chinese minipigs. The experimental group (10 flaps) received 20 antithrombin units of hirudin, injected subdermally into the distal half immediately after surgery and on days 1 and 2; a control group (10 flaps) was injected with saline and a sham group (10 flaps) was not injected. All flaps were followed for 10 days postoperatively. Macroscopically, the congested/necrotic length in the experimental group was significantly decreased compared with the other two groups by day 3. Histopathological evaluation revealed venous congestion and inflammation in the control and sham groups from day 1, but minimal changes in the experimental group. By day 10, the mean ± SD surviving area was significantly greater in the experimental group (67.6 ± 2.1%) than in the control (45.2 ± 1.4%) or sham (48.3 ± 1.1%) groups. Local administration of hirudin can significantly increase the surviving area in overdimensioned random pattern skin flaps, in a porcine model.

7. Inelastic cross section and survival probabilities at the LHC in minijet models

Science.gov (United States)

Fagundes, Daniel A.; Grau, Agnes; Pancheri, Giulia; Shekhovtsova, Olga; Srivastava, Yogendra N.

2017-09-01

Recent results for the total and inelastic hadronic cross sections from LHC experiments are compared with predictions from a single-channel eikonal minijet model driven by parton density functions and from an empirical model. The role of soft gluon resummation in the infrared region in taming the rise of minijets and their contribution to the increase of the total cross sections at high energies are discussed. Survival probabilities at the LHC, whose theoretical estimates range from circa 10% to a few per mille, are estimated in this model and compared with results from QCD-inspired models and from multichannel eikonal models. We revisit a previous calculation and examine the origin of these discrepancies.

8. Modular degradable dendrimers enable small RNAs to extend survival in an aggressive liver cancer model.

Science.gov (United States)

Zhou, Kejin; Nguyen, Liem H; Miller, Jason B; Yan, Yunfeng; Kos, Petra; Xiong, Hu; Li, Lin; Hao, Jing; Minnig, Jonathan T; Zhu, Hao; Siegwart, Daniel J

2016-01-19

RNA-based cancer therapies are hindered by the lack of delivery vehicles that avoid cancer-induced organ dysfunction, which exacerbates carrier toxicity. We address this issue by reporting modular degradable dendrimers that achieve the required combination of high potency to tumors and low hepatotoxicity to provide a pronounced survival benefit in an aggressive genetic cancer model. More than 1,500 dendrimers were synthesized using sequential, orthogonal reactions where ester degradability was systematically integrated with chemically diversified cores, peripheries, and generations. A lead dendrimer, 5A2-SC8, provided a broad therapeutic window: identified as potent [EC50 75 mg/kg dendrimer repeated dosing). Delivery of let-7 g microRNA (miRNA) mimic inhibited tumor growth and dramatically extended survival. Efficacy stemmed from a combination of a small RNA with the dendrimer's own negligible toxicity, therefore illuminating an underappreciated complication in treating cancer with RNA-based drugs.

9. Gene-gene interaction analysis for the survival phenotype based on the Cox model.

Science.gov (United States)

Lee, Seungyeoun; Kwon, Min-Seok; Oh, Jung Mi; Park, Taesung

2012-09-15

For the past few decades, many statistical methods in genome-wide association studies (GWAS) have been developed to identify SNP-SNP interactions for case-control studies. However, there has been less work for prospective cohort studies, involving the survival time. Recently, Gui et al. (2011) proposed a novel method, called Surv-MDR, for detecting gene-gene interactions associated with survival time. Surv-MDR is an extension of the multifactor dimensionality reduction (MDR) method to the survival phenotype by using the log-rank test for defining a binary attribute. However, the Surv-MDR method has some drawbacks in the sense that it needs more intensive computations and does not allow for a covariate adjustment. In this article, we propose a new approach, called Cox-MDR, which is an extension of the generalized multifactor dimensionality reduction (GMDR) to the survival phenotype by using a martingale residual as a score to classify multi-level genotypes as high- and low-risk groups. The advantages of Cox-MDR over Surv-MDR are to allow for the effects of discrete and quantitative covariates in the frame of Cox regression model and to require less computation than Surv-MDR. Through simulation studies, we compared the power of Cox-MDR with those of Surv-MDR and Cox regression model for various heritability and minor allele frequency combinations without and with adjusting for covariate. We found that Cox-MDR and Cox regression model perform better than Surv-MDR for low minor allele frequency of 0.2, but Surv-MDR has high power for minor allele frequency of 0.4. However, when the effect of covariate is adjusted for, Cox-MDR and Cox regression model perform much better than Surv-MDR. We also compared the performance of Cox-MDR and Surv-MDR for a real data of leukemia patients to detect the gene-gene interactions with the survival time. leesy@sejong.ac.kr; tspark@snu.ac.kr.

10. Does the interpersonal model apply across eating disorder diagnostic groups? A structural equation modeling approach.

Science.gov (United States)

Ivanova, Iryna V; Tasca, Giorgio A; Proulx, Geneviève; Bissada, Hany

2015-11-01

Interpersonal model has been validated with binge-eating disorder (BED), but it is not yet known if the model applies across a range of eating disorders (ED). The goal of this study was to investigate the validity of the interpersonal model in anorexia nervosa (restricting type; ANR and binge-eating/purge type; ANBP), bulimia nervosa (BN), BED, and eating disorder not otherwise specified (EDNOS). Data from a cross-sectional sample of 1459 treatment-seeking women diagnosed with ANR, ANBP, BN, BED and EDNOS were examined for indirect effects of interpersonal problems on ED psychopathology mediated through negative affect. Findings from structural equation modeling demonstrated the mediating role of negative affect in four of the five diagnostic groups. There were significant, medium to large (.239, .558), indirect effects in the ANR, BN, BED and EDNOS groups but not in the ANBP group. The results of the first reverse model of interpersonal problems as a mediator between negative affect and ED psychopathology were nonsignificant, suggesting the specificity of these hypothesized paths. However, in the second reverse model ED psychopathology was related to interpersonal problems indirectly through negative affect. This is the first study to find support for the interpersonal model of ED in a clinical sample of women with diverse ED diagnoses, though there may be a reciprocal relationship between ED psychopathology and relationship problems through negative affect. Negative affect partially explains the relationship between interpersonal problems and ED psychopathology in women diagnosed with ANR, BN, BED and EDNOS. Interpersonal psychotherapies for ED may be addressing the underlying interpersonal-affective difficulties, thereby reducing ED psychopathology. Copyright © 2015 Elsevier Inc. All rights reserved.

11. The effects of aspirated thickened water on survival and pulmonary injury in a rabbit model.

Science.gov (United States)

Nativ-Zeltzer, Nogah; Kuhn, Maggie A; Imai, Denise M; Traslavina, Ryan P; Domer, Amanda S; Litts, Juliana K; Adams, Brett; Belafsky, Peter C

2018-02-01

Liquid thickeners are one of the most frequently utilized treatment strategies for persons with oropharyngeal swallowing dysfunction. The effect of commercially available thickeners on lung injury is uncertain. The purpose of this study was to compare the effects of aspiration of water alone, xanthan gum (XG)-thickened water, and cornstarch (CS)-thickened water on survival and lung morphology in a rabbit model. Animal model. Prospective small animal clinical trial. Adult New Zealand White rabbits (n = 24) were divided into three groups of eight rabbits. The groups underwent 3 consecutive days of 1.5 mL/kg intratracheal instillation of water (n = 8), XG-thickened water (n = 8), and CS-thickened water (n = 8). The animals were euthanized on day 4, and survival and pulmonary histopathology were compared between groups. In all, 12.5% of rabbits (n = 8) instilled with CS-thickened water survived until the endpoint of the study (day 4). All animals instilled with water (n = 8) or XG-thickened water (n = 8) survived. A mild increase in intra-alveolar hemorrhage was observed for the animals instilled with CS-thickened water compared to the other groups (P thickened with XG resulted in greater pulmonary inflammation, pulmonary interstitial congestion, and alveolar edema than water alone (P thickened water are fatal, and that XG-thickened water is more injurious than aspirated water alone. Additional research is necessary to further delineate the dangers of aspirated thickened liquids. NA. Laryngoscope, 128:327-331, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

12. Optimal selection for and mutation testing using a combination of 'easy to apply' probability models

National Research Council Canada - National Science Library

2006-01-01

... optimal ascertainment, many risk assessment models and prior probability models have been developed and evaluated (; ). Four such models, the Claus, Gilpin, Frank and Evans model (; ; ; ) are empirically derived scoring systems, easy to apply in daily practice with the use of a pencil and a paper and easy to understand for both counsellor...

13. Intratumoral delivery of bortezomib: impact on survival in an intracranial glioma tumor model.

Science.gov (United States)

Wang, Weijun; Cho, Hee-Yeon; Rosenstein-Sisson, Rachel; Marín Ramos, Nagore I; Price, Ryan; Hurth, Kyle; Schönthal, Axel H; Hofman, Florence M; Chen, Thomas C

2017-04-14

OBJECTIVE Glioblastoma (GBM) is the most prevalent and the most aggressive of primary brain tumors. There is currently no effective treatment for this tumor. The proteasome inhibitor bortezomib is effective for a variety of tumors, but not for GBM. The authors' goal was to demonstrate that bortezomib can be effective in the orthotopic GBM murine model if the appropriate method of drug delivery is used. In this study the Alzet mini-osmotic pump was used to bring the drug directly to the tumor in the brain, circumventing the blood-brain barrier; thus making bortezomib an effective treatment for GBM. METHODS The 2 human glioma cell lines, U87 and U251, were labeled with luciferase and used in the subcutaneous and intracranial in vivo tumor models. Glioma cells were implanted subcutaneously into the right flank, or intracranially into the frontal cortex of athymic nude mice. Mice bearing intracranial glioma tumors were implanted with an Alzet mini-osmotic pump containing different doses of bortezomib. The Alzet pumps were introduced directly into the tumor bed in the brain. Survival was documented for mice with intracranial tumors. RESULTS Glioma cells were sensitive to bortezomib at nanomolar quantities in vitro. In the subcutaneous in vivo xenograft tumor model, bortezomib given intravenously was effective in reducing tumor progression. However, in the intracranial glioma model, bortezomib given systemically did not affect survival. By sharp contrast, animals treated with bortezomib intracranially at the tumor site exhibited significantly increased survival. CONCLUSIONS Bypassing the blood-brain barrier by using the osmotic pump resulted in an increase in the efficacy of bortezomib for the treatment of intracranial tumors. Thus, the intratumoral administration of bortezomib into the cranial cavity is an effective approach for glioma therapy.

14. Practical considerations when analyzing discrete survival times using the grouped relative risk model.

Science.gov (United States)

Altman, Rachel MacKay; Henrey, Andrew

2017-10-11

The grouped relative risk model (GRRM) is a popular semi-parametric model for analyzing discrete survival time data. The maximum likelihood estimators (MLEs) of the regression coefficients in this model are often asymptotically efficient relative to those based on a more restrictive, parametric model. However, in settings with a small number of sampling units, the usual properties of the MLEs are not assured. In this paper, we discuss computational issues that can arise when fitting a GRRM to small samples, and describe conditions under which the MLEs can be ill-behaved. We find that, overall, estimators based on a penalized score function behave substantially better than the MLEs in this setting and, in particular, can be far more efficient. We also provide methods of assessing the fit of a GRRM to small samples.

15. Cell survival in carbon beams - comparison of amorphous track model predictions

DEFF Research Database (Denmark)

Grzanka, L.; Greilich, S.; Korcyl, M.

neutrons, stopped pions, and heavy ion beams. Nucl Instrum Meth. 1973;111:93-116. 2.Weyrather WK, Kraft G. RBE of carbon ions: experimental data and the strategy of RBE calculation for treatment planning. Radiother Oncol. 2004;73(Suppl 2):161-9. 3.Greilich S, Grzanka L, Bassler N, Andersen CE, Jäkel O......Introduction: Predictions of the radiobiological effectiveness (RBE) play an essential role in treatment planning with heavy charged particles. Amorphous track models ( [1] , [2] , also referred to as track structure models) provide currently the most suitable description of cell survival under ion....... [2] . In addition, a new approach based on microdosimetric distributions is presented and investigated [3] . Material and methods: A suitable software library embrasing the mentioned amorphous track models including numerous submodels with respect to delta-electron range models, radial dose...

16. A parameter estimation and identifiability analysis methodology applied to a street canyon air pollution model

DEFF Research Database (Denmark)

Ottosen, Thor Bjørn; Ketzel, Matthias; Skov, Henrik

2016-01-01

Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street...... applied for the uncertainty calculations underestimated the parameter uncertainties. The model parameter uncertainty was qualitatively assessed to be significant, and reduction strategies were identified. © 2016 Elsevier Ltd...

17. Lipid emulsion improves survival in animal models of local anesthetic toxicity: a meta-analysis.

Science.gov (United States)

Fettiplace, Michael R; McCabe, Daniel J

2017-08-01

The Lipid Emulsion Therapy workgroup, organized by the American Academy of Clinical Toxicology, recently conducted a systematic review, which subjectively evaluated lipid emulsion as a treatment for local anesthetic toxicity. We re-extracted data and conducted a meta-analysis of survival in animal models. We extracted survival data from 26 publications and conducted a random-effect meta-analysis based on odds ratio weighted by inverse variance. We assessed the benefit of lipid emulsion as an independent variable in resuscitative models (16 studies). We measured Cochran's Q for heterogeneity and I2 to determine variance contributed by heterogeneity. Finally, we conducted a funnel plot analysis and Egger's test to assess for publication bias in studies. Lipid emulsion reduced the odds of death in resuscitative models (OR =0.24; 95%CI: 0.1-0.56, p = .0012). Heterogeneity analysis indicated a homogenous distribution. Funnel plot analysis did not indicate publication bias in experimental models. Meta-analysis of animal data supports the use of lipid emulsion (in combination with other resuscitative measures) for the treatment of local anesthetic toxicity, specifically from bupivacaine. Our conclusion differed from the original review. Analysis of outliers reinforced the need for good life support measures (securement of airway and chest compressions) along with prompt treatment with lipid.

18. Integrative genomic testing of cancer survival using semiparametric linear transformation models.

Science.gov (United States)

Huang, Yen-Tsung; Cai, Tianxi; Kim, Eunhee

2016-07-20

The wide availability of multi-dimensional genomic data has spurred increasing interests in integrating multi-platform genomic data. Integrative analysis of cancer genome landscape can potentially lead to deeper understanding of the biological process of cancer. We integrate epigenetics (DNA methylation and microRNA expression) and gene expression data in tumor genome to delineate the association between different aspects of the biological processes and brain tumor survival. To model the association, we employ a flexible semiparametric linear transformation model that incorporates both the main effects of these genomic measures as well as the possible interactions among them. We develop variance component tests to examine different coordinated effects by testing various subsets of model coefficients for the genomic markers. A Monte Carlo perturbation procedure is constructed to approximate the null distribution of the proposed test statistics. We further propose omnibus testing procedures to synthesize information from fitting various parsimonious sub-models to improve power. Simulation results suggest that our proposed testing procedures maintain proper size under the null and outperform standard score tests. We further illustrate the utility of our procedure in two genomic analyses for survival of glioblastoma multiforme patients. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

19. Influence analysis for skew-normal semiparametric joint models of multivariate longitudinal and multivariate survival data.

Science.gov (United States)

Tang, An-Min; Tang, Nian-Sheng; Zhu, Hongtu

2017-04-30

The normality assumption of measurement error is a widely used distribution in joint models of longitudinal and survival data, but it may lead to unreasonable or even misleading results when longitudinal data reveal skewness feature. This paper proposes a new joint model for multivariate longitudinal and multivariate survival data by incorporating a nonparametric function into the trajectory function and hazard function and assuming that measurement errors in longitudinal measurement models follow a skew-normal distribution. A Monte Carlo Expectation-Maximization (EM) algorithm together with the penalized-splines technique and the Metropolis-Hastings algorithm within the Gibbs sampler is developed to estimate parameters and nonparametric functions in the considered joint models. Case deletion diagnostic measures are proposed to identify the potential influential observations, and an extended local influence method is presented to assess local influence of minor perturbations. Simulation studies and a real example from a clinical trial are presented to illustrate the proposed methodologies. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

20. Resveratrol improves survival, hemodynamics and energetics in a rat model of hypertension leading to heart failure.

Science.gov (United States)

Rimbaud, Stéphanie; Ruiz, Matthieu; Piquereau, Jérôme; Mateo, Philippe; Fortin, Dominique; Veksler, Vladimir; Garnier, Anne; Ventura-Clapier, Renée

2011-01-01

Heart failure (HF) is characterized by contractile dysfunction associated with altered energy metabolism. This study was aimed at determining whether resveratrol, a polyphenol known to activate energy metabolism, could be beneficial as a metabolic therapy of HF. Survival, ventricular and vascular function as well as cardiac and skeletal muscle energy metabolism were assessed in a hypertensive model of HF, the Dahl salt-sensitive rat fed with a high-salt diet (HS-NT). Resveratrol (18 mg/kg/day; HS-RSV) was given for 8 weeks after hypertension and cardiac hypertrophy were established (which occurred 3 weeks after salt addition). Resveratrol treatment improved survival (64% in HS-RSV versus 15% in HS-NT, phypertension or hypertrophy. Moreover, aortic endothelial dysfunction present in HS-NT was prevented in resveratrol-treated rats. Resveratrol treatment tended to preserve mitochondrial mass and biogenesis and completely protected mitochondrial fatty acid oxidation and PPARα (peroxisome proliferator-activated receptor α) expression. We conclude that resveratrol treatment exerts beneficial protective effects on survival, endothelium-dependent smooth muscle relaxation and cardiac contractile and mitochondrial function, suggesting that resveratrol or metabolic activators could be a relevant therapy in hypertension-induced HF.

1. Modeling of thermal stresses and probability of survival of tubular SOFC

Energy Technology Data Exchange (ETDEWEB)

Nakajo, Arata [Laboratory for Industrial Energy Systems (LENI), Faculty of Engineering, Swiss Federal Institute of Technology, 1015 Lausanne (Switzerland); Stiller, Christoph; Bolland, Olav [Department of Energy and Process Engineering, Norwegian University of Science and Technology, Trondheim N-7491 (Norway); Haerkegaard, Gunnar [Department of Engineering Design and Materials, Norwegian University of Science and Technology, Trondheim N-7491 (Norway)

2006-07-14

The temperature profile generated by a thermo-electro-chemical model was used to calculate the thermal stress distribution in a tubular solid oxide fuel cell (SOFC). The solid heat balances were calculated separately for each layer of the MEA (membrane electrode assembly) in order to detect the radial thermal gradients more precisely. It appeared that the electrolyte undergoes high tensile stresses at the ends of the cell in limited areas and that the anode is submitted to moderate tensile stresses. A simplified version of the widely used Weibull analysis was used to calculate the global probability of survival for the assessment of the risks related to both operating points and load changes. The cell at room temperature was considered and revealed as critical. As a general trend, the computed probabilities of survival were too low for the typical requirements for a commercial product. A sensitivity analysis showed a strong influence of the thermal expansion mismatch between the layers of the MEA on the probability of survival. The lack of knowledge on mechanical material properties as well as uncertainties about the phenomena occurring in the cell revealed itself as a limiting parameter for the simulation of thermal stresses. (author)

2. Resveratrol improves survival, hemodynamics and energetics in a rat model of hypertension leading to heart failure.

Directory of Open Access Journals (Sweden)

Stéphanie Rimbaud

Full Text Available Heart failure (HF is characterized by contractile dysfunction associated with altered energy metabolism. This study was aimed at determining whether resveratrol, a polyphenol known to activate energy metabolism, could be beneficial as a metabolic therapy of HF. Survival, ventricular and vascular function as well as cardiac and skeletal muscle energy metabolism were assessed in a hypertensive model of HF, the Dahl salt-sensitive rat fed with a high-salt diet (HS-NT. Resveratrol (18 mg/kg/day; HS-RSV was given for 8 weeks after hypertension and cardiac hypertrophy were established (which occurred 3 weeks after salt addition. Resveratrol treatment improved survival (64% in HS-RSV versus 15% in HS-NT, p<0.001, and prevented the 25% reduction in body weight in HS-NT (P<0.001. Moreover, RSV counteracted the development of cardiac dysfunction (fractional shortening -34% in HS-NT as evaluated by echocardiography, which occurred without regression of hypertension or hypertrophy. Moreover, aortic endothelial dysfunction present in HS-NT was prevented in resveratrol-treated rats. Resveratrol treatment tended to preserve mitochondrial mass and biogenesis and completely protected mitochondrial fatty acid oxidation and PPARα (peroxisome proliferator-activated receptor α expression. We conclude that resveratrol treatment exerts beneficial protective effects on survival, endothelium-dependent smooth muscle relaxation and cardiac contractile and mitochondrial function, suggesting that resveratrol or metabolic activators could be a relevant therapy in hypertension-induced HF.

3. Resveratrol Improves Survival, Hemodynamics and Energetics in a Rat Model of Hypertension Leading to Heart Failure

Science.gov (United States)

Rimbaud, Stéphanie; Ruiz, Matthieu; Piquereau, Jérôme; Mateo, Philippe; Fortin, Dominique; Veksler, Vladimir; Garnier, Anne; Ventura-Clapier, Renée

2011-01-01

Heart failure (HF) is characterized by contractile dysfunction associated with altered energy metabolism. This study was aimed at determining whether resveratrol, a polyphenol known to activate energy metabolism, could be beneficial as a metabolic therapy of HF. Survival, ventricular and vascular function as well as cardiac and skeletal muscle energy metabolism were assessed in a hypertensive model of HF, the Dahl salt-sensitive rat fed with a high-salt diet (HS-NT). Resveratrol (18 mg/kg/day; HS-RSV) was given for 8 weeks after hypertension and cardiac hypertrophy were established (which occurred 3 weeks after salt addition). Resveratrol treatment improved survival (64% in HS-RSV versus 15% in HS-NT, p<0.001), and prevented the 25% reduction in body weight in HS-NT (P<0.001). Moreover, RSV counteracted the development of cardiac dysfunction (fractional shortening −34% in HS-NT) as evaluated by echocardiography, which occurred without regression of hypertension or hypertrophy. Moreover, aortic endothelial dysfunction present in HS-NT was prevented in resveratrol-treated rats. Resveratrol treatment tended to preserve mitochondrial mass and biogenesis and completely protected mitochondrial fatty acid oxidation and PPARα (peroxisome proliferator-activated receptor α) expression. We conclude that resveratrol treatment exerts beneficial protective effects on survival, endothelium–dependent smooth muscle relaxation and cardiac contractile and mitochondrial function, suggesting that resveratrol or metabolic activators could be a relevant therapy in hypertension-induced HF. PMID:22028869

4. Development of a likelihood of survival scoring system for hospitalized equine neonates using generalized boosted regression modeling.

Directory of Open Access Journals (Sweden)

Katarzyna A Dembek

Full Text Available BACKGROUND: Medical management of critically ill equine neonates (foals can be expensive and labor intensive. Predicting the odds of foal survival using clinical information could facilitate the decision-making process for owners and clinicians. Numerous prognostic indicators and mathematical models to predict outcome in foals have been published; however, a validated scoring method to predict survival in sick foals has not been reported. The goal of this study was to develop and validate a scoring system that can be used by clinicians to predict likelihood of survival of equine neonates based on clinical data obtained on admission. METHODS AND RESULTS: Data from 339 hospitalized foals of less than four days of age admitted to three equine hospitals were included to develop the model. Thirty seven variables including historical information, physical examination and laboratory findings were analyzed by generalized boosted regression modeling (GBM to determine which ones would be included in the survival score. Of these, six variables were retained in the final model. The weight for each variable was calculated using a generalized linear model and the probability of survival for each total score was determined. The highest (7 and the lowest (0 scores represented 97% and 3% probability of survival, respectively. Accuracy of this survival score was validated in a prospective study on data from 283 hospitalized foals from the same three hospitals. Sensitivity, specificity, positive and negative predictive values for the survival score in the prospective population were 96%, 71%, 91%, and 85%, respectively. CONCLUSIONS: The survival score developed in our study was validated in a large number of foals with a wide range of diseases and can be easily implemented using data available in most equine hospitals. GBM was a useful tool to develop the survival score. Further evaluations of this scoring system in field conditions are needed.

5. A comparison of economic evaluation models as applied to geothermal energy technology

Science.gov (United States)

Ziman, G. M.; Rosenberg, L. S.

1983-01-01

Several cost estimation and financial cash flow models have been applied to a series of geothermal case studies. In order to draw conclusions about relative performance and applicability of these models to geothermal projects, the consistency of results was assessed. The model outputs of principal interest in this study were net present value, internal rate of return, or levelized breakeven price. The models used were VENVAL, a venture analysis model; the Geothermal Probabilistic Cost Model (GPC Model); the Alternative Power Systems Economic Analysis Model (APSEAM); the Geothermal Loan Guarantee Cash Flow Model (GCFM); and the GEOCOST and GEOCITY geothermal models. The case studies to which the models were applied include a geothermal reservoir at Heber, CA; a geothermal eletric power plant to be located at the Heber site; an alcohol fuels production facility to be built at Raft River, ID; and a direct-use, district heating system in Susanville, CA.

6. Multivariable model development and internal validation for prostate cancer specific survival and overall survival after whole-gland salvage Iodine-125 prostate brachytherapy.

Science.gov (United States)

Peters, Max; van der Voort van Zyp, Jochem R N; Moerland, Marinus A; Hoekstra, Carel J; van de Pol, Sandrine; Westendorp, Hendrik; Maenhout, Metha; Kattevilder, Rob; Verkooijen, Helena M; van Rossum, Peter S N; Ahmed, Hashim U; Shah, Taimur T; Emberton, Mark; van Vulpen, Marco

2016-04-01

Whole-gland salvage Iodine-125-brachytherapy is a potentially curative treatment strategy for localised prostate cancer (PCa) recurrences after radiotherapy. Prognostic factors influencing PCa-specific and overall survival (PCaSS & OS) are not known. The objective of this study was to develop a multivariable, internally validated prognostic model for survival after whole-gland salvage I-125-brachytherapy. Whole-gland salvage I-125-brachytherapy patients treated in the Netherlands from 1993-2010 were included. Eligible patients had a transrectal ultrasound-guided biopsy-confirmed localised recurrence after biochemical failure (clinical judgement, ASTRO or Phoenix-definition). Recurrences were assessed clinically and with CT and/or MRI. Metastases were excluded using CT/MRI and technetium-99m scintigraphy. Multivariable Cox-regression was used to assess the predictive value of clinical characteristics in relation to PCa-specific and overall mortality. PCa-specific mortality was defined as patients dying with distant metastases present. Missing data were handled using multiple imputation (20 imputed sets). Internal validation was performed and the C-statistic calculated. Calibration plots were created to visually assess the goodness-of-fit of the final model. Optimism-corrected survival proportions were calculated. All analyses were performed according to the TRIPOD statement. Median total follow-up was 78months (range 5-139). A total of 62 patients were treated, of which 28 (45%) died from PCa after mean (±SD) 82 (±36) months. Overall, 36 patients (58%) patients died after mean 84 (±40) months. PSA doubling time (PSADT) remained a predictive factor for both types of mortality (PCa-specific and overall): corrected hazard ratio's (HR's) 0.92 (95% CI: 0.86-0.98, p=0.02) and 0.94 (95% CI: 0.90-0.99, p=0.01), respectively (C-statistics 0.71 and 0.69, respectively). Calibration was accurate up to 96month follow-up. Over 80% of patients can survive 8years if PSADT>24

7. Survival benefits of antiretroviral therapy in Brazil: a model-based analysis

Science.gov (United States)

Luz, Paula M; Girouard, Michael P; Grinsztejn, Beatriz; Freedberg, Kenneth A; Veloso, Valdilea G; Losina, Elena; Struchiner, Claudio J; MacLean, Rachel L; Parker, Robert A; Paltiel, A David; Walensky, Rochelle P

2016-01-01

Objective In Brazil, universal provision of antiretroviral therapy (ART) has been guaranteed free of charge to eligible HIV-positive patients since December 1996. We sought to quantify the survival benefits of ART attributable to this programme. Methods We used a previously published microsimulation model of HIV disease and treatment (CEPAC-International) and data from Brazil to estimate life expectancy increase for HIV-positive patients initiating ART in Brazil. We divided the period of 1997 to 2014 into six eras reflecting increased drug regimen efficacy, regimen availability and era-specific mean CD4 count at ART initiation. Patients were simulated first without ART and then with ART. The 2014-censored and lifetime survival benefits attributable to ART in each era were calculated as the product of the number of patients initiating ART in a given era and the increase in life expectancy attributable to ART in that era. Results In total, we estimated that 598,741 individuals initiated ART. Projected life expectancy increased from 2.7, 3.3, 4.1, 4.9, 5.5 and 7.1 years without ART to 11.0, 17.5, 20.7, 23.0, 25.3, and 27.0 years with ART in Eras 1 through 6, respectively. Of the total projected lifetime survival benefit of 9.3 million life-years, 16% (or 1.5 million life-years) has been realized as of December 2014. Conclusions Provision of ART through a national programme has led to dramatic survival benefits in Brazil, the majority of which are still to be realized. Improvements in initial and subsequent ART regimens and higher CD4 counts at ART initiation have contributed to these increasing benefits. PMID:27029828

8. Human Engineered Heart Muscles Engraft and Survive Long-Term in a Rodent Myocardial Infarction Model

Science.gov (United States)

Riegler, Johannes; Tiburcy, Malte; Ebert, Antje; Tzatzalos, Evangeline; Raaz, Uwe; Abilez, Oscar J.; Shen, Qi; Kooreman, Nigel G.; Neofytou, Evgenios; Chen, Vincent C.; Wang, Mouer; Meyer, Tim; Tsao, Philip S.; Connolly, Andrew J.; Couture, Larry A.; Gold, Joseph D.; Zimmermann, Wolfram H.; Wu, Joseph C.

2015-01-01

Rational Tissue engineering approaches may improve survival and functional benefits from human embryonic stem cell-derived cardiomyocte (ESC-CM) transplantation, thereby potentially preventing dilative remodelling and progression to heart failure. Objective Assessment of transport stability, long term survival, structural organisation, functional benefits, and teratoma risk of engineered heart muscle (EHM) in a chronic myocardial infarction (MI) model. Methods and Results We constructed EHMs from ESC-CMs and released them for transatlantic shipping following predefined quality control criteria. Two days of shipment did not lead to adverse effects on cell viability or contractile performance of EHMs (n=3, P=0.83, P=0.87). After ischemia/reperfusion (I/R) injury, EHMs were implanted onto immunocompromised rat hearts at 1 month to simulate chronic ischemia. Bioluminescence imaging (BLI) showed stable engraftment with no significant cell loss between week 2 and 12 (n=6, P=0.67), preserving up to 25% of the transplanted cells. Despite high engraftment rates and attenuated disease progression (change in ejection fraction for EHMs −6.7±1.4% vs control −10.9±1.5%, n>12, P=0.05), we observed no difference between EHMs containing viable or non-viable human cardiomyocytes in this chronic xenotransplantation model (n>12, P=0.41). Grafted cardiomyocytes showed enhanced sarcomere alignment and increased connexin 43 expression at 220 days after transplantation. No teratomas or tumors were found in any of the animals (n=14) used for long-term monitoring. Conclusions EHM transplantation led to high engraftment rates, long term survival, and progressive maturation of human cardiomyocytes. However, cell engraftment was not correlated with functional improvements in this chronic MI model. Most importantly, the safety of this approach was demonstrated by the lack of tumor or teratoma formation. PMID:26291556

9. A comparative study of two food model systems to test the survival of Campylobacter jejuni at -18 degrees C

DEFF Research Database (Denmark)

Birk, Tina; Rosenquist, Hanne; Brondsted, L.

2006-01-01

The survival of Campylobacter jejuni NCTC 11168 was tested at freezing conditions (-18 degrees C) over a period of 32 days in two food models that simulated either (i) the chicken skin surface (skin model) or (ii) the chicken juice in and around a broiler carcass (liquid model). In the skin model...

10. Making Faces - State-Space Models Applied to Multi-Modal Signal Processing

DEFF Research Database (Denmark)

Lehn-Schiøler, Tue

2005-01-01

The two main focus areas of this thesis are State-Space Models and multi modal signal processing. The general State-Space Model is investigated and an addition to the class of sequential sampling methods is proposed. This new algorithm is denoted as the Parzen Particle Filter. Furthermore...... optimizer can be applied to speed up convergence. The linear version of the State-Space Model, the Kalman Filter, is applied to multi modal signal processing. It is demonstrated how a State-Space Model can be used to map from speech to lip movements. Besides the State-Space Model and the multi modal...

11. Immediate survival focus: synthesizing life history theory and dual process models to explain substance use.

Science.gov (United States)

Richardson, George B; Hardesty, Patrick

2012-01-01

Researchers have recently applied evolutionary life history theory to the understanding of behaviors often conceived of as prosocial or antisocial. In addition, researchers have applied cognitive science to the understanding of substance use and used dual process models, where explicit cognitive processes are modeled as relatively distinct from implicit cognitive processes, to explain and predict substance use behaviors. In this paper we synthesized these two theoretical perspectives to produce an adaptive and cognitive framework for explaining substance use. We contend that this framework provides new insights into the nature of substance use that may be valuable for both clinicians and researchers.

12. Dynamical properties of the Penna aging model applied to the population of wolves

Science.gov (United States)

Makowiec, Danuta

1997-02-01

The parameters of th Penna bit-string model of aging of biological systems are systematically tested to better understand the model itself as well as the results arising from applying this model to studies of the development of the stationary population of Alaska wolves.

13. "Analyze, Acquire, Apply, and Write" as a New Learning Model in Science

Science.gov (United States)

Choe, Jeong V.

2015-01-01

I have developed a new teaching and learning model called AAAW, which stand for Analyze, Acquire, Apply and Write. This model grows from action research and unique experience in teaching a biochemistry course to high school students who are talented in math and science. In this model, students first "Analyze" lab data to generate…

14. Evaluation of MM5 model resolution when applied to prediction of national fire danger rating indexes

Science.gov (United States)

Jeanne L. Hoadley; Miriam L. Rorig; Larry Bradshaw; Sue A. Ferguson; Kenneth J. Westrick; Scott L. Goodrick; Paul Werth

2006-01-01

Weather predictions from the MM5 mesoscale model were used to compute gridded predictions of National Fire Danger Rating System (NFDRS) indexes. The model output was applied to a case study of the 2000 fire season in Northern Idaho and Western Montana to simulate an extreme event. To determine the preferred resolution for automating NFD RS predictions, model...

15. A mixed linear model controlling for case underascertainment across multiple cancer registries estimated time trends in survival.

Science.gov (United States)

Dahm, Stefan; Bertz, Joachim; Barnes, Benjamin; Kraywinkel, Klaus

2018-01-10

Large temporal and geographical variation in survival rates estimated from epidemiological cancer registries coupled with heterogeneity in death certificate only (DCO) notifications makes it difficult to interpret trends in survival. The aim of our study is to introduce a method for estimating such trends while accounting for heterogeneity in DCO notifications in a cancer site-specific manner. We used the data of 4.0 million cancer cases notified in 14 German epidemiological cancer registries. Annual 5-year relative survival rates from 2002 through 2013 were estimated, and proportions of DCO notifications were recorded. "DCO-excluded" survival rates were regressed on DCO proportions and calendar years using a mixed linear model with cancer registry as a random effect. Based on this model, trends in survival rates were estimated for Germany at 0% DCO. For most cancer sites and age groups, we estimated significant positive trends in survival. Age-standardized survival for all cancers combined increased by 7.1% units for women and 10.8% units for men. The described method could be used to estimate trends in cancer survival based on the data from epidemiological cancer registries with differing DCO proportions and with changing DCO proportions over time. Copyright © 2018 Elsevier Inc. All rights reserved.

16. Inhibition of the Mitochondrial Fission Protein Drp1 Improves Survival in a Murine Cardiac Arrest Model

Science.gov (United States)

Sharp, Willard W.; Beiser, David G.; Fang, Yong Hu; Han, Mei; Piao, Lin; Varughese, Justin; Archer, Stephen L.

2015-01-01

Objectives Survival following sudden cardiac arrest is poor despite advances in cardiopulmonary resuscitation (CPR) and the use of therapeutic hypothermia. Dynamin related protein 1 (Drp1), a regulator of mitochondrial fission, is an important determinant of reactive oxygen species generation, myocardial necrosis, and left ventricular function following ischemia/reperfusion injury, but its role in cardiac arrest is unknown. We hypothesized that Drp1 inhibition would improve survival, cardiac hemodynamics, and mitochondrial function in an in vivo model of cardiac arrest. Design Laboratory investigation. Setting University laboratory Interventions Anesthetized and ventilated adult female C57BL/6 wild-type mice underwent an 8-min KCl induced cardiac arrest followed by 90 seconds of CPR. Mice were then blindly randomized to a single intravenous injection of Mdivi-1 (0.24 mg/kg), a small molecule Drp1 inhibitor or vehicle (DMSO). Measurements and Main Results Following resuscitation from cardiac arrest, mitochondrial fission was evidenced by Drp1 translocation to the mitochondrial membrane and a decrease in mitochondrial size. Mitochondrial fission was associated with increased lactate and evidence of oxidative damage. Mdivi-1 administration during CPR inhibited Drp1 activation, preserved mitochondrial morphology, and decreased oxidative damage. Mdivi-1 also reduced the time to return of spontaneous circulation (ROSC) 116±4 vs. 143±7 sec (pcardiac arrest. Conclusions Post cardiac arrest inhibition of Drp1 improves time to ROSC and myocardial hemodynamics resulting in improved survival and neurological outcomes in a murine model of cardiac arrest. Pharmacological targeting of mitochondrial fission may be a promising therapy for cardiac arrest. PMID:25599491

17. Plasma Resuscitation Improved Survival in a Cecal Ligation and Puncture Rat Model of Sepsis.

Science.gov (United States)

Chang, Ronald; Holcomb, John B; Johansson, Par I; Pati, Shibani; Schreiber, Martin A; Wade, Charles E

2017-06-06

The paradigm shift from crystalloid to plasma resuscitation of traumatic hemorrhagic shock has improved patient outcomes due in part to plasma-mediated reversal of catecholamine and inflammation-induced endothelial injury, decreasing vascular permeability and attenuating organ injury. Since sepsis induces a similar endothelial injury as seen in hemorrhage, we hypothesized that plasma resuscitation would increase 48-hour survival in a rat sepsis model. Adult male Sprague-Dawley rats (375-425 g) were subjected to 35% cecal ligation and puncture (CLP) (t = 0 h). Twenty-two hours post-CLP and prior to resuscitation (t = 22 h), animals were randomized to resuscitation with normal saline (NS, 10 cc/kg/hr) or pooled rat fresh frozen plasma (FFP, 3.33 cc/kg/hr). Resuscitation under general anesthesia proceeded for the next six hours (t = 22 h to t = 28 h); lactate was checked every 2 hours, and fluid volumes were titrated based on lactate clearance. Blood samples were obtained before (t = 22 h) and after resuscitation (t = 28 h), and at death or study conclusion. Lung specimens were obtained for calculation of wet-to-dry weight ratio. Fisher's exact test was used to analyze the primary outcome of 48-hour survival. ANOVA with repeated measures was used to analyze the effect of FFP versus NS resuscitation on blood gas, electrolytes, blood urea nitrogen (BUN), creatinine, interleukin (IL)-6, IL-10, catecholamines, and syndecan-1 (marker for endothelial injury). A two-tailed alpha level of dry weight ratio (5.28 vs 5.94) (all p < 0.05). Compared to crystalloid, plasma resuscitation increased 48-hour survival in a rat sepsis model, improved pulmonary function and decreased pulmonary edema, and attenuated markers for inflammation, endothelial injury, and catecholamines.

18. Rethinking plant functional types in Earth System Models: pan-tropical analysis of tree survival across environmental gradients

Science.gov (United States)

Johnson, D. J.; Needham, J.; Xu, C.; Davies, S. J.; Bunyavejchewin, S.; Giardina, C. P.; Condit, R.; Cordell, S.; Litton, C. M.; Hubbell, S.; Kassim, A. R. B.; Shawn, L. K. Y.; Nasardin, M. B.; Ong, P.; Ostertag, R.; Sack, L.; Tan, S. K. S.; Yap, S.; McDowell, N. G.; McMahon, S.

2016-12-01

Terrestrial carbon cycling is a function of the growth and survival of trees. Current model representations of tree growth and survival at a global scale rely on coarse plant functional traits that are parameterized very generally. In view of the large biodiversity in the tropical forests, it is important that we account for the functional diversity in order to better predict tropical forest responses to future climate changes. Several next generation Earth System Models are moving towards a size-structured, trait-based approach to modelling vegetation globally, but the challenge of which and how many traits are necessary to capture forest complexity remains. Additionally, the challenge of collecting sufficient trait data to describe the vast species richness of tropical forests is enormous. We propose a more fundamental approach to these problems by characterizing forests by their patterns of survival. We expect our approach to distill real-world tree survival into a reasonable number of functional types. Using 10 large-area tropical forest plots that span geographic, edaphic and climatic gradients, we model tree survival as a function of tree size for hundreds of species. We found surprisingly few categories of size-survival functions emerge. This indicates some fundamental strategies at play across diverse forests to constrain the range of possible size-survival functions. Initial cluster analysis indicates that four to eight functional forms are necessary to describe variation in size-survival relations. Temporal variation in size-survival functions can be related to local environmental variation, allowing us to parameterize how demographically similar groups of species respond to perturbations in the ecosystem. We believe this methodology will yield a synthetic approach to classifying forest systems that will greatly reduce uncertainty and complexity in global vegetation models.

19. Applying Retrospective Demographic Models to Assess Sustainable Use: the Maya Management of Xa'an Palms

Directory of Open Access Journals (Sweden)

Andrea Martínez-Ballesté

2005-12-01

Full Text Available Xa'an palm (Sabal yapa has been used to thatch traditional Maya houses for over 3000 years. In the Yucatan Peninsula, this palm has been introduced to pasturelands, maize fields (milpas, and homegardens. These and other traditional management systems are usually believed to be sustainable, but there is as yet little evidence to support this hypothesis. Demographic models have been used for this purpose, mainly focusing on population growth rate (λ. So far, retrospective analysis has not been applied, even though it examines how changes in the the life cycle of a species, caused by different management regimes, affect its λ. In this study, we assess whether ecologically sustainable use of xa'an occurs in homegardens, pasturelands, and milpas, and if so, how it is achieved. We constructed matrix population models for four populations of xa'an that were followed for 3 years, and then conducted a retrospective analysis on them. Management in homegardens seems to be oriented to increasing the availability of xa'an leaves, favoring the survival of seedlings, and increasing the density of harvestable-sized palms. However, in the milpa and the pastureland, the population size structure resembles that of unmanaged populations. Our λ values suggest that the traditional use of xa'an in all the studied management regimes is sustainable. Nevertheless, the processes that lead to sustainable use are different in each system, as shown by our retrospective analysis. Although fecundity contributes positively to λ only in homegardens, permanence and growth maintain palm populations at an equilibrium in the pastureland and in the milpa, respectively. Between-year climatic differences had a smaller impact on λ than management practices, which may vary from one year to another, leading to different balances in the sustainable use of the populations involved. Even though no significant differences were found in λ values, Maya achieve sustainable use of xa

20. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

Directory of Open Access Journals (Sweden)

Nils Ternès

2017-05-01

Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

1. NanOx, a new model to predict cell survival in the context of particle therapy

Science.gov (United States)

Cunha, M.; Monini, C.; Testa, E.; Beuve, M.

2017-02-01

Particle therapy is increasingly attractive for the treatment of tumors and the number of facilities offering it is rising worldwide. Due to the well-known enhanced effectiveness of ions, it is of utmost importance to plan treatments with great care to ensure tumor killing and healthy tissues sparing. Hence, the accurate quantification of the relative biological effectiveness (RBE) of ions, used in the calculation of the biological dose, is critical. Nevertheless, the RBE is a complex function of many parameters and its determination requires modeling. The approaches currently used have allowed particle therapy to thrive, but still show some shortcomings. We present herein a short description of a new theoretical framework, NanOx, to calculate cell survival in the context of particle therapy. It gathers principles from existing approaches, while addressing some of their weaknesses. NanOx is a multiscale model that takes the stochastic nature of radiation at nanometric and micrometric scales fully into account, integrating also the chemical aspects of radiation-matter interaction. The latter are included in the model by means of a chemical specific energy, determined from the production of reactive chemical species induced by irradiation. Such a production represents the accumulation of oxidative stress and sublethal damage in the cell, potentially generating non-local lethal events in NanOx. The complementary local lethal events occur in a very localized region and can, alone, lead to cell death. Both these classes of events contribute to cell death. The comparison between experimental data and model predictions for the V79 cell line show a good agreement. In particular, the dependence of the typical shoulders of cell survival curves on linear energy transfer are well described, but also the effectiveness of different ions, including the overkill effect. These results required the adjustment of a number of parameters compatible with the application of the model in

2. Intracranial AAV-sTRAIL combined with lanatoside C prolongs survival in an orthotopic xenograft mouse model of invasive glioblastoma.

Science.gov (United States)

Crommentuijn, Matheus H W; Maguire, Casey A; Niers, Johanna M; Vandertop, W Peter; Badr, Christian E; Würdinger, Thomas; Tannous, Bakhos A

2016-04-01

3. Environmental enrichment extends photoreceptor survival and visual function in a mouse model of retinitis pigmentosa.

Directory of Open Access Journals (Sweden)

Ilaria Barone

Full Text Available Slow, progressive rod degeneration followed by cone death leading to blindness is the pathological signature of all forms of human retinitis pigmentosa (RP. Therapeutic schemes based on intraocular delivery of neuroprotective agents prolong the lifetime of photoreceptors and have reached the stage of clinical trial. The success of these approaches depends upon optimization of chronic supply and appropriate combination of factors. Environmental enrichment (EE, a novel neuroprotective strategy based on enhanced motor, sensory and social stimulation, has already been shown to exert beneficial effects in animal models of various disorders of the CNS, including Alzheimer and Huntington disease. Here we report the results of prolonged exposure of rd10 mice, a mutant strain undergoing progressive photoreceptor degeneration mimicking human RP, to such an enriched environment from birth. By means of microscopy of retinal tissue, electrophysiological recordings, visual behaviour assessment and molecular analysis, we show that EE considerably preserves retinal morphology and physiology as well as visual perception over time in rd10 mutant mice. We find that protective effects of EE are accompanied by increased expression of retinal mRNAs for CNTF and mTOR, both factors known as instrumental to photoreceptor survival. Compared to other rescue approaches used in similar animal models, EE is highly effective, minimally invasive and results into a long-lasting retinal protection. These results open novel perspectives of research pointing to environmental strategies as useful tools to extend photoreceptor survival.

4. Survival prediction based on compound covariate under Cox proportional hazard models.

Directory of Open Access Journals (Sweden)

Takeshi Emura

Full Text Available Survival prediction from a large number of covariates is a current focus of statistical and medical research. In this paper, we study a methodology known as the compound covariate prediction performed under univariate Cox proportional hazard models. We demonstrate via simulations and real data analysis that the compound covariate method generally competes well with ridge regression and Lasso methods, both already well-studied methods for predicting survival outcomes with a large number of covariates. Furthermore, we develop a refinement of the compound covariate method by incorporating likelihood information from multivariate Cox models. The new proposal is an adaptive method that borrows information contained in both the univariate and multivariate Cox regression estimators. We show that the new proposal has a theoretical justification from a statistical large sample theory and is naturally interpreted as a shrinkage-type estimator, a popular class of estimators in statistical literature. Two datasets, the primary biliary cirrhosis of the liver data and the non-small-cell lung cancer data, are used for illustration. The proposed method is implemented in R package "compound.Cox" available in CRAN at http://cran.r-project.org/.

5. The Antarctic nematode Plectus murrayi: an emerging model to study multiple stress survival.

Science.gov (United States)

Adhikari, Bishwo N; Tomasel, Cecilia M; Li, Grace; Wall, Diana H; Adams, Byron J

2010-11-01

The genus Plectus is one of the most widely distributed and common nematode taxa of freshwater and terrestrial habitats in the world, and is of particular interest because of its phylogenetic position relative to the origin of the Secernentean radiation. Plectus murrayi, a bacteria-feeding nematode, inhabits both semi-aquatic and terrestrial biotopes in the Antarctic McMurdo Dry Valleys (MCM), where its distribution is limited by organic carbon and soil moisture. Plectus nematodes from the MCM can survive extreme desiccation, freezing conditions, and other types of stress. Ongoing investigations of the physiological and molecular aspects of the stress biology of P. murrayi, along with the availability of genomic resources, will likely establish this nematode as an excellent invertebrate model system for studies of extreme environmental survival, and may provide a valuable source of genomic resources for comparative studies in other organisms. Moreover, because P. murrayi and Caenorhabditis elegans share a most recent common ancestor with the rest of the Secernentea, and given the ability of P. murrayi to be cultured at lower temperatures compared to C. elegans, P. murrayi could also be an emerging model system for the study of the evolution of environment-sensitive (stress response) alleles in nematodes.

6. Applying Statistical Models and Parametric Distance Measures for Music Similarity Search

Science.gov (United States)

Lukashevich, Hanna; Dittmar, Christian; Bastuck, Christoph

Automatic deriving of similarity relations between music pieces is an inherent field of music information retrieval research. Due to the nearly unrestricted amount of musical data, the real-world similarity search algorithms have to be highly efficient and scalable. The possible solution is to represent each music excerpt with a statistical model (ex. Gaussian mixture model) and thus to reduce the computational costs by applying the parametric distance measures between the models. In this paper we discuss the combinations of applying different parametric modelling techniques and distance measures and weigh the benefits of each one against the others.

7. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

DEFF Research Database (Denmark)

Nielsen, Jan; Parner, Erik

2010-01-01

In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

8. Fasudil improves survival and promotes skeletal muscle development in a mouse model of spinal muscular atrophy

Directory of Open Access Journals (Sweden)

Bowerman Melissa

2012-03-01

Full Text Available Abstract Background Spinal muscular atrophy (SMA is the leading genetic cause of infant death. It is caused by mutations/deletions of the survival motor neuron 1 (SMN1 gene and is typified by the loss of spinal cord motor neurons, muscular atrophy, and in severe cases, death. The SMN protein is ubiquitously expressed and various cellular- and tissue-specific functions have been investigated to explain the specific motor neuron loss in SMA. We have previously shown that the RhoA/Rho kinase (ROCK pathway is misregulated in cellular and animal SMA models, and that inhibition of ROCK with the chemical Y-27632 significantly increased the lifespan of a mouse model of SMA. In the present study, we evaluated the therapeutic potential of the clinically approved ROCK inhibitor fasudil. Methods Fasudil was administered by oral gavage from post-natal day 3 to 21 at a concentration of 30 mg/kg twice daily. The effects of fasudil on lifespan and SMA pathological hallmarks of the SMA mice were assessed and compared to vehicle-treated mice. For the Kaplan-Meier survival analysis, the log-rank test was used and survival curves were considered significantly different at P t test for paired variables and one-way analysis of variance (ANOVA were used to test for differences between samples and data were considered significantly different at P Results Fasudil significantly improves survival of SMA mice. This dramatic phenotypic improvement is not mediated by an up-regulation of Smn protein or via preservation of motor neurons. However, fasudil administration results in a significant increase in muscle fiber and postsynaptic endplate size, and restores normal expression of markers of skeletal muscle development, suggesting that the beneficial effects of fasudil could be muscle-specific. Conclusions Our work underscores the importance of muscle as a therapeutic target in SMA and highlights the beneficial potential of ROCK inhibitors as a therapeutic strategy for SMA

9. Hydroxocobalamin and epinephrine both improve survival in a swine model of cyanide-induced cardiac arrest.

Science.gov (United States)

Bebarta, Vikhyat S; Pitotti, Rebecca L; Dixon, Patricia S; Valtier, Sandra; Esquivel, Luis; Bush, Anneke; Little, Charles M

2012-10-01

To determine whether hydroxocobalamin will improve survival compared with epinephrine and saline solution controls in a model of cyanide-induced cardiac arrest. Forty-five swine (38 to 42 kg) were tracheally intubated, anesthetized, and central venous and arterial continuous cardiovascular monitoring catheters were inserted. Potassium cyanide was infused until cardiac arrest developed, defined as mean arterial pressure less than 30 mm Hg. Animals were treated with standardized mechanical chest compressions and were randomly assigned to receive one of 3 intravenous bolus therapies: hydroxocobalamin, epinephrine, or saline solution (control). All animals were monitored for 60 minutes after cardiac arrest. Additional epinephrine infusions were used in all arms of the study after return of spontaneous circulation for systolic blood pressure less than 90 mm Hg. A sample size of 15 animals per group was determined according to a power of 80%, a survival difference of 0.5, and an α of 0.05. Repeated-measure ANOVA was used to determine statistically significant changes between groups over time. Baseline weight, time to arrest, and cyanide dose at cardiac arrest were similar in the 3 groups. Coronary perfusion pressures with chest compressions were greater than 15 mm Hg in both treatment groups indicating sufficient compression depth. Zero of 15 (95% confidence interval [CI] 0% to 25%) animals in the control group, 11 of 15 (73%; 95% CI 48% to 90%) in the hydroxocobalamin group, and 11 of 15 (73%; 95% CI 48% to 90%) in the epinephrine group survived to the conclusion of the study (Pcyanide levels in the hydroxocobalamin group were also lower than that of the epinephrine group from cardiac arrest through the conclusion of the study. Intravenous hydroxocobalamin and epinephrine both independently improved survival compared with saline solution control in our swine model of cyanide-induced cardiac arrest. Hydroxocobalamin improved mean arterial pressure and pH, decreased

10. Modeling fecundity in the presence of a sterile fraction using a semi-parametric transformation model for grouped survival data.

Science.gov (United States)

McLain, Alexander C; Sundaram, Rajeshwari; Buck Louis, Germaine M

2016-02-01

The analysis of fecundity data is challenging and requires consideration of both highly timed and interrelated biologic processes in the context of essential behaviors such as sexual intercourse during the fertile window. Understanding human fecundity is further complicated by presence of a sterile population, i.e. couples unable to achieve pregnancy. Modeling techniques conducted to date have largely relied upon discrete time-to-pregnancy survival or day-specific probability models to estimate the determinants of time-to-pregnancy or acute effects, respectively. We developed a class of semi-parametric grouped transformation cure models that capture day-level variates purported to affect the cycle-level hazards of conception and, possibly, sterility. Our model's performance is assessed using simulation and longitudinal data from one of the few prospective cohort studies with preconception enrollment of women followed for 12 menstrual cycles at risk for pregnancy. © The Author(s) 2012.

11. A prognostic model of therapy-related myelodysplastic syndrome for predicting survival and transformation to acute myeloid leukemia.

Science.gov (United States)

Quintás-Cardama, Alfonso; Daver, Naval; Kim, Hawk; Dinardo, Courtney; Jabbour, Elias; Kadia, Tapan; Borthakur, Gautam; Pierce, Sherry; Shan, Jianqin; Cardenas-Turanzas, Marylou; Cortes, Jorge; Ravandi, Farhad; Wierda, William; Estrov, Zeev; Faderl, Stefan; Wei, Yue; Kantarjian, Hagop; Garcia-Manero, Guillermo

2014-10-01

We evaluated the characteristics of a cohort of patients with myelodysplastic syndrome (MDS) related to therapy (t-MDS) to create a prognostic model. We identified 281 patients with MDS who had received previous chemotherapy and/or radiotherapy for previous malignancy. Potential prognostic factors were determined using univariate and multivariate analyses. Multivariate Cox regression analysis identified 7 factors that independently predicted short survival in t-MDS: age ≥ 65 years (hazard ratio [HR], 1.63), Eastern Cooperative Oncology Group performance status 2-4 (HR, 1.86), poor cytogenetics (-7 and/or complex; HR, 2.47), World Health Organization MDS subtype (RARs or RAEB-1/2; HR, 1.92), hemoglobin (HR, 2.24), platelets (HR, 2.01), and transfusion dependency (HR, 1.59). These risk factors were used to create a prognostic model that segregated patients into 3 groups with distinct median overall survival: good (0-2 risk factors; 34 months), intermediate (3-4 risk factors; 12 months), and poor (5-7 risk factors; 5 months) (P < .001) and 1-year leukemia-free survival (96%, 84%, and 72%, respectively, P = .003). This model also identified distinct survival groups according to t-MDS therapy. In summary, we devised a prognostic model specifically for patients with t-MDS that predicted overall survival and leukemia-free survival. This model might facilitate the development of risk-adapted therapeutic strategies. Copyright © 2014 Elsevier Inc. All rights reserved.

12. Review and evaluation of performance measures for survival prediction models in external validation settings.

Science.gov (United States)

Rahman, M Shafiqur; Ambler, Gareth; Choodari-Oskooei, Babak; Omar, Rumana Z

2017-04-18

When developing a prediction model for survival data it is essential to validate its performance in external validation settings using appropriate performance measures. Although a number of such measures have been proposed, there is only limited guidance regarding their use in the context of model validation. This paper reviewed and evaluated a wide range of performance measures to provide some guidelines for their use in practice. An extensive simulation study based on two clinical datasets was conducted to investigate the performance of the measures in external validation settings. Measures were selected from categories that assess the overall performance, discrimination and calibration of a survival prediction model. Some of these have been modified to allow their use with validation data, and a case study is provided to describe how these measures can be estimated in practice. The measures were evaluated with respect to their robustness to censoring and ease of interpretation. All measures are implemented, or are straightforward to implement, in statistical software. Most of the performance measures were reasonably robust to moderate levels of censoring. One exception was Harrell's concordance measure which tended to increase as censoring increased. We recommend that Uno's concordance measure is used to quantify concordance when there are moderate levels of censoring. Alternatively, Gönen and Heller's measure could be considered, especially if censoring is very high, but we suggest that the prediction model is re-calibrated first. We also recommend that Royston's D is routinely reported to assess discrimination since it has an appealing interpretation. The calibration slope is useful for both internal and external validation settings and recommended to report routinely. Our recommendation would be to use any of the predictive accuracy measures and provide the corresponding predictive accuracy curves. In addition, we recommend to investigate the characteristics

13. Automated estimation of the truncation of room impulse response by applying a nonlinear decay model.

Science.gov (United States)

Janković, Marko; Ćirić, Dejan G; Pantić, Aleksandar

2016-03-01

Noise represents one of the most significant disturbances in measured room impulse responses (RIRs), and it has a potentially large impact on evaluation of the decay parameters. In order to reduce noise effects, various methods have been applied, including truncation of an RIR. In this paper, a procedure for the response truncation based on a model of RIR (nonlinear decay model) is presented. The model is represented by an exponential decay plus stationary noise. Unknown parameters of the model are calculated by an optimization that minimizes the difference between the curve generated by the model and the target one of the response to be truncated. Different curves can be applied in the optimization-absolute value of the RIR, logarithmic decay curve, and Schroeder curve obtained by the backward integration of the RIR. The proposed procedure is tested on various synthesized and measured impulse responses. It is compared with the procedure taken from the literature, often applied in practice.

14. Applying MDA to SDR for Space to Model Real-time Issues

Science.gov (United States)

Blaser, Tammy M.

2007-01-01

NASA space communications systems have the challenge of designing SDRs with highly-constrained Size, Weight and Power (SWaP) resources. A study is being conducted to assess the effectiveness of applying the MDA Platform-Independent Model (PIM) and one or more Platform-Specific Models (PSM) specifically to address NASA space domain real-time issues. This paper will summarize our experiences with applying MDA to SDR for Space to model real-time issues. Real-time issues to be examined, measured, and analyzed are: meeting waveform timing requirements and efficiently applying Real-time Operating System (RTOS) scheduling algorithms, applying safety control measures, and SWaP verification. Real-time waveform algorithms benchmarked with the worst case environment conditions under the heaviest workload will drive the SDR for Space real-time PSM design.

15. SURVIVAL OF MICROORGANISMS FROM MODERN PROBIOTICS IN MODEL CONDITIONS OF THE INTESTINE

Directory of Open Access Journals (Sweden)

Kabluchko TV

2017-03-01

Full Text Available Introduction. The staye of intestinal microflora affects the work of the whole organism. When composition of normal ibtestine microflora changes, its restoration is required. In our days a wide variety of probiotic drugs are available on the market which can be used to solve this problem. Most bacteria having probiotic properties represent the families Lactobacillus and Bifidobacterium, which have poor resistance to acidic content of the stomach and toxic effects of bile salts. Various studies have clearly shown that in a person with normal acidic and bile secretion, the lactobacilli and bifidobacteria are not detected after the passage through the duodenum, i.e., they perish before reaching the small intestines. In this study we compared the survival of different microorganisms which are contained in 9 probiotic drugs in a model of gastric and intestinal environments. Material and methods. In the laboratory of SI: “Mechnikov Institute Microbiology and Immunology, National Ukrainian Academy Medical Sciences" the in vitro experiments have been evaluated to test the ability of different probiotic bacteria which were contained in 9 probiotic drugs to survive the impact of the model environment of the stomach and duodenum. Bacillus coagulans persistence was evaluated under impact of simulated environment of the stomach and duodenum, it also was assessed by the quantity of CFU by incubation on culture medium. The following were studied: Lactobacillus acidophilus, Lactobacillus rhamnosus, Lactobacillus reuteri, Lactobacillus casei, Lactobacillus plantarum, Lactobacillus bulgaricus, Bifidobacterium bifidum, Bifidobacterium longum , Bifidobacterium breve, Bifidobacterium infantis, Bifidobacterium animalis subsp. Lactis BB-12, Saccharomyces boulardii, Bacillus coagulans, Bacillus clausii, Enterococcus faecium. Microorganisms were incubated for 3 hours in a model environment of the stomach (pepsin 3 g / l, hydrochloric acid of 160 mmol / l, pH 2

16. A comparative study of generalized linear mixed modelling and artificial neural network approach for the joint modelling of survival and incidence of Dengue patients in Sri Lanka

Science.gov (United States)

Hapugoda, J. C.; Sooriyarachchi, M. R.

2017-09-01

Survival time of patients with a disease and the incidence of that particular disease (count) is frequently observed in medical studies with the data of a clustered nature. In many cases, though, the survival times and the count can be correlated in a way that, diseases that occur rarely could have shorter survival times or vice versa. Due to this fact, joint modelling of these two variables will provide interesting and certainly improved results than modelling these separately. Authors have previously proposed a methodology using Generalized Linear Mixed Models (GLMM) by joining the Discrete Time Hazard model with the Poisson Regression model to jointly model survival and count model. As Aritificial Neural Network (ANN) has become a most powerful computational tool to model complex non-linear systems, it was proposed to develop a new joint model of survival and count of Dengue patients of Sri Lanka by using that approach. Thus, the objective of this study is to develop a model using ANN approach and compare the results with the previously developed GLMM model. As the response variables are continuous in nature, Generalized Regression Neural Network (GRNN) approach was adopted to model the data. To compare the model fit, measures such as root mean square error (RMSE), absolute mean error (AME) and correlation coefficient (R) were used. The measures indicate the GRNN model fits the data better than the GLMM model.

17. Bayesian Analysis for Dynamic Generalized Linear Latent Model with Application to Tree Survival Rate

Directory of Open Access Journals (Sweden)

Yu-sheng Cheng

2014-01-01

Full Text Available Logistic regression model is the most popular regression technique, available for modeling categorical data especially for dichotomous variables. Classic logistic regression model is typically used to interpret relationship between response variables and explanatory variables. However, in real applications, most data sets are collected in follow-up, which leads to the temporal correlation among the data. In order to characterize the different variables correlations, a new method about the latent variables is introduced in this study. At the same time, the latent variables about AR (1 model are used to depict time dependence. In the framework of Bayesian analysis, parameters estimates and statistical inferences are carried out via Gibbs sampler with Metropolis-Hastings (MH algorithm. Model comparison, based on the Bayes factor, and forecasting/smoothing of the survival rate of the tree are established. A simulation study is conducted to assess the performance of the proposed method and a pika data set is analyzed to illustrate the real application. Since Bayes factor approaches vary significantly, efficiency tests have been performed in order to decide which solution provides a better tool for the analysis of real relational data sets.

18. ANP AFFECTS CARDIAC REMODELING, FUNCTION, HEART FAILURE AND SURVIVAL IN A MOUSE MODEL OF DILATED CARDIOMYOPATHY

Science.gov (United States)

Wang, Dong; Gladysheva, Inna P.; Fan, Tai-Hwang M.; Sullivan, Ryan; Houng, Aiilyan K.; Reed, Guy L.

2014-01-01

Dilated cardiomyopathy is a frequent cause of heart failure and death. Atrial natriuretic peptide (ANP) is a biomarker of dilated cardiomyopathy, but there is controversy whether ANP modulates the development of heart failure. Therefore we examined whether ANP affects heart failure, cardiac remodeling, function and survival in a well-characterized, transgenic model of dilated cardiomyopathy. Mice with dilated cardiomyopathy with normal ANP levels survived longer than mice with partial ANP (pANP deficiency (pANP protected against the development of heart failure as indicated by reduced lung water, alveolar congestion, pleural effusions etc. ANP improved systolic function and reduced cardiomegaly. Pathologic cardiac remodeling was diminished in mice with normal ANP as indicated by decreased ventricular interstitial and perivascular fibrosis. Mice with dilated cardiomyopathy and normal ANP levels had better systolic function (pANP-deficiency. Dilated cardiomyopathy was associated with diminished cardiac transcripts for natriuretic peptide receptors A and B in mice with normal ANP and ANP-deficiency but transcripts for natriuretic peptide receptor C and CNP were selectively altered in mice with dilated cardiomyopathy and ANP-deficiency. Taken together, these data indicate that ANP has potent effects in experimental dilated cardiomyopathy that reduce the development of heart failure, prevent pathologic remodeling, preserve systolic function and reduce mortality. Despite the apparent overlap in physiologic function between the natriuretic peptides, these data suggest that the role of ANP in dilated cardiomyopathy and heart failure is not compensated physiologically by other natriuretic peptides. PMID:24379183

19. Cisplatin Resistant Spheroids Model Clinically Relevant Survival Mechanisms in Ovarian Tumors.

Directory of Open Access Journals (Sweden)

Full Text Available The majority of ovarian tumors eventually recur in a drug resistant form. Using cisplatin sensitive and resistant cell lines assembled into 3D spheroids we profiled gene expression and identified candidate mechanisms and biological pathways associated with cisplatin resistance. OVCAR-8 human ovarian carcinoma cells were exposed to sub-lethal concentrations of cisplatin to create a matched cisplatin-resistant cell line, OVCAR-8R. Genome-wide gene expression profiling of sensitive and resistant ovarian cancer spheroids identified 3,331 significantly differentially expressed probesets coding for 3,139 distinct protein-coding genes (Fc >2, FDR < 0.05 (S2 Table. Despite significant expression changes in some transporters including MDR1, cisplatin resistance was not associated with differences in intracellular cisplatin concentration. Cisplatin resistant cells were significantly enriched for a mesenchymal gene expression signature. OVCAR-8R resistance derived gene sets were significantly more biased to patients with shorter survival. From the most differentially expressed genes, we derived a 17-gene expression signature that identifies ovarian cancer patients with shorter overall survival in three independent datasets. We propose that the use of cisplatin resistant cell lines in 3D spheroid models is a viable approach to gain insight into resistance mechanisms relevant to ovarian tumors in patients. Our data support the emerging concept that ovarian cancers can acquire drug resistance through an epithelial-to-mesenchymal transition.

20. Transcolonic Perirectal NOTES Access (PNA): A feasibility study with survival in swine model.

Science.gov (United States)

Oliveira, André L A; Zorron, Ricardo; Oliveira, Flavio M M DE; Santos, Marcelo B Dos; Scheffer, Jussara P; Rios, Marcelo; Antunes, Fernanda

2017-05-01

Transrectal access still has some unsolved issues such as spatial orientation, infection, access and site closure. This study presents a simple technique to perform transcolonic access with survival in a swine model series. A new technique for NOTES perirectal access to perform retroperitoneoscopy, peritoneoscopy, liver and lymphnode biopsies was performed in 6 pigs, using Totally NOTES technique. The specimens were extracted transanally. The flexible endoscope was inserted through a posterior transmural incision and the retrorectal space. Cultures of bacteria were documented for the retroperitoneal space and intra abdominal cavity after 14 days. Rectal site was closed using non-absorbable sutures. There was no bowel cleansing, nor preoperative fasting. The procedures were performed in 6 pigs through transcolonic natural orifice access using available endoscopic flexible instruments. All animals survived 14 days without complications, and cultures were negative. Histopathologic examination of the rectal closure site showed adequate healing of suture line and no micro abscesses. The results of feasibility and safety of experimental Transcolonic NOTES potentially brings new frontiers and future wider applications for minimally invasive surgery. The treatment of colorectal, abdominal and retroperitoneal diseases through a flexible Perirectal NOTES Access (PNA) is a promising new approach.

1. Neuron-specific antioxidant OXR1 extends survival of a mouse model of amyotrophic lateral sclerosis.

Science.gov (United States)

Liu, Kevin X; Edwards, Benjamin; Lee, Sheena; Finelli, Mattéa J; Davies, Ben; Davies, Kay E; Oliver, Peter L

2015-05-01

Amyotrophic lateral sclerosis is a devastating neurodegenerative disorder characterized by the progressive loss of spinal motor neurons. While the aetiological mechanisms underlying the disease remain poorly understood, oxidative stress is a central component of amyotrophic lateral sclerosis and contributes to motor neuron injury. Recently, oxidation resistance 1 (OXR1) has emerged as a critical regulator of neuronal survival in response to oxidative stress, and is upregulated in the spinal cord of patients with amyotrophic lateral sclerosis. Here, we tested the hypothesis that OXR1 is a key neuroprotective factor during amyotrophic lateral sclerosis pathogenesis by crossing a new transgenic mouse line that overexpresses OXR1 in neurons with the SOD1(G93A) mouse model of amyotrophic lateral sclerosis. Interestingly, we report that overexpression of OXR1 significantly extends survival, improves motor deficits, and delays pathology in the spinal cord and in muscles of SOD1(G93A) mice. Furthermore, we find that overexpression of OXR1 in neurons significantly delays non-cell-autonomous neuroinflammatory response, classic complement system activation, and STAT3 activation through transcriptomic analysis of spinal cords of SOD1(G93A) mice. Taken together, these data identify OXR1 as the first neuron-specific antioxidant modulator of pathogenesis and disease progression in SOD1-mediated amyotrophic lateral sclerosis, and suggest that OXR1 may serve as a novel target for future therapeutic strategies. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain.

2. Survival analysis of gastric cancer patients using Cox model: a five year study

Directory of Open Access Journals (Sweden)

Biglarian A

2009-08-01

Full Text Available "n Normal 0 false false false EN-US X-NONE AR-SA MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi;} Background: Gastric cancer is the second most common cancer and known as the second cause of death due to cancers worldwide. Adenocarcinoma is the most fatal cancer in Iran and a patient with this kind of cancer, has a lower lifetime than others. In this research, the survival of patients with gastric carcinoma who were registered at Taleghani Hospital, were studied."n"nMethods: 291 patients with Gastric carcinoma who had received care, chemotherapy or chemoradiotherapy, at Taleghani Hospital in Tehran from 2002 to 2007 were studied as a historical cohort. Their survival rates and its relationship with 12 risk factors were assessed."n"nResults: Of the 291 patients with Gastric carcinoma, 70.1 percent were men and others (29.9% were women. The mean age of men was 62.26 years and of women was 59.32 years at the time of diagnosis. Most of patients (93.91% were advanced stage and metastasis. The Cox proportional hazards model showed that age at diagnosis, tumor stage and histology type with survival time had significant relationships (p=0.039, p=0.042 and p=0.032 respectively."n"n Conclusion: The five-year survival rate and median lifetime of gastric cancer patients who underwent chemotherapy or chemoradiotherapy are very

3. To Be or Not to Be an Entrepreneur: Applying a Normative Model to Career Decisions

Science.gov (United States)

Callanan, Gerard A.; Zimmerman, Monica

2016-01-01

Reflecting the need for a better and broader understanding of the factors influencing the choices to enter into or exit an entrepreneurial career, this article applies a structured, normative model of career management to the career decision-making of entrepreneurs. The application of a structured model can assist career counselors, college career…

4. An improved k-ε model applied to a wind turbine wake in atmospheric turbulence

DEFF Research Database (Denmark)

Laan, van der, Paul Maarten; Sørensen, Niels N.; Réthoré, Pierre-Elouan

2015-01-01

An improved k-ε turbulence model is developed and applied to a single wind turbine wake in a neutral atmospheric boundary layer using a Reynolds averaged Navier–Stokes solver. The proposed model includes a flow-dependent Cμ that is sensitive to high velocity gradients, e.g., at the edge of a wind...

5. A Dyadic Approach: Applying a Developmental-Conceptual Model to Couples Coping with Chronic Illness

Science.gov (United States)

Checton, Maria G.; Magsamen-Conrad, Kate; Venetis, Maria K.; Greene, Kathryn

2015-01-01

The purpose of the present study was to apply Berg and Upchurch's developmental-conceptual model toward a better understanding of how couples cope with chronic illness. Specifically, a model was hypothesized in which proximal factors (relational quality), dyadic appraisal (illness interference), and dyadic coping (partner support) influence…

6. How High Is the Tramping Track? Mathematising and Applying in a Calculus Model-Eliciting Activity

Science.gov (United States)

Yoon, Caroline; Dreyfus, Tommy; Thomas, Michael O. J.

2010-01-01

Two complementary processes involved in mathematical modelling are mathematising a realistic situation and applying a mathematical technique to a given realistic situation. We present and analyse work from two undergraduate students and two secondary school teachers who engaged in both processes during a mathematical modelling task that required…

7. Divorce and Child Behavior Problems: Applying Latent Change Score Models to Life Event Data

Science.gov (United States)

Malone, Patrick S.; Lansford, Jennifer E.; Castellino, Domini R.; Berlin, Lisa J.; Dodge, Kenneth A.; Bates, John E.; Pettit, Gregory S.

2004-01-01

Effects of parents' divorce on children's adjustment have been studied extensively. This article applies new advances in trajectory modeling to the problem of disentangling the effects of divorce on children's adjustment from related factors such as the child's age at the time of divorce and the child's gender. Latent change score models were used…

8. A generalised chemical precipitation modelling approach in wastewater treatment applied to calcite

DEFF Research Database (Denmark)

Mbamba, Christian Kazadi; Batstone, Damien J.; Flores Alsina, Xavier

2015-01-01

, the present study aims to identify a broadly applicable precipitation modelling approach. The study uses two experimental platforms applied to calcite precipitating from synthetic aqueous solutions to identify and validate the model approach. Firstly, dynamic pH titration tests are performed to define...

9. Applying a Conceptual Model in Sport Sector Work- Integrated Learning Contexts

Science.gov (United States)

Agnew, Deborah; Pill, Shane; Orrell, Janice

2017-01-01

This paper applies a conceptual model for work-integrated learning (WIL) in a multidisciplinary sports degree program. Two examples of WIL in sport will be used to illustrate how the conceptual WIL model is being operationalized. The implications for practice are that curriculum design must recognize a highly flexible approach to the nature of…

10. Regression modeling strategies with applications to linear models, logistic and ordinal regression, and survival analysis

CERN Document Server

Harrell , Jr , Frank E

2015-01-01

This highly anticipated second edition features new chapters and sections, 225 new references, and comprehensive R software. In keeping with the previous edition, this book is about the art and science of data analysis and predictive modeling, which entails choosing and using multiple tools. Instead of presenting isolated techniques, this text emphasizes problem solving strategies that address the many issues arising when developing multivariable models using real data and not standard textbook examples. It includes imputation methods for dealing with missing data effectively, methods for fitting nonlinear relationships and for making the estimation of transformations a formal part of the modeling process, methods for dealing with "too many variables to analyze and not enough observations," and powerful model validation techniques based on the bootstrap.  The reader will gain a keen understanding of predictive accuracy, and the harm of categorizing continuous predictors or outcomes.  This text realistically...

11. DISCRETE ELEMENT MODELING OF BLADE–STRIKE FREQUENCY AND SURVIVAL OF FISH PASSING THROUGH HYDROKINETIC TURBINES

Energy Technology Data Exchange (ETDEWEB)

Romero Gomez, Pedro DJ; Richmond, Marshall C.

2014-04-17

Evaluating the consequences from blade-strike of fish on marine hydrokinetic (MHK) turbine blades is essential for incorporating environmental objectives into the integral optimization of machine performance. For instance, experience with conventional hydroelectric turbines has shown that innovative shaping of the blade and other machine components can lead to improved designs that generate more power without increased impacts to fish and other aquatic life. In this work, we used unsteady computational fluid dynamics (CFD) simulations of turbine flow and discrete element modeling (DEM) of particle motion to estimate the frequency and severity of collisions between a horizontal axis MHK tidal energy device and drifting aquatic organisms or debris. Two metrics are determined with the method: the strike frequency and survival rate estimate. To illustrate the procedure step-by-step, an exemplary case of a simple runner model was run and compared against a probabilistic model widely used for strike frequency evaluation. The results for the exemplary case showed a strong correlation between the two approaches. In the application case of the MHK turbine flow, turbulent flow was modeled using detached eddy simulation (DES) in conjunction with a full moving rotor at full scale. The CFD simulated power and thrust were satisfactorily comparable to experimental results conducted in a water tunnel on a reduced scaled (1:8.7) version of the turbine design. A cloud of DEM particles was injected into the domain to simulate fish or debris that were entrained into the turbine flow. The strike frequency was the ratio of the count of colliding particles to the crossing sample size. The fish length and approaching velocity were test conditions in the simulations of the MHK turbine. Comparisons showed that DEM-based frequencies tend to be greater than previous results from Lagrangian particles and probabilistic models, mostly because the DEM scheme accounts for both the geometric

12. Modelling of the process of micromycetus survival in fruit and berry syrups

Directory of Open Access Journals (Sweden)

L. Osipova

2017-06-01

Full Text Available In order to develop methods for preserving fruit and berry syrup, which exclude the use of high-temperature sterilization and preservatives, the survival of spores of micromycetes (B. nivea molds in model media with different concentration of food osmotically active substances (sucrose, ethyl alcohol, citric acid at a certain concentration of lethal effects on microorganisms. It has been established that model media (juice based syrups from blueberries with a mass content of 4 % and 6 % alcohol, 50 % sucrose, 1 % and 2 % titrated acids, have a lethal effect on spores of B. nivea molds. The regression equation is obtained expressing the dependence of the amount of spores of B. nivea molds on the concentration of sucrose, acid, alcohol and the storage time of syrups. The form of the dependence and direction of the connection between the variables is established – a negative linear regression, which is expressed in the uniform decrease of the function. The estimation of quality of the received regression model is defined. The deviations of the calculated data from the data of the initial set are calculated. The proposed model has sufficient reliability, since the regression function is defined, interpreted and justified, and the estimation of the accuracy of the regression analysis meets the requirements.

13. Survival data analyses in ecotoxicology: critical effect concentrations, methods and models. What should we use?

Science.gov (United States)

Forfait-Dubuc, Carole; Charles, Sandrine; Billoir, Elise; Delignette-Muller, Marie Laure

2012-05-01

In ecotoxicology, critical effect concentrations are the most common indicators to quantitatively assess risks for species exposed to contaminants. Three types of critical effect concentrations are classically used: lowest/ no observed effect concentration (LOEC/NOEC), LC( x) (x% lethal concentration) and NEC (no effect concentration). In this article, for each of these three types of critical effect concentration, we compared methods or models used for their estimation and proposed one as the most appropriate. We then compared these critical effect concentrations to each other. For that, we used nine survival data sets corresponding to D. magna exposition to nine different contaminants, for which the time-course of the response was monitored. Our results showed that: (i) LOEC/NOEC values at day 21 were method-dependent, and that the Cochran-Armitage test with a step-down procedure appeared to be the most protective for the environment; (ii) all tested concentration-response models we compared gave close values of LC50 at day 21, nevertheless the Weibull model had the lowest global mean deviance; (iii) a simple threshold NEC-model both concentration and time dependent more completely described whole data (i.e. all timepoints) and enabled a precise estimation of the NEC. We then compared the three critical effect concentrations and argued that the use of the NEC might be a good option for environmental risk assessment.

14. Increased flexibility for modeling telemetry and nest-survival data using the multistate framework

Science.gov (United States)

Devineau, Olivier; Kendall, William L.; Doherty, Paul F.; Shenk, Tanya M.; White, Gary C.; Lukacs, Paul M.; Burnham, Kenneth P.

2014-01-01

Although telemetry is one of the most common tools used in the study of wildlife, advances in the analysis of telemetry data have lagged compared to progress in the development of telemetry devices. We demonstrate how standard known-fate telemetry and related nest-survival data analysis models are special cases of the more general multistate framework. We present a short theoretical development, and 2 case examples regarding the American black duck and the mallard. We also present a more complex lynx data analysis. Although not necessary in all situations, the multistate framework provides additional flexibility to analyze telemetry data, which may help analysts and biologists better deal with the vagaries of real-world data collection.

15. An experimental test of two mathematical models applied to the size-weight illusion.

Science.gov (United States)

Sarris, V; Heineken, E

1976-05-01

Two quantitative models, which make different quantitative predictions for the amount of the size-weight illusion, were tested according to the psychophysical methods employed by the respective authors (magnitude estimation versus category ratings). Both models with their corresponding method were supported. This causes uncertainty over Anderson's chaim that the validity of both a model and the applied scale used is sufficiently test by the socalled joint testing procedure.

16. Applying circular economy innovation theory in business process modeling and analysis

Science.gov (United States)

Popa, V.; Popa, L.

2017-08-01

The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

17. Modeling longitudinal data and its impact on survival in observational nephrology studies: tools and considerations.

Science.gov (United States)

Streja, Elani; Goldstein, Leanne; Soohoo, Melissa; Obi, Yoshitsugu; Kalantar-Zadeh, Kamyar; Rhee, Connie M

2017-04-01

Nephrologists and kidney disease researchers are often interested in monitoring how patients' clinical and laboratory measures change over time, what factors may impact these changes, and how these changes may lead to differences in morbidity, mortality, and other outcomes. When longitudinal data with repeated measures over time in the same patients are available, there are a number of analytical approaches that could be employed to describe the trends and changes in these measures, and to explore the associations of these changes with outcomes. Researchers may choose a streamlined and simplified analytic approach to examine trajectories with subsequent outcomes such as estimating deltas (subtraction of the last observation from the first observation) or estimating per patient slopes with linear regression. Conversely, they could more fully address the data complexity by using a longitudinal mixed model to estimate change as a predictor or employ a joint model, which can simultaneously model the longitudinal effect and its impact on an outcome such as survival. In this review, we aim to assist nephrologists and clinical researchers by reviewing these approaches in modeling the association of longitudinal change in a marker with outcomes, while appropriately considering the data complexity. Namely, we will discuss the use of simplified approaches for creating predictor variables representing change in measurements including deltas and patient slopes, as well more sophisticated longitudinal models including joint models, which can be used in addition to simplified models based on the indications and objectives of the study as warranted. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

18. Low-temperature survival of Salmonella spp. in a model food system with natural microflora.

Science.gov (United States)

Morey, Amit; Singh, Manpreet

2012-03-01

The United States Department of Agriculture requires chilled poultry carcass temperature to be below 4°C (40°F) to inhibit the growth of Salmonella and improve shelf life. Post-process temperature abuse of chicken leads to proliferation of existing bacteria, including Salmonella, which can lead to the increased risk of human infections. While models predicting Salmonella growth at abusive temperatures are developed using sterile media or chicken slurry, there are limited studies of Salmonella growth in the presence of background microflora at 4-10°C. Experiments in this study were conducted to determine the growth of Salmonella Typhimurium and Heidelberg at 4-10°C in brain heart infusion broth (BHI) and non-sterile chicken slurry (CS). Nalidixic acid-resistant Salmonella Typhimurium and S. Heidelberg (3 log CFU/mL) were inoculated separately in CS and sterile BHI in a 12-well microtiter plate and incubated at 4°C, 7°C, and 10°C, following which samples were taken every 24 h for up to 6 days. Samples from each well (n=5) were spread plated on XLT4 agar+nalidixic acid and incubated at 37°C for 24 h. Bacterial populations were reported as CFU/mL. No significant differences (p>0.05) were observed in the survival of both strains in CS and BHI over the period of 6 days at all temperatures except S. Heidelberg at 7°C. Survival populations of both strains at 4°C were significantly different (p ≤ 0.05) than at 7°C and 10°C in both media types. S. Heidelberg showed a maximum growth of 2 logs in BHI at 10°C among all the treatments. Growth patterns and survival of Salmonella at near refrigeration temperatures during carcass chilling can be useful to develop models to predict Salmonella growth post-processing and during storage, hence assisting processors in improving process controls.

19. Recent progress and modern challenges in applied mathematics, modeling and computational science

CERN Document Server

Makarov, Roman; Belair, Jacques

2017-01-01

This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

20. Clinical prediction of 5-year survival in systemic sclerosis: validation of a simple prognostic model in EUSTAR centres

NARCIS (Netherlands)

Fransen, J.; Popa-Diaconu, D.A.; Hesselstrand, R.; Carreira, P.; Valentini, G.; Beretta, L.; Airo, P.; Inanc, M.; Ullman, S.; Balbir-Gurman, A.; Sierakowski, S.; Allanore, Y.; Czirjak, L.; Riccieri, V.; Giacomelli, R.; Gabrielli, A.; Riemekasten, G.; Matucci-Cerinic, M.; Farge, D.; Hunzelmann, N.; Hoogen, F.H. Van den; Vonk, M.C.

2011-01-01

OBJECTIVE: Systemic sclerosis (SSc) is associated with a significant reduction in life expectancy. A simple prognostic model to predict 5-year survival in SSc was developed in 1999 in 280 patients, but it has not been validated in other patients. The predictions of a prognostic model are usually

1. Comparison of the average surviving fraction model with the integral biologically effective dose model for an optimal irradiation scheme.

Science.gov (United States)

Takagi, Ryo; Komiya, Yuriko; Sutherland, Kenneth L; Shirato, Hiroki; Date, Hiroyuki; Mizuta, Masahiro

2018-01-04

2. Modeling the survival responses of a multi-component biofilm to environmental stress

Science.gov (United States)

Carles Brangarí, Albert; Manzoni, Stefano; Sanchez-Vila, Xavier; Fernàndez-Garcia, Daniel

2017-04-01

Biofilms are consortia of microorganisms embedded in self-produced matrices of biopolymers. The survival of such communities depends on their capacity to improve the environmental conditions of their habitat by mitigating, or even benefitting from some adverse external factors. The mechanisms by which the microbial habitat is regulated remain mostly unknown. However, many studies have reported physiological responses to environmental stresses that include the release of extracellular polymeric substances (EPS) and the induction of a dormancy state. A sound understanding of these capacities is required to enhance the knowledge of the microbial dynamics in soils and its potential role in the carbon cycle, with significant implications for the degradation of contaminants and the emission of greenhouse gases, among others. We present a numerical analysis of the dynamics of soil microbes and their responses to environmental stresses. The conceptual model considers a multi-component heterotrophic biofilm made up of active cells, dormant cells, EPS, and extracellular enzymes. Biofilm distribution and properties are defined at the pore-scale and used to determine nutrient availability and water saturation via feedbacks of biofilm on soil hydraulic properties. The pore space micro-habitat is modeled as a simplified pore-network of cylindrical tubes in which biofilms proliferate. Microbial compartments and most of the carbon fluxes are defined at the bulk level. Microbial processes include the synthesis, decay and detachment of biomass, the activation/deactivation of cells, and the release and reutilization of EPS. Results suggest that the release of EPS and the capacity to enter a dormant state offer clear evolutionary advantages in scenarios characterized by environmental stress. On the contrary, when the conditions are favorable, the diversion of carbon into the production of the aforementioned survival mechanisms does not confer any additional benefit and the population

3. Peptides modeled after the alpha-domain of metallothionein induce neurite outgrowth and promote survival of cerebellar granule neurons

DEFF Research Database (Denmark)

Asmussen, Johanne Wirenfeldt; Ambjørn, Malene; Bock, Elisabeth

2009-01-01

Metallothionein (MT) is a metal-binding protein capable of preventing oxidative stress and apoptotic cell death in the central nervous system of mammals, and hence is of putative therapeutic value in the treatment of neurodegenerative disorders. Recently, we demonstrated that a peptide modeled...... after the beta-domain of MT, EmtinB, induced neurite outgrowth and increased neuronal survival through binding to receptors of the low-density lipoprotein receptor family (LDLR). The present study identified two MT alpha-domain-derived peptide sequences termed EmtinAn and EmtinAc, each consisting of 14...... amino acids, as potent stimulators of neuronal differentiation and survival of primary neurons. In addition, we show that a peptide derived from the N-terminus of the MT beta-domain, EmtinBn, promotes neuronal survival. The neuritogenic and survival promoting effects of EmtinAc, similar to MT and Emtin...

4. A novel survival model of cardioplegic arrest and cardiopulmonary bypass in rats: a methodology paper

Directory of Open Access Journals (Sweden)

Podgoreanu Mihai V

2008-08-01

Full Text Available Abstract Background Given the growing population of cardiac surgery patients with impaired preoperative cardiac function and rapidly expanding surgical techniques, continued efforts to improve myocardial protection strategies are warranted. Prior research is mostly limited to either large animal models or ex vivo preparations. We developed a new in vivo survival model that combines administration of antegrade cardioplegia with endoaortic crossclamping during cardiopulmonary bypass (CPB in the rat. Methods Sprague-Dawley rats were cannulated for CPB (n = 10. With ultrasound guidance, a 3.5 mm balloon angioplasty catheter was positioned via the right common carotid artery with its tip proximal to the aortic valve. To initiate cardioplegic arrest, the balloon was inflated and cardioplegia solution injected. After 30 min of cardioplegic arrest, the balloon was deflated, ventilation resumed, and rats were weaned from CPB and recovered. To rule out any evidence of cerebral ischemia due to right carotid artery ligation, animals were neurologically tested on postoperative day 14, and their brains histologically assessed. Results Thirty minutes of cardioplegic arrest was successfully established in all animals. Functional assessment revealed no neurologic deficits, and histology demonstrated no gross neuronal damage. Conclusion This novel small animal CPB model with cardioplegic arrest allows for both the study of myocardial ischemia-reperfusion injury as well as new cardioprotective strategies. Major advantages of this model include its overall feasibility and cost effectiveness. In future experiments long-term echocardiographic outcomes as well as enzymatic, genetic, and histologic characterization of myocardial injury can be assessed. In the field of myocardial protection, rodent models will be an important avenue of research.

5. Development of a Model to Predict Transplant-free Survival of Patients With Acute Liver Failure.

Science.gov (United States)

Koch, David G; Tillman, Holly; Durkalski, Valerie; Lee, William M; Reuben, Adrian

2016-08-01

Patients with acute liver failure (ALF) have a high risk of death that can be substantially reduced with liver transplantation. It is a challenge to predict which patients with ALF will survive without liver transplant because available prognostic scoring systems are inadequate. We devised a mathematical model, using a large dataset collected by the Acute Liver Failure Study Group, which can predict transplant-free survival in patients with ALF. We performed a retrospective analysis of data from 1974 subjects who met criteria for ALF (coagulopathy and hepatic encephalopathy within 26 weeks of the first symptoms, without pre-existing liver disease) enrolled in the Acute Liver Failure Study Group database from January 1, 1998 through June 11, 2013. We randomly assigned the subjects to development and validation cohorts. Data from the development cohort were analyzed to identify factors associated with transplant-free survival (alive without transplantation by 21 days after admission to the study). Statistically significant variables were used to create a multivariable logistic regression model. Most subjects were women (70%) and white (78%); acetaminophen overdose was the most common cause (48% of subjects). The rate of transplant-free survival was 50%. Admission values of hepatic encephalopathy grade, ALF etiology, vasopressor use, and log transformations of bilirubin and international normalized ratio were significantly associated with transplant-free survival, based on logistic regression analysis. In the validation cohort, the resulting model predicted transplant-free survival with a C statistic value of 0.84, 66.3% accuracy (95% confidence interval, 63.1%-69.4%), 37.1% sensitivity (95% confidence interval, 32.5%-41.8%), and 95.3% specificity (95% confidence interval, 92.9%-97.1%). Using data from the Acute Liver Failure Study Group, we developed a model that predicts transplant-free survival of patients with ALF based on easily identifiable hospital admission

6. Prognostic model for survival in patients with metastatic renal cell carcinoma: results from the international kidney cancer working group.

Science.gov (United States)

Manola, Judith; Royston, Patrick; Elson, Paul; McCormack, Jennifer Bacik; Mazumdar, Madhu; Négrier, Sylvie; Escudier, Bernard; Eisen, Tim; Dutcher, Janice; Atkins, Michael; Heng, Daniel Y C; Choueiri, Toni K; Motzer, Robert; Bukowski, Ronald

2011-08-15

To develop a single validated model for survival in metastatic renal cell carcinoma (mRCC) using a comprehensive international database. A comprehensive database of 3,748 patients including previously reported clinical prognostic factors was established by pooling patient-level data from clinical trials. Following quality control and standardization, descriptive statistics were generated. Univariate analyses were conducted using proportional hazards models. Multivariable analysis using a log-logistic model stratified by center and multivariable fractional polynomials was conducted to identify independent predictors of survival. Missing data were handled using multiple imputation methods. Three risk groups were formed using the 25th and 75th percentiles of the resulting prognostic index. The model was validated using an independent data set of 645 patients treated with tyrosine kinase inhibitor (TKI) therapy. Median survival in the favorable, intermediate and poor risk groups was 26.9 months, 11.5 months, and 4.2 months, respectively. Factors contributing to the prognostic index included treatment, performance status, number of metastatic sites, time from diagnosis to treatment, and pretreatment hemoglobin, white blood count, lactate dehydrogenase, alkaline phosphatase, and serum calcium. The model showed good concordance when tested among patients treated with TKI therapy (C statistic = 0.741, 95% CI: 0.714-0.768). Nine clinical factors can be used to model survival in mRCC and form distinct prognostic groups. The model shows utility among patients treated in the TKI era. ©2011 AACR.

7. Plants modify biological processes to ensure survival following carbon depletion: a Lolium perenne model.

Directory of Open Access Journals (Sweden)

Julia M Lee

Full Text Available BACKGROUND: Plants, due to their immobility, have evolved mechanisms allowing them to adapt to multiple environmental and management conditions. Short-term undesirable conditions (e.g. moisture deficit, cold temperatures generally reduce photosynthetic carbon supply while increasing soluble carbohydrate accumulation. It is not known, however, what strategies plants may use in the long-term to adapt to situations resulting in net carbon depletion (i.e. reduced photosynthetic carbon supply and carbohydrate accumulation. In addition, many transcriptomic experiments have typically been undertaken under laboratory conditions; therefore, long-term acclimation strategies that plants use in natural environments are not well understood. METHODOLOGY/PRINCIPAL FINDINGS: Perennial ryegrass (Lolium perenne L. was used as a model plant to define whether plants adapt to repetitive carbon depletion and to further elucidate their long-term acclimation mechanisms. Transcriptome changes in both lamina and stubble tissues of field-grown plants with depleted carbon reserves were characterised using reverse transcription-quantitative polymerase chain reaction (RT-qPCR. The RT-qPCR data for select key genes indicated that plants reduced fructan degradation, and increased photosynthesis and fructan synthesis capacities following carbon depletion. This acclimatory response was not sufficient to prevent a reduction (P<0.001 in net biomass accumulation, but ensured that the plant survived. CONCLUSIONS: Adaptations of plants with depleted carbon reserves resulted in reduced post-defoliation carbon mobilization and earlier replenishment of carbon reserves, thereby ensuring survival and continued growth. These findings will help pave the way to improve plant biomass production, for either grazing livestock or biofuel purposes.

8. Survival benefit with radium-223 dichloride in a mouse model of breast cancer bone metastasis.

Science.gov (United States)

Suominen, Mari I; Rissanen, Jukka P; Käkönen, Rami; Fagerlund, Katja M; Alhoniemi, Esa; Mumberg, Dominik; Ziegelbauer, Karl; Halleen, Jussi M; Käkönen, Sanna-Maria; Scholz, Arne

2013-06-19

9. A two-fluid model for vertical flow applied to CO2 injection wells

DEFF Research Database (Denmark)

Linga, Gaute; Lund, Halvor

2016-01-01

the well, including tubing, packer fluid, casing, cement or drilling mud, and rock formation. This enables prediction of the temperature in the well fluid and in each layer of the well. The model is applied to sudden shut-in and blowout cases of a CO2 injection well, where we employ the highly accurate...... to thermal stresses and subsequent loss of well integrity, and it is therefore crucial to employ models that can predict this accurately. In this work, we present a model for vertical well flow that includes both two-phase flow and heat conduction. The flow is described by a two-fluid model, where mass...

10. Applying model analysis to a resource-based analysis of the Force and Motion Conceptual Evaluation

Directory of Open Access Journals (Sweden)

Trevor I. Smith

2014-07-01

Full Text Available Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information regarding the results of investigations using these question clusters than normalized gain graphs. We provide examples from two different institutions to show how the use of model analysis with our redefined clusters can provide previously hidden insight into the effectiveness of instruction.

11. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

Science.gov (United States)

Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V

2016-01-01

Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.

12. A pioneering healthcare model applying large-scale production concepts: Principles and performance after more than 11,000 transplants at Hospital do Rim.

Science.gov (United States)

Pestana, José Medina

2016-10-01

The kidney transplant program at Hospital do Rim (hrim) is a unique healthcare model that applies the same principles of repetition of processes used in industrial production. This model, devised by Frederick Taylor, is founded on principles of scientific management that involve planning, rational execution of work, and distribution of responsibilities. The expected result is increased efficiency, improvement of results and optimization of resources. This model, almost completely subsidized by the Unified Health System (SUS, in the Portuguese acronym), has been used at the hrim in more than 11,000 transplants over the last 18 years. The hrim model consists of eight interconnected modules: organ procurement organization, preparation for the transplant, admission for transplant, surgical procedure, post-operative period, outpatient clinic, support units, and coordination and quality control. The flow of medical activities enables organized and systematic care of all patients. The improvement of the activities in each module is constant, with full monitoring of various administrative, health care, and performance indicators. The continuous improvement in clinical results confirms the efficiency of the program. Between 1998 and 2015, an increase was noted in graft survival (77.4 vs. 90.4%, p<0.001) and patient survival (90.5 vs. 95.1%, p=0.001). The high productivity, efficiency, and progressive improvement of the results obtained with this model suggest that it could be applied to other therapeutic areas that require large-scale care, preserving the humanistic characteristic of providing health care activity.

13. A pioneering healthcare model applying large-scale production concepts: Principles and performance after more than 11,000 transplants at Hospital do Rim

Directory of Open Access Journals (Sweden)

José Medina Pestana

Full Text Available Summary The kidney transplant program at Hospital do Rim (hrim is a unique healthcare model that applies the same principles of repetition of processes used in industrial production. This model, devised by Frederick Taylor, is founded on principles of scientific management that involve planning, rational execution of work, and distribution of responsibilities. The expected result is increased efficiency, improvement of results and optimization of resources. This model, almost completely subsidized by the Unified Health System (SUS, in the Portuguese acronym, has been used at the hrim in more than 11,000 transplants over the last 18 years. The hrim model consists of eight interconnected modules: organ procurement organization, preparation for the transplant, admission for transplant, surgical procedure, post-operative period, outpatient clinic, support units, and coordination and quality control. The flow of medical activities enables organized and systematic care of all patients. The improvement of the activities in each module is constant, with full monitoring of various administrative, health care, and performance indicators. The continuous improvement in clinical results confirms the efficiency of the program. Between 1998 and 2015, an increase was noted in graft survival (77.4 vs. 90.4%, p<0.001 and patient survival (90.5 vs. 95.1%, p=0.001. The high productivity, efficiency, and progressive improvement of the results obtained with this model suggest that it could be applied to other therapeutic areas that require large-scale care, preserving the humanistic characteristic of providing health care activity.

14. A Dirichlet process mixture model for survival outcome data: assessing nationwide kidney transplant centers.

Science.gov (United States)

Zhao, Lili; Shi, Jingchunzi; Shearon, Tempie H; Li, Yi

2015-04-15

Mortality rates are probably the most important indicator for the performance of kidney transplant centers. Motivated by the national evaluation of mortality rates at kidney transplant centers in the USA, we seek to categorize the transplant centers based on the mortality outcome. We describe a Dirichlet process model and a Dirichlet process mixture model with a half-cauchy prior for the estimation of the risk-adjusted effects of the transplant centers, with strategies for improving the model performance, interpretability, and classification ability. We derive statistical measures and create graphical tools to rate transplant centers and identify outlying groups of centers with exceptionally good or poor performance. The proposed method was evaluated through simulation and then applied to assess kidney transplant centers from a national organ failure registry. Copyright © 2015 John Wiley & Sons, Ltd.

15. A Bayesian semiparametric multilevel survival modelling of age at first birth in Nigeria

Directory of Open Access Journals (Sweden)

Ezra Gayawan

2013-06-01

Full Text Available BACKGROUND The age at which childbearing begins influences the total number of children a woman bears throughout her reproductive period, in the absence of any active fertility control. For countries in sub-Saharan Africa where contraceptive prevalence rate is still low, younger ages at first birth tend to increase the number of children a woman will have thereby hindering the process of fertility decline. Research has also shown that early childbearing can endanger the health of the mother and her offspring, which can in turn lead to high child and maternal mortality. OBJECTIVE In this paper, an attempt was made to explore possible trends, geographical variation and determinants of timing of first birth in Nigeria, using the 1999 - 2008 Nigeria Demographic and Health Survey data sets. METHODS A structured additive survival model for continuous time data, an approach that simultaneously estimates the nonlinear effect of metrical covariates, fixed effects, spatial effects and smoothing parameters within a Bayesian context in one step is employed for all estimations. All analyses were carried out using BayesX - a software package for Bayesian modelling techniques. RESULTS Results from this paper reveal that variation in age at first birth in Nigeria is determined more by individual household than by community, and that substantial geographical variations in timing of first birth also exist. COMMENTS These findings can guide policymakers in identifying states or districts that are associated with significant risk of early childbirth, which can in turn be used in designing effective strategies and in decision making.

16. Irreversible electroporation of the pancreas is feasible and safe in a porcine survival model.

Science.gov (United States)

Fritz, Stefan; Sommer, Christof M; Vollherbst, Dominik; Wachter, Miguel F; Longerich, Thomas; Sachsenmeier, Milena; Knapp, Jürgen; Radeleff, Boris A; Werner, Jens

2015-07-01

Use of thermal tumor ablation in the pancreatic parenchyma is limited because of the risk of pancreatitis, pancreatic fistula, or hemorrhage. This study aimed to evaluate the feasibility and safety of irreversible electroporation (IRE) in a porcine model. Ten pigs were divided into 2 study groups. In the first group, animals received IRE of the pancreatic tail and were killed after 60 minutes. In the second group, animals received IRE at the head of the pancreas and were followed up for 7 days. Clinical parameters, computed tomography imaging, laboratory results, and histology were obtained. All animals survived IRE ablation, and no cardiac adverse effects were noted. Sixty minutes after IRE, a hypodense lesion on computed tomography imaging indicated the ablation zone. None of the animals developed clinical signs of acute pancreatitis. Only small amounts of ascites fluid, with a transient increase in amylase and lipase levels, were observed, indicating that no pancreatic fistula occurred. This porcine model shows that IRE is feasible and safe in the pancreatic parenchyma. Computed tomography imaging reveals significant changes at 60 minutes after IRE and therefore might serve as an early indicator of therapeutic success. Clinical studies are needed to evaluate the efficacy of IRE in pancreatic cancer.

17. The LIFE Model: A Meta-Theoretical Conceptual Map for Applied Positive Psychology

OpenAIRE

Lomas, Tim; Hefferon, Kate; Ivtzan, Itai

2014-01-01

Since its emergence in 1998, positive psychology has flourished. Among its successes is the burgeoning field of applied positive psychology (APP), involving interventions to promote wellbeing. However, the remit of APP is currently unclear. As such, we offer a meta-theoretical conceptual map delineating the terrain that APP might conceivably cover, namely, the Layered Integrated Framework Example model. The model is based on Wilber’s (J Conscious Stud 4(1):71–92, 1997) Integral Framework, whi...

18. Leaders Are the Network: Applying the Kotter Model in Shaping Future Information Systems

Science.gov (United States)

2010-01-01

Comunications and Information Systems ,” New York, NY: Springer Publishing, 1997, 183-193. Findley, Mike and Luck, Gary, “Information Overload...1 LEADERS ARE THE NETWORK: APPLYING THE KOTTER MODEL IN SHAPING FUTURE INFORMATION SYSTEMS Submitted...Model in Shaping Future Information Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e

19. Divorce and Child Behavior Problems: Applying Latent Change Score Models to Life Event Data

OpenAIRE

Malone, Patrick S.; Lansford, Jennifer E.; Castellino, Domini R.; Berlin, Lisa J.; Dodge, Kenneth A.; Bates, John E.; Pettit, Gregory S.

2004-01-01

Effects of parents' divorce on children's adjustment have been studied extensively. This article applies new advances in trajectory modeling to the problem of disentangling the effects of divorce on children's adjustment from related factors such as the child's age at the time of divorce and the child's gender. Latent change score models were used to examine trajectories of externalizing behavior problems in relation to children's experience of their parents' divorce. Participants included 35...

20. Modeling Hierarchically Clustered Longitudinal Survival Processes with Applications to Child Mortality and Maternal Health

Directory of Open Access Journals (Sweden)

Kuate-Defo, Bathélémy

2001-01-01

Full Text Available EnglishThis paper merges two parallel developments since the 1970s of newstatistical tools for data analysis: statistical methods known as hazard models that are used foranalyzing event-duration data and statistical methods for analyzing hierarchically clustered dataknown as multilevel models. These developments have rarely been integrated in research practice andthe formalization and estimation of models for hierarchically clustered survival data remain largelyuncharted. I attempt to fill some of this gap and demonstrate the merits of formulating and estimatingmultilevel hazard models with longitudinal data.FrenchCette étude intègre deux approches statistiques de pointe d'analyse des donnéesquantitatives depuis les années 70: les méthodes statistiques d'analyse desdonnées biographiques ou méthodes de survie et les méthodes statistiquesd'analyse des données hiérarchiques ou méthodes multi-niveaux. Ces deuxapproches ont été très peu mis en symbiose dans la pratique de recherche et parconséquent, la formulation et l'estimation des modèles appropriés aux donnéeslongitudinales et hiérarchiquement nichées demeure essentiellement un champd'investigation vierge. J'essaye de combler ce vide et j'utilise des données réellesen santé publique pour démontrer les mérites et contextes de formulation etd'estimation des modèles multi-niveaux et multi-états des données biographiqueset longitudinales.

1. Modeling the current distribution in HTS tapes with transport current and applied magnetic field

NARCIS (Netherlands)

Yazawa, T.; Yazawa, Takashi; Rabbers, J.J.; Chevtchenko, O.A.; ten Haken, Bernard; ten Kate, Herman H.J.; Maeda, Hideaki

1999-01-01

A numerical model is developed for the current distribution in a high temperature superconducting (HTS) tape, (Bi,Pb)2Sr2 Ca2Cu3Ox-Ag, subjected to a combination of a transport current and an applied magnetic field. This analysis is based on a two-dimensional formulation of Maxwell's equations in

2. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

Science.gov (United States)

What Works Clearinghouse, 2010

2010-01-01

The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours…

3. Problems and advantages of applying the e-learning model to the teaching of English

OpenAIRE

Shaparenko, А.; Golikova, А.

2013-01-01

In this article we mention some potential and noted problems and advantages of applying the e-learning model to the teaching of English. In the area of foreign language teaching a lot has been done, but there are constant attempts for new solutions. Another option for e-learning is a hybrid course.

4. Flipped Classroom Adapted to the ARCS Model of Motivation and Applied to a Physics Course

Science.gov (United States)

Asiksoy, Gülsüm; Özdamli, Fezile

2016-01-01

This study aims to determine the effect on the achievement, motivation and self-sufficiency of students of the flipped classroom approach adapted to Keller's ARCS (Attention, Relevance, Confidence and Satisfaction) motivation model and applied to a physics course. The study involved 66 students divided into two classes of a physics course. The…

5. Applying the cube model to pediatric psychology: development of research competency skills at the doctoral level.

Science.gov (United States)

Madan-Swain, Avi; Hankins, Shirley L; Gilliam, Margaux Barnes; Ross, Kelly; Reynolds, Nina; Milby, Jesse; Schwebel, David C

2012-03-01

This article considers the development of research competencies in professional psychology and how that movement might be applied to training in pediatric psychology. The field of pediatric psychology has a short but rich history, and experts have identified critical competencies. However, pediatric psychology has not yet detailed a set of research-based competencies. This article initially reviews the competency initiative in professional psychology, including the cube model as it relates to research training. Next, we review and adapt the knowledge-based/foundational and applied/functional research competencies proposed by health psychology into a cube model for pediatric psychology. We focus especially on graduate-level training but allude to its application throughout professional development. We present the cube model as it is currently being applied to the development of a systematic research competency evaluation for graduate training at our medical/clinical psychology doctoral program at the University of Alabama at Birmingham. Based on the review and synthesis of the literature on research competency in professional psychology we propose future initiatives to develop these competencies for the field of pediatric psychology. The cube model can be successfully applied to the development of research training competencies in pediatric psychology. Future research should address the development, implementation, and assessment of the research competencies for training and career development of future pediatric psychologists.

6. Applying footprint models to investigate MO-dissimilarity over heterogeneous areas

NARCIS (Netherlands)

Boer, van de A.; Graf, A.; Moene, A.F.; Schüttemeyer, D.

2012-01-01

Monin Obukhov Similarity Theory (MOST) is one of the cornerstones of surface layer meteorology and as such it is widely applied in models and for data analysis. One major disadvantage of using MOST for describing land-atmosphere interactions is that all turbulence properties are described as a

7. An Investigation of Employees' Use of E-Learning Systems: Applying the Technology Acceptance Model

Science.gov (United States)

Lee, Yi-Hsuan; Hsieh, Yi-Chuan; Chen, Yen-Hsun

2013-01-01

The purpose of this study is to apply the technology acceptance model to examine the employees' attitudes and acceptance of electronic learning (e-learning) systems in organisations. This study examines four factors (organisational support, computer self-efficacy, prior experience and task equivocality) that are believed to influence employees'…

8. Designing and Applying Web Assisted Activities to Be Used in Flipped Classroom Model

Science.gov (United States)

Çetinkaya, Murat

2017-01-01

The purpose of this study is to develop personalized web assisted activities for the flipped classroom model applied in the "Human and Environment Interactions" unit of science lesson and to research its effect on students' achievement. The study was conducted with the 74 participation of 7th grade science lesson students within a period…

9. Applying the Transtheoretical Model to Reality Television: The Biggest Loser Case Study

Science.gov (United States)

Barry, Adam E.; Piazza-Gardner, Anna K.

2012-01-01

This teaching idea presents a heuristic example using reality television as a tool for applying health behavior theory. It utilizes The Biggest Loser (TBL) to provide "real world" cases depicting how individuals progress through/experience the Transtheoretical Model (TTM). Observing TBL contestants provides students practice grounding…

10. Risk assessment and food allergy: the probabilistic model applied to allergens

NARCIS (Netherlands)

Spanjersberg, M.Q.I.; Kruizinga, A.G.; Rennen, M.A.J.; Houben, G.F.

2007-01-01

In order to assess the risk of unintended exposure to food allergens, traditional deterministic risk assessment is usually applied, leading to inconsequential conclusions as 'an allergic reaction cannot be excluded'. TNO therefore developed a quantitative risk assessment model for allergens based on

11. Construction the model on the breast cancer survival analysis use support vector machine, logistic regression and decision tree.

Science.gov (United States)

Chao, Cheng-Min; Yu, Ya-Wen; Cheng, Bor-Wen; Kuo, Yao-Lung

2014-10-01

The aim of the paper is to use data mining technology to establish a classification of breast cancer survival patterns, and offers a treatment decision-making reference for the survival ability of women diagnosed with breast cancer in Taiwan. We studied patients with breast cancer in a specific hospital in Central Taiwan to obtain 1,340 data sets. We employed a support vector machine, logistic regression, and a C5.0 decision tree to construct a classification model of breast cancer patients' survival rates, and used a 10-fold cross-validation approach to identify the model. The results show that the establishment of classification tools for the classification of the models yielded an average accuracy rate of more than 90% for both; the SVM provided the best method for constructing the three categories of the classification system for the survival mode. The results of the experiment show that the three methods used to create the classification system, established a high accuracy rate, predicted a more accurate survival ability of women diagnosed with breast cancer, and could be used as a reference when creating a medical decision-making frame.

12. Uncertainty from model calibration: applying a new method to transport energy demand modelling

NARCIS (Netherlands)

van Ruijven, B.J.|info:eu-repo/dai/nl/304834521; van der Sluijs, J.P.|info:eu-repo/dai/nl/073427489; van Vuuren, D.P.; Janssen, Peter; Heuberger, P.S.C.; de Vries, B.|info:eu-repo/dai/nl/068361599

2009-01-01

Uncertainties in energy demand modelling originate from both limited understanding of the real-world system and a lack of data for model development, calibration and validation. These uncertainties allow for the development of different models, but also leave room for different calibrations of a

13. Uncertainty from Model Calibration : Applying a New Method to Transport Energy Demand Modelling

NARCIS (Netherlands)

Ruijven, B.; Van der Sluijs, J.P.; Van Vuuren, D.P.; Janssen, P.; Heuberger, P.S.C.; De Vries, B.

2009-01-01

Uncertainties in energy demand modelling originate from both limited understanding of the real-world system and a lack of data for model development, calibration and validation. These uncertainties allow for the development of different models, but also leave room for different calibrations of a

14. Accelerating quality improvement within your organization: Applying the Model for Improvement.

Science.gov (United States)

Crowl, Ashley; Sharma, Anita; Sorge, Lindsay; Sorensen, Todd

2015-01-01

To discuss the fundamentals of the Model for Improvement and how the model can be applied to quality improvement activities associated with medication use, including understanding the three essential questions that guide quality improvement, applying a process for actively testing change within an organization, and measuring the success of these changes on care delivery. PubMed from 1990 through April 2014 using the search terms quality improvement, process improvement, hospitals, and primary care. At the authors' discretion, studies were selected based on their relevance in demonstrating the quality improvement process and tests of change within an organization. Organizations are continuously seeking to enhance quality in patient care services, and much of this work focuses on improving care delivery processes. Yet change in these systems is often slow, which can lead to frustration or apathy among frontline practitioners. Adopting and applying the Model for Improvement as a core strategy for quality improvement efforts can accelerate the process. While the model is frequently well known in hospitals and primary care settings, it is not always familiar to pharmacists. In addition, while some organizations may be familiar with the "plan, do, study, act" (PDSA) cycles-one element of the Model for Improvement-many do not apply it effectively. The goal of the model is to combine a continuous process of small tests of change (PDSA cycles) within an overarching aim with a longitudinal measurement process. This process differs from other forms of improvement work that plan and implement large-scale change over an extended period, followed by months of data collection. In this scenario it may take months or years to determine whether an intervention will have a positive impact. By following the Model for Improvement, frontline practitioners and their organizational leaders quickly identify strategies that make a positive difference and result in a greater degree of

15. Clinical variables serve as prognostic factors in a model for survival from glioblastoma multiforme

DEFF Research Database (Denmark)

Michaelsen, Signe Regner; Christensen, Ib Jarle; Grunnet, Kirsten

2013-01-01

Although implementation of temozolomide (TMZ) as a part of primary therapy for glioblastoma multiforme (GBM) has resulted in improved patient survival, the disease is still incurable. Previous studies have correlated various parameters to survival, although no single parameter has yet been...

16. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

Directory of Open Access Journals (Sweden)

L Potgieter

2012-12-01

Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

17. A simple model for fatigue crack growth in concrete applied to a hinge beam model

DEFF Research Database (Denmark)

Skar, Asmus; Poulsen, Peter Noe; Olesen, John Forbes

2017-01-01

In concrete structures, fatigue is one of the major causes of material deterioration. Repeated loads result in formation of cracks. Propagation of these cracks cause internal progressive damage within the concrete material which ultimately leads to failure. This paper presents a simplified general...... concept for non-linear analysis of concrete subjected to cyclic loading. The model is based on the fracture mechanics concepts of the fictitious crack model, considering a fiber of concrete material, and a simple energy based approach for estimating the bridging stress under cyclic loading. Further......, the uni-axial fiber response is incorporated in a numerical hinge model for beam analysis. Finally, the hinge model is implemented into a finite element beam element on a constitutive level. The proposed model is compared to experimental results on both fiber-and beam level. The proposed model shows good...

18. Analytical Compliance Modeling of Serial Flexure-Based Compliant Mechanism Under Arbitrary Applied Load

Science.gov (United States)

Wang, Li-Ping; Jiang, Yao; Li, Tie-Min

2017-07-01

Analytical compliance model is vital to the flexure- based compliant mechanism in its mechanical design and motion control. The matrix is a common and effective approach in the compliance modeling while it is not well developed for the closed-loop serial and parallel compliant mechanisms and is not applicable to the situation when the external loads are applied on the flexure members. Concise and explicit analytical compliance models of the serial flexure-based compliant mechanisms under arbitrary loads are derived by using the matrix method. An equivalent method is proposed to deal with the situation when the external loads are applied on the flexure members. The external loads are transformed to concentrated forces applied on the rigid links, which satisfy the equations of static equilibrium and also guarantee that the deformations at the displacement output point remain unchanged. Then the matrix method can be still adopted for the compliance analysis of the compliant mechanism. Finally, several specific examples and an experimental test are given to verify the effectiveness of the compliance models and the force equivalent method. The research enriches the matrix method and provides concise analytical compliance models for the serial compliant mechanism.

19. Description and validation of a Markov model of survival for individuals free of cardiovascular disease that uses Framingham risk factors

Directory of Open Access Journals (Sweden)

Martin Chris

2004-05-01

Full Text Available Abstract Background Estimation of cardiovascular disease risk is increasingly used to inform decisions on interventions, such as the use of antihypertensives and statins, or to communicate the risks of smoking. Crude 10-year cardiovascular disease risk risks may not give a realistic view of the likely impact of an intervention over a lifetime and will underestimate of the risks of smoking. A validated model of survival to act as a decision aid in the consultation may help to address these problems. This study aims to describe the development of such a model for use with people free of cardiovascular disease and evaluates its accuracy against data from a United Kingdom cohort. Methods A Markov cycle tree evaluated using cohort simulation was developed utilizing Framingham estimates of cardiovascular risk, 1998 United Kingdom mortality data, the relative risk for smoking related non-cardiovascular disease risk and changes in systolic blood pressure and serum total cholesterol total cholesterol with age. The model's estimates of survival at 20 years for 1391 members of the Whickham survey cohort between the ages of 35 and 65 were compared with the observed survival at 20-year follow-up. Results The model estimate for survival was 75% and the observed survival was 75.4%. The correlation between estimated and observed survival was 0.933 over 39 subgroups of the cohort stratified by estimated survival, 0.992 for the seven 5-year age bands from 35 to 64, 0.936 for the ten 10 mmHg systolic blood pressure bands between 100 mmHg and 200 mmHg, and 0.693 for the fifteen 0.5 mmol/l total cholesterol bands between 3.0 and 10.0 mmol/l. The model significantly underestimated mortality in those people with a systolic blood pressure greater than or equal to 180 mmHg (p = 0.006. The average gain in life expectancy from the elimination of cardiovascular disease risk as a cause of death was 4.0 years for all the 35 year-old men in the sample (n = 24, and 1.8 years

20. Description and validation of a Markov model of survival for individuals free of cardiovascular disease that uses Framingham risk factors.

Science.gov (United States)

Martin, Chris; Vanderpump, Mark; French, Joyce

2004-05-24

Estimation of cardiovascular disease risk is increasingly used to inform decisions on interventions, such as the use of antihypertensives and statins, or to communicate the risks of smoking. Crude 10-year cardiovascular disease risk risks may not give a realistic view of the likely impact of an intervention over a lifetime and will underestimate of the risks of smoking. A validated model of survival to act as a decision aid in the consultation may help to address these problems. This study aims to describe the development of such a model for use with people free of cardiovascular disease and evaluates its accuracy against data from a United Kingdom cohort. A Markov cycle tree evaluated using cohort simulation was developed utilizing Framingham estimates of cardiovascular risk, 1998 United Kingdom mortality data, the relative risk for smoking related non-cardiovascular disease risk and changes in systolic blood pressure and serum total cholesterol total cholesterol with age. The model's estimates of survival at 20 years for 1391 members of the Whickham survey cohort between the ages of 35 and 65 were compared with the observed survival at 20-year follow-up. The model estimate for survival was 75% and the observed survival was 75.4%. The correlation between estimated and observed survival was 0.933 over 39 subgroups of the cohort stratified by estimated survival, 0.992 for the seven 5-year age bands from 35 to 64, 0.936 for the ten 10 mmHg systolic blood pressure bands between 100 mmHg and 200 mmHg, and 0.693 for the fifteen 0.5 mmol/l total cholesterol bands between 3.0 and 10.0 mmol/l. The model significantly underestimated mortality in those people with a systolic blood pressure greater than or equal to 180 mmHg (p = 0.006). The average gain in life expectancy from the elimination of cardiovascular disease risk as a cause of death was 4.0 years for all the 35 year-old men in the sample (n = 24), and 1.8 years for all the 35 year-old women in the sample (n = 32

1. Generation of a convalescent model of virulent Francisella tularensis infection for assessment of host requirements for survival of tularemia.

Directory of Open Access Journals (Sweden)

Deborah D Crane

Full Text Available Francisella tularensis is a facultative intracellular bacterium and the causative agent of tularemia. Development of novel vaccines and therapeutics for tularemia has been hampered by the lack of understanding of which immune components are required to survive infection. Defining these requirements for protection against virulent F. tularensis, such as strain SchuS4, has been difficult since experimentally infected animals typically die within 5 days after exposure to as few as 10 bacteria. Such a short mean time to death typically precludes development, and therefore assessment, of immune responses directed against virulent F. tularensis. To enable identification of the components of the immune system that are required for survival of virulent F. tularensis, we developed a convalescent model of tularemia in C57Bl/6 mice using low dose antibiotic therapy in which the host immune response is ultimately responsible for clearance of the bacterium. Using this model we demonstrate αβTCR(+ cells, γδTCR(+ cells, and B cells are necessary to survive primary SchuS4 infection. Analysis of mice deficient in specific soluble mediators shows that IL-12p40 and IL-12p35 are essential for survival of SchuS4 infection. We also show that IFN-γ is required for survival of SchuS4 infection since mice lacking IFN-γR succumb to disease during the course of antibiotic therapy. Finally, we found that both CD4(+ and CD8(+ cells are the primary producers of IFN-γand that γδTCR(+ cells and NK cells make a minimal contribution toward production of this cytokine throughout infection. Together these data provide a novel model that identifies key cells and cytokines required for survival or exacerbation of infection with virulent F. tularensis and provides evidence that this model will be a useful tool for better understanding the dynamics of tularemia infection.

2. Comparison among Models to Estimate the Shielding Effectiveness Applied to Conductive Textiles

Directory of Open Access Journals (Sweden)

Alberto Lopez

2013-01-01

Full Text Available The purpose of this paper is to present a comparison among two models and its measurement to calculate the shielding effectiveness of electromagnetic barriers, applying it to conductive textiles. Each one, models a conductive textile as either a (1 wire mesh screen or (2 compact material. Therefore, the objective is to perform an analysis of the models in order to determine which one is a better approximation for electromagnetic shielding fabrics. In order to provide results for the comparison, the shielding effectiveness of the sample has been measured by means of the standard ASTM D4935-99.

3. Dynamic plant uptake model applied for drip irrigation of an insecticide to pepper fruit plants

DEFF Research Database (Denmark)

Legind, Charlotte Nielsen; Kennedy, C. M.; Rein, Arno

2011-01-01

irrigation, its application for a soil-applied insecticide and a sensitivity analysis of the model parameters. RESULTS: The model predicted the measured increase and decline of residues following two soil applications of an insecticide to peppers, with an absolute error between model and measurement ranging...... from 0.002 to 0.034 mg kg fw—1. Maximum measured concentrations in pepper fruit were approximately 0.22 mg kg fw—1. Temperature was the most sensitive component for predicting the peak and final concentration in pepper fruit, through its influence on soil and plant degradation rates...

4. Cisplatin plus paclitaxel and maintenance of bevacizumab on tumour progression, dissemination, and survival of ovarian carcinoma xenograft models.

Science.gov (United States)

Oliva, P; Decio, A; Castiglioni, V; Bassi, A; Pesenti, E; Cesca, M; Scanziani, E; Belotti, D; Giavazzi, R

2012-07-10

Bevacizumab is being incorporated as first-line therapy with standard-of-care chemotherapy on epithelial ovarian carcinoma (EOC). We investigated bevacizumab combined with chemotherapy on tumour progression and mouse survival in EOC xenograft models. Bevacizumab was administered concomitantly with cisplatin plus paclitaxel (DDP+PTX), continued after induction (maintenance) or started after chemotherapy. The effect on tumour progression was monitored by bioluminescence imaging (BLI) (1A9-luc xenograft). Tumour dissemination into the peritoneal organs and ascites formation (HOC22 xenograft) was evaluated by histological analysis at the end of treatment (interim) and at euthanasia (survival). The effects on overall survival (OS) were investigated in both EOC models. Bevacizumab with PTX+DDP delayed tumour progression in mice bearing EOC xenografts. OS was significantly extended, with complete responses, by bevacizumab continued after stopping chemotherapy in the HOC22 xenograft. Bevacizumab alone inhibited ascites formation, with only limited effect on tumour burden, but combined with PTX+DDP reduced ascites and metastases. Bevacizumab started after induction with PTX+DDP and maintained was equally effective on tumour progression and survival on 1A9-luc xenograft. Bevacizumab combined with chemotherapy not only affected tumour progression, but when administered as maintenance regimen significantly prolonged survival, reducing ascites, and tumour dissemination. We believe our findings are consistent with the clinical results and shed light on the potential effects of this kind of treatment on tumour progression.

5. A software complex intended for constructing applied models and meta-models on the basis of mathematical programming principles

Directory of Open Access Journals (Sweden)

Михаил Юрьевич Чернышов

2013-12-01

Full Text Available A software complex (SC elaborated by the authors on the basis of the language LMPL and representing a software tool intended for synthesis of applied software models and meta-models constructed on the basis of mathematical programming (MP principles is described. LMPL provides for an explicit form of declarative representation of MP-models, presumes automatic constructing and transformation of models and the capability of adding external software packages. The following software versions of the SC have been implemented: 1 a SC intended for representing the process of choosing an optimal hydroelectric power plant model (on the principles of meta-modeling and 2 a SC intended for representing the logic-sense relations between the models of a set of discourse formations in the discourse meta-model.

6. Club cell secretory protein improves survival in a murine obliterative bronchiolitis model.

Science.gov (United States)

Wendt, Christine; Tram, Kevin; Price, Andrew; England, Kristen; Stiehm, Andrew; Panoskaltsis-Mortari, Angela

2013-11-01

Club cell secretory protein (CCSP) is an indirect phospholipase A2 inhibitor with some immunosuppressive and antiproliferative properties that is expressed in bronchiolar Club cells. In our murine bone marrow transplant (BMT) model of obliterative bronchiolitis (OB), CCSP is diminished; however, its role is unknown. To determine the role of CCSP, B6 wild-type (WT) or CCSP-deficient (CCSP(-/-)) mice were lethally conditioned and given allogeneic bone marrow with a sublethal dose of allogeneic splenic T cells to induce OB. We found that CCSP(-/-) mice demonstrated a higher mortality following BMT-induced OB compared with WT mice. Mice were analyzed 60 days post-BMT for protein expression, pulmonary function, and histology. CCSP levels were reduced in WT mice with BMT-induced OB, and lower levels correlated to decreased lung compliance. CCSP(-/-) had a higher degree of injury and fibrosis as measured by hydroxy proline, along with an increased lung resistance and the inflammatory markers, leukotriene B4 and CXCL1. Replacement with recombinant intravenous CCSP partially reversed the weight loss and improved survival in the CCSP(-/-) mice. In addition, CCSP replacement improved histology and decreased inflammatory cells and markers. These findings indicate that CCSP has a regulatory role in OB and may have potential as a preventive therapy.

7. Intranasal Oncolytic Virotherapy with CXCR4-Enhanced Stem Cells Extends Survival in Mouse Model of Glioma.

Science.gov (United States)

Dey, Mahua; Yu, Dou; Kanojia, Deepak; Li, Gina; Sukhanova, Madina; Spencer, Drew A; Pituch, Katatzyna C; Zhang, Lingjiao; Han, Yu; Ahmed, Atique U; Aboody, Karen S; Lesniak, Maciej S; Balyasnikova, Irina V

2016-09-13

The challenges to effective drug delivery to brain tumors are twofold: (1) there is a lack of non-invasive methods of local delivery and (2) the blood-brain barrier limits systemic delivery. Intranasal delivery of therapeutics to the brain overcomes both challenges. In mouse model of malignant glioma, we observed that a small fraction of intranasally delivered neural stem cells (NSCs) can migrate to the brain tumor site. Here, we demonstrate that hypoxic preconditioning or overexpression of CXCR4 significantly enhances the tumor-targeting ability of NSCs, but without altering their phenotype only in genetically modified NSCs. Modified NSCs deliver oncolytic virus to glioma more efficiently and extend survival of experimental animals in the context of radiotherapy. Our findings indicate that intranasal delivery of stem cell-based therapeutics could be optimized for future clinical applications, and allow for safe and repeated administration of biological therapies to brain tumors and other CNS disorders. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

8. [Applying temporally-adjusted land use regression models to estimate ambient air pollution exposure during pregnancy].

Science.gov (United States)

Zhang, Y J; Xue, F X; Bai, Z P

2017-03-06

The impact of maternal air pollution exposure on offspring health has received much attention. Precise and feasible exposure estimation is particularly important for clarifying exposure-response relationships and reducing heterogeneity among studies. Temporally-adjusted land use regression (LUR) models are exposure assessment methods developed in recent years that have the advantage of having high spatial-temporal resolution. Studies on the health effects of outdoor air pollution exposure during pregnancy have been increasingly carried out using this model. In China, research applying LUR models was done mostly at the model construction stage, and findings from related epidemiological studies were rarely reported. In this paper, the sources of heterogeneity and research progress of meta-analysis research on the associations between air pollution and adverse pregnancy outcomes were analyzed. The methods of the characteristics of temporally-adjusted LUR models were introduced. The current epidemiological studies on adverse pregnancy outcomes that applied this model were systematically summarized. Recommendations for the development and application of LUR models in China are presented. This will encourage the implementation of more valid exposure predictions during pregnancy in large-scale epidemiological studies on the health effects of air pollution in China.

9. Explained variation in a fully specified model for data-grouped survival data.

Science.gov (United States)

Pipper, C B; Ritz, C; Scheike, T H

2011-12-01

An additive hazards model may be used to quantify the effect of genetic and environmental predictors on flowering of sugar beet plants recorded as data-grouped time-to-event data. Estimated predictor effects have an intuitive interpretation rooted in the underlying time dynamics of the flowering process. However, agricultural experiments are often designed using several plots containing a large number of plants that are subsequently being monitored. In this article, we consider an additive hazards model with an additional plot structure induced by latent shared frailty variables. This approach enables us to derive a method to assess the quality of predictors in terms of how much plot variation they explain. We apply the method to a large data set exploring flowering of sugar beet and conclude that the genetic predictor biotype, which has a strong effect, also explains a substantial amount of the plot variation. The method is also applied to a data set from medical research concerning days to virus positivity of serum samples in AIDS patients. © 2011, The International Biometric Society.

10. A new formalism for modelling parameters α and β of the linear-quadratic model of cell survival for hadron therapy

Science.gov (United States)

Vassiliev, Oleg N.; Grosshans, David R.; Mohan, Radhe

2017-10-01

We propose a new formalism for calculating parameters α and β of the linear-quadratic model of cell survival. This formalism, primarily intended for calculating relative biological effectiveness (RBE) for treatment planning in hadron therapy, is based on a recently proposed microdosimetric revision of the single-target multi-hit model. The main advantage of our formalism is that it reliably produces α and β that have correct general properties with respect to their dependence on physical properties of the beam, including the asymptotic behavior for very low and high linear energy transfer (LET) beams. For example, in the case of monoenergetic beams, our formalism predicts that, as a function of LET, (a) α has a maximum and (b) the α/β ratio increases monotonically with increasing LET. No prior models reviewed in this study predict both properties (a) and (b) correctly, and therefore, these prior models are valid only within a limited LET range. We first present our formalism in a general form, for polyenergetic beams. A significant new result in this general case is that parameter β is represented as an average over the joint distribution of energies E 1 and E 2 of two particles in the beam. This result is consistent with the role of the quadratic term in the linear-quadratic model. It accounts for the two-track mechanism of cell kill, in which two particles, one after another, damage the same site in the cell nucleus. We then present simplified versions of the formalism, and discuss predicted properties of α and β. Finally, to demonstrate consistency of our formalism with experimental data, we apply it to fit two sets of experimental data: (1) α for heavy ions, covering a broad range of LETs, and (2) β for protons. In both cases, good agreement is achieved.

11. Positive end-expiratory pressure improves survival in a rodent model of cardiopulmonary resuscitation using high-dose epinephrine.

LENUS (Irish Health Repository)

McCaul, Conán

2009-10-01

Multiple interventions have been tested in models of cardiopulmonary resuscitation (CPR) to optimize drug use, chest compressions, and ventilation. None has studied the effects of positive end-expiratory pressure (PEEP) on outcome. We hypothesized that because PEEP can reverse pulmonary atelectasis, lower pulmonary vascular resistance, and potentially improve cardiac output, its use during CPR would increase survival.

12. External validation of a model to predict the survival of patients presenting with a spinal epidural metastasis

NARCIS (Netherlands)

Bartels, R.H.M.A.; Feuth, T.; Rades, D.; Hedlund, R.; Villas, C.; Linden, Y. van der; Borm, W.; Kappelle, A.C.; Maazen, R.W. van der; Grotenhuis, J.A.; Verbeek, A.L.M.

2011-01-01

The surgical treatment of spinal metastases is evolving. The major problem is the selection of patients who may benefit from surgical treatment. One of the criteria is an expected survival of at least 3 months. A prediction model has been previously developed. The present study has been performed in

13. Individual-tree basal area growth, survival, and total height models for upland hardwoods in the Boston Mountains of Arkansa

Science.gov (United States)

Paul A. Murphy; David L. Graney

1988-01-01

Models were developed for individual-tree basal area growth, survival, and total heights for different species of upland hardwoods in the Boston Mountains of north Arkansas. Data used were from 87 permanent plots located in an array of different sites and stand ages; the plots were thinned to different stocking levels and included unthinned controls. To test these...

14. Anti-CD45 radioimmunotherapy using 211At with bone marrow transplantation prolongs survival in a disseminated murine leukemia model

Energy Technology Data Exchange (ETDEWEB)

Orozco, Johnnie J.; Back, Tom; Kenoyer, Aimee L.; Balkin, Ethan R.; Hamlin, Donald K.; Wilbur, D. Scott; Fisher, Darrell R.; Frayo, Shani; Hylarides, Mark; Green, Damian J.; Gopal, Ajay K.; Press, Oliver W.; Pagel, John M.

2013-05-15

Anti-CD45 Radioimmunotherapy using an Alpha-Emitting Radionuclide 211At Combined with Bone Marrow Transplantation Prolongs Survival in a Disseminated Murine Leukemia Model ABSTRACT Despite aggressive chemotherapy combined with hematopoietic cell transplant (HCT), many patients with acute myeloid leukemia (AML) relapse. Radioimmunotherapy (RIT) using antibodies (Ab) labeled primarily with beta-emitting radionuclides has been explored to reduce relapse.

15. Classification Models to Predict Survival of Kidney Transplant Recipients Using Two Intelligent Techniques of Data Mining and Logistic Regression.

Science.gov (United States)

Nematollahi, M; Akbari, R; Nikeghbalian, S; Salehnasab, C

2017-01-01

Kidney transplantation is the treatment of choice for patients with end-stage renal disease (ESRD). Prediction of the transplant survival is of paramount importance. The objective of this study was to develop a model for predicting survival in kidney transplant recipients. In a cross-sectional study, 717 patients with ESRD admitted to Nemazee Hospital during 2008-2012 for renal transplantation were studied and the transplant survival was predicted for 5 years. The multilayer perceptron of artificial neural networks (MLP-ANN), logistic regression (LR), Support Vector Machine (SVM), and evaluation tools were used to verify the determinant models of the predictions and determine the independent predictors. The accuracy, area under curve (AUC), sensitivity, and specificity of SVM, MLP-ANN, and LR models were 90.4%, 86.5%, 98.2%, and 49.6%; 85.9%, 76.9%, 97.3%, and 26.1%; and 84.7%, 77.4%, 97.5%, and 17.4%, respectively. Meanwhile, the independent predictors were discharge time creatinine level, recipient age, donor age, donor blood group, cause of ESRD, recipient hypertension after transplantation, and duration of dialysis before transplantation. SVM and MLP-ANN models could efficiently be used for determining survival prediction in kidney transplant recipients.

16. Low dose radiation risks for women surviving the a-bombs in Japan: generalized additive model.

Science.gov (United States)

Dropkin, Greg

2016-11-24

Analyses of cancer mortality and incidence in Japanese A-bomb survivors have been used to estimate radiation risks, which are generally higher for women. Relative Risk (RR) is usually modelled as a linear function of dose. Extrapolation from data including high doses predicts small risks at low doses. Generalized Additive Models (GAMs) are flexible methods for modelling non-linear behaviour. GAMs are applied to cancer incidence in female low dose subcohorts, using anonymous public data for the 1958 - 1998 Life Span Study, to test for linearity, explore interactions, adjust for the skewed dose distribution, examine significance below 100 mGy, and estimate risks at 10 mGy. For all solid cancer incidence, RR estimated from 0 - 100 mGy and 0 - 20 mGy subcohorts is significantly raised. The response tapers above 150 mGy. At low doses, RR increases with age-at-exposure and decreases with time-since-exposure, the preferred covariate. Using the empirical cumulative distribution of dose improves model fit, and capacity to detect non-linear responses. RR is elevated over wide ranges of covariate values. Results are stable under simulation, or when removing exceptional data cells, or adjusting neutron RBE. Estimates of Excess RR at 10 mGy using the cumulative dose distribution are 10 - 45 times higher than extrapolations from a linear model fitted to the full cohort. Below 100 mGy, quasipoisson models find significant effects for all solid, squamous, uterus, corpus, and thyroid cancers, and for respiratory cancers when age-at-exposure > 35 yrs. Results for the thyroid are compatible with studies of children treated for tinea capitis, and Chernobyl survivors. Results for the uterus are compatible with studies of UK nuclear workers and the Techa River cohort. Non-linear models find large, significant cancer risks for Japanese women exposed to low dose radiation from the atomic bombings. The risks should be reflected in protection standards.

17. Addressing dependability by applying an approach for model-based risk assessment

Energy Technology Data Exchange (ETDEWEB)

Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

2007-11-15

This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

18. Dynamics modeling and vibration analysis of a piezoelectric diaphragm applied in valveless micropump

Science.gov (United States)

He, Xiuhua; Xu, Wei; Lin, Nan; Uzoejinwa, B. B.; Deng, Zhidan

2017-09-01

This paper presents the dynamical model involved with load of fluid pressure, electric-solid coupling simulation and experimental performance of the piezoelectric diaphragm fabricated and applied in valveless micropump. The model is based on the theory of plate-shell with small deflection, considering the two-layer structure of piezoelectric ceramic and elastic substrate. The high-order non-homogeneous vibration equation of the piezoelectric diaphragm, derived in the course of the study, was solved by being divided into a homogeneous Bessel equation and a non-homogeneous static equation according to the superposition principle. The amplitude of the piezoelectric diaphragm driven by sinusoidal voltage against the load of fluid pressure was obtained from the solution of the vibration equation. Also, finite element simulation of electric-solid coupling between displacement of piezoelectric diaphragm due to an applied voltage and resulting deformation of membrane was considered. The simulation result showed that the maximum deflection of diaphragm is 9.51 μm at a quarter cycle time when applied a peak-to-peak voltage of 150VP-P with a frequency of 90 Hz, and the displacement distribution according to the direction of the radius was demonstrated. Experiments were performed to verify the prediction of the dynamic modeling and the coupling simulation, the experimental data showed a good agreement with the dynamical model and simulation.

19. The Motivational Knowledge Management Model: proposal to apply it in the library sector

Directory of Open Access Journals (Sweden)

Daniel López-Fernández

2016-12-01

Full Text Available In professional environments, attention paid to aspects such as supervisory styles, interpersonal relationships and workers eagerness can have a positive impact on employee motivation and, consequently, on their performance and well-being. To achieve this, knowledge management models such as those presented here can be applied. This model generates diagnoses of motivation and recommendations for improvement, both systematically and scientifically. Consequently, it is especially useful for managers and human resource departments. The proposed model can be adapted to different kinds of professional groups, including those in library and documentation services. The suitability, reliability and usefulness of the proposed model have been empirically checked through case studies with 92 students and 166 professionals. The positive results allow us to conclude that the model is effective and useful for assessing and improving motivation.

20. Predictive Modeling of Influenza Shows the Promise of Applied Evolutionary Biology.

Science.gov (United States)

Morris, Dylan H; Gostic, Katelyn M; Pompei, Simone; Bedford, Trevor; Łuksza, Marta; Neher, Richard A; Grenfell, Bryan T; Lässig, Michael; McCauley, John W

2017-10-30

Seasonal influenza is controlled through vaccination campaigns. Evolution of influenza virus antigens means that vaccines must be updated to match novel strains, and vaccine effectiveness depends on the ability of scientists to predict nearly a year in advance which influenza variants will dominate in upcoming seasons. In this review, we highlight a promising new surveillance tool: predictive models. Developed through data-sharing and close collaboration between the World Health Organization and academic scientists, these models use surveillance data to make quantitative predictions regarding influenza evolution. Predictive models demonstrate the potential of applied evolutionary biology to improve public health and disease control. We review the state of influenza predictive modeling and discuss next steps and recommendations to ensure that these models deliver upon their considerable biomedical promise. Copyright © 2017 Elsevier Ltd. All rights reserved.

1. Positive Mathematical Programming Approaches – Recent Developments in Literature and Applied Modelling

Directory of Open Access Journals (Sweden)

Thomas Heckelei

2012-05-01

Full Text Available This paper reviews and discusses the more recent literature and application of Positive Mathematical Programming in the context of agricultural supply models. Specifically, advances in the empirical foundation of parameter specifications as well as the economic rationalisation of PMP models – both criticized in earlier reviews – are investigated. Moreover, the paper provides an overview on a larger set of models with regular/repeated policy application that apply variants of PMP. Results show that most applications today avoid arbitrary parameter specifications and rely on exogenous information on supply responses to calibrate model parameters. However, only few approaches use multiple observations to estimate parameters, which is likely due to the still considerable technical challenges associated with it. Equally, we found only limited reflection on the behavioral or technological assumptions that could rationalise the PMP model structure while still keeping the model’s advantages.

2. Modelling of composite concrete block pavement systems applying a cohesive zone model

DEFF Research Database (Denmark)

Skar, Asmus; Poulsen, Peter Noe

This paper presents a numerical analysis of the fracture behaviour of the cement bound base material in composite concrete block pavement systems, using a cohesive zone model. The functionality of the proposed model is tested on experimental and numerical investigations of beam bending tests....... The pavement is modelled as a simple slab on grade structure and parameters influencing the response, such as analysis technique, geometry and material parameters are studied. Moreover, the analysis is extended to a real scale example, modelling the pavement as a three-layered structure. It is found...... block pavements. It is envisaged that the methodology implemented in this study can be extended and thereby contribute to the ongoing development of rational failure criteria that can replace the empirical formulas currently used in pavement engineering....

3. Development and internal validation of a prognostic model to predict recurrence free survival in patients with adult granulosa cell tumors of the ovary

NARCIS (Netherlands)

van Meurs, Hannah S.; Schuit, Ewoud; Horlings, Hugo M.; van der Velden, Jacobus; van Driel, Willemien J.; Mol, Ben Willem J.; Kenter, Gemma G.; Buist, Marrije R.

2014-01-01

Models to predict the probability of recurrence free survival exist for various types of malignancies, but a model for recurrence free survival in individuals with an adult granulosa cell tumor (GCT) of the ovary is lacking. We aimed to develop and internally validate such a prognostic model. We

4. A stereovision model applied in bio-micromanipulation system based on stereo light microscope.

Science.gov (United States)

Wang, Yuezong

2017-12-01

A bio-micromanipulation system is designed for manipulating micro-objects with a length scale of tens or hundreds of microns based on stereo light microscope. The world coordinate reconstruction of points on the surface of micro-objects is an important goal for the micromanipulation. Traditional pinhole camera model is applied widely in macrocomputer vision. However, this model will output bad data with remarkable error if it is directly used to reconstruct three-dimensional world coordinates for stereo light microscope. Therefore, a novel and improved pinhole camera model applied in bio-micromanipulation system is proposed in this article. The new model is composed of binocular-pinhole model and error-correction model. The binocular-pinhole model is used to output the basic world coordinates. The error-correction model is used to correct the errors from the basic world coordinates and outputs the final high-precision world coordinates. The results show that the new model achieves a precision of 0.01 mm in the X direction, 0.01 mm in the Y direction, and 0.015 mm in the Z direction within a maximum reconstruction distance of 4.1 mm in the X direction, 2.9 mm in the Y direction, and 2.25 mm in the Z direction, and that traditional pinhole camera model achieves a lower and unsatisfactory precision of about 0.1 mm. © 2017 Wiley Periodicals, Inc.

5. Test of a trust and confidence model in the applied context of electromagnetic field (EMF) risks.

Science.gov (United States)

Siegrist, Michael; Earle, Timothy C; Gutscher, Heinz

2003-08-01

Trust is an important factor in risk management. There is little agreement among researchers, however, on how trust in risk management should be studied. Based on a comprehensive review of the trust literature a "dual-mode model of social trust and confidence" is proposed. Trust and confidence are separate but, under some circumstances, interacting sources of cooperation. Trust is based on value similarity, and confidence is based on performance. According to our model, judging similarity between an observer's currently active values and the values attributed to others determines social trust. Thus, the basis for trust is a judgment that the person to be trusted would act as the trusting person would. Interpretation of the other's performance influences confidence. Both social trust and confidence have an impact on people's willingness to cooperate (e.g., accept electromagnetic fields or EMF in the neighborhood). The postulated model was tested in the applied context of EMF risks. Structural equation modeling procedures and data from a random sample of 1,313 Swiss citizens between 18 and 74 years old were used. Results indicated that after minor modifications the model explained the data very well. In the applied context of EMF risks, both trust and confidence had an impact on cooperation. Results suggest that the dual-mode model of social trust and confidence could be used as a common framework in the field of trust and risk management. Practical implications of the results are discussed.

6. Applying risk and resilience models to predicting the effects of media violence on development.

Science.gov (United States)

Prot, Sara; Gentile, Douglas A

2014-01-01

Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective.

7. Applying trait-based models to achieve functional targets for theory-driven ecological restoration.

Science.gov (United States)

Laughlin, Daniel C

2014-07-01

Manipulating community assemblages to achieve functional targets is a key component of restoring degraded ecosystems. The response-and-effect trait framework provides a conceptual foundation for translating restoration goals into functional trait targets, but a quantitative framework has been lacking for translating trait targets into assemblages of species that practitioners can actually manipulate. This study describes new trait-based models that can be used to generate ranges of species abundances to test theories about which traits, which trait values and which species assemblages are most effective for achieving functional outcomes. These models are generalisable, flexible tools that can be widely applied across many terrestrial ecosystems. Examples illustrate how the framework generates assemblages of indigenous species to (1) achieve desired community responses by applying the theories of environmental filtering, limiting similarity and competitive hierarchies, or (2) achieve desired effects on ecosystem functions by applying the theories of mass ratios and niche complementarity. Experimental applications of this framework will advance our understanding of how to set functional trait targets to achieve the desired restoration goals. A trait-based framework provides restoration ecology with a robust scaffold on which to apply fundamental ecological theory to maintain resilient and functioning ecosystems in a rapidly changing world. © 2014 John Wiley & Sons Ltd/CNRS.

8. Mathematical Modeling Applied to Drilling Engineering: An Application of Bourgoyne and Young ROP Model to a Presalt Case Study

Directory of Open Access Journals (Sweden)

Andreas Nascimento

2015-01-01

Full Text Available Several mathematical ROP models were developed in the last five decades in the petroleum industry, departing from rather simple but less reliable R-W-N (drilling rate, weight on bit, and rotary speed formulations until the arrival to more comprehensive and complete approaches such as the Bourgoyne and Young ROP model (BYM widely used in the petroleum industry. The paper emphasizes the BYM formulation, how it is applied in terms of ROP modeling, identifies the main drilling parameters driving each subfunction, and introduces how they were developed; the paper is also addressing the normalization factors and modeling coefficients which have significant influence on the model. The present work details three simulations aiming to understand the approach by applying the formulation in a presalt layer and how some modification of the main method may impact the modeling of the fitting process. The simulation runs show that the relative error measures can be seen as the most reliable fitting verification on top of R-squared. Applying normalization factors and by allowing a more wide range of applicable drillability coefficients, the regression could allow better fitting of the simulation to real data from 54% to 73%, which is an improvement of about 20%.

9. Applying Infinite State Model Checking and Other Analysis Techniques to Tabular Requirements Specifications of Safety-Critical Systems

National Research Council Canada - National Science Library

Bultan, Tevfik; Heitmeyer, Constance

2006-01-01

Although it is most often applied to finite state models, in recent years, symbolic model checking has been extended to infinite state models using symbolic representations that encode infinite sets...

10. Gene–gene interaction analysis for the survival phenotype based on the Cox model

OpenAIRE

Lee, Seungyeoun; Kwon, Min-Seok; Oh, Jung Mi; Park, Taesung

2012-01-01

Motivation: For the past few decades, many statistical methods in genome-wide association studies (GWAS) have been developed to identify SNP–SNP interactions for case-control studies. However, there has been less work for prospective cohort studies, involving the survival time. Recently, Gui et al. (2011) proposed a novel method, called Surv-MDR, for detecting gene–gene interactions associated with survival time. Surv-MDR is an extension of the multifactor dimensionality reduction (MDR) metho...

11. Comparison of Cox Model and K-Nearest Neighbor to Estimation of Survival in Kidney Transplant Patients

Directory of Open Access Journals (Sweden)

2016-01-01

Full Text Available Introduction & Objective: Cox model is a common method to estimate survival and validity of the results is dependent on the proportional hazards assumption. K- Nearest neighbor is a nonparametric method for survival probability in heterogeneous communities. The purpose of this study was to compare the performance of k- nearest neighbor method (K-NN with Cox model. Materials & Methods: This retrospective cohort study was conducted in Hamadan Province, on 475 patients who had undergone kidney transplantation from 1994 to 2011. Data were extracted from patients’ medical records using a checklist. The duration of the time between kidney transplantation and rejection was considered as the surviv­al time. Cox model and k- nearest neighbor method were used for Data modeling. The prediction error Brier score was used to compare the performance models. Results: Out of 475 transplantations, 55 episodes of rejection occurred. 5, 10 and 15 year survival rates of transplantation were 91.70 %, 84.90% and 74.50%, respectively. The number of neighborhood optimized using cross validation method was 45. Cumulative Brier score of k-NN algorithm for t=5, 10 and 15 years were 0.003, 0.006 and 0.007, respectively. Cumulative Brier of score Cox model for t=5, 10 and 15 years were 0.036, 0.058 and 0.058, respectively. Prediction error of k-NN algorithm for t=5, 10 and 15 years was less than Cox model that shows that the k-NN method outperforms. Conclusions: The results of this study show that the predictions of KNN has higher accuracy than the Cox model when sample sizes and the number of predictor variables are high. Sci J Hamadan Univ Med Sci . 2016; 22 (4 :300-308

12. Bang-Bang Control Applied on an HIV-1 within host Model

Directory of Open Access Journals (Sweden)

A. Rahmoun

2016-03-01

Full Text Available Local controllability analysis of an HIV infection model on which three controls are effective is investigated, the optimal control policy to minimize the number of infected cells, the number of free virus and maximize the number of healthy cells for each control separately, then for all controls applied at once is formulated and solved as an optimal bang-bang control problem (command all or nothing. Numerical examples are given to illustrate the obtained results.

13. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

OpenAIRE

Shrirang Ambaji KULKARNI; Raghavendra G . RAO

2017-01-01

Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique ...

14. Applying nonlinear MODM model to supply chain management with quantity discount policy under complex fuzzy environment

Directory of Open Access Journals (Sweden)

Zhe Zhang

2014-06-01

Full Text Available Purpose: The aim of this paper is to deal with the supply chain management (SCM with quantity discount policy under the complex fuzzy environment, which is characterized as the bi-fuzzy variables. By taking into account the strategy and the process of decision making, a bi-fuzzy nonlinear multiple objective decision making (MODM model is presented to solve the proposed problem.Design/methodology/approach: The bi-fuzzy variables in the MODM model are transformed into the trapezoidal fuzzy variables by the DMs's degree of optimism ?1 and ?2, which are de-fuzzified by the expected value index subsequently. For solving the complex nonlinear model, a multi-objective adaptive particle swarm optimization algorithm (MO-APSO is designed as the solution method.Findings: The proposed model and algorithm are applied to a typical example of SCM problem to illustrate the effectiveness. Based on the sensitivity analysis of the results, the bi-fuzzy nonlinear MODM SCM model is proved to be sensitive to the possibility level ?1.Practical implications: The study focuses on the SCM under complex fuzzy environment in SCM, which has a great practical significance. Therefore, the bi-fuzzy MODM model and MO-APSO can be further applied in SCM problem with quantity discount policy.Originality/value: The bi-fuzzy variable is employed in the nonlinear MODM model of SCM to characterize the hybrid uncertain environment, and this work is original. In addition, the hybrid crisp approach is proposed to transferred to model to an equivalent crisp one by the DMs's degree of optimism and the expected value index. Since the MODM model consider the bi-fuzzy environment and quantity discount policy, so this paper has a great practical significance.

15. NTCP modelling of lung toxicity after SBRT comparing the universal survival curve and the linear quadratic model for fractionation correction.

Science.gov (United States)

Wennberg, Berit M; Baumann, Pia; Gagliardi, Giovanna; Nyman, Jan; Drugge, Ninni; Hoyer, Morten; Traberg, Anders; Nilsson, Kristina; Morhed, Elisabeth; Ekberg, Lars; Wittgren, Lena; Lund, Jo-Åsmund; Levin, Nina; Sederholm, Christer; Lewensohn, Rolf; Lax, Ingmar

2011-05-01

In SBRT of lung tumours no established relationship between dose-volume parameters and the incidence of lung toxicity is found. The aim of this study is to compare the LQ model and the universal survival curve (USC) to calculate biologically equivalent doses in SBRT to see if this will improve knowledge on this relationship. Toxicity data on radiation pneumonitis grade 2 or more (RP2+) from 57 patients were used, 10.5% were diagnosed with RP2+. The lung DVHs were corrected for fractionation (LQ and USC) and analysed with the Lyman- Kutcher-Burman (LKB) model. In the LQ-correction α/β = 3 Gy was used and the USC parameters used were: α/β = 3 Gy, D(0) = 1.0 Gy, [Formula: see text] = 10, α = 0.206 Gy(-1) and d(T) = 5.8 Gy. In order to understand the relative contribution of different dose levels to the calculated NTCP the concept of fractional NTCP was used. This might give an insight to the questions of whether "high doses to small volumes" or "low doses to large volumes" are most important for lung toxicity. NTCP analysis with the LKB-model using parameters m = 0.4, D(50) = 30 Gy resulted for the volume dependence parameter (n) with LQ correction n = 0.87 and with USC correction n = 0.71. Using parameters m = 0.3, D(50) = 20 Gy n = 0.93 with LQ correction and n = 0.83 with USC correction. In SBRT of lung tumours, NTCP modelling of lung toxicity comparing models (LQ,USC) for fractionation correction, shows that low dose contribute less and high dose more to the NTCP when using the USC-model. Comparing NTCP modelling of SBRT data and data from breast cancer, lung cancer and whole lung irradiation implies that the response of the lung is treatment specific. More data are however needed in order to have a more reliable modelling.

16. Models, validation, and applied geochemistry: Issues in science, communication, and philosophy

Science.gov (United States)

Nordstrom, D. Kirk

2012-01-01

Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.

17. Applying global sensitivity analysis to the modelling of flow and water quality in sewers.

Science.gov (United States)

Gamerith, Valentin; Neumann, Marc B; Muschalla, Dirk

2013-09-01

While several approaches for global sensitivity analysis (GSA) have been proposed in literature, only few applications exist in urban drainage modelling. This contribution discusses two GSA methods applied to a sewer flow and sewer water quality model: Standardised Regression Coefficients (SRCs) using Monte-Carlo simulation as well as the Morris Screening method. For selected model variables we evaluate how the sensitivities are influenced by the choice of the rainfall event. The aims are to i) compare both methods concerning the similarity of results and their applicability, ii) discuss the implications for factor fixing (identifying non-influential parameters) and factor prioritisation (identifying important parameters) and iii) rank the important parameters for the investigated model. It was shown that both methods lead to similar results for the hydraulic model. Parameter interactions and non-linearity were identified for the water quality model and the parameter ranking differs between the methods. For the investigated model the results allow a sound choice of output variables and rainfall events in view of detailed uncertainty analysis or model calibration. We advocate the simultaneous use of both methods for a first model assessment as they allow answering both factor fixing and factor prioritisation at low computational cost. Copyright © 2013 Elsevier Ltd. All rights reserved.

18. Nonlinear models applied to seed germination of Rhipsalis cereuscula Haw (Cactaceae

Directory of Open Access Journals (Sweden)

Terezinha Aparecida Guedes

2014-09-01

Full Text Available The objective of this analysis was to fit germination data of Rhipsalis cereuscula Haw seeds to the Weibull model with three parameters using Frequentist and Bayesian methods. Five parameterizations were compared using the Bayesian analysis to fit a prior distribution. The parameter estimates from the Frequentist method were similar to the Bayesian responses considering the following non-informative a priori distribution for the parameter vectors: gamma (10³, 10³ in the model M1, normal (0, 106 in the model M2, uniform (0, Lsup in the model M3, exp (μ in the model M4 and Lnormal (μ, 106 in the model M5. However, to achieve the convergence in the models M4 and M5, we applied the μ from the estimates of the Frequentist approach. The best models fitted by the Bayesian method were the M1 and M3. The adequacy of these models was based on the advantages over the Frequentist method such as the reduced computational efforts and the possibility of comparison.

19. An Integrated Optimization Design Method Based on Surrogate Modeling Applied to Diverging Duct Design

Science.gov (United States)

Hanan, Lu; Qiushi, Li; Shaobin, Li

2016-12-01

This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.

20. Active lubrication applied to radial gas journal bearings. Part 2: Modelling improvement and experimental validation

DEFF Research Database (Denmark)

Pierart, Fabián G.; Santos, Ilmar F.

2016-01-01

Actively-controlled lubrication techniques are applied to radial gas bearings aiming at enhancing one of their most critical drawbacks, their lack of damping. A model-based control design approach is presented using simple feedback control laws, i.e. proportional controllers. The design approach...... by finite element method and the global model is used as control design tool. Active lubrication allows for significant increase in damping factor of the rotor-bearing system. Very good agreement between theory and experiment is obtained, supporting the multi-physic design tool developed....

1. A semiparametric joint model for terminal trend of quality of life and survival in palliative care research.

Science.gov (United States)

Li, Zhigang; Frost, H R; Tosteson, Tor D; Zhao, Lihui; Liu, Lei; Lyons, Kathleen; Chen, Huaihou; Cole, Bernard; Currow, David; Bakitas, Marie

2017-12-20

Palliative medicine is an interdisciplinary specialty focusing on improving quality of life (QOL) for patients with serious illness and their families. Palliative care programs are available or under development at over 80% of large US hospitals (300+ beds). Palliative care clinical trials present unique analytic challenges relative to evaluating the palliative care treatment efficacy which is to improve patients' diminishing QOL as disease progresses towards end of life (EOL). A unique feature of palliative care clinical trials is that patients will experience decreasing QOL during the trial despite potentially beneficial treatment. Often longitudinal QOL and survival data are highly correlated which, in the face of censoring, makes it challenging to properly analyze and interpret terminal QOL trend. To address these issues, we propose a novel semiparametric statistical approach to jointly model the terminal trend of QOL and survival data. There are two sub-models in our approach: a semiparametric mixed effects model for longitudinal QOL and a Cox model for survival. We use regression splines method to estimate the nonparametric curves and AIC to select knots. We assess the model performance through simulation to establish a novel modeling approach that could be used in future palliative care research trials. Application of our approach in a recently completed palliative care clinical trial is also presented. Copyright © 2017 John Wiley & Sons, Ltd.

2. Applying downscaled global climate model data to a hydrodynamic surface-water and groundwater model

Science.gov (United States)

Swain, Eric; Stefanova, Lydia; Smith, Thomas

2014-01-01

Precipitation data from Global Climate Models have been downscaled to smaller regions. Adapting this downscaled precipitation data to a coupled hydrodynamic surface-water/groundwater model of southern Florida allows an examination of future conditions and their effect on groundwater levels, inundation patterns, surface-water stage and flows, and salinity. The downscaled rainfall data include the 1996-2001 time series from the European Center for Medium-Range Weather Forecasting ERA-40 simulation and both the 1996-1999 and 2038-2057 time series from two global climate models: the Community Climate System Model (CCSM) and the Geophysical Fluid Dynamic Laboratory (GFDL). Synthesized surface-water inflow datasets were developed for the 2038-2057 simulations. The resulting hydrologic simulations, with and without a 30-cm sea-level rise, were compared with each other and field data to analyze a range of projected conditions. Simulations predicted generally higher future stage and groundwater levels and surface-water flows, with sea-level rise inducing higher coastal salinities. A coincident rise in sea level, precipitation and surface-water flows resulted in a narrower inland saline/fresh transition zone. The inland areas were affected more by the rainfall difference than the sea-level rise, and the rainfall differences make little difference in coastal inundation, but a larger difference in coastal salinities.

3. Modelling the Progression of Bird Migration with Conditional Autoregressive Models Applied to Ringing Data

Science.gov (United States)

Ambrosini, Roberto; Borgoni, Riccardo; Rubolini, Diego; Sicurella, Beatrice; Fiedler, Wolfgang; Bairlein, Franz; Baillie, Stephen R.; Robinson, Robert A.; Clark, Jacquie A.; Spina, Fernando; Saino, Nicola

2014-01-01

Migration is a fundamental stage in the life history of several taxa, including birds, and is under strong selective pressure. At present, the only data that may allow for both an assessment of patterns of bird migration and for retrospective analyses of changes in migration timing are the databases of ring recoveries. We used ring recoveries of the Barn Swallow Hirundo rustica collected from 1908–2008 in Europe to model the calendar date at which a given proportion of birds is expected to have reached a given geographical area (‘progression of migration’) and to investigate the change in timing of migration over the same areas between three time periods (1908–1969, 1970–1990, 1991–2008). The analyses were conducted using binomial conditional autoregressive (CAR) mixed models. We first concentrated on data from the British Isles and then expanded the models to western Europe and north Africa. We produced maps of the progression of migration that disclosed local patterns of migration consistent with those obtained from the analyses of the movements of ringed individuals. Timing of migration estimated from our model is consistent with data on migration phenology of the Barn Swallow available in the literature, but in some cases it is later than that estimated by data collected at ringing stations, which, however, may not be representative of migration phenology over large geographical areas. The comparison of median migration date estimated over the same geographical area among time periods showed no significant advancement of spring migration over the whole of Europe, but a significant advancement of autumn migration in southern Europe. Our modelling approach can be generalized to any records of ringing date and locality of individuals including those which have not been recovered subsequently, as well as to geo-referenced databases of sightings of migratory individuals. PMID:25047331

4. Modelling the progression of bird migration with conditional autoregressive models applied to ringing data.

Science.gov (United States)

Ambrosini, Roberto; Borgoni, Riccardo; Rubolini, Diego; Sicurella, Beatrice; Fiedler, Wolfgang; Bairlein, Franz; Baillie, Stephen R; Robinson, Robert A; Clark, Jacquie A; Spina, Fernando; Saino, Nicola

2014-01-01

Migration is a fundamental stage in the life history of several taxa, including birds, and is under strong selective pressure. At present, the only data that may allow for both an assessment of patterns of bird migration and for retrospective analyses of changes in migration timing are the databases of ring recoveries. We used ring recoveries of the Barn Swallow Hirundo rustica collected from 1908-2008 in Europe to model the calendar date at which a given proportion of birds is expected to have reached a given geographical area ('progression of migration') and to investigate the change in timing of migration over the same areas between three time periods (1908-1969, 1970-1990, 1991-2008). The analyses were conducted using binomial conditional autoregressive (CAR) mixed models. We first concentrated on data from the British Isles and then expanded the models to western Europe and north Africa. We produced maps of the progression of migration that disclosed local patterns of migration consistent with those obtained from the analyses of the movements of ringed individuals. Timing of migration estimated from our model is consistent with data on migration phenology of the Barn Swallow available in the literature, but in some cases it is later than that estimated by data collected at ringing stations, which, however, may not be representative of migration phenology over large geographical areas. The comparison of median migration date estimated over the same geographical area among time periods showed no significant advancement of spring migration over the whole of Europe, but a significant advancement of autumn migration in southern Europe. Our modelling approach can be generalized to any records of ringing date and locality of individuals including those which have not been recovered subsequently, as well as to geo-referenced databases of sightings of migratory individuals.

5. Identifying 'unhealthy' food advertising on television: a case study applying the UK Nutrient Profile model.

Science.gov (United States)

Jenkin, Gabrielle; Wilson, Nick; Hermanson, Nicole

2009-05-01

6. Online Cancer Information Seeking: Applying and Extending the Comprehensive Model of Information Seeking.

Science.gov (United States)

Van Stee, Stephanie K; Yang, Qinghua

2017-10-30

This study applied the comprehensive model of information seeking (CMIS) to online cancer information and extended the model by incorporating an exogenous variable: interest in online health information exchange with health providers. A nationally representative sample from the Health Information National Trends Survey 4 Cycle 4 was analyzed to examine the extended CMIS in predicting online cancer information seeking. Findings from a structural equation model supported most of the hypotheses derived from the CMIS, as well as the extension of the model related to interest in online health information exchange. In particular, socioeconomic status, beliefs, and interest in online health information exchange predicted utility. Utility, in turn, predicted online cancer information seeking, as did information-carrier characteristics. An unexpected but important finding from the study was the significant, direct relationship between cancer worry and online cancer information seeking. Theoretical and practical implications are discussed.

7. Adapted strategic plannig model applied to small business: a case study in the fitness area

Directory of Open Access Journals (Sweden)

Eduarda Tirelli Hennig

2012-06-01

Full Text Available The strategic planning is an important management tool in the corporate scenario and shall not be restricted to big Companies. However, this kind of planning process in small business may need special adaptations due to their own characteristics. This paper aims to identify and adapt the existent models of strategic planning to the scenario of a small business in the fitness area. Initially, it is accomplished a comparative study among models of different authors to identify theirs phases and activities. Then, it is defined which of these phases and activities should be present in a model that will be utilized in a small business. That model was applied to a Pilates studio; it involves the establishment of an organizational identity, an environmental analysis as well as the definition of strategic goals, strategies and actions to reach them. Finally, benefits to the organization could be identified, as well as hurdles in the implementation of the tool.

8. Evaluation model applied to TRANSPETRO's Marine Terminals Standardization Program

Energy Technology Data Exchange (ETDEWEB)

Almeida, Maria Fatima Ludovico de; Mueller, Gabriela [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Garcia, Luciano Maldonado [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

2009-07-01

This paper describes an innovative evaluation model applied to TRANSPETRO's 'Marine Terminals Standardization Program' based on updating approaches of programs evaluation and organizational learning. Since the program was launched in 2004, the need for having an evaluation model able to evaluate its implementation progress, to measure the degree of standards compliance and its potential economic, social and environmental impacts has become evident. Within a vision of safe and environmentally responsible operations of marine terminals, this evaluation model was jointly designed by TRANSPETRO and PUC-Rio to promote continuous improvement and learning in operational practices and in the standardization process itself. TRANSPETRO believes that standardization supports its services and management innovation capability by creating objective and internationally recognized parameters, targets and metrology for its business activities. The conceptual model and application guidelines for this important tool are presented in this paper, as well as the next steps towards its implementation. (author)

9. Neural networks and differential evolution algorithm applied for modelling the depollution process of some gaseous streams.

Science.gov (United States)

Curteanu, Silvia; Suditu, Gabriel Dan; Buburuzan, Adela Marina; Dragoi, Elena Niculina

2014-11-01

The depollution of some gaseous streams containing n-hexane is studied by adsorption in a fixed bed column, under dynamic conditions, using granular activated carbon and two types of non-functionalized hypercross-linked polymeric resins. In order to model the process, a new neuro-evolutionary approach is proposed. It is a combination of a modified differential evolution (DE) with neural networks (NNs) and two local search algorithms, the global and local optimizers, working together to determine the optimal NN model. The main elements that characterize the applied variant of DE consist in using an opposition-based learning initialization, a simple self-adaptive procedure for the control parameters, and a modified mutation principle based on the fitness function as a criterion for reorganization. The results obtained prove that the proposed algorithm is able to determine a good model of the considered process, its performance being better than those of an available phenomenological model.

10. Pharmacological Amelioration of Cone Survival and Vision in a Mouse Model for Leber Congenital Amaurosis.

Science.gov (United States)

Li, Songhua; Samardzija, Marijana; Yang, Zhihui; Grimm, Christian; Jin, Minghao

2016-05-25

improve cone survival and function in patients with LCA caused by RPE65 mutations. Using a mouse model carrying the most frequent LCA-associated mutation (R91W), we found that the mutant RPE65 underwent ubiquitination-dependent proteasomal degradation due to misfolding. Treatment of the mice with a chemical chaperone partially corrected stability, enzymatic activity, and subcellular localization of R91W RPE65, which was also accompanied by improvement of cone survival and vision. These findings identify an in vivo molecular pathogenic mechanism for R91W mutation and provide a feasible pharmacological approach that can delay vision loss in patients with RPE65 mutations. Copyright © 2016 the authors 0270-6474/16/365808-12\$15.00/0.

11. Breeding or assisted reproduction? Relevance of the horse model applied to the conservation of endangered equids.

Science.gov (United States)

Smits, K; Hoogewijs, M; Woelders, H; Daels, P; Van Soom, A

2012-08-01

Many wild equids are at present endangered in the wild. Concurrently, increased mechanization has pushed back the numbers of some old native horse breeds to levels that are no longer compatible with survival of the breed. Strong concerns arose in the last decade to preserve animal biodiversity, including that of rare horse breeds. Genome Resource Banking refers to the cryostorage of genetic material and is an approach for ex situ conservation, which should be applied in combination with in situ conservation programmes. In this review, we propose that, owing to the great reproductive similarity among the different members of the genus Equus, the domestic horse can be used to optimize cryopreservation and embryo production protocols for future application in wild equids. We will give this hypothesis a scientific underpinning by listing successful applications of epididymal sperm freezing, embryo freezing, intracytoplasmic sperm injection, oocyte vitrification and somatic cell nuclear transfer in domestic horses. Some ART fertilization methods may be performed with semen of very low quality or with oocytes obtained after the death of the mare. © 2012 Blackwell Verlag GmbH.

12. Iron-Chelating Drugs Enhance Cone Photoreceptor Survival in a Mouse Model of Retinitis Pigmentosa.

Science.gov (United States)

Wang, Ke; Peng, Bo; Xiao, Jia; Weinreb, Orly; Youdim, Moussa B H; Lin, Bin

2017-10-01

Retinitis pigmentosa (RP) is a group of hereditary retinal degeneration in which mutations commonly result in the initial phase of rod cell death followed by gradual cone cell death. The mechanisms by which the mutations lead to photoreceptor cell death in RP have not been clearly elucidated. There is currently no effective treatment for RP. The purpose of this work was to explore iron chelation therapy for improving cone survival and function in the rd10 mouse model of RP. Two iron-chelating drugs, 5-(4-(2-hydroxyethyl) piperazin-1-yl (methyl)-8-hydroxyquinoline (VK28) and its chimeric derivative 5-(N-methyl-N-propargyaminomethyl)-quinoline-8-oldihydrochloride (VAR10303), were injected intraperitoneally to rd10 mice every other day starting from postnatal day 14. We investigate the effects of the two compounds on cone rescue at three time points, using a combination of immunocytochemistry, RT-PCR, Western blot analysis, and a series of visual function tests. VK28 and VAR10303 treatments partially rescued cones, and significantly improved visual function in rd10 mice. Moreover, we showed that the neuroprotective effects of VK28 and VAR10303 were correlated to inhibition of neuroinflammation, oxidative stress, and apoptosis. Furthermore, we demonstrated that downregulation of NF-kB and p53 is likely to be the mechanisms by which proinflammatory mediators and apoptosis are reduced in the rd10 retina, respectively. VK28 and VAR10303 provided partial histologic and functional rescue of cones in RD10 mice. Our study demonstrated that iron chelation therapy might represent an effective therapeutic strategy for RP patients.

13. The 5D to 4D projection model applied as a Lepton to Galaxy Creation model

CERN Document Server

Wong, Kai-Wai; Jungner, Högne

2013-01-01

The 5D to 4D projection is presented in a simple geometry giving the Perelman Theorem, resulting in a 3D doughnut structure for the space manifold of the Lorentz space-time. It is shown that in the lowest quantum state, this Lorentz manifold confines and gives the de Broglie leptons from the massless 5D e-trinos. On the scale of the universe, it allows for a model for the creation of galaxies.

14. Implementing team huddles in small rural hospitals: How does the Kotter model of change apply?

Science.gov (United States)

Baloh, Jure; Zhu, Xi; Ward, Marcia M

2017-12-17

To examine how the process of change prescribed in Kotter's change model applies in implementing team huddles, and to assess the impact of the execution of early change phases on change success in later phases. Kotter's model can help to guide hospital leaders to implement change and potentially to improve success rates. However, the model is under studied, particularly in health care. We followed eight hospitals implementing team huddles for 2 years, interviewing the change teams quarterly to inquire about implementation progress. We assessed how the hospitals performed in the three overarching phases of the Kotter model, and examined whether performance in the initial phase influenced subsequent performance. In half of the hospitals, change processes were congruent with Kotter's model, where performance in the initial phase influenced their success in subsequent phases. In other hospitals, change processes were incongruent with the model, and their success depended on implementation scope and the strategies employed. We found mixed support for the Kotter model. It better fits implementation that aims to spread to multiple hospital units. When the scope is limited, changes can be successful even when steps are skipped. Kotter's model can be a useful guide for nurse managers implementing changes. © 2017 John Wiley & Sons Ltd.

15. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

Science.gov (United States)

Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

2017-07-28

Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

16. A methodology for estimating the uncertainty in model parameters applying the robust Bayesian inferences

Energy Technology Data Exchange (ETDEWEB)

Kim, Joo Yeon; Lee, Seung Hyun; Park, Tai Jin [Korean Association for Radiation Application, Seoul (Korea, Republic of)

2016-06-15

Any real application of Bayesian inference must acknowledge that both prior distribution and likelihood function have only been specified as more or less convenient approximations to whatever the analyzer's true belief might be. If the inferences from the Bayesian analysis are to be trusted, it is important to determine that they are robust to such variations of prior and likelihood as might also be consistent with the analyzer's stated beliefs. The robust Bayesian inference was applied to atmospheric dispersion assessment using Gaussian plume model. The scopes of contaminations were specified as the uncertainties of distribution type and parametric variability. The probabilistic distribution of model parameters was assumed to be contaminated as the symmetric unimodal and unimodal distributions. The distribution of the sector-averaged relative concentrations was then calculated by applying the contaminated priors to the model parameters. The sector-averaged concentrations for stability class were compared by applying the symmetric unimodal and unimodal priors, respectively, as the contaminated one based on the class of ε-contamination. Though ε was assumed as 10%, the medians reflecting the symmetric unimodal priors were nearly approximated within 10% compared with ones reflecting the plausible ones. However, the medians reflecting the unimodal priors were approximated within 20% for a few downwind distances compared with ones reflecting the plausible ones. The robustness has been answered by estimating how the results of the Bayesian inferences are robust to reasonable variations of the plausible priors. From these robust inferences, it is reasonable to apply the symmetric unimodal priors for analyzing the robustness of the Bayesian inferences.

17. Innovations’ Survival

Directory of Open Access Journals (Sweden)

Jakub Tabas

2016-01-01

Full Text Available Innovations currently represent a tool of maintaining the going concern of a business entity and its competitiveness. However, effects of innovations are not infinite and if an innovation should constantly preserve a life of business entity, it has to be a continual chain of innovations, i.e. continual process. Effective live of a single innovation is limited while the limitation is derived especially from industry. The paper provides the results of research on innovations effects in the financial performance of small and medium-sized enterprises in the Czech Republic. Objective of this paper is to determine the length and intensity of the effects of technical innovations in company’s financial performance. The economic effect of innovations has been measured at application of company’s gross production power while the Deviation Analysis has been applied for three years’ time series. Subsequently the Survival Analysis has been applied. The analyses are elaborated for three statistical samples of SMEs constructed in accordance to the industry. The results obtained show significant differences in innovations’ survival within these three samples of enterprises then. The results are quite specific for the industries, and are confronted and discussed with the results of authors’ former research on the issue.

18. Using simulation to interpret a discrete time survival model in a complex biological system: fertility and lameness in dairy cows.

Directory of Open Access Journals (Sweden)

Christopher D Hudson

Full Text Available The ever-growing volume of data routinely collected and stored in everyday life presents researchers with a number of opportunities to gain insight and make predictions. This study aimed to demonstrate the usefulness in a specific clinical context of a simulation-based technique called probabilistic sensitivity analysis (PSA in interpreting the results of a discrete time survival model based on a large dataset of routinely collected dairy herd management data. Data from 12,515 dairy cows (from 39 herds were used to construct a multilevel discrete time survival model in which the outcome was the probability of a cow becoming pregnant during a given two day period of risk, and presence or absence of a recorded lameness event during various time frames relative to the risk period amongst the potential explanatory variables. A separate simulation model was then constructed to evaluate the wider clinical implications of the model results (i.e. the potential for a herd's incidence rate of lameness to influence its overall reproductive performance using PSA. Although the discrete time survival analysis revealed some relatively large associations between lameness events and risk of pregnancy (for example, occurrence of a lameness case within 14 days of a risk period was associated with a 25% reduction in the risk of the cow becoming pregnant during that risk period, PSA revealed that, when viewed in the context of a realistic clinical situation, a herd's lameness incidence rate is highly unlikely to influence its overall reproductive performance to a meaningful extent in the vast majority of situations. Construction of a simulation model within a PSA framework proved to be a very useful additional step to aid contextualisation of the results from a discrete time survival model, especially where the research is designed to guide on-farm management decisions at population (i.e. herd rather than individual level.

19. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

Directory of Open Access Journals (Sweden)