#### Sample records for survival model applied

1. A Log Logistic Survival Model Applied to Hypobaric Decompression Sickness

Science.gov (United States)

Conkin, Johnny

2001-01-01

Decompression sickness (DCS) is a complex, multivariable problem. A mathematical description or model of the likelihood of DCS requires a large amount of quality research data, ideas on how to define a decompression dose using physical and physiological variables, and an appropriate analytical approach. It also requires a high-performance computer with specialized software. I have used published DCS data to develop my decompression doses, which are variants of equilibrium expressions for evolved gas plus other explanatory variables. My analytical approach is survival analysis, where the time of DCS occurrence is modeled. My conclusions can be applied to simple hypobaric decompressions - ascents lasting from 5 to 30 minutes - and, after minutes to hours, to denitrogenation (prebreathing). They are also applicable to long or short exposures, and can be used whether the sufferer of DCS is at rest or exercising at altitude. Ultimately I would like my models to be applied to astronauts to reduce the risk of DCS during spacewalks, as well as to future spaceflight crews on the Moon and Mars.

2. Applied survival analysis using R

CERN Document Server

Moore, Dirk F

2016-01-01

Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

3. Modelling survival

DEFF Research Database (Denmark)

Ashauer, Roman; Albert, Carlo; Augustine, Starrlight

2016-01-01

The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test...

International Nuclear Information System (INIS)

Zackrisson, B.

1992-01-01

A central issue in clinical radiobiological research is the prediction of responses to different radiation qualities. The choice of cell survival and dose-response model greatly influences the results. In this context the relationship between theory and model is emphasized. Generally, the interpretations of experimental data depend on the model. Cell survival models are systematized with respect to their relations to radiobiological theories of cell kill. The growing knowlegde of biological, physical, and chemical mechanisms is reflected in the formulation of new models. The present overview shows that recent modelling has been more oriented towards the stochastic fluctuations connected to radiation energy deposition. This implies that the traditional cell surivival models ought to be complemented by models of stochastic energy deposition processes and repair processes at the intracellular level. (orig.)

5. Survival analysis models and applications

CERN Document Server

Liu, Xian

2012-01-01

Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

6. Probabilistic Survivability Versus Time Modeling

Science.gov (United States)

Joyner, James J., Sr.

2016-01-01

This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

7. Applied impulsive mathematical models

CERN Document Server

Stamova, Ivanka

2016-01-01

Using the theory of impulsive differential equations, this book focuses on mathematical models which reflect current research in biology, population dynamics, neural networks and economics. The authors provide the basic background from the fundamental theory and give a systematic exposition of recent results related to the qualitative analysis of impulsive mathematical models. Consisting of six chapters, the book presents many applicable techniques, making them available in a single source easily accessible to researchers interested in mathematical models and their applications. Serving as a valuable reference, this text is addressed to a wide audience of professionals, including mathematicians, applied researchers and practitioners.

8. Repair models of cell survival and corresponding computer program for survival curve fitting

International Nuclear Information System (INIS)

Shen Xun; Hu Yiwei

1992-01-01

Some basic concepts and formulations of two repair models of survival, the incomplete repair (IR) model and the lethal-potentially lethal (LPL) model, are introduced. An IBM-PC computer program for survival curve fitting with these models was developed and applied to fit the survivals of human melanoma cells HX118 irradiated at different dose rates. Comparison was made between the repair models and two non-repair models, the multitar get-single hit model and the linear-quadratic model, in the fitting and analysis of the survival-dose curves. It was shown that either IR model or LPL model can fit a set of survival curves of different dose rates with same parameters and provide information on the repair capacity of cells. These two mathematical models could be very useful in quantitative study on the radiosensitivity and repair capacity of cells

9. Applied stochastic modelling

CERN Document Server

Morgan, Byron JT; Tanner, Martin Abba; Carlin, Bradley P

2008-01-01

Introduction and Examples Introduction Examples of data sets Basic Model Fitting Introduction Maximum-likelihood estimation for a geometric model Maximum-likelihood for the beta-geometric model Modelling polyspermy Which model? What is a model for? Mechanistic models Function Optimisation Introduction MATLAB: graphs and finite differences Deterministic search methods Stochastic search methods Accuracy and a hybrid approach Basic Likelihood ToolsIntroduction Estimating standard errors and correlations Looking at surfaces: profile log-likelihoods Confidence regions from profiles Hypothesis testing in model selectionScore and Wald tests Classical goodness of fit Model selection biasGeneral Principles Introduction Parameterisation Parameter redundancy Boundary estimates Regression and influence The EM algorithm Alternative methods of model fitting Non-regular problemsSimulation Techniques Introduction Simulating random variables Integral estimation Verification Monte Carlo inference Estimating sampling distributi...

10. Applied Bayesian modelling

CERN Document Server

Congdon, Peter

2014-01-01

This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

11. Life-Cycle Models for Survivable Systems

National Research Council Canada - National Science Library

Linger, Richard

2002-01-01

.... Current software development life-cycle models are not focused on creating survivable systems, and exhibit shortcomings when the goal is to develop systems with a high degree of assurance of survivability...

12. Survivability Assessment: Modeling A Recovery Process

OpenAIRE

Paputungan, Irving Vitra; Abdullah, Azween

2009-01-01

Survivability is the ability of a system to continue operating, in a timely manner, in the presence ofattacks, failures, or accidents. Recovery in survivability is a process of a system to heal or recover from damageas early as possible to fulfill its mission as condition permit. In this paper, we show a preliminary recoverymodel to enhance the system survivability. The model focuses on how we preserve the system and resumes itscritical service under attacks as soon as possible.Keywords: surv...

13. Locally Applied Valproate Enhances Survival in Rats after Neocortical Treatment with Tetanus Toxin and Cobalt Chloride

Directory of Open Access Journals (Sweden)

Dirk-Matthias Altenmüller

2013-01-01

Full Text Available Purpose. In neocortical epilepsies not satisfactorily responsive to systemic antiepileptic drug therapy, local application of antiepileptic agents onto the epileptic focus may enhance treatment efficacy and tolerability. We describe the effects of focally applied valproate (VPA in a newly emerging rat model of neocortical epilepsy induced by tetanus toxin (TeT plus cobalt chloride (CoCl2. Methods. In rats, VPA ( or sodium chloride (NaCl ( containing polycaprolactone (PCL implants were applied onto the right motor cortex treated before with a triple injection of 75 ng TeT plus 15 mg CoCl2. Video-EEG monitoring was performed with intracortical depth electrodes. Results. All rats randomized to the NaCl group died within one week after surgery. In contrast, the rats treated with local VPA survived significantly longer (. In both groups, witnessed deaths occurred in the context of seizures. At least of the rats surviving the first postoperative day developed neocortical epilepsy with recurrent spontaneous seizures. Conclusions. The novel TeT/CoCl2 approach targets at a new model of neocortical epilepsy in rats and allows the investigation of local epilepsy therapy strategies. In this vehicle-controlled study, local application of VPA significantly enhanced survival in rats, possibly by focal antiepileptic or antiepileptogenic mechanisms.

14. Predictive model for survival in patients with gastric cancer.

Science.gov (United States)

Goshayeshi, Ladan; Hoseini, Benyamin; Yousefli, Zahra; Khooie, Alireza; Etminani, Kobra; Esmaeilzadeh, Abbas; Golabpour, Amin

2017-12-01

Gastric cancer is one of the most prevalent cancers in the world. Characterized by poor prognosis, it is a frequent cause of cancer in Iran. The aim of the study was to design a predictive model of survival time for patients suffering from gastric cancer. This was a historical cohort conducted between 2011 and 2016. Study population were 277 patients suffering from gastric cancer. Data were gathered from the Iranian Cancer Registry and the laboratory of Emam Reza Hospital in Mashhad, Iran. Patients or their relatives underwent interviews where it was needed. Missing values were imputed by data mining techniques. Fifteen factors were analyzed. Survival was addressed as a dependent variable. Then, the predictive model was designed by combining both genetic algorithm and logistic regression. Matlab 2014 software was used to combine them. Of the 277 patients, only survival of 80 patients was available whose data were used for designing the predictive model. Mean ?SD of missing values for each patient was 4.43?.41 combined predictive model achieved 72.57% accuracy. Sex, birth year, age at diagnosis time, age at diagnosis time of patients' family, family history of gastric cancer, and family history of other gastrointestinal cancers were six parameters associated with patient survival. The study revealed that imputing missing values by data mining techniques have a good accuracy. And it also revealed six parameters extracted by genetic algorithm effect on the survival of patients with gastric cancer. Our combined predictive model, with a good accuracy, is appropriate to forecast the survival of patients suffering from Gastric cancer. So, we suggest policy makers and specialists to apply it for prediction of patients' survival.

15. Modelling the joint distribution of competing risks survival times using copula functions

OpenAIRE

Kaishev, V. K.; Haberman, S.; Dimitrova, D. S.

2005-01-01

The problem of modelling the joint distribution of survival times in a competing risks model, using copula functions is considered. In order to evaluate this joint distribution and the related overall survival function, a system of non-linear differential equations is solved, which relates the crude and net survival functions of the modelled competing risks, through the copula. A similar approach to modelling dependent multiple decrements was applied by Carriere (1994) who used a Gaussian cop...

16. A generalized additive regression model for survival times

DEFF Research Database (Denmark)

Scheike, Thomas H.

2001-01-01

Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models......Additive Aalen model; counting process; disability model; illness-death model; generalized additive models; multiple time-scales; non-parametric estimation; survival data; varying-coefficient models...

17. Standard model group: Survival of the fittest

Science.gov (United States)

Nielsen, H. B.; Brene, N.

1983-09-01

The essential content of this paper is related to random dynamics. We speculate that the world seen through a sub-Planck-scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some "world (gauge) group". We see that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. We further argue that the subgroup which survives as the end product of a possible chain of collapses is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property.

18. Standard model group: survival of the fittest

Energy Technology Data Exchange (ETDEWEB)

Nielsen, H.B. (Niels Bohr Inst., Copenhagen (Denmark); Nordisk Inst. for Teoretisk Atomfysik, Copenhagen (Denmark)); Brene, N. (Niels Bohr Inst., Copenhagen (Denmark))

1983-09-19

The essential content of this paper is related to random dynamics. We speculate that the world seen through a sub-Planck-scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some ''world (gauge) group''. We see that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. We further argue that the subgroup which survives as the end product of a possible chain of collapse is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property.

19. Standard model group: survival of the fittest

International Nuclear Information System (INIS)

Nielsen, H.B.; Brene, N.

1983-01-01

Th essential content of this paper is related to random dynamics. We speculate that the world seen through a sub-Planck-scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some ''world (gauge) group''. We see that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. We further argue that the subgroup which survives as the end product of a possible chain of collapse is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property. (orig.)

20. Standard model group survival of the fittest

International Nuclear Information System (INIS)

Nielsen, H.B.; Brene, N.

1983-02-01

The essential content of this note is related to random dynamics. The authors speculate that the world seen through a sub Planck scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some ''world (gauge) group''. It is seen that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. It is further argued that the subgroup which survives as the end product of a possible chain of collapses is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property. (Auth.)

1. Comparison of Cox and Gray's survival models in severe sepsis

DEFF Research Database (Denmark)

Kasal, Jan; Andersen, Zorana Jovanovic; Clermont, Gilles

2004-01-01

Although survival is traditionally modeled using Cox proportional hazards modeling, this approach may be inappropriate in sepsis, in which the proportional hazards assumption does not hold. Newer, more flexible models, such as Gray's model, may be more appropriate....

2. Extensions and Applications of the Cox-Aalen Survival Model

DEFF Research Database (Denmark)

Scheike, Thomas H.; Zhang, Mei-Jie

2003-01-01

Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects......Aalen additive risk model; competing risk; counting processes; Cox model; cumulative incidence function; goodness of fit; prediction of survival probability; time-varying effects...

3. Predicting and Modelling of Survival Data when Cox's Regression Model does not hold

DEFF Research Database (Denmark)

Scheike, Thomas H.; Zhang, Mei-Jie

2002-01-01

Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects...

4. Effect of topically applied minoxidil on the survival of rat dorsal skin flap.

Science.gov (United States)

Gümüş, Nazım; Odemiş, Yusuf; Yılmaz, Sarper; Tuncer, Ersin

2012-12-01

Flap necrosis still is a challenging problem in reconstructive surgery that results in irreversible tissue loss. This study evaluated the effect of topically applied minoxidil on angiogenesis and survival of a caudally based dorsal rat skin flap. For this study, 24 male Wistar rats were randomly divided into three groups of eight each. A caudally based dorsal skin flap with the dimensions of 9 × 3 cm was raised. After elevation of the flaps, they were sutured back into their initial positions. In group 1 (control group), 1 ml of isotonic saline was applied topically to the flaps of all the animals for 14 days. In group 2, minoxidil solution was spread uniformly over the flap surface for 7 days after the flap elevation. In group 3, minoxidil solution was applied topically to the flap surface during a 14-day period. On day 7 after the flap elevation, the rats were killed. The average area of flap survival was determined for each rat. Subdermal vascular architecture and angiogenesis were evaluated under a light microscope after two full-thickness skin biopsy specimens had been obtained from the midline of the flaps. The lowest flap survival rate was observed in group 1, and no difference was observed between groups 1 and 2. Compared with groups 1 and 2, group 3 had a significantly increased percentage of flap survival (P minoxidil is vasodilation and that prolonged use before flap elevation leads to angiogenesis, increasing flap viability. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

5. Geostatistical methods applied to field model residuals

DEFF Research Database (Denmark)

Maule, Fox; Mosegaard, K.; Olsen, Nils

consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based...

6. Stage-specific predictive models for breast cancer survivability.

Science.gov (United States)

2017-01-01

Survivability rates vary widely among various stages of breast cancer. Although machine learning models built in past to predict breast cancer survivability were given stage as one of the features, they were not trained or evaluated separately for each stage. To investigate whether there are differences in performance of machine learning models trained and evaluated across different stages for predicting breast cancer survivability. Using three different machine learning methods we built models to predict breast cancer survivability separately for each stage and compared them with the traditional joint models built for all the stages. We also evaluated the models separately for each stage and together for all the stages. Our results show that the most suitable model to predict survivability for a specific stage is the model trained for that particular stage. In our experiments, using additional examples of other stages during training did not help, in fact, it made it worse in some cases. The most important features for predicting survivability were also found to be different for different stages. By evaluating the models separately on different stages we found that the performance widely varied across them. We also demonstrate that evaluating predictive models for survivability on all the stages together, as was done in the past, is misleading because it overestimates performance. Copyright Â© 2016 Elsevier Ireland Ltd. All rights reserved.

7. Developing a scalable modeling architecture for studying survivability technologies

Science.gov (United States)

Mohammad, Syed; Bounker, Paul; Mason, James; Brister, Jason; Shady, Dan; Tucker, David

2006-05-01

To facilitate interoperability of models in a scalable environment, and provide a relevant virtual environment in which Survivability technologies can be evaluated, the US Army Research Development and Engineering Command (RDECOM) Modeling Architecture for Technology Research and Experimentation (MATREX) Science and Technology Objective (STO) program has initiated the Survivability Thread which will seek to address some of the many technical and programmatic challenges associated with the effort. In coordination with different Thread customers, such as the Survivability branches of various Army labs, a collaborative group has been formed to define the requirements for the simulation environment that would in turn provide them a value-added tool for assessing models and gauge system-level performance relevant to Future Combat Systems (FCS) and the Survivability requirements of other burgeoning programs. An initial set of customer requirements has been generated in coordination with the RDECOM Survivability IPT lead, through the Survivability Technology Area at RDECOM Tank-automotive Research Development and Engineering Center (TARDEC, Warren, MI). The results of this project are aimed at a culminating experiment and demonstration scheduled for September, 2006, which will include a multitude of components from within RDECOM and provide the framework for future experiments to support Survivability research. This paper details the components with which the MATREX Survivability Thread was created and executed, and provides insight into the capabilities currently demanded by the Survivability faculty within RDECOM.

8. Applying the WEAP Model to Water Resource

DEFF Research Database (Denmark)

Gao, Jingjing; Christensen, Per; Li, Wei

efficiency, treatment and reuse of water. The WEAP model was applied to the Ordos catchment where it was used for the first time in China. The changes in water resource utilization in Ordos basin were assessed with the model. It was found that the WEAP model is a useful tool for water resource assessment......Water resources assessment is a tool to provide decision makers with an appropriate basis to make informed judgments regarding the objectives and targets to be addressed during the Strategic Environmental Assessment (SEA) process. The study shows how water resources assessment can be applied in SEA...... in assessing the effects on water resources using a case study on a Coal Industry Development Plan in an arid region in North Western China. In the case the WEAP model (Water Evaluation And Planning System) were used to simulate various scenarios using a diversity of technological instruments like irrigation...

9. Efficient estimation of semiparametric copula models for bivariate survival data

KAUST Repository

Cheng, Guang

2014-01-01

A semiparametric copula model for bivariate survival data is characterized by a parametric copula model of dependence and nonparametric models of two marginal survival functions. Efficient estimation for the semiparametric copula model has been recently studied for the complete data case. When the survival data are censored, semiparametric efficient estimation has only been considered for some specific copula models such as the Gaussian copulas. In this paper, we obtain the semiparametric efficiency bound and efficient estimation for general semiparametric copula models for possibly censored data. We construct an approximate maximum likelihood estimator by approximating the log baseline hazard functions with spline functions. We show that our estimates of the copula dependence parameter and the survival functions are asymptotically normal and efficient. Simple consistent covariance estimators are also provided. Numerical results are used to illustrate the finite sample performance of the proposed estimators. © 2013 Elsevier Inc.

10. Applied probability models with optimization applications

CERN Document Server

Ross, Sheldon M

1992-01-01

Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

11. SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL

Directory of Open Access Journals (Sweden)

Jenq-Daw Lee

2008-07-01

Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.

12. Applied Integer Programming Modeling and Solution

CERN Document Server

Chen, Der-San; Dang, Yu

2011-01-01

An accessible treatment of the modeling and solution of integer programming problems, featuring modern applications and software In order to fully comprehend the algorithms associated with integer programming, it is important to understand not only how algorithms work, but also why they work. Applied Integer Programming features a unique emphasis on this point, focusing on problem modeling and solution using commercial software. Taking an application-oriented approach, this book addresses the art and science of mathematical modeling related to the mixed integer programming (MIP) framework and

13. Applied research in uncertainty modeling and analysis

CERN Document Server

Ayyub, Bilal

2005-01-01

Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

14. Applied Mathematics, Modelling and Computational Science

CERN Document Server

Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

2015-01-01

The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

15. Applying incentive sensitization models to behavioral addiction

DEFF Research Database (Denmark)

Rømer Thomsen, Kristine; Fjorback, Lone; Møller, Arne

2014-01-01

The incentive sensitization theory is a promising model for understanding the mechanisms underlying drug addiction, and has received support in animal and human studies. So far the theory has not been applied to the case of behavioral addictions like Gambling Disorder, despite sharing clinical...... symptoms and underlying neurobiology. We examine the relevance of this theory for Gambling Disorder and point to predictions for future studies. The theory promises a significant contribution to the understanding of behavioral addiction and opens new avenues for treatment....

16. Causal inference for long-term survival in randomised trials with treatment switching: Should re-censoring be applied when estimating counterfactual survival times?

OpenAIRE

Latimer, N.R.; White, I.R.; Abrams, K.R.; Sieburt, U.

2017-01-01

Treatment switching often has a crucial impact on estimates of effectiveness and cost-effectiveness of new oncology treatments. Rank preserving structural failure time models (RPSFTM) and two-stage estimation (TSE) methods estimate ‘counterfactual’ (i.e. had there been no switching) survival times and incorporate re-censoring to guard against informative censoring in the counterfactual dataset. However, re-censoring causes a loss of longer term survival information which is problematic when e...

17. Determining factors influencing survival of breast cancer by fuzzy logistic regression model.

Science.gov (United States)

Nikbakht, Roya; Bahrampour, Abbas

2017-01-01

Fuzzy logistic regression model can be used for determining influential factors of disease. This study explores the important factors of actual predictive survival factors of breast cancer's patients. We used breast cancer data which collected by cancer registry of Kerman University of Medical Sciences during the period of 2000-2007. The variables such as morphology, grade, age, and treatments (surgery, radiotherapy, and chemotherapy) were applied in the fuzzy logistic regression model. Performance of model was determined in terms of mean degree of membership (MDM). The study results showed that almost 41% of patients were in neoplasm and malignant group and more than two-third of them were still alive after 5-year follow-up. Based on the fuzzy logistic model, the most important factors influencing survival were chemotherapy, morphology, and radiotherapy, respectively. Furthermore, the MDM criteria show that the fuzzy logistic regression have a good fit on the data (MDM = 0.86). Fuzzy logistic regression model showed that chemotherapy is more important than radiotherapy in survival of patients with breast cancer. In addition, another ability of this model is calculating possibilistic odds of survival in cancer patients. The results of this study can be applied in clinical research. Furthermore, there are few studies which applied the fuzzy logistic models. Furthermore, we recommend using this model in various research areas.

18. Applied Regression Modeling A Business Approach

CERN Document Server

Pardoe, Iain

2012-01-01

An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

19. Terahertz spectroscopy applied to food model systems

DEFF Research Database (Denmark)

Møller, Uffe

Water plays a crucial role in the quality of food. Apart from the natural water content of a food product, the state of that water is very important. Water can be found integrated into the biological material or it can be added during production of the product. Currently it is difficult...... to differentiate between these types of water in subsequent quality controls. This thesis describes terahertz time-domain spectroscopy applied on aqueous food model systems, with particular focus on ethanol-water mixtures and confined water pools in inverse micelles....

20. Commercial Consolidation Model Applied to Transport Infrastructure

Energy Technology Data Exchange (ETDEWEB)

Guilherme de Aragão, J.J.; Santos Fontes Pereira, L. dos; Yamashita, Y.

2016-07-01

Since the 1990s, transport concessions, including public-private partnerships (PPPs), have been increasingly adopted by governments as an alternative for financing and operations in public investments, especially in transport infrastructure. The advantage pointed out by proponents of these models lies in merging the expertise and capital of the private sector to the public interest. Several arrangements are possible and have been employed in different cases. After the duration of the first PPP contracts in transportation, many authors have analyzed the success and failure factors of partnerships. The occurrence of failures in some stages of the process can greatly encumber the public administration, incurring losses to the fiscal responsibility of the competent bodies. This article aims to propose a new commercial consolidation model applied to transport infrastructure to ensure fiscal sustainability and overcome the weaknesses of current models. Initially, a systematic review of the literature covering studies on transport concessions between 1990 and 2015 is offered, where the different approaches between various countries are compared and the critical success factors indicated in the studies are identified. In the subsequent part of the paper, an approach for the commercial consolidation of the infrastructure concessions is presented, where the concessionary is paid following a finalistic performance model, which includes the overall fiscal balance of regional growth. Finally, the papers analyses the usefulness of the model in coping with the critical success factors explained before. (Author)

1. Repair-misrepair model of cell survival

International Nuclear Information System (INIS)

Tobias, C.A.; Blakely, E.A.; Ngo, F.Q.H.

1980-01-01

During the last three years a new model, the repair-misrepair model (RMR) has been proposed, to interpret radiobiological experiments with heavy ions. In using the RMR model it became apparent that some of its features are suitable for handling the effects produced by a variety of environmental agents in addition to ionizing radiation. Two separate sequences of events are assumed to take place in an irradiated cell. The first sequence begins with an initial energy transfer consisting of ionizations and excitations, culminating via fast secondary physical and chemical processes in established macromolecular lesions in essential cell structures. The second sequence contains the responses of the cell to the lesions and consists of the processes of recognition and molecular repair. In normal cells there exists one repair process or at most a few enzymatic repair processes for each essential macromolecular lesion. The enzymatic repair processes may last for hours and minutes, and can be separated in time from the initial physicochemical and later genetic phases

2. Exploring location influences on firm survival rates using parametric duration models

NARCIS (Netherlands)

Manzato, G.G.; Arentze, T.A.; Timmermans, H.J.P.; Ettema, D.F.; Timmermans, H.J.P.; Vries, de B.

2010-01-01

Using parametric duration models applied to an office firm dataset, we carried out an exploratory study about the location influences on firm survival rates. Amongst the variables included, we found that accessibility to infrastructure supply, regional effects, demographic and economic aspects, and

3. Exploring location influences on firm survival rates using parametric duration models

NARCIS (Netherlands)

Manzato, G.G.; Arentze, T.A.; Timmermans, H.J.P.; Ettema, D.F.

2011-01-01

Using parametric duration models applied to an office firm dataset, we carried out an exploratory study about the location influences on firm survival rates. Amongst the variables included, we found that accessibility to infrastructure supply, regional effects, demographic and economic aspects, and

4. Exploration of location influences on firm survival rates using parametric duration models

NARCIS (Netherlands)

Manzato, G.G.; Arentze, T.A.; Timmermans, H.J.P.; Ettema, D.F.

2011-01-01

This study explored the influences of location on business firm survival rates with the use of parametric duration models applied to a data set. Of the variables included, those found to be the most significant were accessibility to infrastructure supply, regional effects, demographic and economic

5. Modeling survival: application of the Andersen-Gill model to Yellowstone grizzly bears

Science.gov (United States)

Johnson, Christopher J.; Boyce, Mark S.; Schwartz, Charles C.; Haroldson, Mark A.

2004-01-01

Wildlife ecologists often use the Kaplan-Meier procedure or Cox proportional hazards model to estimate survival rates, distributions, and magnitude of risk factors. The Andersen-Gill formulation (A-G) of the Cox proportional hazards model has seen limited application to mark-resight data but has a number of advantages, including the ability to accommodate left-censored data, time-varying covariates, multiple events, and discontinuous intervals of risks. We introduce the A-G model including structure of data, interpretation of results, and assessment of assumptions. We then apply the model to 22 years of radiotelemetry data for grizzly bears (Ursus arctos) of the Greater Yellowstone Grizzly Bear Recovery Zone in Montana, Idaho, and Wyoming, USA. We used Akaike's Information Criterion (AICc) and multi-model inference to assess a number of potentially useful predictive models relative to explanatory covariates for demography, human disturbance, and habitat. Using the most parsimonious models, we generated risk ratios, hypothetical survival curves, and a map of the spatial distribution of high-risk areas across the recovery zone. Our results were in agreement with past studies of mortality factors for Yellowstone grizzly bears. Holding other covariates constant, mortality was highest for bears that were subjected to repeated management actions and inhabited areas with high road densities outside Yellowstone National Park. Hazard models developed with covariates descriptive of foraging habitats were not the most parsimonious, but they suggested that high-elevation areas offered lower risks of mortality when compared to agricultural areas.

6. Statistical models and methods for reliability and survival analysis

CERN Document Server

Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

2013-01-01

Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

7. Modelling survival: exposure pattern, species sensitivity and uncertainty.

Science.gov (United States)

Ashauer, Roman; Albert, Carlo; Augustine, Starrlight; Cedergreen, Nina; Charles, Sandrine; Ducrot, Virginie; Focks, Andreas; Gabsi, Faten; Gergs, André; Goussen, Benoit; Jager, Tjalling; Kramer, Nynke I; Nyman, Anna-Maija; Poulsen, Veronique; Reichenberger, Stefan; Schäfer, Ralf B; Van den Brink, Paul J; Veltman, Karin; Vogel, Sören; Zimmer, Elke I; Preuss, Thomas G

2016-07-06

The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test the ability of GUTS to predict survival of aquatic organisms across different pesticide exposure patterns, time scales and species. Firstly, using synthetic data, we identified experimental data requirements which allow for the estimation of all parameters of the GUTS proper model. Secondly, we assessed how well GUTS, calibrated with short-term survival data of Gammarus pulex exposed to four pesticides, can forecast effects of longer-term pulsed exposures. Thirdly, we tested the ability of GUTS to estimate 14-day median effect concentrations of malathion for a range of species and use these estimates to build species sensitivity distributions for different exposure patterns. We find that GUTS adequately predicts survival across exposure patterns that vary over time. When toxicity is assessed for time-variable concentrations species may differ in their responses depending on the exposure profile. This can result in different species sensitivity rankings and safe levels. The interplay of exposure pattern and species sensitivity deserves systematic investigation in order to better understand how organisms respond to stress, including humans.

8. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

DEFF Research Database (Denmark)

Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

2014-01-01

A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented co...... applications. The methods presented are implemented in such a way that large and complex quantitative genetic data can be analyzed......A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant...

9. Multivariate Survival Mixed Models for Genetic Analysis of Longevity Traits

DEFF Research Database (Denmark)

Pimentel Maia, Rafael; Madsen, Per; Labouriau, Rodrigo

2013-01-01

A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented co...... applications. The methods presented are implemented in such a way that large and complex quantitative genetic data can be analyzed......A class of multivariate mixed survival models for continuous and discrete time with a complex covariance structure is introduced in a context of quantitative genetic applications. The methods introduced can be used in many applications in quantitative genetics although the discussion presented...... concentrates on longevity studies. The framework presented allows to combine models based on continuous time with models based on discrete time in a joint analysis. The continuous time models are approximations of the frailty model in which the hazard function will be assumed to be piece-wise constant...

10. Discrete dynamic modeling of T cell survival signaling networks

Science.gov (United States)

Zhang, Ranran

2009-03-01

Biochemistry-based frameworks are often not applicable for the modeling of heterogeneous regulatory systems that are sparsely documented in terms of quantitative information. As an alternative, qualitative models assuming a small set of discrete states are gaining acceptance. This talk will present a discrete dynamic model of the signaling network responsible for the survival and long-term competence of cytotoxic T cells in the blood cancer T-LGL leukemia. We integrated the signaling pathways involved in normal T cell activation and the known deregulations of survival signaling in leukemic T-LGL, and formulated the regulation of each network element as a Boolean (logic) rule. Our model suggests that the persistence of two signals is sufficient to reproduce all known deregulations in leukemic T-LGL. It also indicates the nodes whose inactivity is necessary and sufficient for the reversal of the T-LGL state. We have experimentally validated several model predictions, including: (i) Inhibiting PDGF signaling induces apoptosis in leukemic T-LGL. (ii) Sphingosine kinase 1 and NFκB are essential for the long-term survival of T cells in T-LGL leukemia. (iii) T box expressed in T cells (T-bet) is constitutively activated in the T-LGL state. The model has identified potential therapeutic targets for T-LGL leukemia and can be used for generating long-term competent CTL necessary for tumor and cancer vaccine development. The success of this model, and of other discrete dynamic models, suggests that the organization of signaling networks has an determining role in their dynamics. Reference: R. Zhang, M. V. Shah, J. Yang, S. B. Nyland, X. Liu, J. K. Yun, R. Albert, T. P. Loughran, Jr., Network Model of Survival Signaling in LGL Leukemia, PNAS 105, 16308-16313 (2008).

11. Modelling survival and connectivity of Mnemiopsis leidyi in the south-western North Sea and Scheldt estuaries

DEFF Research Database (Denmark)

van der Molen, J.; van Beek, J.; Augustine, Starrlight

2015-01-01

Three different models were applied to study the reproduction, survival and dispersal of Mnemiopsis leidyi in the Scheldt estuaries and the southern North Sea: a high-resolution particle tracking model with passive particles, a low-resolution particle tracking model with a reproduction model coup...

12. Parameter resolution in two models for cell survival after radiation

International Nuclear Information System (INIS)

Di Cera, E.; Andreasi Bassi, F.; Arcovito, G.

1989-01-01

The resolvability of model parameters for the linear-quadratic and the repair-misrepair models for cell survival after radiation has been studied by Monte Carlo simulations as a function of the number of experimental data points collected in a given dose range and the experimental error. Statistical analysis of the results reveals the range of experimental conditions under which the model parameters can be resolved with sufficient accuracy, and points out some differences in the operational aspects of the two models. (orig.)

13. Survival prediction model for postoperative hepatocellular carcinoma patients.

Science.gov (United States)

Ren, Zhihui; He, Shasha; Fan, Xiaotang; He, Fangping; Sang, Wei; Bao, Yongxing; Ren, Weixin; Zhao, Jinming; Ji, Xuewen; Wen, Hao

2017-09-01

This study is to establish a predictive index (PI) model of 5-year survival rate for patients with hepatocellular carcinoma (HCC) after radical resection and to evaluate its prediction sensitivity, specificity, and accuracy.Patients underwent HCC surgical resection were enrolled and randomly divided into prediction model group (101 patients) and model evaluation group (100 patients). Cox regression model was used for univariate and multivariate survival analysis. A PI model was established based on multivariate analysis and receiver operating characteristic (ROC) curve was drawn accordingly. The area under ROC (AUROC) and PI cutoff value was identified.Multiple Cox regression analysis of prediction model group showed that neutrophil to lymphocyte ratio, histological grade, microvascular invasion, positive resection margin, number of tumor, and postoperative transcatheter arterial chemoembolization treatment were the independent predictors for the 5-year survival rate for HCC patients. The model was PI = 0.377 × NLR + 0.554 × HG + 0.927 × PRM + 0.778 × MVI + 0.740 × NT - 0.831 × transcatheter arterial chemoembolization (TACE). In the prediction model group, AUROC was 0.832 and the PI cutoff value was 3.38. The sensitivity, specificity, and accuracy were 78.0%, 80%, and 79.2%, respectively. In model evaluation group, AUROC was 0.822, and the PI cutoff value was well corresponded to the prediction model group with sensitivity, specificity, and accuracy of 85.0%, 83.3%, and 84.0%, respectively.The PI model can quantify the mortality risk of hepatitis B related HCC with high sensitivity, specificity, and accuracy.

14. Predicting water main failures using Bayesian model averaging and survival modelling approach

International Nuclear Information System (INIS)

Kabir, Golam; Tesfamariam, Solomon; Sadiq, Rehan

2015-01-01

To develop an effective preventive or proactive repair and replacement action plan, water utilities often rely on water main failure prediction models. However, in predicting the failure of water mains, uncertainty is inherent regardless of the quality and quantity of data used in the model. To improve the understanding of water main failure, a Bayesian framework is developed for predicting the failure of water mains considering uncertainties. In this study, Bayesian model averaging method (BMA) is presented to identify the influential pipe-dependent and time-dependent covariates considering model uncertainties whereas Bayesian Weibull Proportional Hazard Model (BWPHM) is applied to develop the survival curves and to predict the failure rates of water mains. To accredit the proposed framework, it is implemented to predict the failure of cast iron (CI) and ductile iron (DI) pipes of the water distribution network of the City of Calgary, Alberta, Canada. Results indicate that the predicted 95% uncertainty bounds of the proposed BWPHMs capture effectively the observed breaks for both CI and DI water mains. Moreover, the performance of the proposed BWPHMs are better compare to the Cox-Proportional Hazard Model (Cox-PHM) for considering Weibull distribution for the baseline hazard function and model uncertainties. - Highlights: • Prioritize rehabilitation and replacements (R/R) strategies of water mains. • Consider the uncertainties for the failure prediction. • Improve the prediction capability of the water mains failure models. • Identify the influential and appropriate covariates for different models. • Determine the effects of the covariates on failure

15. Joint modelling of longitudinal CEA tumour marker progression and survival data on breast cancer

Science.gov (United States)

Borges, Ana; Sousa, Inês; Castro, Luis

2017-06-01

This work proposes the use of Biostatistics methods to study breast cancer in patients of Braga's Hospital Senology Unit, located in Portugal. The primary motivation is to contribute to the understanding of the progression of breast cancer, within the Portuguese population, using a more complex statistical model assumptions than the traditional analysis that take into account a possible existence of a serial correlation structure within a same subject observations. We aim to infer which risk factors aect the survival of Braga's Hospital patients, diagnosed with breast tumour. Whilst analysing risk factors that aect a tumour markers used on the surveillance of disease progression the Carcinoembryonic antigen (CEA). As survival and longitudinal processes may be associated, it is important to model these two processes together. Hence, a joint modelling of these two processes to infer on the association of these was conducted. A data set of 540 patients, along with 50 variables, was collected from medical records of the Hospital. A joint model approach was used to analyse these data. Two dierent joint models were applied to the same data set, with dierent parameterizations which give dierent interpretations to model parameters. These were used by convenience as the ones implemented in R software. Results from the two models were compared. Results from joint models, showed that the longitudinal CEA values were signicantly associated with the survival probability of these patients. A comparison between parameter estimates obtained in this analysis and previous independent survival[4] and longitudinal analysis[5][6], lead us to conclude that independent analysis brings up bias parameter estimates. Hence, an assumption of association between the two processes in a joint model of breast cancer data is necessary. Results indicate that the longitudinal progression of CEA is signicantly associated with the probability of survival of these patients. Hence, an assumption of

16. Mathematical Modeling Applied to Maritime Security

OpenAIRE

Center for Homeland Defense and Security

2010-01-01

Center for Homeland Defense and Security, OUT OF THE CLASSROOM Download the paper: Layered Defense: Modeling Terrorist Transfer Threat Networks and Optimizing Network Risk Reduction” Students in Ted Lewis’ Critical Infrastructure Protection course are taught how mathematic modeling can provide...

17. Applying Modeling Tools to Ground System Procedures

Science.gov (United States)

Di Pasquale, Peter

2012-01-01

As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

18. Applied Geography Internships: Operational Canadian Models.

Science.gov (United States)

Foster, L. T.

1982-01-01

Anxious to maintain student enrollments, geography departments have placed greater emphasis on the applied nature of the discipline. Described are (1) the advantages of internships in college geography curricula that enable students to gain firsthand knowledge about the usefulness of geography in real world situations and (2) operational models…

19. Applied Creativity: The Creative Marketing Breakthrough Model

Science.gov (United States)

Titus, Philip A.

2007-01-01

Despite the increasing importance of personal creativity in today's business environment, few conceptual creativity frameworks have been presented in the marketing education literature. The purpose of this article is to advance the integration of creativity instruction into marketing classrooms by presenting an applied creative marketing…

20. Applying Olap Model On Public Finance Management

OpenAIRE

Dorde Pavlovic; Branko Gledovic

2011-01-01

Budget control is derivate from one of the main functions of budget, that aims that the budget is control instrument of acquiring and pending of budget needs. OLAP model represents an instrument that finds its place in the budget planning process, executive phases of budget, accountancy, etc. There is a direct correlation between the OLAP model and public finance management process.

1. Applying the Sport Education Model to Tennis

Science.gov (United States)

Ayvazo, Shiri

2009-01-01

The physical education field abounds with theoretically sound curricular approaches such as fitness education, skill theme approach, tactical approach, and sport education. In an era that emphasizes authentic sport experiences, the Sport Education Model includes unique features that sets it apart from other curricular models and can be a valuable…

2. Estimating the average treatment effect on survival based on observational data and using partly conditional modeling.

Science.gov (United States)

Gong, Qi; Schaubel, Douglas E

2017-03-01

Treatments are frequently evaluated in terms of their effect on patient survival. In settings where randomization of treatment is not feasible, observational data are employed, necessitating correction for covariate imbalances. Treatments are usually compared using a hazard ratio. Most existing methods which quantify the treatment effect through the survival function are applicable to treatments assigned at time 0. In the data structure of our interest, subjects typically begin follow-up untreated; time-until-treatment, and the pretreatment death hazard are both heavily influenced by longitudinal covariates; and subjects may experience periods of treatment ineligibility. We propose semiparametric methods for estimating the average difference in restricted mean survival time attributable to a time-dependent treatment, the average effect of treatment among the treated, under current treatment assignment patterns. The pre- and posttreatment models are partly conditional, in that they use the covariate history up to the time of treatment. The pre-treatment model is estimated through recently developed landmark analysis methods. For each treated patient, fitted pre- and posttreatment survival curves are projected out, then averaged in a manner which accounts for the censoring of treatment times. Asymptotic properties are derived and evaluated through simulation. The proposed methods are applied to liver transplant data in order to estimate the effect of liver transplantation on survival among transplant recipients under current practice patterns. © 2016, The International Biometric Society.

3. Analyzing sickness absence with statistical models for survival data

DEFF Research Database (Denmark)

Christensen, Karl Bang; Andersen, Per Kragh; Smith-Hansen, Lars

2007-01-01

OBJECTIVES: Sickness absence is the outcome in many epidemiologic studies and is often based on summary measures such as the number of sickness absences per year. In this study the use of modern statistical methods was examined by making better use of the available information. Since sickness...... absence data deal with events occurring over time, the use of statistical models for survival data has been reviewed, and the use of frailty models has been proposed for the analysis of such data. METHODS: Three methods for analyzing data on sickness absences were compared using a simulation study...... involving the following: (i) Poisson regression using a single outcome variable (number of sickness absences), (ii) analysis of time to first event using the Cox proportional hazards model, and (iii) frailty models, which are random effects proportional hazards models. Data from a study of the relation...

4. Machine learning models in breast cancer survival prediction.

Science.gov (United States)

2016-01-01

Breast cancer is one of the most common cancers with a high mortality rate among women. With the early diagnosis of breast cancer survival will increase from 56% to more than 86%. Therefore, an accurate and reliable system is necessary for the early diagnosis of this cancer. The proposed model is the combination of rules and different machine learning techniques. Machine learning models can help physicians to reduce the number of false decisions. They try to exploit patterns and relationships among a large number of cases and predict the outcome of a disease using historical cases stored in datasets. The objective of this study is to propose a rule-based classification method with machine learning techniques for the prediction of different types of Breast cancer survival. We use a dataset with eight attributes that include the records of 900 patients in which 876 patients (97.3%) and 24 (2.7%) patients were females and males respectively. Naive Bayes (NB), Trees Random Forest (TRF), 1-Nearest Neighbor (1NN), AdaBoost (AD), Support Vector Machine (SVM), RBF Network (RBFN), and Multilayer Perceptron (MLP) machine learning techniques with 10-cross fold technique were used with the proposed model for the prediction of breast cancer survival. The performance of machine learning techniques were evaluated with accuracy, precision, sensitivity, specificity, and area under ROC curve. Out of 900 patients, 803 patients and 97 patients were alive and dead, respectively. In this study, Trees Random Forest (TRF) technique showed better results in comparison to other techniques (NB, 1NN, AD, SVM and RBFN, MLP). The accuracy, sensitivity and the area under ROC curve of TRF are 96%, 96%, 93%, respectively. However, 1NN machine learning technique provided poor performance (accuracy 91%, sensitivity 91% and area under ROC curve 78%). This study demonstrates that Trees Random Forest model (TRF) which is a rule-based classification model was the best model with the highest level of

5. Estimating true instead of apparent survival using spatial Cormack-Jolly-Seber models

Science.gov (United States)

Schaub, Michael; Royle, J. Andrew

2014-01-01

Survival is often estimated from capture–recapture data using Cormack–Jolly–Seber (CJS) models, where mortality and emigration cannot be distinguished, and the estimated apparent survival probability is the product of the probabilities of true survival and of study area fidelity. Consequently, apparent survival is lower than true survival unless study area fidelity equals one. Underestimation of true survival from capture–recapture data is a main limitation of the method.

6. Applied modelling and computing in social science

CERN Document Server

Povh, Janez

2015-01-01

In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

7. Survival models for harvest management of mourning dove populations

Science.gov (United States)

Otis, D.L.

2002-01-01

Quantitative models of the relationship between annual survival and harvest rate of migratory game-bird populations are essential to science-based harvest management strategies. I used the best available band-recovery and harvest data for mourning doves (Zenaida macroura) to build a set of models based on different assumptions about compensatory harvest mortality. Although these models suffer from lack of contemporary data, they can be used in development of an initial set of population models that synthesize existing demographic data on a management-unit scale, and serve as a tool for prioritization of population demographic information needs. Credible harvest management plans for mourning dove populations will require a long-term commitment to population monitoring and iterative population analysis.

8. Flexible Modeling of Survival Data with Covariates Subject to Detection Limits via Multiple Imputation.

Science.gov (United States)

Bernhardt, Paul W; Wang, Huixia Judy; Zhang, Daowen

2014-01-01

Models for survival data generally assume that covariates are fully observed. However, in medical studies it is not uncommon for biomarkers to be censored at known detection limits. A computationally-efficient multiple imputation procedure for modeling survival data with covariates subject to detection limits is proposed. This procedure is developed in the context of an accelerated failure time model with a flexible seminonparametric error distribution. The consistency and asymptotic normality of the multiple imputation estimator are established and a consistent variance estimator is provided. An iterative version of the proposed multiple imputation algorithm that approximates the EM algorithm for maximum likelihood is also suggested. Simulation studies demonstrate that the proposed multiple imputation methods work well while alternative methods lead to estimates that are either biased or more variable. The proposed methods are applied to analyze the dataset from a recently-conducted GenIMS study.

9. Connecting single-stock assessment models through correlated survival

DEFF Research Database (Denmark)

Albertsen, Christoffer Moesgaard; Nielsen, Anders; Thygesen, Uffe Høgsbro

2017-01-01

times. We propose a simple alternative. In three case studies each with two stocks, we improve the single-stock models, as measured by Akaike information criterion, by adding correlation in the cohort survival. To limit the number of parameters, the correlations are parameterized through...... the corresponding partial correlations. We consider six models where the partial correlation matrix between stocks follows a band structure ranging from independent assessments to complex correlation structures. Further, a simulation study illustrates the importance of handling correlated data sufficiently...... by investigating the coverage of confidence intervals for estimated fishing mortality. The results presented will allow managers to evaluate stock statuses based on a more accurate evaluation of model output uncertainty. The methods are directly implementable for stocks with an analytical assessment and do...

10. Predictive modelling of Lactobacillus casei KN291 survival in fermented soy beverage.

Science.gov (United States)

Zielińska, Dorota; Dorota, Zielińska; Kołożyn-Krajewska, Danuta; Danuta, Kołożyn-Krajewska; Goryl, Antoni; Antoni, Goryl; Motyl, Ilona

2014-02-01

The aim of the study was to construct and verify predictive growth and survival models of a potentially probiotic bacteria in fermented soy beverage. The research material included natural soy beverage (Polgrunt, Poland) and the strain of lactic acid bacteria (LAB) - Lactobacillus casei KN291. To construct predictive models for the growth and survival of L. casei KN291 bacteria in the fermented soy beverage we design an experiment which allowed the collection of CFU data. Fermented soy beverage samples were stored at various temperature conditions (5, 10, 15, and 20°C) for 28 days. On the basis of obtained data concerning the survival of L. casei KN291 bacteria in soy beverage at different temperature and time conditions, two non-linear models (r(2)= 0.68-0.93) and two surface models (r(2)=0.76-0.79) were constructed; these models described the behaviour of the bacteria in the product to a satisfactory extent. Verification of the surface models was carried out utilizing the validation data - at 7°C during 28 days. It was found that applied models were well fitted and charged with small systematic errors, which is evidenced by accuracy factor - Af, bias factor - Bf and mean squared error - MSE. The constructed microbiological growth and survival models of L. casei KN291 in fermented soy beverage enable the estimation of products shelf life period, which in this case is defined by the requirement for the level of the bacteria to be above 10(6) CFU/cm(3). The constructed models may be useful as a tool for the manufacture of probiotic foods to estimate of their shelf life period.

11. Private healthcare quality: applying a SERVQUAL model.

Science.gov (United States)

Butt, Mohsin Muhammad; de Run, Ernest Cyril

2010-01-01

This paper seeks to develop and test the SERVQUAL model scale for measuring Malaysian private health service quality. The study consists of 340 randomly selected participants visiting a private healthcare facility during a three-month data collection period. Data were analyzed using means, correlations, principal component and confirmatory factor analysis to establish the modified SERVQUAL scale's reliability, underlying dimensionality and convergent, discriminant validity. Results indicate a moderate negative quality gap for overall Malaysian private healthcare service quality. Results also indicate a moderate negative quality gap on each service quality scale dimension. However, scale development analysis yielded excellent results, which can be used in wider healthcare policy and practice. Respondents were skewed towards a younger population, causing concern that the results might not represent all Malaysian age groups. The study's major contribution is that it offers a way to assess private healthcare service quality. Second, it successfully develops a scale that can be used to measure health service quality in Malaysian contexts.

12. SU-E-T-131: Artificial Neural Networks Applied to Overall Survival Prediction for Patients with Periampullary Carcinoma

Energy Technology Data Exchange (ETDEWEB)

Gong, Y; Yu, J; Yeung, V; Palmer, J; Yu, Y; Lu, B; Babinsky, L; Burkhart, R; Leiby, B; Siow, V; Lavu, H; Rosato, E; Winter, J; Lewis, N; Sama, A; Mitchell, E; Anne, P; Hurwitz, M; Yeo, C; Bar-Ad, V [Thomas Jefferson University Hospital, Philadelphia, PA (United States); and others

2015-06-15

Purpose: Artificial neural networks (ANN) can be used to discover complex relations within datasets to help with medical decision making. This study aimed to develop an ANN method to predict two-year overall survival of patients with peri-ampullary cancer (PAC) following resection. Methods: Data were collected from 334 patients with PAC following resection treated in our institutional pancreatic tumor registry between 2006 and 2012. The dataset contains 14 variables including age, gender, T-stage, tumor differentiation, positive-lymph-node ratio, positive resection margins, chemotherapy, radiation therapy, and tumor histology.After censoring for two-year survival analysis, 309 patients were left, of which 44 patients (∼15%) were randomly selected to form testing set. The remaining 265 cases were randomly divided into training set (211 cases, ∼80% of 265) and validation set (54 cases, ∼20% of 265) for 20 times to build 20 ANN models. Each ANN has one hidden layer with 5 units. The 20 ANN models were ranked according to their concordance index (c-index) of prediction on validation sets. To further improve prediction, the top 10% of ANN models were selected, and their outputs averaged for prediction on testing set. Results: By random division, 44 cases in testing set and the remaining 265 cases have approximately equal two-year survival rates, 36.4% and 35.5% respectively. The 20 ANN models, which were trained and validated on the 265 cases, yielded mean c-indexes as 0.59 and 0.63 on validation sets and the testing set, respectively. C-index was 0.72 when the two best ANN models (top 10%) were used in prediction on testing set. The c-index of Cox regression analysis was 0.63. Conclusion: ANN improved survival prediction for patients with PAC. More patient data and further analysis of additional factors may be needed for a more robust model, which will help guide physicians in providing optimal post-operative care. This project was supported by PA CURE Grant.

13. SU-E-T-131: Artificial Neural Networks Applied to Overall Survival Prediction for Patients with Periampullary Carcinoma

International Nuclear Information System (INIS)

Gong, Y; Yu, J; Yeung, V; Palmer, J; Yu, Y; Lu, B; Babinsky, L; Burkhart, R; Leiby, B; Siow, V; Lavu, H; Rosato, E; Winter, J; Lewis, N; Sama, A; Mitchell, E; Anne, P; Hurwitz, M; Yeo, C; Bar-Ad, V

2015-01-01

Purpose: Artificial neural networks (ANN) can be used to discover complex relations within datasets to help with medical decision making. This study aimed to develop an ANN method to predict two-year overall survival of patients with peri-ampullary cancer (PAC) following resection. Methods: Data were collected from 334 patients with PAC following resection treated in our institutional pancreatic tumor registry between 2006 and 2012. The dataset contains 14 variables including age, gender, T-stage, tumor differentiation, positive-lymph-node ratio, positive resection margins, chemotherapy, radiation therapy, and tumor histology.After censoring for two-year survival analysis, 309 patients were left, of which 44 patients (∼15%) were randomly selected to form testing set. The remaining 265 cases were randomly divided into training set (211 cases, ∼80% of 265) and validation set (54 cases, ∼20% of 265) for 20 times to build 20 ANN models. Each ANN has one hidden layer with 5 units. The 20 ANN models were ranked according to their concordance index (c-index) of prediction on validation sets. To further improve prediction, the top 10% of ANN models were selected, and their outputs averaged for prediction on testing set. Results: By random division, 44 cases in testing set and the remaining 265 cases have approximately equal two-year survival rates, 36.4% and 35.5% respectively. The 20 ANN models, which were trained and validated on the 265 cases, yielded mean c-indexes as 0.59 and 0.63 on validation sets and the testing set, respectively. C-index was 0.72 when the two best ANN models (top 10%) were used in prediction on testing set. The c-index of Cox regression analysis was 0.63. Conclusion: ANN improved survival prediction for patients with PAC. More patient data and further analysis of additional factors may be needed for a more robust model, which will help guide physicians in providing optimal post-operative care. This project was supported by PA CURE Grant

14. Learning to Apply Models of Materials While Explaining Their Properties

Science.gov (United States)

Karpin, Tiia; Juuti, Kalle; Lavonen, Jari

2014-01-01

Background: Applying structural models is important to chemistry education at the upper secondary level, but it is considered one of the most difficult topics to learn. Purpose: This study analyses to what extent in designed lessons students learned to apply structural models in explaining the properties and behaviours of various materials.…

15. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

International Nuclear Information System (INIS)

Taktak, Azzam F G; Fisher, Anthony C; Damato, Bertil E

2004-01-01

This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate

16. Modelling survival after treatment of intraocular melanoma using artificial neural networks and Bayes theorem

Energy Technology Data Exchange (ETDEWEB)

Taktak, Azzam F G [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Fisher, Anthony C [Department of Clinical Engineering, Duncan Building, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom); Damato, Bertil E [Department of Ophthalmology, Royal Liverpool University Hospital, Liverpool L7 8XP (United Kingdom)

2004-01-07

This paper describes the development of an artificial intelligence (AI) system for survival prediction from intraocular melanoma. The system used artificial neural networks (ANNs) with five input parameters: coronal and sagittal tumour location, anterior tumour margin, largest basal tumour diameter and the cell type. After excluding records with missing data, 2331 patients were included in the study. These were split randomly into training and test sets. Date censorship was applied to the records to deal with patients who were lost to follow-up and patients who died from general causes. Bayes theorem was then applied to the ANN output to construct survival probability curves. A validation set with 34 patients unseen to both training and test sets was used to compare the AI system with Cox's regression (CR) and Kaplan-Meier (KM) analyses. Results showed large differences in the mean 5 year survival probability figures when the number of records with matching characteristics was small. However, as the number of matches increased to >100 the system tended to agree with CR and KM. The validation set was also used to compare the system with a clinical expert in predicting time to metastatic death. The rms error was 3.7 years for the system and 4.3 years for the clinical expert for 15 years survival. For <10 years survival, these figures were 2.7 and 4.2, respectively. We concluded that the AI system can match if not better the clinical expert's prediction. There were significant differences with CR and KM analyses when the number of records was small, but it was not known which model is more accurate.

17. Models for cell survival with low LET radiation

International Nuclear Information System (INIS)

Payne, M.G.; Garrett, W.R.

1975-01-01

A model for cell survival under low LET irradiation was developed in which the cell is considered to have N 0 -independent sensitive sites, each of which can exist in either an undamaged state (state A) or one of two damaged states. Radiation can change the sensitive sites from the undamaged state to either of two damaged states. The first damaged state (state B) can either be repaired or be promoted on the second damaged state (state C), which is irreparable. The promotion from the first damaged state to the second can occur due to any of the following: (1) further radiation damage, (2) an abortive attempt to repair the site, or (3) the arrival at a part of the cell cycle where the damage is ''fixed.'' Subject to the further assumptions that radiation damage can occur either indirectly (i.e., through radiation products) or due to direct interaction, and that repair of the first damaged state is a one-step process, expressions can be derived for P(N/sub A/, N/sub B/,t) = probability that after time t a cell will have N/sub A/ sites in state A and N/sub B/ in state B. The problem of determining P(N/sub A/, N/sub B/, t) is formulated for arbitrary time dependences of the radiation field and of all rate coefficients. A large family of cell-survival models can be described by interpreting the sensitive sites in different ways and by making different choices of rate coefficients and of the combinations of numbers of sites in different states that will lead to cell death. (U.S.)

18. A nonlinear interface model applied to masonry structures

Science.gov (United States)

Lebon, Frédéric; Raffa, Maria Letizia; Rizzoni, Raffaella

2015-12-01

In this paper, a new imperfect interface model is presented. The model includes finite strains, micro-cracks and smooth roughness. The model is consistently derived by coupling a homogenization approach for micro-cracked media and arguments of asymptotic analysis. The model is applied to brick/mortar interfaces. Numerical results are presented.

19. Partitioning of excess mortality in population-based cancer patient survival studies using flexible parametric survival models

Directory of Open Access Journals (Sweden)

Eloranta Sandra

2012-06-01

Full Text Available Abstract Background Relative survival is commonly used for studying survival of cancer patients as it captures both the direct and indirect contribution of a cancer diagnosis on mortality by comparing the observed survival of the patients to the expected survival in a comparable cancer-free population. However, existing methods do not allow estimation of the impact of isolated conditions (e.g., excess cardiovascular mortality on the total excess mortality. For this purpose we extend flexible parametric survival models for relative survival, which use restricted cubic splines for the baseline cumulative excess hazard and for any time-dependent effects. Methods In the extended model we partition the excess mortality associated with a diagnosis of cancer through estimating a separate baseline excess hazard function for the outcomes under investigation. This is done by incorporating mutually exclusive background mortality rates, stratified by the underlying causes of death reported in the Swedish population, and by introducing cause of death as a time-dependent effect in the extended model. This approach thereby enables modeling of temporal trends in e.g., excess cardiovascular mortality and remaining cancer excess mortality simultaneously. Furthermore, we illustrate how the results from the proposed model can be used to derive crude probabilities of death due to the component parts, i.e., probabilities estimated in the presence of competing causes of death. Results The method is illustrated with examples where the total excess mortality experienced by patients diagnosed with breast cancer is partitioned into excess cardiovascular mortality and remaining cancer excess mortality. Conclusions The proposed method can be used to simultaneously study disease patterns and temporal trends for various causes of cancer-consequent deaths. Such information should be of interest for patients and clinicians as one way of improving prognosis after cancer is

20. Impact of censoring on learning Bayesian networks in survival modelling.

Science.gov (United States)

Stajduhar, Ivan; Dalbelo-Basić, Bojana; Bogunović, Nikola

2009-11-01

Bayesian networks are commonly used for presenting uncertainty and covariate interactions in an easily interpretable way. Because of their efficient inference and ability to represent causal relationships, they are an excellent choice for medical decision support systems in diagnosis, treatment, and prognosis. Although good procedures for learning Bayesian networks from data have been defined, their performance in learning from censored survival data has not been widely studied. In this paper, we explore how to use these procedures to learn about possible interactions between prognostic factors and their influence on the variate of interest. We study how censoring affects the probability of learning correct Bayesian network structures. Additionally, we analyse the potential usefulness of the learnt models for predicting the time-independent probability of an event of interest. We analysed the influence of censoring with a simulation on synthetic data sampled from randomly generated Bayesian networks. We used two well-known methods for learning Bayesian networks from data: a constraint-based method and a score-based method. We compared the performance of each method under different levels of censoring to those of the naive Bayes classifier and the proportional hazards model. We did additional experiments on several datasets from real-world medical domains. The machine-learning methods treated censored cases in the data as event-free. We report and compare results for several commonly used model evaluation metrics. On average, the proportional hazards method outperformed other methods in most censoring setups. As part of the simulation study, we also analysed structural similarities of the learnt networks. Heavy censoring, as opposed to no censoring, produces up to a 5% surplus and up to 10% missing total arcs. It also produces up to 50% missing arcs that should originally be connected to the variate of interest. Presented methods for learning Bayesian networks from

1. Estimation of direct effects for survival data by using the Aalen additive hazards model

DEFF Research Database (Denmark)

Martinussen, Torben; Vansteelandt, Stijn; Gerster, Mette

2011-01-01

We extend the definition of the controlled direct effect of a point exposure on a survival outcome, other than through some given, time-fixed intermediate variable, to the additive hazard scale. We propose two-stage estimators for this effect when the exposure is dichotomous and randomly assigned...... Aalen's additive regression for the event time, given exposure, intermediate variable and confounders. The second stage involves applying Aalen's additive model, given the exposure alone, to a modified stochastic process (i.e. a modification of the observed counting process based on the first...

2. Re-evaluating neonatal-age models for ungulates: does model choice affect survival estimates?

Directory of Open Access Journals (Sweden)

Troy W Grovenburg

Full Text Available New-hoof growth is regarded as the most reliable metric for predicting age of newborn ungulates, but variation in estimated age among hoof-growth equations that have been developed may affect estimates of survival in staggered-entry models. We used known-age newborns to evaluate variation in age estimates among existing hoof-growth equations and to determine the consequences of that variation on survival estimates. During 2001-2009, we captured and radiocollared 174 newborn (≤24-hrs old ungulates: 76 white-tailed deer (Odocoileus virginianus in Minnesota and South Dakota, 61 mule deer (O. hemionus in California, and 37 pronghorn (Antilocapra americana in South Dakota. Estimated age of known-age newborns differed among hoof-growth models and varied by >15 days for white-tailed deer, >20 days for mule deer, and >10 days for pronghorn. Accuracy (i.e., the proportion of neonates assigned to the correct age in aging newborns using published equations ranged from 0.0% to 39.4% in white-tailed deer, 0.0% to 3.3% in mule deer, and was 0.0% for pronghorns. Results of survival modeling indicated that variability in estimates of age-at-capture affected short-term estimates of survival (i.e., 30 days for white-tailed deer and mule deer, and survival estimates over a longer time frame (i.e., 120 days for mule deer. Conversely, survival estimates for pronghorn were not affected by estimates of age. Our analyses indicate that modeling survival in daily intervals is too fine a temporal scale when age-at-capture is unknown given the potential inaccuracies among equations used to estimate age of neonates. Instead, weekly survival intervals are more appropriate because most models accurately predicted ages within 1 week of the known age. Variation among results of neonatal-age models on short- and long-term estimates of survival for known-age young emphasizes the importance of selecting an appropriate hoof-growth equation and appropriately defining intervals (i

3. Addressing issues associated with evaluating prediction models for survival endpoints based on the concordance statistic.

Science.gov (United States)

Wang, Ming; Long, Qi

2016-09-01

Prediction models for disease risk and prognosis play an important role in biomedical research, and evaluating their predictive accuracy in the presence of censored data is of substantial interest. The standard concordance (c) statistic has been extended to provide a summary measure of predictive accuracy for survival models. Motivated by a prostate cancer study, we address several issues associated with evaluating survival prediction models based on c-statistic with a focus on estimators using the technique of inverse probability of censoring weighting (IPCW). Compared to the existing work, we provide complete results on the asymptotic properties of the IPCW estimators under the assumption of coarsening at random (CAR), and propose a sensitivity analysis under the mechanism of noncoarsening at random (NCAR). In addition, we extend the IPCW approach as well as the sensitivity analysis to high-dimensional settings. The predictive accuracy of prediction models for cancer recurrence after prostatectomy is assessed by applying the proposed approaches. We find that the estimated predictive accuracy for the models in consideration is sensitive to NCAR assumption, and thus identify the best predictive model. Finally, we further evaluate the performance of the proposed methods in both settings of low-dimensional and high-dimensional data under CAR and NCAR through simulations. © 2016, The International Biometric Society.

4. Comparison of hypertabastic survival model with other unimodal hazard rate functions using a goodness-of-fit test.

Science.gov (United States)

Tahir, M Ramzan; Tran, Quang X; Nikulin, Mikhail S

2017-05-30

We studied the problem of testing a hypothesized distribution in survival regression models when the data is right censored and survival times are influenced by covariates. A modified chi-squared type test, known as Nikulin-Rao-Robson statistic, is applied for the comparison of accelerated failure time models. This statistic is used to test the goodness-of-fit for hypertabastic survival model and four other unimodal hazard rate functions. The results of simulation study showed that the hypertabastic distribution can be used as an alternative to log-logistic and log-normal distribution. In statistical modeling, because of its flexible shape of hazard functions, this distribution can also be used as a competitor of Birnbaum-Saunders and inverse Gaussian distributions. The results for the real data application are shown. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

5. Nonlinear Eddy Viscosity Models applied to Wind Turbine Wakes

DEFF Research Database (Denmark)

Laan, van der, Paul Maarten; Sørensen, Niels N.; Réthoré, Pierre-Elouan

2013-01-01

The linear k−ε eddy viscosity model and modified versions of two existing nonlinear eddy viscosity models are applied to single wind turbine wake simulations using a Reynolds Averaged Navier-Stokes code. Results are compared with field wake measurements. The nonlinear models give better results...

6. The Limitations of Applying Rational Decision-Making Models

African Journals Online (AJOL)

decision-making models as applied to child spacing and more. specificaDy to the use .... also assumes that the individual operates as a rational decision- making organism in ..... work involves: Motivation; Counselling; Distribution ofIEC mate-.

7. A BRDF statistical model applying to space target materials modeling

Science.gov (United States)

Liu, Chenghao; Li, Zhi; Xu, Can; Tian, Qichen

2017-10-01

In order to solve the problem of poor effect in modeling the large density BRDF measured data with five-parameter semi-empirical model, a refined statistical model of BRDF which is suitable for multi-class space target material modeling were proposed. The refined model improved the Torrance-Sparrow model while having the modeling advantages of five-parameter model. Compared with the existing empirical model, the model contains six simple parameters, which can approximate the roughness distribution of the material surface, can approximate the intensity of the Fresnel reflectance phenomenon and the attenuation of the reflected light's brightness with the azimuth angle changes. The model is able to achieve parameter inversion quickly with no extra loss of accuracy. The genetic algorithm was used to invert the parameters of 11 different samples in the space target commonly used materials, and the fitting errors of all materials were below 6%, which were much lower than those of five-parameter model. The effect of the refined model is verified by comparing the fitting results of the three samples at different incident zenith angles in 0° azimuth angle. Finally, the three-dimensional modeling visualizations of these samples in the upper hemisphere space was given, in which the strength of the optical scattering of different materials could be clearly shown. It proved the good describing ability of the refined model at the material characterization as well.

8. Survival, transport, and sources of fecal bacteria in streams and survival in land-applied poultry litter in the upper Shoal Creek basin, southwestern Missouri, 2001-2002

Science.gov (United States)

Schumacher, John G.

2003-01-01

five sampling sites along the 5.7-mi study reach of Shoal Creek, but the trends at successive downstream sites were out of phase and could not be explained by simple advection and dispersion. At base-flow conditions, the travel time of bacteria in Shoal Creek along the 5.7-mi reach between State Highway W (site 2) and the MDNR sampling site (site 3) was about 26 hours. Substantial dispersion and dilution occurs along the upper 4.1 mi of this reach because of inflows from a number of springs and tributaries and the presence of several long pools and channel meanders. Minimal dispersion and dilution occurs along the 1.6-mi reach immediately upstream from the MDNR sampling site. Measurements of fecal bacteria decay in Shoal Creek during July 2001 indicated that about 8 percent of fecal coliform and E. coli bacteria decay each hour with an average first-order decay constant of 0.084 h-1 (per hour). Results of field test plots indicated that substantial numbers of fecal bacteria present in poul try litter can survive in fields for as much as 8 weeks after the application of the litter to the land surface. Median densities of fecal coliform and E. coli in slurry-water samples collected from fields increased from less than 60 col/100 mL before the application of turkey and broiler litter, to as large as 420,000 and 290,000 col/100 mL after the application of litter. Bacteria densities in the test plots generally decreased in a exponential manner over time with decay rates ranging from 0.085 to 0.185 d-1 (per day) for fecal coliform to between 0.100 and 0.250 d-1 for E. coli. The apparent survival of significant numbers of fecal bacteria on fields where poultry litter has been applied indicates that runoff from these fields is a potential source of fecal bacteria to vicinity streams for many weeks following litter application.

9. Comparison of two multiaxial fatigue models applied to dental implants

Directory of Open Access Journals (Sweden)

JM. Ayllon

2015-07-01

Full Text Available This paper presents two multiaxial fatigue life prediction models applied to a commercial dental implant. One model is called Variable Initiation Length Model and takes into account both the crack initiation and propagation phases. The second model combines the Theory of Critical Distance with a critical plane damage model to characterise the initiation and initial propagation of micro/meso cracks in the material. This paper discusses which material properties are necessary for the implementation of these models and how to obtain them in the laboratory from simple test specimens. It also describes the FE models developed for the stress/strain and stress intensity factor characterisation in the implant. The results of applying both life prediction models are compared with experimental results arising from the application of ISO-14801 standard to a commercial dental implant.

10. Sensitivity analysis approaches applied to systems biology models.

Science.gov (United States)

Zi, Z

2011-11-01

With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

11. An extended gravity model with substitution applied to international trade

NARCIS (Netherlands)

Bikker, J.A.|info:eu-repo/dai/nl/06912261X

The traditional gravity model has been applied many times to international trade flows, especially in order to analyze trade creation and trade diversion. However, there are two fundamental objections to the model: it cannot describe substitutions between flows and it lacks a cogent theoretical

12. Exponential models applied to automated processing of radioimmunoassay standard curves

International Nuclear Information System (INIS)

Morin, J.F.; Savina, A.; Caroff, J.; Miossec, J.; Legendre, J.M.; Jacolot, G.; Morin, P.P.

1979-01-01

An improved computer processing is described for fitting of radio-immunological standard curves by means of an exponential model on a desk-top calculator. This method has been applied to a variety of radioassays and the results are in accordance with those obtained by more sophisticated models [fr

13. Modeling in applied sciences a kinetic theory approach

CERN Document Server

Pulvirenti, Mario

2000-01-01

Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

14. LEARNING SEMANTICS-ENHANCED LANGUAGE MODELS APPLIED TO UNSUEPRVISED WSD

Energy Technology Data Exchange (ETDEWEB)

VERSPOOR, KARIN [Los Alamos National Laboratory; LIN, SHOU-DE [Los Alamos National Laboratory

2007-01-29

An N-gram language model aims at capturing statistical syntactic word order information from corpora. Although the concept of language models has been applied extensively to handle a variety of NLP problems with reasonable success, the standard model does not incorporate semantic information, and consequently limits its applicability to semantic problems such as word sense disambiguation. We propose a framework that integrates semantic information into the language model schema, allowing a system to exploit both syntactic and semantic information to address NLP problems. Furthermore, acknowledging the limited availability of semantically annotated data, we discuss how the proposed model can be learned without annotated training examples. Finally, we report on a case study showing how the semantics-enhanced language model can be applied to unsupervised word sense disambiguation with promising results.

15. Analytic model of Applied-B ion diode impedance behavior

International Nuclear Information System (INIS)

Miller, P.A.; Mendel, C.W. Jr.

1987-01-01

An empirical analysis of impedance data from Applied-B ion diodes used in seven inertial confinement fusion research experiments was published recently. The diodes all operated with impedance values well below the Child's-law value. The analysis uncovered an unusual unifying relationship among data from the different experiments. The analysis suggested that closure of the anode-cathode gap by electrode plasma was not a dominant factor in the experiments, but was not able to elaborate the underlying physics. Here we present a new analytic model of Applied-B ion diodes coupled to accelerators. A critical feature of the diode model is based on magnetic insulation theory. The model successfully describes impedance behavior of these diodes and supports stimulating new viewpoints of the physics of Applied-B ion diode operation

16. Methods for model selection in applied science and engineering.

Energy Technology Data Exchange (ETDEWEB)

Field, Richard V., Jr.

2004-10-01

Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

17. In-season retail sales forecasting using survival models

African Journals Online (AJOL)

Retail sales forecasting, survival analysis, time series analysis, Holt's smoothing .... where fx(t) is the probability density function of the future lifetime, Tx, of a .... Adjustments were made to the shape of the smoothed mortality rates in light of new.

18. Modeling Organizational Design - Applying A Formalism Model From Theoretical Physics

Directory of Open Access Journals (Sweden)

Robert Fabac

2008-06-01

Full Text Available Modern organizations are exposed to diverse external environment influences. Currently accepted concepts of organizational design take into account structure, its interaction with strategy, processes, people, etc. Organization design and planning aims to align this key organizational design variables. At the higher conceptual level, however, completely satisfactory formulation for this alignment doesn’t exist. We develop an approach originating from the application of concepts of theoretical physics to social systems. Under this approach, the allocation of organizational resources is analyzed in terms of social entropy, social free energy and social temperature. This allows us to formalize the dynamic relationship between organizational design variables. In this paper we relate this model to Galbraith's Star Model and we also suggest improvements in the procedure of the complex analytical method in organizational design.

19. Applied data analysis and modeling for energy engineers and scientists

CERN Document Server

Reddy, T Agami

2011-01-01

""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

20. Estimation and model selection of semiparametric multivariate survival functions under general censorship.

Science.gov (United States)

Chen, Xiaohong; Fan, Yanqin; Pouzo, Demian; Ying, Zhiliang

2010-07-01

We study estimation and model selection of semiparametric models of multivariate survival functions for censored data, which are characterized by possibly misspecified parametric copulas and nonparametric marginal survivals. We obtain the consistency and root- n asymptotic normality of a two-step copula estimator to the pseudo-true copula parameter value according to KLIC, and provide a simple consistent estimator of its asymptotic variance, allowing for a first-step nonparametric estimation of the marginal survivals. We establish the asymptotic distribution of the penalized pseudo-likelihood ratio statistic for comparing multiple semiparametric multivariate survival functions subject to copula misspecification and general censorship. An empirical application is provided.

1. Linear mixing model applied to coarse resolution satellite data

Science.gov (United States)

Holben, Brent N.; Shimabukuro, Yosio E.

1992-01-01

A linear mixing model typically applied to high resolution data such as Airborne Visible/Infrared Imaging Spectrometer, Thematic Mapper, and Multispectral Scanner System is applied to the NOAA Advanced Very High Resolution Radiometer coarse resolution satellite data. The reflective portion extracted from the middle IR channel 3 (3.55 - 3.93 microns) is used with channels 1 (0.58 - 0.68 microns) and 2 (0.725 - 1.1 microns) to run the Constrained Least Squares model to generate fraction images for an area in the west central region of Brazil. The derived fraction images are compared with an unsupervised classification and the fraction images derived from Landsat TM data acquired in the same day. In addition, the relationship betweeen these fraction images and the well known NDVI images are presented. The results show the great potential of the unmixing techniques for applying to coarse resolution data for global studies.

2. The sdg interacting-boson model applied to 168Er

Science.gov (United States)

Yoshinaga, N.; Akiyama, Y.; Arima, A.

1986-03-01

The sdg interacting-boson model is applied to 168Er. Energy levels and E2 transitions are calculated. This model is shown to solve the problem of anharmonicity regarding the excitation energy of the first Kπ=4+ band relative to that of the first Kπ=2+ one. The level scheme including the Kπ=3+ band is well reproduced and the calculated B(E2)'s are consistent with the experimental data.

3. Remarks on orthotropic elastic models applied to wood

Directory of Open Access Journals (Sweden)

2006-09-01

Full Text Available Wood is generally considered an anisotropic material. In terms of engineering elastic models, wood is usually treated as an orthotropic material. This paper presents an analysis of two principal anisotropic elastic models that are usually applied to wood. The first one, the linear orthotropic model, where the material axes L (Longitudinal, R( radial and T(tangential are coincident with the Cartesian axes (x, y, z, is more accepted as wood elastic model. The other one, the cylindrical orthotropic model is more adequate of the growth caracteristics of wood but more mathematically complex to be adopted in practical terms. Specifically due to its importance in wood elastic parameters, this paper deals with the fiber orientation influence in these models through adequate transformation of coordinates. As a final result, some examples of the linear model, which show the variation of elastic moduli, i.e., Young´s modulus and shear modulus, with fiber orientation are presented.

4. An applied general equilibrium model for Dutch agribusiness policy analysis

NARCIS (Netherlands)

Peerlings, J.

1993-01-01

The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of

5. The limitations of applying rational decision-making models to ...

African Journals Online (AJOL)

The aim of this paper is to show the limitations of rational decision-making models as applied to child spacing and more specifically to the use of modern methods of contraception. In the light of factors known to influence low uptake of child spacing services in other African countries, suggestions are made to explain the ...

6. Applying the Flipped Classroom Model to English Language Arts Education

Science.gov (United States)

Young, Carl A., Ed.; Moran, Clarice M., Ed.

2017-01-01

The flipped classroom method, particularly when used with digital video, has recently attracted many supporters within the education field. Now more than ever, language arts educators can benefit tremendously from incorporating flipped classroom techniques into their curriculum. "Applying the Flipped Classroom Model to English Language Arts…

7. Modeling the effect of temperature on survival rate of Listeria monocytogenes in yogurt.

Science.gov (United States)

Szczawiński, J; Szczawińska, M E; Łobacz, A; Jackowska-Tracz, A

2016-01-01

The aim of the study was to (i) evaluate the behavior of Listeria monocytogenes in a commercially produced yogurt, (ii) determine the survival/inactivation rates of L. monocytogenes during cold storage of yogurt and (iii) to generate primary and secondary mathematical models to predict the behavior of these bacteria during storage at different temperatures. The samples of yogurt were inoculated with the mixture of three L. monocytogenes strains and stored at 3, 6, 9, 12 and 15°C for 16 days. The number of listeriae was determined after 0, 1, 2, 3, 5, 7, 9, 12, 14 and 16 days of storage. From each sample a series of decimal dilutions were prepared and plated onto ALOA agar (agar for Listeria according to Ottaviani and Agosti). It was found that applied temperature and storage time significantly influenced the survival rate of listeriae (pbacteria was found in the samples stored at 6°C (D-10 value = 243.9 h), whereas the highest reduction in the number of the bacteria was observed in the samples stored at 15°C (D-10 value = 87.0 h). The number of L. monocytogenes was correlated with the pH value of the samples (pyogurt stored under temperature range from 3 to 15°C, however, the polynomial model gave a better fit to the experimental data.

8. Molecular modeling: An open invitation for applied mathematics

Science.gov (United States)

Mezey, Paul G.

2013-10-01

Molecular modeling methods provide a very wide range of challenges for innovative mathematical and computational techniques, where often high dimensionality, large sets of data, and complicated interrelations imply a multitude of iterative approximations. The physical and chemical basis of these methodologies involves quantum mechanics with several non-intuitive aspects, where classical interpretation and classical analogies are often misleading or outright wrong. Hence, instead of the everyday, common sense approaches which work so well in engineering, in molecular modeling one often needs to rely on rather abstract mathematical constraints and conditions, again emphasizing the high level of reliance on applied mathematics. Yet, the interdisciplinary aspects of the field of molecular modeling also generates some inertia and perhaps too conservative reliance on tried and tested methodologies, that is at least partially caused by the less than up-to-date involvement in the newest developments in applied mathematics. It is expected that as more applied mathematicians take up the challenge of employing the latest advances of their field in molecular modeling, important breakthroughs may follow. In this presentation some of the current challenges of molecular modeling are discussed.

9. Statistical study of clone survival curves after irradiation in one or two stages. Comparison and generalization of different models

International Nuclear Information System (INIS)

Lachet, Bernard.

1975-01-01

A statistical study was carried out on 208 survival curves for chlorella subjected to γ or particle radiations. The computing programmes used were written in Fortran. The different experimental causes contributing to the variance of a survival rate are analyzed and consequently the experiments can be planned. Each curve was fitted to four models by the weighted least squares method applied to non-linear functions. The validity of the fits obtained can be checked by the F test. It was possible to define the confidence and prediction zones around an adjusted curve by weighting of the residual variance, in spite of error on the doses delivered; the confidence limits can them be fixed for a dose estimated from an exact or measured survival. The four models adopted were compared for the precision of their fit (by a non-parametric simultaneous comparison test) and the scattering of their adjusted parameters: Wideroe's model gives a very good fit with the experimental points in return for a scattering of its parameters, which robs them of their presumed meaning. The principal component analysis showed the statistical equivalence of the 1 and 2 hit target models. Division of the irradiation into two doses, the first fixed by the investigator, leads to families of curves for which the equation was established from that of any basic model expressing the dose survival relationship in one-stage irradiation [fr

10. Applying a realistic evaluation model to occupational safety interventions

DEFF Research Database (Denmark)

Pedersen, Louise Møller

2018-01-01

Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal characte......Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal...... and qualitative methods. This revised model has, however, not been applied in a real life context. Method: The model is applied in a controlled, four-component, integrated behaviour-based and safety culture-based safety intervention study (2008-2010) in a medium-sized wood manufacturing company. The interventions...... involve the company’s safety committee, safety manager, safety groups and 130 workers. Results: The model provides a framework for more valid evidence of what works within injury prevention. Affective commitment and role behaviour among key actors are identified as crucial for the implementation...

11. Applying Model Based Systems Engineering to NASA's Space Communications Networks

Science.gov (United States)

Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

2013-01-01

System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

12. A special case of reduced rank models for identification and modelling of time varying effects in survival analysis.

Science.gov (United States)

Perperoglou, Aris

2016-12-10

Flexible survival models are in need when modelling data from long term follow-up studies. In many cases, the assumption of proportionality imposed by a Cox model will not be valid. Instead, a model that can identify time varying effects of fixed covariates can be used. Although there are several approaches that deal with this problem, it is not always straightforward how to choose which covariates should be modelled having time varying effects and which not. At the same time, it is up to the researcher to define appropriate time functions that describe the dynamic pattern of the effects. In this work, we suggest a model that can deal with both fixed and time varying effects and uses simple hypotheses tests to distinguish which covariates do have dynamic effects. The model is an extension of the parsimonious reduced rank model of rank 1. As such, the number of parameters is kept low, and thus, a flexible set of time functions, such as b-splines, can be used. The basic theory is illustrated along with an efficient fitting algorithm. The proposed method is applied to a dataset of breast cancer patients and compared with a multivariate fractional polynomials approach for modelling time-varying effects. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

13. Agrochemical fate models applied in agricultural areas from Colombia

Science.gov (United States)

Garcia-Santos, Glenda; Yang, Jing; Andreoli, Romano; Binder, Claudia

2010-05-01

The misuse application of pesticides in mainly agricultural catchments can lead to severe problems for humans and environment. Especially in developing countries where there is often found overuse of agrochemicals and incipient or lack of water quality monitoring at local and regional levels, models are needed for decision making and hot spots identification. However, the complexity of the water cycle contrasts strongly with the scarce data availability, limiting the number of analysis, techniques, and models available to researchers. Therefore there is a strong need for model simplification able to appropriate model complexity and still represent the processes. We have developed a new model so-called Westpa-Pest to improve water quality management of an agricultural catchment located in the highlands of Colombia. Westpa-Pest is based on the fully distributed hydrologic model Wetspa and a fate pesticide module. We have applied a multi-criteria analysis for model selection under the conditions and data availability found in the region and compared with the new developed Westpa-Pest model. Furthermore, both models were empirically calibrated and validated. The following questions were addressed i) what are the strengths and weaknesses of the models?, ii) which are the most sensitive parameters of each model?, iii) what happens with uncertainties in soil parameters?, and iv) how sensitive are the transfer coefficients?

14. A general diagnostic model applied to language testing data.

Science.gov (United States)

von Davier, Matthias

2008-11-01

Probabilistic models with one or more latent variables are designed to report on a corresponding number of skills or cognitive attributes. Multidimensional skill profiles offer additional information beyond what a single test score can provide, if the reported skills can be identified and distinguished reliably. Many recent approaches to skill profile models are limited to dichotomous data and have made use of computationally intensive estimation methods such as Markov chain Monte Carlo, since standard maximum likelihood (ML) estimation techniques were deemed infeasible. This paper presents a general diagnostic model (GDM) that can be estimated with standard ML techniques and applies to polytomous response variables as well as to skills with two or more proficiency levels. The paper uses one member of a larger class of diagnostic models, a compensatory diagnostic model for dichotomous and partial credit data. Many well-known models, such as univariate and multivariate versions of the Rasch model and the two-parameter logistic item response theory model, the generalized partial credit model, as well as a variety of skill profile models, are special cases of this GDM. In addition to an introduction to this model, the paper presents a parameter recovery study using simulated data and an application to real data from the field test for TOEFL Internet-based testing.

15. Ensemble of cell survival experiments after ion irradiation for validation of RBE models

Energy Technology Data Exchange (ETDEWEB)

Friedrich, Thomas; Scholz, Uwe; Scholz, Michael [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Durante, Marco [GSI Helmholtzzentrum fuer Schwerionenforschung, Darmstadt (Germany); Institut fuer Festkoerperphysik, TU Darmstadt, Darmstadt (Germany)

2012-07-01

There is persistent interest in understanding the systematics of the relative biological effectiveness (RBE). Models such as the Local Effect Model (LEM) or the Microdosimetric Kinetic Model have the goal to predict the RBE. For the validation of these models a collection of many in-vitro cell survival experiments is most appropriate. The set-up of an ensemble of in-vitro cell survival data comprising about 850 survival experiments after both ion and photon irradiation is reported. The survival curves have been taken out from publications. The experiments encompass survival curves obtained in different labs, using different ion species from protons to uranium, varying irradiation modalities (shaped or monoenergetic beam), various energies and linear energy transfers, and a whole variety of cell types (human or rodent; normal, mutagenic or tumor; radioresistant or -sensitive). Each cell survival curve has been parameterized by the linear-quadratic model. The photon parameters have been added to the data base to allow to calculate the experimental RBE to any survival level. We report on experimental trends found within the data ensemble. The data will serve as a testing ground for RBE models such as the LEM. Finally, a roadmap for further validation and first model results using the data base in combination with the LEM are presented.

16. Applied systems ecology: models, data, and statistical methods

Energy Technology Data Exchange (ETDEWEB)

Eberhardt, L L

1976-01-01

In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

17. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

Science.gov (United States)

Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

2016-01-01

One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.

18. Modeling the effect of temperature on survival rate of Salmonella Enteritidis in yogurt.

Science.gov (United States)

Szczawiński, J; Szczawińska, M E; Łobacz, A; Jackowska-Tracz, A

2014-01-01

The aim of the study was to determine the inactivation rates of Salmonella Enteritidis in commercially produced yogurt and to generate primary and secondary mathematical models to predict the behaviour of these bacteria during storage at different temperatures. The samples were inoculated with the mixture of three S. Enteritidis strains and stored at 5 degrees C, 10 degrees C, 15 degrees C, 20 degrees C and 25 degrees C for 24 h. The number of salmonellae was determined every two hours. It was found that the number of bacteria decreased linearly with storage time in all samples. Storage temperature and pH of yogurt significantly influenced survival rate of S. Enteritidis (p bacteria was the most dynamic. The natural logarithm of mean inactivation rates of Salmonella calculated from primary model was fitted to two secondary models: linear and polynomial. Equations obtained from both secondary models can be applied as a tool for prediction of inactivation rate of Salmonella in yogurt stored under temperature range from 5 to 25 degrees C; however, polynomial model gave the better fit to the experimental data.

19. Applying model predictive control to power system frequency control

OpenAIRE

Ersdal, AM; Imsland, L; Cecilio, IM; Fabozzi, D; Thornhill, NF

2013-01-01

16.07.14 KB Ok to add accepted version to Spiral Model predictive control (MPC) is investigated as a control method which may offer advantages in frequency control of power systems than the control methods applied today, especially in presence of increased renewable energy penetration. The MPC includes constraints on both generation amount and generation rate of change, and it is tested on a one-area system. The proposed MPC is tested against a conventional proportional-integral (PI) cont...

20. Applied model for the growth of the daytime mixed layer

DEFF Research Database (Denmark)

Batchvarova, E.; Gryning, Sven-Erik

1991-01-01

numerically. When the mixed layer is shallow or the atmosphere nearly neutrally stratified, the growth is controlled mainly by mechanical turbulence. When the layer is deep, its growth is controlled mainly by convective turbulence. The model is applied on a data set of the evolution of the height of the mixed...... layer in the morning hours, when both mechanical and convective turbulence contribute to the growth process. Realistic mixed-layer developments are obtained....

1. Fractional calculus model of electrical impedance applied to human skin.

Science.gov (United States)

Vosika, Zoran B; Lazovic, Goran M; Misevic, Gradimir N; Simic-Krstic, Jovana B

2013-01-01

Fractional calculus is a mathematical approach dealing with derivatives and integrals of arbitrary and complex orders. Therefore, it adds a new dimension to understand and describe basic nature and behavior of complex systems in an improved way. Here we use the fractional calculus for modeling electrical properties of biological systems. We derived a new class of generalized models for electrical impedance and applied them to human skin by experimental data fitting. The primary model introduces new generalizations of: 1) Weyl fractional derivative operator, 2) Cole equation, and 3) Constant Phase Element (CPE). These generalizations were described by the novel equation which presented parameter [Formula: see text] related to remnant memory and corrected four essential parameters [Formula: see text] We further generalized single generalized element by introducing specific partial sum of Maclaurin series determined by parameters [Formula: see text] We defined individual primary model elements and their serial combination models by the appropriate equations and electrical schemes. Cole equation is a special case of our generalized class of models for[Formula: see text] Previous bioimpedance data analyses of living systems using basic Cole and serial Cole models show significant imprecisions. Our new class of models considerably improves the quality of fitting, evaluated by mean square errors, for bioimpedance data obtained from human skin. Our models with new parameters presented in specific partial sum of Maclaurin series also extend representation, understanding and description of complex systems electrical properties in terms of remnant memory effects.

2. Fractional calculus model of electrical impedance applied to human skin.

Directory of Open Access Journals (Sweden)

Zoran B Vosika

Full Text Available Fractional calculus is a mathematical approach dealing with derivatives and integrals of arbitrary and complex orders. Therefore, it adds a new dimension to understand and describe basic nature and behavior of complex systems in an improved way. Here we use the fractional calculus for modeling electrical properties of biological systems. We derived a new class of generalized models for electrical impedance and applied them to human skin by experimental data fitting. The primary model introduces new generalizations of: 1 Weyl fractional derivative operator, 2 Cole equation, and 3 Constant Phase Element (CPE. These generalizations were described by the novel equation which presented parameter [Formula: see text] related to remnant memory and corrected four essential parameters [Formula: see text] We further generalized single generalized element by introducing specific partial sum of Maclaurin series determined by parameters [Formula: see text] We defined individual primary model elements and their serial combination models by the appropriate equations and electrical schemes. Cole equation is a special case of our generalized class of models for[Formula: see text] Previous bioimpedance data analyses of living systems using basic Cole and serial Cole models show significant imprecisions. Our new class of models considerably improves the quality of fitting, evaluated by mean square errors, for bioimpedance data obtained from human skin. Our models with new parameters presented in specific partial sum of Maclaurin series also extend representation, understanding and description of complex systems electrical properties in terms of remnant memory effects.

3. Online traffic flow model applying dynamic flow-density relation

International Nuclear Information System (INIS)

Kim, Y.

2002-01-01

This dissertation describes a new approach of the online traffic flow modelling based on the hydrodynamic traffic flow model and an online process to adapt the flow-density relation dynamically. The new modelling approach was tested based on the real traffic situations in various homogeneous motorway sections and a motorway section with ramps and gave encouraging simulation results. This work is composed of two parts: first the analysis of traffic flow characteristics and second the development of a new online traffic flow model applying these characteristics. For homogeneous motorway sections traffic flow is classified into six different traffic states with different characteristics. Delimitation criteria were developed to separate these states. The hysteresis phenomena were analysed during the transitions between these traffic states. The traffic states and the transitions are represented on a states diagram with the flow axis and the density axis. For motorway sections with ramps the complicated traffic flow is simplified and classified into three traffic states depending on the propagation of congestion. The traffic states are represented on a phase diagram with the upstream demand axis and the interaction strength axis which was defined in this research. The states diagram and the phase diagram provide a basis for the development of the dynamic flow-density relation. The first-order hydrodynamic traffic flow model was programmed according to the cell-transmission scheme extended by the modification of flow dependent sending/receiving functions, the classification of cells and the determination strategy for the flow-density relation in the cells. The unreasonable results of macroscopic traffic flow models, which may occur in the first and last cells in certain conditions are alleviated by applying buffer cells between the traffic data and the model. The sending/receiving functions of the cells are determined dynamically based on the classification of the

4. Eliciting expert opinion for economic models: an applied example.

Science.gov (United States)

Leal, José; Wordsworth, Sarah; Legood, Rosa; Blair, Edward

2007-01-01

Expert opinion is considered as a legitimate source of information for decision-analytic modeling where required data are unavailable. Our objective was to develop a practical computer-based tool for eliciting expert opinion about the shape of the uncertainty distribution around individual model parameters. We first developed a prepilot survey with departmental colleagues to test a number of alternative approaches to eliciting opinions on the shape of the uncertainty distribution around individual parameters. This information was used to develop a survey instrument for an applied clinical example. This involved eliciting opinions from experts to inform a number of parameters involving Bernoulli processes in an economic model evaluating DNA testing for families with a genetic disease, hypertrophic cardiomyopathy. The experts were cardiologists, clinical geneticists, and laboratory scientists working with cardiomyopathy patient populations and DNA testing. Our initial prepilot work suggested that the more complex elicitation techniques advocated in the literature were difficult to use in practice. In contrast, our approach achieved a reasonable response rate (50%), provided logical answers, and was generally rated as easy to use by respondents. The computer software user interface permitted graphical feedback throughout the elicitation process. The distributions obtained were incorporated into the model, enabling the use of probabilistic sensitivity analysis. There is clearly a gap in the literature between theoretical elicitation techniques and tools that can be used in applied decision-analytic models. The results of this methodological study are potentially valuable for other decision analysts deriving expert opinion.

5. Surface-bounded growth modeling applied to human mandibles

DEFF Research Database (Denmark)

Andresen, Per Rønsholt

1999-01-01

This thesis presents mathematical and computational techniques for three dimensional growth modeling applied to human mandibles. The longitudinal shape changes make the mandible a complex bone. The teeth erupt and the condylar processes change direction, from pointing predominantly backward...... of the common features. 3.model the process that moves the matched points (growth modeling). A local shape feature called crest line has shown itself to be structurally stable on mandibles. Registration of crest lines (from different mandibles) results in a sparse deformation field, which must be interpolated...... old mandible based on the 3 month old scan. When using successively more recent scans as basis for the model the error drops to 2.0 mm for the 11 years old scan. Thus, it seems reasonable to assume that the mandibular growth is linear....

6. Fuzzy model predictive control algorithm applied in nuclear power plant

International Nuclear Information System (INIS)

2006-01-01

The aim of this paper is to design a predictive controller based on a fuzzy model. The Takagi-Sugeno fuzzy model with an Adaptive B-splines neuro-fuzzy implementation is used and incorporated as a predictor in a predictive controller. An optimization approach with a simplified gradient technique is used to calculate predictions of the future control actions. In this approach, adaptation of the fuzzy model using dynamic process information is carried out to build the predictive controller. The easy description of the fuzzy model and the easy computation of the gradient sector during the optimization procedure are the main advantages of the computation algorithm. The algorithm is applied to the control of a U-tube steam generation unit (UTSG) used for electricity generation. (author)

7. A survivability model for ejection of green compacts in powder metallurgy technology

Directory of Open Access Journals (Sweden)

Payman Ahi

2012-01-01

Full Text Available Reliability and quality assurance have become major considerations in the design and manufacture of today’s parts and products. Survivability of green compact using powder metallurgy technology is considered as one of the major quality attributes in manufacturing systems today. During powder metallurgy (PM production, the compaction conditions and behavior of the metal powder dictate the stress and density distribution in the green compact prior to sintering. These parameters greatly influence the mechanical properties and overall strength of the final component. In order to improve these properties, higher compaction pressures are usually employed, which make unloading and ejection of green compacts more challenging, especially for the powder-compacted parts with relatively complicated shapes. This study looked at a mathematical survivability model concerning green compact characteristics in PM technology and the stress-strength failure model in reliability engineering. This model depicts the relationship between mechanical loads (stress during ejection, experimentally determined green strength and survivability of green compact. The resulting survivability is the probability that a green compact survives during and after ejection. This survivability model can be used as an efficient tool for selecting the appropriate parameters for the process planning stage in PM technology. A case study is presented here in order to demonstrate the application of the proposed survivability model.

8. Hepatic retransplantation in New England--a regional experience and survival model.

Science.gov (United States)

Powelson, J A; Cosimi, A B; Lewis, W D; Rohrer, R J; Freeman, R B; Vacanti, J P; Jonas, M; Lorber, M I; Marks, W H; Bradley, J

1993-04-01

Hepatic retransplantation (reTx) offers the only alternative to death for patients who have failed primary hepatic transplantation (PTx). Assuming a finite number of donor organs, reTx also denies the chance of survival for some patients awaiting PTx. The impact of reTx on overall survival (i.e., the survival of all candidates for transplantation) must therefore be clarified. Between 1983 and 1991, 651 patients from the New England Organ Bank underwent liver transplantation, and 73 reTx were performed in 71 patients (11% reTx rate). The 1-year actuarial survival for reTx (48%) was significantly less than for PTx (70%, P 365 days, 83%). Patients on the regional waiting list had an 18% mortality rate while awaiting transplantation. These results were incorporated into a mathematical model describing survival as a function of reTx rate, assuming a limited supply of donor livers. ReTx improves the 1-year survival rate for patients undergoing PTx but decreases overall survival (survival of all candidates) for liver transplantation. In the current era of persistently insufficient donor numbers, strategies based on minimizing the use of reTx, especially in the case of patients in whom chances of success are minimal, will result in the best overall rate of patient survival.

9. A comparison of two-component and quadratic models to assess survival of irradiated stage-7 oocytes of Drosophila melanogaster

International Nuclear Information System (INIS)

Peres, C.A.; Koo, J.O.

1981-01-01

In this paper, the quadratic model to analyse data of this kind, i.e. S/S 0 = exp(-αD-bD 2 ), where S and Ssub(o) are defined as before is proposed is shown that the same biological interpretation can be given to the parameters α and A and to the parameters β and B. Furthermore it is shown that the quadratic model involves one probabilistic stage more than the two-component model, and therefore the quadratic model would perhaps be more appropriate as a dose-response model for survival of irradiated stage-7 oocytes of Drosophila melanogaster. In order to apply these results, the data presented by Sankaranarayanan and Sankaranarayanan and Volkers are reanalysed using the quadratic model. It is shown that the quadratic model fits better than the two-component model to the data in most situations. (orig./AJ)

10. Climate Change and Market Collapse: A Model Applied to Darfur

Directory of Open Access Journals (Sweden)

Ola Olsson

2016-03-01

Full Text Available A recurring argument in the global debate is that climate deterioration is likely to make social conflicts over diminishing natural resources more common in the future. The exact mechanism behind such a development has so far not been successfully characterized in the literature. In this paper, we present a general model of a community populated by farmers and herders who can either divide up land in a market economy or in autarky. The key insight from our model is that decreasing resources can make trade between the two groups collapse, which in turn makes each group’s welfare independent of that of the other. Predictions from the model are then applied to the conflict in Darfur. Our analysis suggests that three decades of drought in the area can at least partially explain the observed disintegration of markets and the subsequent rise of social tensions.

11. Liquid-drop model applied to heavy ions irradiation

International Nuclear Information System (INIS)

De Cicco, Hernan; Alurralde, Martin A.; Saint-Martin, Maria L. G.; Bernaola, Omar A.

1999-01-01

Liquid-drop model is used, previously applied in the study of radiation damage in metals, in an energy range not covered by molecular dynamics, in order to understand experimental data of particle tracks in an organic material (Makrofol E), which cannot be accurately described by the existing theoretical methods. The nuclear and electronic energy depositions are considered for each ion considered and the evolution of the thermal explosion is evaluated. The experimental observation of particle tracks in a region previously considered as 'prohibited' are justified. Although the model used has free parameters and some discrepancies with the experimental diametrical values exist, the agreement obtained is highly superior than that of other existing models. (author)

12. Linear mixing model applied to AVHRR LAC data

Science.gov (United States)

Holben, Brent N.; Shimabukuro, Yosio E.

1993-01-01

A linear mixing model was applied to coarse spatial resolution data from the NOAA Advanced Very High Resolution Radiometer. The reflective component of the 3.55 - 3.93 microns channel was extracted and used with the two reflective channels 0.58 - 0.68 microns and 0.725 - 1.1 microns to run a Constraine Least Squares model to generate vegetation, soil, and shade fraction images for an area in the Western region of Brazil. The Landsat Thematic Mapper data covering the Emas National park region was used for estimating the spectral response of the mixture components and for evaluating the mixing model results. The fraction images were compared with an unsupervised classification derived from Landsat TM data acquired on the same day. The relationship between the fraction images and normalized difference vegetation index images show the potential of the unmixing techniques when using coarse resolution data for global studies.

13. Remote sensing applied to numerical modelling. [water resources pollution

Science.gov (United States)

Sengupta, S.; Lee, S. S.; Veziroglu, T. N.; Bland, R.

1975-01-01

Progress and remaining difficulties in the construction of predictive mathematical models of large bodies of water as ecosystems are reviewed. Surface temperature is at present the only variable than can be measured accurately and reliably by remote sensing techniques, but satellite infrared data are of sufficient resolution for macro-scale modeling of oceans and large lakes, and airborne radiometers are useful in meso-scale analysis (of lakes, bays, and thermal plumes). Finite-element and finite-difference techniques applied to the solution of relevant coupled time-dependent nonlinear partial differential equations are compared, and the specific problem of the Biscayne Bay and environs ecosystem is tackled in a finite-differences treatment using the rigid-lid model and a rigid-line grid system.

14. Apply Functional Modelling to Consequence Analysis in Supervision Systems

DEFF Research Database (Denmark)

Zhang, Xinxin; Lind, Morten; Gola, Giulio

2013-01-01

This paper will first present the purpose and goals of applying functional modelling approach to consequence analysis by adopting Multilevel Flow Modelling (MFM). MFM Models describe a complex system in multiple abstraction levels in both means-end dimension and whole-part dimension. It contains...... consequence analysis to practical or online applications in supervision systems. It will also suggest a multiagent solution as the integration architecture for developing tools to facilitate the utilization results of functional consequence analysis. Finally a prototype of the multiagent reasoning system...... causal relations between functions and goals. A rule base system can be developed to trace the causal relations and perform consequence propagations. This paper will illustrate how to use MFM for consequence reasoning by using rule base technology and describe the challenges for integrating functional...

15. Applying Probabilistic Decision Models to Clinical Trial Design

Science.gov (United States)

Smith, Wade P; Phillips, Mark H

2018-01-01

Clinical trial design most often focuses on a single or several related outcomes with corresponding calculations of statistical power. We consider a clinical trial to be a decision problem, often with competing outcomes. Using a current controversy in the treatment of HPV-positive head and neck cancer, we apply several different probabilistic methods to help define the range of outcomes given different possible trial designs. Our model incorporates the uncertainties in the disease process and treatment response and the inhomogeneities in the patient population. Instead of expected utility, we have used a Markov model to calculate quality adjusted life expectancy as a maximization objective. Monte Carlo simulations over realistic ranges of parameters are used to explore different trial scenarios given the possible ranges of parameters. This modeling approach can be used to better inform the initial trial design so that it will more likely achieve clinical relevance.

16. Applying a Dynamic Resource Supply Model in a Smart Grid

Directory of Open Access Journals (Sweden)

Kaiyu Wan

2014-09-01

Full Text Available Dynamic resource supply is a complex issue to resolve in a cyber-physical system (CPS. In our previous work, a resource model called the dynamic resource supply model (DRSM has been proposed to handle resources specification, management and allocation in CPS. In this paper, we are integrating the DRSM with service-oriented architecture and applying it to a smart grid (SG, one of the most complex CPS examples. We give the detailed design of the SG for electricity charging request and electricity allocation between plug-in hybrid electric vehicles (PHEV and DRSM through the Android system. In the design, we explain a mechanism for electricity consumption with data collection and re-allocation through ZigBee network. In this design, we verify the correctness of this resource model for expected electricity allocation.

17. Fluid Survival Tool: A Model Checker for Hybrid Petri Nets

NARCIS (Netherlands)

Postema, Björn Frits; Remke, Anne Katharina Ingrid; Haverkort, Boudewijn R.H.M.; Ghasemieh, Hamed

2014-01-01

Recently, algorithms for model checking Stochastic Time Logic (STL) on Hybrid Petri nets with a single general one-shot transition (HPNG) have been introduced. This paper presents a tool for model checking HPNG models against STL formulas. A graphical user interface (GUI) not only helps to

18. Nature preservation acceptance model applied to tanker oil spill simulations

DEFF Research Database (Denmark)

Friis-Hansen, Peter; Ditlevsen, Ove Dalager

2003-01-01

is exemplified by a study of oil spills due to simulated tanker collisions in the Danish straits. It is found that the distribution of the oil spill volume per spill is well represented by an exponential distribution both in Oeresund and in Great Belt. When applied in the Poisson model, a risk profile reasonably...... acceptance criterion for the pollution of the environment. This NPWI acceptance criterion is applied to the oil spill example....... be defined in a similar way as the so-called Life Quality Index defined by Nathwani et al [Nathwani JS, Lind NC, Padey MD. Affordable safety by choice: the life quality method. Institute for Risk Research, University of Waterloo; Waterloo (Ontario, Canada):1997], and can be used to quantify the risk...

19. Parametric and semiparametric models with applications to reliability, survival analysis, and quality of life

CERN Document Server

Nikulin, M; Mesbah, M; Limnios, N

2004-01-01

Parametric and semiparametric models are tools with a wide range of applications to reliability, survival analysis, and quality of life. This self-contained volume examines these tools in survey articles written by experts currently working on the development and evaluation of models and methods. While a number of chapters deal with general theory, several explore more specific connections and recent results in "real-world" reliability theory, survival analysis, and related fields.

20. Experimental designs for autoregressive models applied to industrial maintenance

International Nuclear Information System (INIS)

Amo-Salas, M.; López-Fidalgo, J.; Pedregal, D.J.

2015-01-01

Some time series applications require data which are either expensive or technically difficult to obtain. In such cases scheduling the points in time at which the information should be collected is of paramount importance in order to optimize the resources available. In this paper time series models are studied from a new perspective, consisting in the use of Optimal Experimental Design setup to obtain the best times to take measurements, with the principal aim of saving costs or discarding useless information. The model and the covariance function are expressed in an explicit form to apply the usual techniques of Optimal Experimental Design. Optimal designs for various approaches are computed and their efficiencies are compared. The methods working in an application of industrial maintenance of a critical piece of equipment at a petrochemical plant are shown. This simple model allows explicit calculations in order to show openly the procedure to find the correlation structure, needed for computing the optimal experimental design. In this sense the techniques used in this paper to compute optimal designs may be transferred to other situations following the ideas of the paper, but taking into account the increasing difficulty of the procedure for more complex models. - Highlights: • Optimal experimental design theory is applied to AR models to reduce costs. • The first observation has an important impact on any optimal design. • Either the lack of precision or small starting observations claim for large times. • Reasonable optimal times were obtained relaxing slightly the efficiency. • Optimal designs were computed in a predictive maintenance context

1. Inverse geothermal modelling applied to Danish sedimentary basins

Science.gov (United States)

Poulsen, Søren E.; Balling, Niels; Bording, Thue S.; Mathiesen, Anders; Nielsen, Søren B.

2017-10-01

This paper presents a numerical procedure for predicting subsurface temperatures and heat-flow distribution in 3-D using inverse calibration methodology. The procedure is based on a modified version of the groundwater code MODFLOW by taking advantage of the mathematical similarity between confined groundwater flow (Darcy's law) and heat conduction (Fourier's law). Thermal conductivity, heat production and exponential porosity-depth relations are specified separately for the individual geological units of the model domain. The steady-state temperature model includes a model-based transient correction for the long-term palaeoclimatic thermal disturbance of the subsurface temperature regime. Variable model parameters are estimated by inversion of measured borehole temperatures with uncertainties reflecting their quality. The procedure facilitates uncertainty estimation for temperature predictions. The modelling procedure is applied to Danish onshore areas containing deep sedimentary basins. A 3-D voxel-based model, with 14 lithological units from surface to 5000 m depth, was built from digital geological maps derived from combined analyses of reflection seismic lines and borehole information. Matrix thermal conductivity of model lithologies was estimated by inversion of all available deep borehole temperature data and applied together with prescribed background heat flow to derive the 3-D subsurface temperature distribution. Modelled temperatures are found to agree very well with observations. The numerical model was utilized for predicting and contouring temperatures at 2000 and 3000 m depths and for two main geothermal reservoir units, the Gassum (Lower Jurassic-Upper Triassic) and Bunter/Skagerrak (Triassic) reservoirs, both currently utilized for geothermal energy production. Temperature gradients to depths of 2000-3000 m are generally around 25-30 °C km-1, locally up to about 35 °C km-1. Large regions have geothermal reservoirs with characteristic temperatures

2. Modeling the airborne survival of influenza virus in a residential setting: the impacts of home humidification

Science.gov (United States)

2010-01-01

Background Laboratory research studies indicate that aerosolized influenza viruses survive for longer periods at low relative humidity (RH) conditions. Further analysis has shown that absolute humidity (AH) may be an improved predictor of virus survival in the environment. Maintaining airborne moisture levels that reduce survival of the virus in the air and on surfaces could be another tool for managing public health risks of influenza. Methods A multi-zone indoor air quality model was used to evaluate the ability of portable humidifiers to control moisture content of the air and the potential related benefit of decreasing survival of influenza viruses in single-family residences. We modeled indoor AH and influenza virus concentrations during winter months (Northeast US) using the CONTAM multi-zone indoor air quality model. A two-story residential template was used under two different ventilation conditions - forced hot air and radiant heating. Humidity was evaluated on a room-specific and whole house basis. Estimates of emission rates for influenza virus were particle-size specific and derived from published studies and included emissions during both tidal breathing and coughing events. The survival of the influenza virus was determined based on the established relationship between AH and virus survival. Results The presence of a portable humidifier with an output of 0.16 kg water per hour in the bedroom resulted in an increase in median sleeping hours AH/RH levels of 11 to 19% compared to periods without a humidifier present. The associated percent decrease in influenza virus survival was 17.5 - 31.6%. Distribution of water vapor through a residence was estimated to yield 3 to 12% increases in AH/RH and 7.8-13.9% reductions in influenza virus survival. Conclusion This modeling analysis demonstrates the potential benefit of portable residential humidifiers in reducing the survival of aerosolized influenza virus by controlling humidity indoors. PMID:20815876

3. Modeling the airborne survival of influenza virus in a residential setting: the impacts of home humidification

Directory of Open Access Journals (Sweden)

Myatt Theodore A

2010-09-01

Full Text Available Abstract Background Laboratory research studies indicate that aerosolized influenza viruses survive for longer periods at low relative humidity (RH conditions. Further analysis has shown that absolute humidity (AH may be an improved predictor of virus survival in the environment. Maintaining airborne moisture levels that reduce survival of the virus in the air and on surfaces could be another tool for managing public health risks of influenza. Methods A multi-zone indoor air quality model was used to evaluate the ability of portable humidifiers to control moisture content of the air and the potential related benefit of decreasing survival of influenza viruses in single-family residences. We modeled indoor AH and influenza virus concentrations during winter months (Northeast US using the CONTAM multi-zone indoor air quality model. A two-story residential template was used under two different ventilation conditions - forced hot air and radiant heating. Humidity was evaluated on a room-specific and whole house basis. Estimates of emission rates for influenza virus were particle-size specific and derived from published studies and included emissions during both tidal breathing and coughing events. The survival of the influenza virus was determined based on the established relationship between AH and virus survival. Results The presence of a portable humidifier with an output of 0.16 kg water per hour in the bedroom resulted in an increase in median sleeping hours AH/RH levels of 11 to 19% compared to periods without a humidifier present. The associated percent decrease in influenza virus survival was 17.5 - 31.6%. Distribution of water vapor through a residence was estimated to yield 3 to 12% increases in AH/RH and 7.8-13.9% reductions in influenza virus survival. Conclusion This modeling analysis demonstrates the potential benefit of portable residential humidifiers in reducing the survival of aerosolized influenza virus by controlling humidity

4. Modeling the effects of binary mixtures on survival in time.

NARCIS (Netherlands)

Baas, J.; van Houte, B.P.P.; van Gestel, C.A.M.; Kooijman, S.A.L.M.

2007-01-01

In general, effects of mixtures are difficult to describe, and most of the models in use are descriptive in nature and lack a strong mechanistic basis. The aim of this experiment was to develop a process-based model for the interpretation of mixture toxicity measurements, with effects of binary

5. Applied economic model development algorithm for electronics company

Directory of Open Access Journals (Sweden)

Mikhailov I.

2017-01-01

Full Text Available The purpose of this paper is to report about received experience in the field of creating the actual methods and algorithms that help to simplify development of applied decision support systems. It reports about an algorithm, which is a result of two years research and have more than one-year practical verification. In a case of testing electronic components, the time of the contract conclusion is crucial point to make the greatest managerial mistake. At this stage, it is difficult to achieve a realistic assessment of time-limit and of wage-fund for future work. The creation of estimating model is possible way to solve this problem. In the article is represented an algorithm for creation of those models. The algorithm is based on example of the analytical model development that serves for amount of work estimation. The paper lists the algorithm’s stages and explains their meanings with participants’ goals. The implementation of the algorithm have made possible twofold acceleration of these models development and fulfilment of management’s requirements. The resulting models have made a significant economic effect. A new set of tasks was identified to be further theoretical study.

6. Modeling of pathogen survival during simulated gastric digestion.

Science.gov (United States)

Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

2011-02-01

The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens.

7. Modeling of Pathogen Survival during Simulated Gastric Digestion ▿

Science.gov (United States)

Koseki, Shige; Mizuno, Yasuko; Sotome, Itaru

2011-01-01

The objective of the present study was to develop a mathematical model of pathogenic bacterial inactivation kinetics in a gastric environment in order to further understand a part of the infectious dose-response mechanism. The major bacterial pathogens Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella spp. were examined by using simulated gastric fluid adjusted to various pH values. To correspond to the various pHs in a stomach during digestion, a modified logistic differential equation model and the Weibull differential equation model were examined. The specific inactivation rate for each pathogen was successfully described by a square-root model as a function of pH. The square-root models were combined with the modified logistic differential equation to obtain a complete inactivation curve. Both the modified logistic and Weibull models provided a highly accurate fitting of the static pH conditions for every pathogen. However, while the residuals plots of the modified logistic model indicated no systematic bias and/or regional prediction problems, the residuals plots of the Weibull model showed a systematic bias. The modified logistic model appropriately predicted the pathogen behavior in the simulated gastric digestion process with actual food, including cut lettuce, minced tuna, hamburger, and scrambled egg. Although the developed model enabled us to predict pathogen inactivation during gastric digestion, its results also suggested that the ingested bacteria in the stomach would barely be inactivated in the real digestion process. The results of this study will provide important information on a part of the dose-response mechanism of bacterial pathogens. PMID:21131530

8. A hands-on approach for fitting long-term survival models under the GAMLSS framework.

Science.gov (United States)

de Castro, Mário; Cancho, Vicente G; Rodrigues, Josemar

2010-02-01

In many data sets from clinical studies there are patients insusceptible to the occurrence of the event of interest. Survival models which ignore this fact are generally inadequate. The main goal of this paper is to describe an application of the generalized additive models for location, scale, and shape (GAMLSS) framework to the fitting of long-term survival models. In this work the number of competing causes of the event of interest follows the negative binomial distribution. In this way, some well known models found in the literature are characterized as particular cases of our proposal. The model is conveniently parameterized in terms of the cured fraction, which is then linked to covariates. We explore the use of the gamlss package in R as a powerful tool for inference in long-term survival models. The procedure is illustrated with a numerical example. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

9. Enhanced pid vs model predictive control applied to bldc motor

Science.gov (United States)

Gaya, M. S.; Muhammad, Auwal; Aliyu Abdulkadir, Rabiu; Salim, S. N. S.; Madugu, I. S.; Tijjani, Aminu; Aminu Yusuf, Lukman; Dauda Umar, Ibrahim; Khairi, M. T. M.

2018-01-01

BrushLess Direct Current (BLDC) motor is a multivariable and highly complex nonlinear system. Variation of internal parameter values with environment or reference signal increases the difficulty in controlling the BLDC effectively. Advanced control strategies (like model predictive control) often have to be integrated to satisfy the control desires. Enhancing or proper tuning of a conventional algorithm results in achieving the desired performance. This paper presents a performance comparison of Enhanced PID and Model Predictive Control (MPC) applied to brushless direct current motor. The simulation results demonstrated that the PSO-PID is slightly better than the PID and MPC in tracking the trajectory of the reference signal. The proposed scheme could be useful algorithms for the system.

10. Elastic models: a comparative study applied to retinal images.

Science.gov (United States)

Karali, E; Lambropoulou, S; Koutsouris, D

2011-01-01

In this work various methods of parametric elastic models are compared, namely the classical snake, the gradient vector field snake (GVF snake) and the topology-adaptive snake (t-snake), as well as the method of self-affine mapping system as an alternative to elastic models. We also give a brief overview of the methods used. The self-affine mapping system is implemented using an adapting scheme and minimum distance as optimization criterion, which is more suitable for weak edges detection. All methods are applied to glaucomatic retinal images with the purpose of segmenting the optical disk. The methods are compared in terms of segmentation accuracy and speed, as these are derived from cross-correlation coefficients between real and algorithm extracted contours and segmentation time, respectively. As a result, the method of self-affine mapping system presents adequate segmentation time and segmentation accuracy, and significant independence from initialization.

11. Modelling Tradescantia fluminensis to assess long term survival

Directory of Open Access Journals (Sweden)

Alex James

2015-06-01

Full Text Available We present a simple Poisson process model for the growth of Tradescantia fluminensis, an invasive plant species that inhibits the regeneration of native forest remnants in New Zealand. The model was parameterised with data derived from field experiments in New Zealand and then verified with independent data. The model gave good predictions which showed that its underlying assumptions are sound. However, this simple model had less predictive power for outputs based on variance suggesting that some assumptions were lacking. Therefore, we extended the model to include higher variability between plants thereby improving its predictions. This high variance model suggests that control measures that promote node death at the base of the plant or restrict the main stem growth rate will be more effective than those that reduce the number of branching events. The extended model forms a good basis for assessing the efficacy of various forms of control of this weed, including the recently-released leaf-feeding tradescantia leaf beetle (Neolema ogloblini.

12. Revealing the equivalence of two clonal survival models by principal component analysis

International Nuclear Information System (INIS)

Lachet, Bernard; Dufour, Jacques

1976-01-01

The principal component analysis of 21 chlorella cell survival curves, adjusted by one-hit and two-hit target models, lead to quite similar projections on the principal plan: the homologous parameters of these models are linearly correlated; the reason for the statistical equivalence of these two models, in the present state of experimental inaccuracy, is revealed [fr

13. A Multiscale Survival Process for Modeling Human Activity Patterns.

Science.gov (United States)

Zhang, Tianyang; Cui, Peng; Song, Chaoming; Zhu, Wenwu; Yang, Shiqiang

2016-01-01

Human activity plays a central role in understanding large-scale social dynamics. It is well documented that individual activity pattern follows bursty dynamics characterized by heavy-tailed interevent time distributions. Here we study a large-scale online chatting dataset consisting of 5,549,570 users, finding that individual activity pattern varies with timescales whereas existing models only approximate empirical observations within a limited timescale. We propose a novel approach that models the intensity rate of an individual triggering an activity. We demonstrate that the model precisely captures corresponding human dynamics across multiple timescales over five orders of magnitudes. Our model also allows extracting the population heterogeneity of activity patterns, characterized by a set of individual-specific ingredients. Integrating our approach with social interactions leads to a wide range of implications.

14. A bidirectional coupling procedure applied to multiscale respiratory modeling

Energy Technology Data Exchange (ETDEWEB)

Kuprat, A.P., E-mail: andrew.kuprat@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States); Kabilan, S., E-mail: senthil.kabilan@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States); Carson, J.P., E-mail: james.carson@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States); Corley, R.A., E-mail: rick.corley@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States); Einstein, D.R., E-mail: daniel.einstein@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States)

2013-07-01

pressure applied to the multiple sets of ODEs. In both the simplified geometry and in the imaging-based geometry, the performance of the method was comparable to that of monolithic schemes, in most cases requiring only a single CFD evaluation per time step. Thus, this new accelerator allows us to begin combining pulmonary CFD models with lower-dimensional models of pulmonary mechanics with little computational overhead. Moreover, because the CFD and lower-dimensional models are totally separate, this framework affords great flexibility in terms of the type and breadth of the adopted lower-dimensional model, allowing the biomedical researcher to appropriately focus on model design. Research funded by the National Heart and Blood Institute Award 1RO1HL073598.

15. A bidirectional coupling procedure applied to multiscale respiratory modeling

Science.gov (United States)

Kuprat, A. P.; Kabilan, S.; Carson, J. P.; Corley, R. A.; Einstein, D. R.

2013-07-01

pressure applied to the multiple sets of ODEs. In both the simplified geometry and in the imaging-based geometry, the performance of the method was comparable to that of monolithic schemes, in most cases requiring only a single CFD evaluation per time step. Thus, this new accelerator allows us to begin combining pulmonary CFD models with lower-dimensional models of pulmonary mechanics with little computational overhead. Moreover, because the CFD and lower-dimensional models are totally separate, this framework affords great flexibility in terms of the type and breadth of the adopted lower-dimensional model, allowing the biomedical researcher to appropriately focus on model design. Research funded by the National Heart and Blood Institute Award 1RO1HL073598.

16. A bidirectional coupling procedure applied to multiscale respiratory modeling

International Nuclear Information System (INIS)

Kuprat, A.P.; Kabilan, S.; Carson, J.P.; Corley, R.A.; Einstein, D.R.

2013-01-01

pressure applied to the multiple sets of ODEs. In both the simplified geometry and in the imaging-based geometry, the performance of the method was comparable to that of monolithic schemes, in most cases requiring only a single CFD evaluation per time step. Thus, this new accelerator allows us to begin combining pulmonary CFD models with lower-dimensional models of pulmonary mechanics with little computational overhead. Moreover, because the CFD and lower-dimensional models are totally separate, this framework affords great flexibility in terms of the type and breadth of the adopted lower-dimensional model, allowing the biomedical researcher to appropriately focus on model design. Research funded by the National Heart and Blood Institute Award 1RO1HL073598

17. Regression models for interval censored survival data: Application to HIV infection in Danish homosexual men

DEFF Research Database (Denmark)

Carstensen, Bendix

1996-01-01

This paper shows how to fit excess and relative risk regression models to interval censored survival data, and how to implement the models in standard statistical software. The methods developed are used for the analysis of HIV infection rates in a cohort of Danish homosexual men.......This paper shows how to fit excess and relative risk regression models to interval censored survival data, and how to implement the models in standard statistical software. The methods developed are used for the analysis of HIV infection rates in a cohort of Danish homosexual men....

18. Risk matrix model applied to the outsourcing of logistics' activities

Directory of Open Access Journals (Sweden)

2015-09-01

Full Text Available Purpose: This paper proposes the application of the risk matrix model in the field of logistics outsourcing. Such an application can serve as the basis for decision making regarding the conduct of a risk management in the logistics outsourcing process and allow its prevention. Design/methodology/approach: This study is based on the risk management of logistics outsourcing in the field of the retail sector in Morocco. The authors identify all possible risks and then classify and prioritize them using the Risk Matrix Model. Finally, we have come to four possible decisions for the identified risks. The analysis was made possible through interviews and discussions with the heads of departments and agents who are directly involved in each outsourced activity. Findings and Originality/value: It is possible to improve the risk matrix model by proposing more personalized prevention measures according to each company that operates in the mass-market retailing. Originality/value: This study is the only one made in the process of logistics outsourcing in the retail sector in Morocco through Label’vie as a case study. First, we had identified as thorough as we could all possible risks, then we applied the Risk Matrix Model to sort them out in an ascending order of importance and criticality. As a result, we could hand out to the decision-makers the mapping for an effective control of risks and a better guiding of the process of risk management.

19. Model output statistics applied to wind power prediction

Energy Technology Data Exchange (ETDEWEB)

Joensen, A; Giebel, G; Landberg, L [Risoe National Lab., Roskilde (Denmark); Madsen, H; Nielsen, H A [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

1999-03-01

Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

20. Modeling time-to-event (survival) data using classification tree analysis.

Science.gov (United States)

Linden, Ariel; Yarnold, Paul R

2017-12-01

Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.

1. Survival analysis

International Nuclear Information System (INIS)

1999-01-01

The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model

2. Survival Analysis of a Nonautonomous Logistic Model with Stochastic Perturbation

Directory of Open Access Journals (Sweden)

Chun Lu

2012-01-01

Full Text Available Taking white noise into account, a stochastic nonautonomous logistic model is proposed and investigated. Sufficient conditions for extinction, nonpersistence in the mean, weak persistence, stochastic permanence, and global asymptotic stability are established. Moreover, the threshold between weak persistence and extinction is obtained. Finally, we introduce some numerical simulink graphics to illustrate our main results.

3. Survival Old Model Tamping on Bugis House in Kampong of Bunne Regency of Soppeng South Sulawesi Indonesia

Science.gov (United States)

Abidah, Andi

2017-10-01

Tamping is space circulation from terrace to inside home and also as space for sitting space for low rank social community. Position tamping is one of side of main house. The floor of tamping slightly low than main house floor, this model has seldom found today which community more refer on new tamping model. The new model of tamping today, the same level on main house floor. Even new Bugis house model without tamping. Old model house use tamping but the tamping and watangpola ha the same floor level. This model consists of four modules which three modules on main house and one module tamping. In the past, old model of tamping is different level floor between watangpola and tamping floor now this tamping floor of old Bugis house model gone the same level of watangpola. While new model called eppa-eppa house, did not use tamping. Community in Kampung Bunne is till survive on old model of tamping on their house although several house has change its tamping like community applied now. This model is still found around 45 house of total number of house in the kampung. This study will explore applying old model of tamping of Bugis house in kampong Bunne Regency of Soppeng South Sulawesi. Qualitative research is used on this study. The study was developed base in sketch, photograph and interview.

4. Linear model applied to the evaluation of pharmaceutical stability data

Directory of Open Access Journals (Sweden)

Renato Cesar Souza

2013-09-01

Full Text Available The expiry date on the packaging of a product gives the consumer the confidence that the product will retain its identity, content, quality and purity throughout the period of validity of the drug. The definition of this term in the pharmaceutical industry is based on stability data obtained during the product registration. By the above, this work aims to apply the linear regression according to the guideline ICH Q1E, 2003, to evaluate some aspects of a product undergoing in a registration phase in Brazil. With this propose, the evaluation was realized with the development center of a multinational company in Brazil, with samples of three different batches composed by two active principal ingredients in two different packages. Based on the preliminary results obtained, it was possible to observe the difference of degradation tendency of the product in two different packages and the relationship between the variables studied, added knowledge so new models of linear equations can be applied and developed for other products.

5. Surviving the present: Modeling tools for organizational change

International Nuclear Information System (INIS)

Pangaro, P.

1992-01-01

The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them

6. Applying threshold models to donations to a green electricity fund

International Nuclear Information System (INIS)

Ito, Nobuyuki; Takeuchi, Kenji; Tsuge, Takahiro; Kishimoto, Atsuo

2010-01-01

This study applies a threshold model proposed by to analyze the diffusion process of donating behavior for renewable energy. We first use a stated preference survey to estimate the determinants of a decision to support the donation scheme under various predicted participation rates. Using the estimated coefficients, we simulate how herd behavior spreads and the participation rate reaches the equilibrium. The participation rate at the equilibrium is estimated as 37.88% when the suggested donation is 500 yen, while it is 17.76% when the suggested amount is 1000 yen. The influence of environmentalism and altruism is also examined, and we find that these motivations increase the participation rate by 31.51% on average.

7. Transient heat conduction in a pebble fuel applying fractional model

International Nuclear Information System (INIS)

Gomez A, R.; Espinosa P, G.

2009-10-01

In this paper we presents the equation of thermal diffusion of temporary-fractional order in the one-dimensional space in spherical coordinates, with the objective to analyze the heat transference between the fuel and coolant in a fuel element of a Pebble Bed Modular Reactor. The pebble fuel is the heterogeneous system made by microsphere constitutes by U O, pyrolytic carbon and silicon carbide mixed with graphite. To describe the heat transfer phenomena in the pebble fuel we applied a constitutive law fractional (Non-Fourier) in order to analyze the behaviour transient of the temperature distribution in the pebble fuel with anomalous thermal diffusion effects a numerical model is developed. (Author)

8. The Eyring-Stover theory of survival applied to life-span radiation effects studies in animals

International Nuclear Information System (INIS)

Stover, B.J.; Wrenn, M.E.; Jee, W.S.S.; Atherton, D.R.

1986-01-01

The Eyring-Stover theory of survival describes the observed biological phenomena of damage and repair as steady-state processes that can be expressed in the formalism of absolute reaction rate theory. The steady-state formulation, rather than that of dynamic equilibrium, is invoked since biological phenomena, in contrast with most chemical and physical phenemena, are time irreversible. The theory is appropriate for calculating life shortening that results from environmental factors such as irradiation since it does not require universality and intrinsicality as to some theories of aging. The theory gives not only midrange mortality rate values but also end-range values, which are difficult to predict empirically. The previously calculated life shortening of mice after external x-irradiation and of beagles after internal irradiation from 239 Pu or 226 Ra is reviewed; life shortening at low dose levels of 226 Ra is presented. 21 refs., 1 tab

9. Using cure models for analyzing the influence of pathogens on salmon survival

Science.gov (United States)

Ray, Adam R; Perry, Russell W.; Som, Nicholas A.; Bartholomew, Jerri L

2014-01-01

Parasites and pathogens influence the size and stability of wildlife populations, yet many population models ignore the population-level effects of pathogens. Standard survival analysis methods (e.g., accelerated failure time models) are used to assess how survival rates are influenced by disease. However, they assume that each individual is equally susceptible and will eventually experience the event of interest; this assumption is not typically satisfied with regard to pathogens of wildlife populations. In contrast, mixture cure models, which comprise logistic regression and survival analysis components, allow for different covariates to be entered into each part of the model and provide better predictions of survival when a fraction of the population is expected to survive a disease outbreak. We fitted mixture cure models to the host–pathogen dynamics of Chinook Salmon Oncorhynchus tshawytscha and Coho Salmon O. kisutch and the myxozoan parasite Ceratomyxa shasta. Total parasite concentration, water temperature, and discharge were used as covariates to predict the observed parasite-induced mortality in juvenile salmonids collected as part of a long-term monitoring program in the Klamath River, California. The mixture cure models predicted the observed total mortality well, but some of the variability in observed mortality rates was not captured by the models. Parasite concentration and water temperature were positively associated with total mortality and the mortality rate of both Chinook Salmon and Coho Salmon. Discharge was positively associated with total mortality for both species but only affected the mortality rate for Coho Salmon. The mixture cure models provide insights into how daily survival rates change over time in Chinook Salmon and Coho Salmon after they become infected with C. shasta.

10. A Survival Model for Shortleaf Pine Tress Growing in Uneven-Aged Stands

Science.gov (United States)

Thomas B. Lynch; Lawrence R. Gering; Michael M. Huebschmann; Paul A. Murphy

1999-01-01

A survival model for shortleaf pine (Pinus echinata Mill.) trees growing in uneven-aged stands was developed using data from permanently established plots maintained by an industrial forestry company in western Arkansas. Parameters were fitted to a logistic regression model with a Bernoulli dependent variable in which "0" represented...

11. Hierarchic stochastic modelling applied to intracellular Ca(2+ signals.

Directory of Open Access Journals (Sweden)

Gregor Moenke

Full Text Available Important biological processes like cell signalling and gene expression have noisy components and are very complex at the same time. Mathematical analysis of such systems has often been limited to the study of isolated subsystems, or approximations are used that are difficult to justify. Here we extend a recently published method (Thurley and Falcke, PNAS 2011 which is formulated in observable system configurations instead of molecular transitions. This reduces the number of system states by several orders of magnitude and avoids fitting of kinetic parameters. The method is applied to Ca(2+ signalling. Ca(2+ is a ubiquitous second messenger transmitting information by stochastic sequences of concentration spikes, which arise by coupling of subcellular Ca(2+ release events (puffs. We derive analytical expressions for a mechanistic Ca(2+ model, based on recent data from live cell imaging, and calculate Ca(2+ spike statistics in dependence on cellular parameters like stimulus strength or number of Ca(2+ channels. The new approach substantiates a generic Ca(2+ model, which is a very convenient way to simulate Ca(2+ spike sequences with correct spiking statistics.

12. Survival behavior in the cyclic Lotka-Volterra model with a randomly switching reaction rate

Science.gov (United States)

West, Robert; Mobilia, Mauro; Rucklidge, Alastair M.

2018-02-01

We study the influence of a randomly switching reproduction-predation rate on the survival behavior of the nonspatial cyclic Lotka-Volterra model, also known as the zero-sum rock-paper-scissors game, used to metaphorically describe the cyclic competition between three species. In large and finite populations, demographic fluctuations (internal noise) drive two species to extinction in a finite time, while the species with the smallest reproduction-predation rate is the most likely to be the surviving one (law of the weakest). Here we model environmental (external) noise by assuming that the reproduction-predation rate of the strongest species (the fastest to reproduce and predate) in a given static environment randomly switches between two values corresponding to more and less favorable external conditions. We study the joint effect of environmental and demographic noise on the species survival probabilities and on the mean extinction time. In particular, we investigate whether the survival probabilities follow the law of the weakest and analyze their dependence on the external noise intensity and switching rate. Remarkably, when, on average, there is a finite number of switches prior to extinction, the survival probability of the predator of the species whose reaction rate switches typically varies nonmonotonically with the external noise intensity (with optimal survival about a critical noise strength). We also outline the relationship with the case where all reaction rates switch on markedly different time scales.

13. Survival behavior in the cyclic Lotka-Volterra model with a randomly switching reaction rate.

Science.gov (United States)

West, Robert; Mobilia, Mauro; Rucklidge, Alastair M

2018-02-01

We study the influence of a randomly switching reproduction-predation rate on the survival behavior of the nonspatial cyclic Lotka-Volterra model, also known as the zero-sum rock-paper-scissors game, used to metaphorically describe the cyclic competition between three species. In large and finite populations, demographic fluctuations (internal noise) drive two species to extinction in a finite time, while the species with the smallest reproduction-predation rate is the most likely to be the surviving one (law of the weakest). Here we model environmental (external) noise by assuming that the reproduction-predation rate of the strongest species (the fastest to reproduce and predate) in a given static environment randomly switches between two values corresponding to more and less favorable external conditions. We study the joint effect of environmental and demographic noise on the species survival probabilities and on the mean extinction time. In particular, we investigate whether the survival probabilities follow the law of the weakest and analyze their dependence on the external noise intensity and switching rate. Remarkably, when, on average, there is a finite number of switches prior to extinction, the survival probability of the predator of the species whose reaction rate switches typically varies nonmonotonically with the external noise intensity (with optimal survival about a critical noise strength). We also outline the relationship with the case where all reaction rates switch on markedly different time scales.

14. Applying the Expectancy-Value Model to understand health values.

Science.gov (United States)

Zhang, Xu-Hao; Xie, Feng; Wee, Hwee-Lin; Thumboo, Julian; Li, Shu-Chuen

2008-03-01

Expectancy-Value Model (EVM) is the most structured model in psychology to predict attitudes by measuring attitudinal attributes (AAs) and relevant external variables. Because health value could be categorized as attitude, we aimed to apply EVM to explore its usefulness in explaining variances in health values and investigate underlying factors. Focus group discussion was carried out to identify the most common and significant AAs toward 5 different health states (coded as 11111, 11121, 21221, 32323, and 33333 in EuroQol Five-Dimension (EQ-5D) descriptive system). AAs were measured in a sum of multiplications of subjective probability (expectancy) and perceived value of attributes with 7-point Likert scales. Health values were measured using visual analog scales (VAS, range 0-1). External variables (age, sex, ethnicity, education, housing, marital status, and concurrent chronic diseases) were also incorporated into survey questionnaire distributed by convenience sampling among eligible respondents. Univariate analyses were used to identify external variables causing significant differences in VAS. Multiple linear regression model (MLR) and hierarchical regression model were used to investigate the explanatory power of AAs and possible significant external variable(s) separately or in combination, for each individual health state and a mixed scenario of five states, respectively. Four AAs were identified, namely, "worsening your quality of life in terms of health" (WQoL), "adding a burden to your family" (BTF), "making you less independent" (MLI) and "unable to work or study" (UWS). Data were analyzed based on 232 respondents (mean [SD] age: 27.7 [15.07] years, 49.1% female). Health values varied significantly across 5 health states, ranging from 0.12 (33333) to 0.97 (11111). With no significant external variables identified, EVM explained up to 62% of the variances in health values across 5 health states. The explanatory power of 4 AAs were found to be between 13

15. Cancer survival analysis using semi-supervised learning method based on Cox and AFT models with L1/2 regularization.

Science.gov (United States)

Liang, Yong; Chai, Hua; Liu, Xiao-Ying; Xu, Zong-Ben; Zhang, Hai; Leung, Kwong-Sak

2016-03-01

One of the most important objectives of the clinical cancer research is to diagnose cancer more accurately based on the patients' gene expression profiles. Both Cox proportional hazards model (Cox) and accelerated failure time model (AFT) have been widely adopted to the high risk and low risk classification or survival time prediction for the patients' clinical treatment. Nevertheless, two main dilemmas limit the accuracy of these prediction methods. One is that the small sample size and censored data remain a bottleneck for training robust and accurate Cox classification model. In addition to that, similar phenotype tumours and prognoses are actually completely different diseases at the genotype and molecular level. Thus, the utility of the AFT model for the survival time prediction is limited when such biological differences of the diseases have not been previously identified. To try to overcome these two main dilemmas, we proposed a novel semi-supervised learning method based on the Cox and AFT models to accurately predict the treatment risk and the survival time of the patients. Moreover, we adopted the efficient L1/2 regularization approach in the semi-supervised learning method to select the relevant genes, which are significantly associated with the disease. The results of the simulation experiments show that the semi-supervised learning model can significant improve the predictive performance of Cox and AFT models in survival analysis. The proposed procedures have been successfully applied to four real microarray gene expression and artificial evaluation datasets. The advantages of our proposed semi-supervised learning method include: 1) significantly increase the available training samples from censored data; 2) high capability for identifying the survival risk classes of patient in Cox model; 3) high predictive accuracy for patients' survival time in AFT model; 4) strong capability of the relevant biomarker selection. Consequently, our proposed semi

16. Iterative Bayesian Model Averaging: a method for the application of survival analysis to high-dimensional microarray data

Directory of Open Access Journals (Sweden)

2009-02-01

Full Text Available Abstract Background Microarray technology is increasingly used to identify potential biomarkers for cancer prognostics and diagnostics. Previously, we have developed the iterative Bayesian Model Averaging (BMA algorithm for use in classification. Here, we extend the iterative BMA algorithm for application to survival analysis on high-dimensional microarray data. The main goal in applying survival analysis to microarray data is to determine a highly predictive model of patients' time to event (such as death, relapse, or metastasis using a small number of selected genes. Our multivariate procedure combines the effectiveness of multiple contending models by calculating the weighted average of their posterior probability distributions. Our results demonstrate that our iterative BMA algorithm for survival analysis achieves high prediction accuracy while consistently selecting a small and cost-effective number of predictor genes. Results We applied the iterative BMA algorithm to two cancer datasets: breast cancer and diffuse large B-cell lymphoma (DLBCL data. On the breast cancer data, the algorithm selected a total of 15 predictor genes across 84 contending models from the training data. The maximum likelihood estimates of the selected genes and the posterior probabilities of the selected models from the training data were used to divide patients in the test (or validation dataset into high- and low-risk categories. Using the genes and models determined from the training data, we assigned patients from the test data into highly distinct risk groups (as indicated by a p-value of 7.26e-05 from the log-rank test. Moreover, we achieved comparable results using only the 5 top selected genes with 100% posterior probabilities. On the DLBCL data, our iterative BMA procedure selected a total of 25 genes across 3 contending models from the training data. Once again, we assigned the patients in the validation set to significantly distinct risk groups (p

17. Cellular Automata Models Applied to the Study of Landslide Dynamics

Science.gov (United States)

Liucci, Luisa; Melelli, Laura; Suteanu, Cristian

2015-04-01

Landslides are caused by complex processes controlled by the interaction of numerous factors. Increasing efforts are being made to understand the spatial and temporal evolution of this phenomenon, and the use of remote sensing data is making significant contributions in improving forecast. This paper studies landslides seen as complex dynamic systems, in order to investigate their potential Self Organized Critical (SOC) behavior, and in particular, scale-invariant aspects of processes governing the spatial development of landslides and their temporal evolution, as well as the mechanisms involved in driving the system and keeping it in a critical state. For this purpose, we build Cellular Automata Models, which have been shown to be capable of reproducing the complexity of real world features using a small number of variables and simple rules, thus allowing for the reduction of the number of input parameters commonly used in the study of processes governing landslide evolution, such as those linked to the geomechanical properties of soils. This type of models has already been successfully applied in studying the dynamics of other natural hazards, such as earthquakes and forest fires. The basic structure of the model is composed of three modules: (i) An initialization module, which defines the topographic surface at time zero as a grid of square cells, each described by an altitude value; the surface is acquired from real Digital Elevation Models (DEMs). (ii) A transition function, which defines the rules used by the model to update the state of the system at each iteration. The rules use a stability criterion based on the slope angle and introduce a variable describing the weakening of the material over time, caused for example by rainfall. The weakening brings some sites of the system out of equilibrium thus causing the triggering of landslides, which propagate within the system through local interactions between neighboring cells. By using different rates of

18. Dose-rate dependent stochastic effects in radiation cell-survival models

International Nuclear Information System (INIS)

Sachs, R.K.; Hlatky, L.R.

1990-01-01

When cells are subjected to ionizing radiation the specific energy rate (microscopic analog of dose-rate) varies from cell to cell. Within one cell, this rate fluctuates during the course of time; a crossing of a sensitive cellular site by a high energy charged particle produces many ionizations almost simultaneously, but during the interval between events no ionizations occur. In any cell-survival model one can incorporate the effect of such fluctuations without changing the basic biological assumptions. Using stochastic differential equations and Monte Carlo methods to take into account stochastic effects we calculated the dose-survival rfelationships in a number of current cell survival models. Some of the models assume quadratic misrepair; others assume saturable repair enzyme systems. It was found that a significant effect of random fluctuations is to decrease the theoretically predicted amount of dose-rate sparing. In the limit of low dose-rates neglecting the stochastic nature of specific energy rates often leads to qualitatively misleading results by overestimating the surviving fraction drastically. In the opposite limit of acute irradiation, analyzing the fluctuations in rates merely amounts to analyzing fluctuations in total specific energy via the usual microdosimetric specific energy distribution function, and neglecting fluctuations usually underestimates the surviving fraction. The Monte Carlo methods interpolate systematically between the low dose-rate and high dose-rate limits. As in other approaches, the slope of the survival curve at low dose-rates is virtually independent of dose and equals the initial slope of the survival curve for acute radiation. (orig.)

19. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

Science.gov (United States)

Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

2016-01-01

Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

20. Defining Requirements and Applying Information Modeling for Protecting Enterprise Assets

Science.gov (United States)

Fortier, Stephen C.; Volk, Jennifer H.

The advent of terrorist threats has heightened local, regional, and national governments' interest in emergency response and disaster preparedness. The threat of natural disasters also challenges emergency responders to act swiftly and in a coordinated fashion. When a disaster occurs, an ad hoc coalition of pre-planned groups usually forms to respond to the incident. History has shown that these “system of systems” do not interoperate very well. Communications between fire, police and rescue components either do not work or are inefficient. Government agencies, non-governmental organizations (NGOs), and private industry use a wide array of software platforms for managing data about emergency conditions, resources and response activities. Most of these are stand-alone systems with very limited capability for data sharing with other agencies or other levels of government. Information technology advances have facilitated the movement towards an integrated and coordinated approach to emergency management. Other communication mechanisms, such as video teleconferencing, digital television and radio broadcasting, are being utilized to combat the challenges of emergency information exchange. Recent disasters, such as Hurricane Katrina and the tsunami in Indonesia, have illuminated the weaknesses in emergency response. This paper will discuss the need for defining requirements for components of ad hoc coalitions which are formed to respond to disasters. A goal of our effort was to develop a proof of concept that applying information modeling to the business processes used to protect and mitigate potential loss of an enterprise was feasible. These activities would be modeled both pre- and post-incident.

1. Comparison of six different models describing survival of mammalian cells after irradiation

International Nuclear Information System (INIS)

Sontag, W.

1990-01-01

Six different cell-survival models have been compared. All models are based on the similar assumption that irradiated cells are able to exist in one of three states. S A is the state of a totally repaired cell, in state S C the cell contains lethal lesions and in state S b the cell contains potentially lethal lesions i.e. those which either can be repaired or converted into lethal lesions. The differences between the six models lie in the different mathematical relationships between the three states. To test the six models, six different sets of experimental data were used which describe cell survival at different repair times after irradiation with sparsely ionizing irradiation. In order to compare the models, a goodness-of-fit function was used. The differences between the six models were tested by use of the nonparametric Mann-Whitney two sample test. Based on the 95% confidence limit, this required separation into three groups. (orig.)

2. Applied genre analysis: a multi-perspective model

Directory of Open Access Journals (Sweden)

Vijay K Bhatia

2002-04-01

Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

3. Applying Quality Function Deployment Model in Burn Unit Service Improvement.

Science.gov (United States)

Keshtkaran, Ali; Hashemi, Neda; Kharazmi, Erfan; Abbasi, Mehdi

2016-01-01

Quality function deployment (QFD) is one of the most effective quality design tools. This study applies QFD technique to improve the quality of the burn unit services in Ghotbedin Hospital in Shiraz, Iran. First, the patients' expectations of burn unit services and their priorities were determined through Delphi method. Thereafter, burn unit service specifications were determined through Delphi method. Further, the relationships between the patients' expectations and service specifications and also the relationships between service specifications were determined through an expert group's opinion. Last, the final importance scores of service specifications were calculated through simple additive weighting method. The findings show that burn unit patients have 40 expectations in six different areas. These expectations are in 16 priority levels. Burn units also have 45 service specifications in six different areas. There are four-level relationships between the patients' expectations and service specifications and four-level relationships between service specifications. The most important burn unit service specifications have been identified in this study. The QFD model developed in the study can be a general guideline for QFD planners and executives.

4. Applying Petri nets in modelling the human factor

International Nuclear Information System (INIS)

Bedreaga, Luminita; Constntinescu, Cristina; Guzun, Basarab

2007-01-01

Usually, in the reliability analysis performed for complex systems, we determine the success probability to work with other performance indices, i.e. the likelihood associated with a given state. The possible values assigned to system states can be derived using inductive methods. If one wants to calculate the probability to occur a particular event in the system, then deductive methods should be applied. In the particular case of the human reliability analysis, as part of probabilistic safety analysis, the international regulatory commission have developed specific guides and procedures to perform such assessments. The paper presents the modality to obtain the human reliability quantification using the Petri nets approach. This is an efficient means to assess reliability systems because of their specific features. The examples showed in the paper are from human reliability documentation without a detailed human factor analysis (qualitative). We present human action modelling using event trees and Petri nets approach. The obtained results by these two kinds of methods are in good concordance. (authors)

5. Electrodynamic modeling applied to micro-strip gas chambers

International Nuclear Information System (INIS)

Fang, R.

1998-01-01

Gas gain variations as functions of time, counting rate and substrate resistivity have been observed with Micro-Strip Gas Chambers (MSGC). Such a chamber is here treated as a system of 2 dielectrics, gas and substrate, with finite resistivities. Electric charging between their interface results in variations of the electric field and the gas gain. The electrodynamic equations (including time dependence) for such a system are proposed. A Rule of Charge Accumulation (RCA) is then derived which allows to determine the quantity and sign of charges accumulated on the surface at equilibrium. In order to apply the equations and the rule to MSGCs, a model of gas conductance induced by ionizing radiation is proposed, and a differential equation and some formulae are derived to calculate the rms dispersion and the spatial distribution of electrons (ions) in inhomogeneous electric fields. RCA coupled with a precise simulation of the electric fields gives the first quantitative explanation of gas gain variations of MSGCs. Finally an electrodynamic simulation program is made to reproduce the dynamic process of gain variation due to surface charging with an uncertainty of at most 15% relative to experimental data. As a consequence, the methods for stabilizing operation of MSGCs are proposed. (author)

6. On the analysis of clonogenic survival data: Statistical alternatives to the linear-quadratic model

International Nuclear Information System (INIS)

Unkel, Steffen; Belka, Claus; Lauber, Kirsten

2016-01-01

The most frequently used method to quantitatively describe the response to ionizing irradiation in terms of clonogenic survival is the linear-quadratic (LQ) model. In the LQ model, the logarithm of the surviving fraction is regressed linearly on the radiation dose by means of a second-degree polynomial. The ratio of the estimated parameters for the linear and quadratic term, respectively, represents the dose at which both terms have the same weight in the abrogation of clonogenic survival. This ratio is known as the α/β ratio. However, there are plausible scenarios in which the α/β ratio fails to sufficiently reflect differences between dose-response curves, for example when curves with similar α/β ratio but different overall steepness are being compared. In such situations, the interpretation of the LQ model is severely limited. Colony formation assays were performed in order to measure the clonogenic survival of nine human pancreatic cancer cell lines and immortalized human pancreatic ductal epithelial cells upon irradiation at 0-10 Gy. The resulting dataset was subjected to LQ regression and non-linear log-logistic regression. Dimensionality reduction of the data was performed by cluster analysis and principal component analysis. Both the LQ model and the non-linear log-logistic regression model resulted in accurate approximations of the observed dose-response relationships in the dataset of clonogenic survival. However, in contrast to the LQ model the non-linear regression model allowed the discrimination of curves with different overall steepness but similar α/β ratio and revealed an improved goodness-of-fit. Additionally, the estimated parameters in the non-linear model exhibit a more direct interpretation than the α/β ratio. Dimensionality reduction of clonogenic survival data by means of cluster analysis was shown to be a useful tool for classifying radioresistant and sensitive cell lines. More quantitatively, principal component analysis allowed

7. International Conference on Applied Mathematics, Modeling and Computational Science & Annual meeting of the Canadian Applied and Industrial Mathematics

CERN Document Server

Bélair, Jacques; Kunze, Herb; Makarov, Roman; Melnik, Roderick; Spiteri, Raymond J

2016-01-01

Focusing on five main groups of interdisciplinary problems, this book covers a wide range of topics in mathematical modeling, computational science and applied mathematics. It presents a wealth of new results in the development of modeling theories and methods, advancing diverse areas of applications and promoting interdisciplinary interactions between mathematicians, scientists, engineers and representatives from other disciplines. The book offers a valuable source of methods, ideas, and tools developed for a variety of disciplines, including the natural and social sciences, medicine, engineering, and technology. Original results are presented on both the fundamental and applied level, accompanied by an ample number of real-world problems and examples emphasizing the interdisciplinary nature and universality of mathematical modeling, and providing an excellent outline of today’s challenges. Mathematical modeling, with applied and computational methods and tools, plays a fundamental role in modern science a...

8. Regulatory activity based risk model identifies survival of stage II and III colorectal carcinoma.

Science.gov (United States)

Liu, Gang; Dong, Chuanpeng; Wang, Xing; Hou, Guojun; Zheng, Yu; Xu, Huilin; Zhan, Xiaohui; Liu, Lei

2017-11-17

Clinical and pathological indicators are inadequate for prognosis of stage II and III colorectal carcinoma (CRC). In this study, we utilized the activity of regulatory factors, univariate Cox regression and random forest for variable selection and developed a multivariate Cox model to predict the overall survival of Stage II/III colorectal carcinoma in GSE39582 datasets (469 samples). Patients in low-risk group showed a significant longer overall survival and recurrence-free survival time than those in high-risk group. This finding was further validated in five other independent datasets (GSE14333, GSE17536, GSE17537, GSE33113, and GSE37892). Besides, associations between clinicopathological information and risk score were analyzed. A nomogram including risk score was plotted to facilitate the utilization of risk score. The risk score model is also demonstrated to be effective on predicting both overall and recurrence-free survival of chemotherapy received patients. After performing Gene Set Enrichment Analysis (GSEA) between high and low risk groups, we found that several cell-cell interaction KEGG pathways were identified. Funnel plot results showed that there was no publication bias in these datasets. In summary, by utilizing the regulatory activity in stage II and III colorectal carcinoma, the risk score successfully predicts the survival of 1021 stage II/III CRC patients in six independent datasets.

9. Effects of temperature on development, survival and reproduction of insects: Experimental design, data analysis and modeling

Science.gov (United States)

Jacques Regniere; James Powell; Barbara Bentz; Vincent Nealis

2012-01-01

The developmental response of insects to temperature is important in understanding the ecology of insect life histories. Temperature-dependent phenology models permit examination of the impacts of temperature on the geographical distributions, population dynamics and management of insects. The measurement of insect developmental, survival and reproductive responses to...

10. Inference for shared-frailty survival models with left-truncated data

NARCIS (Netherlands)

van den Berg, G.J.; Drepper, B.

2016-01-01

Shared-frailty survival models specify that systematic unobserved determinants of duration outcomes are identical within groups of individuals. We consider random-effects likelihood-based statistical inference if the duration data are subject to left-truncation. Such inference with left-truncated

11. Applied Research Consultants (ARC): A Vertical Practicum Model of Training Applied Research

Science.gov (United States)

Nadler, Joel T.; Cundiff, Nicole L.

2009-01-01

The demand for highly trained evaluation consultants is increasing. Furthermore, the gap between job seekers' evaluation competencies and job recruiters' expectations suggests a need for providing practical training experiences. A model using a vertical practicum (advanced students assisting in the training of newer students) is suggested as an…

12. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

OpenAIRE

Biggs, Matthew B.; Papin, Jason A.

2013-01-01

Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid mod...

13. A comparative study of machine learning methods for time-to-event survival data for radiomics risk modelling.

Science.gov (United States)

Leger, Stefan; Zwanenburg, Alex; Pilz, Karoline; Lohaus, Fabian; Linge, Annett; Zöphel, Klaus; Kotzerke, Jörg; Schreiber, Andreas; Tinhofer, Inge; Budach, Volker; Sak, Ali; Stuschke, Martin; Balermpas, Panagiotis; Rödel, Claus; Ganswindt, Ute; Belka, Claus; Pigorsch, Steffi; Combs, Stephanie E; Mönnich, David; Zips, Daniel; Krause, Mechthild; Baumann, Michael; Troost, Esther G C; Löck, Steffen; Richter, Christian

2017-10-16

Radiomics applies machine learning algorithms to quantitative imaging data to characterise the tumour phenotype and predict clinical outcome. For the development of radiomics risk models, a variety of different algorithms is available and it is not clear which one gives optimal results. Therefore, we assessed the performance of 11 machine learning algorithms combined with 12 feature selection methods by the concordance index (C-Index), to predict loco-regional tumour control (LRC) and overall survival for patients with head and neck squamous cell carcinoma. The considered algorithms are able to deal with continuous time-to-event survival data. Feature selection and model building were performed on a multicentre cohort (213 patients) and validated using an independent cohort (80 patients). We found several combinations of machine learning algorithms and feature selection methods which achieve similar results, e.g. C-Index = 0.71 and BT-COX: C-Index = 0.70 in combination with Spearman feature selection. Using the best performing models, patients were stratified into groups of low and high risk of recurrence. Significant differences in LRC were obtained between both groups on the validation cohort. Based on the presented analysis, we identified a subset of algorithms which should be considered in future radiomics studies to develop stable and clinically relevant predictive models for time-to-event endpoints.

14. Applying Functional Modeling for Accident Management of Nuclear Power Plant

DEFF Research Database (Denmark)

Lind, Morten; Zhang, Xinxin

2014-01-01

Modeling is given and a detailed presentation of the foundational means-end concepts is presented and the conditions for proper use in modelling accidents are identified. It is shown that Multilevel Flow Modeling can be used for modelling and reasoning about design basis accidents. Its possible role...... for information sharing and decision support in accidents beyond design basis is also indicated. A modelling example demonstrating the application of Multilevel Flow Modelling and reasoning for a PWR LOCA is presented...

15. Applying Functional Modeling for Accident Management of Nucler Power Plant

DEFF Research Database (Denmark)

Lind, Morten; Zhang, Xinxin

2014-01-01

Modeling is given and a detailed presentation of the foundational means-end concepts is presented and the conditions for proper use in modelling accidents are identified. It is shown that Multilevel Flow Modeling can be used for modelling and reasoning about design basis accidents. Its possible role...... for information sharing and decision support in accidents beyond design basis is also indicated. A modelling example demonstrating the application of Multilevel Flow Modelling and reasoning for a PWR LOCA is presented....

16. Object oriented business process modelling in RFID applied computing environment

NARCIS (Netherlands)

Zhao, X.; Liu, Chengfei; Lin, T.; Ranasinghe, D.C.; Sheng, Q.Z.

2010-01-01

As a tracking technology, Radio Frequency Identification (RFID) is now widely applied to enhance the context awareness of enterprise information systems. Such awareness provides great opportunities to facilitate business process automation and thereby improve operation efficiency and accuracy. With

17. Learning Survival Models with On-Line Simulation Activities in the Actuarial Science Degree

Directory of Open Access Journals (Sweden)

Antonio Fernandez-Morales

2011-03-01

Full Text Available The aim of this paper is to describe an on-line survival laboratory designed to enhance teaching and learning in the Statistics courses of the Actuarial Science Degree of the Uni-versity of Málaga. The objective of the on-line survival lab is to help students through a guided program of simulation activities with the understanding of the most important statistical concepts of the stochastic modeling of human survival, from an Actuarial point of view. The graphical interactive simulator is implemented as Java applets for the web version, and as a Javascript animation for a lite iPhone/iPod touch version. Finally, the results of a survey carried out at the end of the course are discussed to have a preliminary assessment of the students’ satisfaction with the resources, and their perception about the usefulness for their learning process.

18. Statistical modelling of survival data with random effects h-likelihood approach

CERN Document Server

Ha, Il Do; Lee, Youngjo

2017-01-01

This book provides a groundbreaking introduction to the likelihood inference for correlated survival data via the hierarchical (or h-) likelihood in order to obtain the (marginal) likelihood and to address the computational difficulties in inferences and extensions. The approach presented in the book overcomes shortcomings in the traditional likelihood-based methods for clustered survival data such as intractable integration. The text includes technical materials such as derivations and proofs in each chapter, as well as recently developed software programs in R (“frailtyHL”), while the real-world data examples together with an R package, “frailtyHL” in CRAN, provide readers with useful hands-on tools. Reviewing new developments since the introduction of the h-likelihood to survival analysis (methods for interval estimation of the individual frailty and for variable selection of the fixed effects in the general class of frailty models) and guiding future directions, the book is of interest to research...

19. An approach to the drone fleet survivability assessment based on a stochastic continues-time model

Science.gov (United States)

Kharchenko, Vyacheslav; Fesenko, Herman; Doukas, Nikos

2017-09-01

An approach and the algorithm to the drone fleet survivability assessment based on a stochastic continues-time model are proposed. The input data are the number of the drones, the drone fleet redundancy coefficient, the drone stability and restoration rate, the limit deviation from the norms of the drone fleet recovery, the drone fleet operational availability coefficient, the probability of the drone failure-free operation, time needed for performing the required tasks by the drone fleet. The ways for improving the recoverable drone fleet survivability taking into account amazing factors of system accident are suggested. Dependencies of the drone fleet survivability rate both on the drone stability and the number of the drones are analysed.

20. Factors associated with supermarket and convenience store closure: a discrete time spatial survival modelling approach.

Science.gov (United States)

Warren, Joshua L; Gordon-Larsen, Penny

2018-06-01

While there is a literature on the distribution of food stores across geographic and social space, much of this research uses cross-sectional data. Analyses attempting to understand whether the availability of stores across neighborhoods is associated with diet and/or health outcomes are limited by a lack of understanding of factors that shape the emergence of new stores and the closure of others. We used quarterly data on supermarket and convenience store locations spanning seven years (2006-2012) and tract-level census data in four US cities: Birmingham, Alabama; Chicago, Illinois; Minneapolis, Minnesota; San Francisco, California. A spatial discrete-time survival model was used to identify factors associated with an earlier and/or later closure time of a store. Sales volume was typically the strongest indicator of store survival. We identified heterogeneity in the association between tract-level poverty and racial composition with respect to store survival. Stores in high poverty, non-White tracts were often at a disadvantage in terms of survival length. The observed patterns of store survival varied by some of the same neighborhood sociodemographic factors associated with lifestyle and health outcomes, which could lead to confusion in interpretation in studies of the estimated effects of introduction of food stores into neighborhoods on health.

1. Combined treatment with atorvastatin and imipenem improves survival and vascular functions in mouse model of sepsis.

Science.gov (United States)

Choudhury, Soumen; Kannan, Kandasamy; Pule Addison, M; Darzi, Sazad A; Singh, Vishakha; Singh, Thakur Uttam; Thangamalai, Ramasamy; Dash, Jeevan Ranjan; Parida, Subhashree; Debroy, Biplab; Paul, Avishek; Mishra, Santosh Kumar

2015-08-01

We have recently reported that pre-treatment, but not the post-treatment with atorvastatin showed survival benefit and improved hemodynamic functions in cecal ligation and puncture (CLP) model of sepsis in mice. Here we examined whether combined treatment with atorvastatin and imipenem after onset of sepsis can prolong survival and improve vascular functions. At 6 and 18h after sepsis induction, treatment with atorvastatin plus imipenem, atorvastatin or imipenem alone or placebo was initiated. Ex vivo experiments were done on mouse aorta to examine the vascular reactivity to nor-adrenaline and acetylcholine and mRNA expressions of α1D AR, GRK2 and eNOS. Atorvastatin plus imipenem extended the survival time to 56.00±4.62h from 20.00±1.66h observed in CLP mice. The survival time with atorvastatin or imipenem alone was 20.50±1.89h and 27.00±4.09h, respectively. The combined treatment reversed the hyporeactivity to nor-adrenaline through preservation of α1D AR mRNA/protein expression and reversal of α1D AR desensitization mediated by GRK2/Gβγ pathway. The treatment also restored endothelium-dependent relaxation to ACh through restoration of aortic eNOS mRNA expression and NO availability. In conclusion, combined treatment with atorvastatin and imipenem exhibited survival benefit and improved vascular functions in septic mice. Copyright © 2015 Elsevier Inc. All rights reserved.

2. Genetic Determinants Associated With in Vivo Survival of Burkholderia cenocepacia in the Caenorhabditis elegans Model

KAUST Repository

Wong, Yee-Chin

2018-05-29

A Burkholderia cenocepacia infection usually leads to reduced survival and fatal cepacia syndrome in cystic fibrosis patients. The identification of B. cenocepacia essential genes for in vivo survival is key to designing new anti-infectives therapies. We used the Transposon-Directed Insertion Sequencing (TraDIS) approach to identify genes required for B. cenocepacia survival in the model infection host, Caenorhabditis elegans. A B. cenocepacia J2315 transposon pool of ∼500,000 mutants was used to infect C. elegans. We identified 178 genes as crucial for B. cenocepacia survival in the infected nematode. The majority of these genes code for proteins of unknown function, many of which are encoded by the genomic island BcenGI13, while other gene products are involved in nutrient acquisition, general stress responses and LPS O-antigen biosynthesis. Deletion of the glycosyltransferase gene wbxB and a histone-like nucleoid structuring (H-NS) protein-encoding gene (BCAL0154) reduced bacterial accumulation and attenuated virulence in C. elegans. Further analysis using quantitative RT-PCR indicated that BCAL0154 modulates B. cenocepacia pathogenesis via transcriptional regulation of motility-associated genes including fliC, fliG, flhD, and cheB1. This screen has successfully identified genes required for B. cenocepacia survival within the host-associated environment, many of which are potential targets for developing new antimicrobials.

3. Genetic Determinants Associated With in Vivo Survival of Burkholderia cenocepacia in the Caenorhabditis elegans Model

KAUST Repository

Wong, Yee-Chin; Abd El Ghany, Moataz; Ghazzali, Raeece N. M.; Yap, Soon-Joo; Hoh, Chee-Choong; Pain, Arnab; Nathan, Sheila

2018-01-01

A Burkholderia cenocepacia infection usually leads to reduced survival and fatal cepacia syndrome in cystic fibrosis patients. The identification of B. cenocepacia essential genes for in vivo survival is key to designing new anti-infectives therapies. We used the Transposon-Directed Insertion Sequencing (TraDIS) approach to identify genes required for B. cenocepacia survival in the model infection host, Caenorhabditis elegans. A B. cenocepacia J2315 transposon pool of ∼500,000 mutants was used to infect C. elegans. We identified 178 genes as crucial for B. cenocepacia survival in the infected nematode. The majority of these genes code for proteins of unknown function, many of which are encoded by the genomic island BcenGI13, while other gene products are involved in nutrient acquisition, general stress responses and LPS O-antigen biosynthesis. Deletion of the glycosyltransferase gene wbxB and a histone-like nucleoid structuring (H-NS) protein-encoding gene (BCAL0154) reduced bacterial accumulation and attenuated virulence in C. elegans. Further analysis using quantitative RT-PCR indicated that BCAL0154 modulates B. cenocepacia pathogenesis via transcriptional regulation of motility-associated genes including fliC, fliG, flhD, and cheB1. This screen has successfully identified genes required for B. cenocepacia survival within the host-associated environment, many of which are potential targets for developing new antimicrobials.

4. Experiences in applying Bayesian integrative models in interdisciplinary modeling: the computational and human challenges

DEFF Research Database (Denmark)

Kuikka, Sakari; Haapasaari, Päivi Elisabet; Helle, Inari

2011-01-01

We review the experience obtained in using integrative Bayesian models in interdisciplinary analysis focusing on sustainable use of marine resources and environmental management tasks. We have applied Bayesian models to both fisheries and environmental risk analysis problems. Bayesian belief...... be time consuming and research projects can be difficult to manage due to unpredictable technical problems related to parameter estimation. Biology, sociology and environmental economics have their own scientific traditions. Bayesian models are becoming traditional tools in fisheries biology, where...

5. Model predictive control based on reduced order models applied to belt conveyor system.

Science.gov (United States)

Chen, Wei; Li, Xin

2016-11-01

In the paper, a model predictive controller based on reduced order model is proposed to control belt conveyor system, which is an electro-mechanics complex system with long visco-elastic body. Firstly, in order to design low-degree controller, the balanced truncation method is used for belt conveyor model reduction. Secondly, MPC algorithm based on reduced order model for belt conveyor system is presented. Because of the error bound between the full-order model and reduced order model, two Kalman state estimators are applied in the control scheme to achieve better system performance. Finally, the simulation experiments are shown that balanced truncation method can significantly reduce the model order with high-accuracy and model predictive control based on reduced-model performs well in controlling the belt conveyor system. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

6. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

Directory of Open Access Journals (Sweden)

Matthew B Biggs

Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

7. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

Science.gov (United States)

Biggs, Matthew B; Papin, Jason A

2013-01-01

Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

8. Empirical modeling and data analysis for engineers and applied scientists

CERN Document Server

Pardo, Scott A

2016-01-01

This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

9. A comparison of various modelling approaches applied to Cholera ...

African Journals Online (AJOL)

linear models, ARIMA time series modelling, and dynamic regression are ... to certain environmental parameters, and to investigate the feasibility of .... in the SSA literature, the term noise is used to refer to both stochastic noise, as well as.

10. Comparison of various modelling approaches applied to cholera case data

CSIR Research Space (South Africa)

Van Den Bergh, F

2008-06-01

Full Text Available cross-wavelet technique, which is used to compute lead times for co-varying variables, and suggests transformations that enhance co-varying behaviour. Several statistical modelling techniques, including generalised linear models, ARIMA time series...

11. Adequateness of applying the Zmijewski model on Serbian companies

Directory of Open Access Journals (Sweden)

2012-12-01

Full Text Available The aim of the paper is to determine the accuracy of the prediction of Zmijewski model in Serbia on the eligible sample. At the same time, the paper identifies model's strengths, weaknesses and limitations of its possible application. Bearing in mind that the economic environment in Serbia is not similar to the United States at the time the model was developed, Zmijewski model is surprisingly accurate in the case of Serbian companies. The accuracy was slightly weaker than the model results in the U.S. in its original form, but much better than the results model gave in the U.S. in the period 1988-1991, and 1992-1999. Model gave also better results in Serbia comparing those in Croatia, even in Croatia model was adjusted.

12. Application of accelerated failure time models for breast cancer patients' survival in Kurdistan Province of Iran.

Science.gov (United States)

Karimi, Asrin; Delpisheh, Ali; Sayehmiri, Kourosh

2016-01-01

Breast cancer is the most common cancer and the second common cause of cancer-induced mortalities in Iranian women. There has been a rapid development in hazard models and survival analysis in the last decade. The aim of this study was to evaluate the prognostic factors of overall survival (OS) in breast cancer patients using accelerated failure time models (AFT). This was a retrospective-analytic cohort study. About 313 women with a pathologically proven diagnosis of breast cancer who had been treated during a 7-year period (since January 2006 until March 2014) in Sanandaj City, Kurdistan Province of Iran were recruited. Performance among AFT was assessed using the goodness of fit methods. Discrimination among the exponential, Weibull, generalized gamma, log-logistic, and log-normal distributions was done using Akaik information criteria and maximum likelihood. The 5 years OS was 75% (95% CI = 74.57-75.43). The main results in terms of survival were found for the different categories of the clinical stage covariate, tumor metastasis, and relapse of cancer. Survival time in breast cancer patients without tumor metastasis and relapse were 4, 2-fold longer than other patients with metastasis and relapse, respectively. One of the most important undermining prognostic factors in breast cancer is metastasis; hence, knowledge of the mechanisms of metastasis is necessary to prevent it so occurrence and treatment of metastatic breast cancer and ultimately extend the lifetime of patients.

13. Modeling the kinetics of survival of Staphylococcus aureus in regional yogurt from goat's milk.

Science.gov (United States)

Bednarko-Młynarczyk, E; Szteyn, J; Białobrzewski, I; Wiszniewska-Łaszczych, A; Liedtke, K

2015-01-01

The aim of this study was to determine the kinetics of the survival of the test strain of Staphylococcus aureus in the product investigated. Yogurt samples were contaminated with S. aure to an initial level of 10(3)-10(4) cfu/g. The samples were then stored at four temperatures: 4, 6, 20, 22°C. During storage, the number of S. aureus forming colonies in a gram of yogurt was determined every two hours. Based on the results of the analysis culture the curves of survival were plotted. Three primary models were selected to describe the kinetics of changes in the count of bacteria: Cole's model, a modified model of Gompertz and the model of Baranyi and Roberts. Analysis of the model fit carried out based on the average values of Pearson's correlation coefficient, between the modeled and measured values, showed that the Cole's model had the worst fit. The modified Gompertz model showed the count of S. aureus as a negative value. These drawbacks were not observed in the model of Baranyi and Roberts. For this reason, this model best reflects the kinetics of changes in the number of staphylococci in yogurt.

14. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

Science.gov (United States)

Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

2016-01-01

Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

15. Mediation Analysis with Survival Outcomes: Accelerated Failure Time Versus Proportional Hazards Models

Directory of Open Access Journals (Sweden)

Lois A Gelfand

2016-03-01

Full Text Available Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH and fully parametric accelerated failure time (AFT approaches for illustration.Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively under varied data conditions, some including censoring. A simulated data set illustrates the findings.Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome – underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG.Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

16. A predictive model for survival in metastatic cancer patients attending an outpatient palliative radiotherapy clinic

International Nuclear Information System (INIS)

Chow, Edward; Fung, KinWah; Panzarella, Tony; Bezjak, Andrea; Danjoux, Cyril; Tannock, Ian

2002-01-01

Purpose: To develop a predictive model for survival from the time of presentation in an outpatient palliative radiotherapy clinic. Methods and Materials: Sixteen factors were analyzed prospectively in 395 patients seen in a dedicated palliative radiotherapy clinic in a large tertiary cancer center using Cox's proportional hazards regression model. Results: Six prognostic factors had a statistically significant impact on survival, as follows: primary cancer site, site of metastases, Karnofsky performance score (KPS), and fatigue, appetite, and shortness of breath scores from the modified Edmonton Symptom Assessment Scale. Risk group stratification was performed (1) by assigning weights to the prognostic factors based on their levels of significance, and (2) by the number of risk factors present. The weighting method provided a Survival Prediction Score (SPS), ranging from 0 to 32. The survival probability at 3, 6, and 12 months was 83%, 70%, and 51%, respectively, for patients with SPS ≤13 (n=133); 67%, 41%, and 20% for patients with SPS 14-19 (n=129); and 36%, 18%, and 4% for patients with SPS ≥20 (n=133) (p<0.0001). Corresponding survival probabilities based on number of risk factors were as follows: 85%, 72%, and 52% (≤3 risk factors) (n=98); 68%, 47%, and 24% (4 risk factors) (n=117); and 46%, 24%, and 11% (≥5 factors) (n=180) (p<0.0001). Conclusion: Clinical prognostic factors can be used to predict prognosis among patients attending a palliative radiotherapy clinic. If validated in an independent series of patients, the model can be used to guide clinical decisions, plan supportive services, and allocate resource use

17. Comparison of two anisotropic layer models applied to induction motors

NARCIS (Netherlands)

Sprangers, R.L.J.; Paulides, J.J.H.; Boynov, K.O.; Waarma, J.; Lomonova, E.

2013-01-01

A general description of the Anisotropic Layer Theory, derived in the polar coordinate system, and applied to the analysis of squirrel-cage induction motors (IMs), is presented. The theory considers non-conductive layers, layer with predefined current density and layers with induced current density.

18. Applying the Job Characteristics Model to the College Education Experience

Science.gov (United States)

Kass, Steven J.; Vodanovich, Stephen J.; Khosravi, Jasmine Y.

2011-01-01

Boredom is one of the most common complaints among university students, with studies suggesting its link to poor grades, drop out, and behavioral problems. Principles borrowed from industrial-organizational psychology may help prevent boredom and enrich the classroom experience. In the current study, we applied the core dimensions of the job…

19. Process Modeling Applied to Metal Forming and Thermomechanical Processing

Science.gov (United States)

1984-09-01

measured (Lloyd & Kenny, 1982 and Kohara & Katsuta, 1978), The interpretation of these relations are qualitative at this stage (Lloyd et al. (1978...34, Applied Science Publishers, London l.elly,P.N. (1971) J . Aus t . Ins t .Me t a 1 s , 1_6, 104. Kohara , S. and Katsuta, M. (1978) J . Ja p . In

20. Comparison of two anisotropic layer models applied to induction motors

NARCIS (Netherlands)

Sprangers, R.L.J.; Paulides, J.J.H.; Boynov, K.O.; Lomonova, E.A.; Waarma, J.

2014-01-01

A general description of the Anisotropic Layer Theory, derived in the polar coordinate system, and applied to the analysis of squirrel-cage induction motors (IMs), is presented. The theory considers non-conductive layers, layer with predefined current density and layers with induced current density.

1. Applying Functional Modeling for Accident Management of Nuclear Power Plant

Energy Technology Data Exchange (ETDEWEB)

Lind, Morten; Zhang Xinxin [Harbin Engineering University, Harbin (China)

2014-08-15

The paper investigate applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow Modeling is given and a detailed presentation of the foundational means-end concepts is presented and the conditions for proper use in modelling accidents are identified. It is shown that Multilevel Flow Modeling can be used for modelling and reasoning about design basis accidents. Its possible role for information sharing and decision support in accidents beyond design basis is also indicated. A modelling example demonstrating the application of Multilevel Flow Modelling and reasoning for a PWR LOCA is presented.

2. Analysis of survival in breast cancer patients by using different parametric models

Science.gov (United States)

Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti

2017-09-01

In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.

3. Developing and Validating a Survival Prediction Model for NSCLC Patients Through Distributed Learning Across 3 Countries.

Science.gov (United States)

Jochems, Arthur; Deist, Timo M; El Naqa, Issam; Kessler, Marc; Mayo, Chuck; Reeves, Jackson; Jolly, Shruti; Matuszak, Martha; Ten Haken, Randall; van Soest, Johan; Oberije, Cary; Faivre-Finn, Corinne; Price, Gareth; de Ruysscher, Dirk; Lambin, Philippe; Dekker, Andre

2017-10-01

Tools for survival prediction for non-small cell lung cancer (NSCLC) patients treated with chemoradiation or radiation therapy are of limited quality. In this work, we developed a predictive model of survival at 2 years. The model is based on a large volume of historical patient data and serves as a proof of concept to demonstrate the distributed learning approach. Clinical data from 698 lung cancer patients, treated with curative intent with chemoradiation or radiation therapy alone, were collected and stored at 2 different cancer institutes (559 patients at Maastro clinic (Netherlands) and 139 at Michigan university [United States]). The model was further validated on 196 patients originating from The Christie (United Kingdon). A Bayesian network model was adapted for distributed learning (the animation can be viewed at https://www.youtube.com/watch?v=ZDJFOxpwqEA). Two-year posttreatment survival was chosen as the endpoint. The Maastro clinic cohort data are publicly available at https://www.cancerdata.org/publication/developing-and-validating-survival-prediction-model-nsclc-patients-through-distributed, and the developed models can be found at www.predictcancer.org. Variables included in the final model were T and N category, age, performance status, and total tumor dose. The model has an area under the curve (AUC) of 0.66 on the external validation set and an AUC of 0.62 on a 5-fold cross validation. A model based on the T and N category performed with an AUC of 0.47 on the validation set, significantly worse than our model (PLearning the model in a centralized or distributed fashion yields a minor difference on the probabilities of the conditional probability tables (0.6%); the discriminative performance of the models on the validation set is similar (P=.26). Distributed learning from federated databases allows learning of predictive models on data originating from multiple institutions while avoiding many of the data-sharing barriers. We believe that

4. Applying Model Checking to Industrial-Sized PLC Programs

CERN Document Server

AUTHOR|(CDS)2079190; Darvas, Daniel; Blanco Vinuela, Enrique; Tournier, Jean-Charles; Bliudze, Simon; Blech, Jan Olaf; Gonzalez Suarez, Victor M

2015-01-01

Programmable logic controllers (PLCs) are embedded computers widely used in industrial control systems. Ensuring that a PLC software complies with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of safety-critical software but is still underused in industry due to the complexity of building and managing formal models of real applications. In this paper, we propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (\\eg CTL, LTL) on PLC programs. This methodology is based on an intermediate model (IM), meant to transform PLC programs written in various standard languages (ST, SFC, etc.) to different modeling languages of verification tools. We present the syntax and semantics of the IM and the transformation rules of the ST and SFC languages to the nuXmv model checker passing through the intermediate model. Finally, two real cases studies of \\CERN PLC programs, written mainly in th...

5. Polarimetric SAR interferometry applied to land ice: modeling

DEFF Research Database (Denmark)

Dall, Jørgen; Papathanassiou, Konstantinos; Skriver, Henning

2004-01-01

This paper introduces a few simple scattering models intended for the application of polarimetric SAR interfer-ometry to land ice. The principal aim is to eliminate the penetration bias hampering ice sheet elevation maps generated with single-channel SAR interferometry. The polarimetric coherent...... scattering models are similar to the oriented-volume model and the random-volume-over-ground model used in vegetation studies, but the ice models are adapted to the different geometry of land ice. Also, due to compaction, land ice is not uniform; a fact that must be taken into account for large penetration...... depths. The validity of the scattering models is examined using L-band polarimetric interferometric SAR data acquired with the EMISAR system over an ice cap located in the percolation zone of the Greenland ice sheet. Radar reflectors were deployed on the ice surface prior to the data acquisition in order...

6. Applied exposure modeling for residual radioactivity and release criteria

International Nuclear Information System (INIS)

Lee, D.W.

1989-01-01

The protection of public health and the environment from the release of materials with residual radioactivity for recycle or disposal as wastes without radioactive contents of concern presents a formidable challenge. Existing regulatory criteria are based on technical judgment concerning detectability and simple modeling. Recently, exposure modeling methodologies have been developed to provide a more consistent level of health protection. Release criteria derived from the application of exposure modeling methodologies share the same basic elements of analysis but are developed to serve a variety of purposes. Models for the support of regulations for all applications rely on conservative interpretations of generalized conditions while models developed to show compliance incorporate specific conditions not likely to be duplicated at other sites. Research models represent yet another type of modeling which strives to simulate the actual behavior of released material. In spite of these differing purposes, exposure modeling permits the application of sound and reasoned principles of radiation protection to the release of materials with residual levels of radioactivity. Examples of the similarities and differences of these models are presented and an application to the disposal of materials with residual levels of uranium contamination is discussed. 5 refs., 2 tabs

7. Validation of a Predictive Model for Survival in Metastatic Cancer Patients Attending an Outpatient Palliative Radiotherapy Clinic

International Nuclear Information System (INIS)

Chow, Edward; Abdolell, Mohamed; Panzarella, Tony; Harris, Kristin; Bezjak, Andrea; Warde, Padraig; Tannock, Ian

2009-01-01

Purpose: To validate a predictive model for survival of patients attending a palliative radiotherapy clinic. Methods and Materials: We described previously a model that had good predictive value for survival of patients referred during 1999 (1). The six prognostic factors (primary cancer site, site of metastases, Karnofsky performance score, and the fatigue, appetite and shortness-of-breath items from the Edmonton Symptom Assessment Scale) identified in this training set were extracted from the prospective database for the year 2000. We generated a partial score whereby each prognostic factor was assigned a value proportional to its prognostic weight. The sum of the partial scores for each patient was used to construct a survival prediction score (SPS). Patients were also grouped according to the number of these risk factors (NRF) that they possessed. The probability of survival at 3, 6, and 12 months was generated. The models were evaluated for their ability to predict survival in this validation set with appropriate statistical tests. Results: The median survival and survival probabilities of the training and validation sets were similar when separated into three groups using both SPS and NRF methods. There was no statistical difference in the performance of the SPS and NRF methods in survival prediction. Conclusion: Both the SPS and NRF models for predicting survival in patients referred for palliative radiotherapy have been validated. The NRF model is preferred because it is simpler and avoids the need to remember the weightings among the prognostic factors

8. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

Science.gov (United States)

Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

9. Applying Hierarchical Model Calibration to Automatically Generated Items.

Science.gov (United States)

Williamson, David M.; Johnson, Matthew S.; Sinharay, Sandip; Bejar, Isaac I.

This study explored the application of hierarchical model calibration as a means of reducing, if not eliminating, the need for pretesting of automatically generated items from a common item model prior to operational use. Ultimately the successful development of automatic item generation (AIG) systems capable of producing items with highly similar…

10. Surface-bounded growth modeling applied to human mandibles

DEFF Research Database (Denmark)

Andresen, Per Rønsholt; Brookstein, F. L.; Conradsen, Knut

2000-01-01

From a set of longitudinal three-dimensional scans of the same anatomical structure, the authors have accurately modeled the temporal shape and size changes using a linear shape model. On a total of 31 computed tomography scans of the mandible from six patients, 14,851 semilandmarks are found...

11. An electricity billing model | Adetona | Journal of Applied Science ...

African Journals Online (AJOL)

Linear regression analysis has been employed to develop a model for predicting accurately the electricity billing for commercial consumers in Ogun State (Nigeria) at faster rate. The electricity billing model was implement-ed, executed and tested using embedded MATLAB function blocks. The correlations between the ...

12. Geographically Weighted Logistic Regression Applied to Credit Scoring Models

Directory of Open Access Journals (Sweden)

Pedro Henrique Melo Albuquerque

Full Text Available Abstract This study used real data from a Brazilian financial institution on transactions involving Consumer Direct Credit (CDC, granted to clients residing in the Distrito Federal (DF, to construct credit scoring models via Logistic Regression and Geographically Weighted Logistic Regression (GWLR techniques. The aims were: to verify whether the factors that influence credit risk differ according to the borrower’s geographic location; to compare the set of models estimated via GWLR with the global model estimated via Logistic Regression, in terms of predictive power and financial losses for the institution; and to verify the viability of using the GWLR technique to develop credit scoring models. The metrics used to compare the models developed via the two techniques were the AICc informational criterion, the accuracy of the models, the percentage of false positives, the sum of the value of false positive debt, and the expected monetary value of portfolio default compared with the monetary value of defaults observed. The models estimated for each region in the DF were distinct in their variables and coefficients (parameters, with it being concluded that credit risk was influenced differently in each region in the study. The Logistic Regression and GWLR methodologies presented very close results, in terms of predictive power and financial losses for the institution, and the study demonstrated viability in using the GWLR technique to develop credit scoring models for the target population in the study.

13. Nonstandard Finite Difference Method Applied to a Linear Pharmacokinetics Model

Directory of Open Access Journals (Sweden)

Oluwaseun Egbelowo

2017-05-01

Full Text Available We extend the nonstandard finite difference method of solution to the study of pharmacokinetic–pharmacodynamic models. Pharmacokinetic (PK models are commonly used to predict drug concentrations that drive controlled intravenous (I.V. transfers (or infusion and oral transfers while pharmacokinetic and pharmacodynamic (PD interaction models are used to provide predictions of drug concentrations affecting the response of these clinical drugs. We structure a nonstandard finite difference (NSFD scheme for the relevant system of equations which models this pharamcokinetic process. We compare the results obtained to standard methods. The scheme is dynamically consistent and reliable in replicating complex dynamic properties of the relevant continuous models for varying step sizes. This study provides assistance in understanding the long-term behavior of the drug in the system, and validation of the efficiency of the nonstandard finite difference scheme as the method of choice.

14. The Cheshire Cat principle applied to hybrid bag models

International Nuclear Information System (INIS)

Nielsen, H.B.; Wirzba, A.

1987-05-01

Here is argued for the Cheshire Cat point of view according to which the bag (itself) has only notational, but no physical significance. It is explained in a 1+1 dimensional exact Cheshire Cat model how a fermion can escape from the bag by means of an anomaly. We also suggest that suitably constructed hybrid bag models may be used to fix such parameters of effective Lagrangians that can otherwise be obtained from experiments only. This idea is illustrated in a calculation of the mass of the pseudoscalar η' meson in 1+1 dimension. Thus there is hope to find a construction principle for a phenomenologically sensible model. (orig.)

15. Trailing edge noise model applied to wind turbine airfoils

Energy Technology Data Exchange (ETDEWEB)

Bertagnolio, F.

2008-01-15

The aim of this work is firstly to provide a quick introduction to the theory of noise generation that are relevant to wind turbine technology with focus on trailing edge noise. Secondly, the socalled TNO trailing edge noise model developed by Parchen [1] is described in more details. The model is tested and validated by comparing with other results from the literature. Finally, this model is used in the optimization process of two reference airfoils in order to reduce their noise signature: the RISOE-B1-18 and the S809 airfoils. (au)

16. Metabolomics with Nuclear Magnetic Resonance Spectroscopy in a Drosophila melanogaster Model of Surviving Sepsis

Science.gov (United States)

Bakalov, Veli; Amathieu, Roland; Triba, Mohamed N.; Clément, Marie-Jeanne; Reyes Uribe, Laura; Le Moyec, Laurence; Kaynar, Ata Murat

2016-01-01

Patients surviving sepsis demonstrate sustained inflammation, which has been associated with long-term complications. One of the main mechanisms behind sustained inflammation is a metabolic switch in parenchymal and immune cells, thus understanding metabolic alterations after sepsis may provide important insights to the pathophysiology of sepsis recovery. In this study, we explored metabolomics in a novel Drosophila melanogaster model of surviving sepsis using Nuclear Magnetic Resonance (NMR), to determine metabolite profiles. We used a model of percutaneous infection in Drosophila melanogaster to mimic sepsis. We had three experimental groups: sepsis survivors (infected with Staphylococcus aureus and treated with oral linezolid), sham (pricked with an aseptic needle), and unmanipulated (positive control). We performed metabolic measurements seven days after sepsis. We then implemented metabolites detected in NMR spectra into the MetExplore web server in order to identify the metabolic pathway alterations in sepsis surviving Drosophila. Our NMR metabolomic approach in a Drosophila model of recovery from sepsis clearly distinguished between all three groups and showed two different metabolomic signatures of inflammation. Sham flies had decreased levels of maltose, alanine, and glutamine, while their level of choline was increased. Sepsis survivors had a metabolic signature characterized by decreased glucose, maltose, tyrosine, beta-alanine, acetate, glutamine, and succinate. PMID:28009836

17. Lecturing and Loving It: Applying the Information-Processing Model.

Science.gov (United States)

Parker, Jonathan K.

1993-01-01

Discusses the benefits of lecturing, when done properly, in high schools. Describes the positive attributes of effective lecturers. Provides a human information-processing model applicable to the task of lecturing to students. (HB)

18. Hydrodynamics and water quality models applied to Sepetiba Bay

Science.gov (United States)

Cunha, Cynara de L. da N.; Rosman, Paulo C. C.; Ferreira, Aldo Pacheco; Carlos do Nascimento Monteiro, Teófilo

2006-10-01

A coupled hydrodynamic and water quality model is used to simulate the pollution in Sepetiba Bay due to sewage effluent. Sepetiba Bay has a complicated geometry and bottom topography, and is located on the Brazilian coast near Rio de Janeiro. In the simulation, the dissolved oxygen (DO) concentration and biochemical oxygen demand (BOD) are used as indicators for the presence of organic matter in the body of water, and as parameters for evaluating the environmental pollution of the eastern part of Sepetiba Bay. Effluent sources in the model are taken from DO and BOD field measurements. The simulation results are consistent with field observations and demonstrate that the model has been correctly calibrated. The model is suitable for evaluating the environmental impact of sewage effluent on Sepetiba Bay from river inflows, assessing the feasibility of different treatment schemes, and developing specific monitoring activities. This approach has general applicability for environmental assessment of complicated coastal bays.

19. A Model-Based Prognostics Approach Applied to Pneumatic Valves

Data.gov (United States)

National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

20. A Model-based Prognostics Approach Applied to Pneumatic Valves

Data.gov (United States)

National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

1. Applying Time Series Analysis Model to Temperature Data in Greenhouses

Directory of Open Access Journals (Sweden)

Abdelhafid Hasni

2011-03-01

Full Text Available The objective of the research is to find an appropriate Seasonal Auto-Regressive Integrated Moving Average (SARIMA Model for fitting the inside air temperature (Tin of a naturally ventilated greenhouse under Mediterranean conditions by considering the minimum of Akaike Information Criterion (AIC. The results of fitting were as follows: the best SARIMA Model for fitting air temperature of greenhouse is SARIMA (1,0,0 (1,0,224.

2. The J3 SCR model applied to resonant converter simulation

Science.gov (United States)

Avant, R. L.; Lee, F. C. Y.

1985-01-01

The J3 SCR model is a continuous topology computer model for the SCR. Its circuit analog and parameter estimation procedure are uniformly applicable to popular computer-aided design and analysis programs such as SPICE2 and SCEPTRE. The circuit analog is based on the intrinsic three pn junction structure of the SCR. The parameter estimation procedure requires only manufacturer's specification sheet quantities as a data base.

3. Cell survival in carbon beams - comparison of amorphous track model predictions

DEFF Research Database (Denmark)

Grzanka, L.; Greilich, S.; Korcyl, M.

Introduction: Predictions of the radiobiological effectiveness (RBE) play an essential role in treatment planning with heavy charged particles. Amorphous track models ( [1] , [2] , also referred to as track structure models) provide currently the most suitable description of cell survival under i....... Amorphous track modelling of luminescence detector efficiency in proton and carbon beams. 4.Tsuruoka C, Suzuki M, Kanai T, et al. LET and ion species dependence for cell killing in normal human skin fibroblasts. Radiat Res. 2005;163:494-500.......Introduction: Predictions of the radiobiological effectiveness (RBE) play an essential role in treatment planning with heavy charged particles. Amorphous track models ( [1] , [2] , also referred to as track structure models) provide currently the most suitable description of cell survival under ion....... [2] . In addition, a new approach based on microdosimetric distributions is presented and investigated [3] . Material and methods: A suitable software library embrasing the mentioned amorphous track models including numerous submodels with respect to delta-electron range models, radial dose...

4. Agent-Based Modelling applied to 5D model of the HIV infection

Directory of Open Access Journals (Sweden)

Toufik Laroum

2016-12-01

The simplest model was the 3D mathematical model. But the complexity of this phenomenon and the diversity of cells and actors which affect its evolution requires the use of new approaches such as multi-agents approach that we have applied in this paper. The results of our simulator on the 5D model are promising because they are consistent with biological knowledge’s. Therefore, the proposed approach is well appropriate to the study of population dynamics in general and could help to understand and predict the dynamics of HIV infection.

5. GIS-Based Population Model Applied to Nevada Transportation Routes

International Nuclear Information System (INIS)

Mills, G.S.; Neuhauser, K.S.

1999-01-01

Recently, a model based on geographic information system (GIS) processing of US Census Block data has made high-resolution population analysis for transportation risk analysis technically and economically feasible. Population density bordering each kilometer of a route may be tabulated with specific route sections falling into each of three categories (Rural, Suburban or Urban) identified for separate risk analysis. In addition to the improvement in resolution of Urban areas along a route, the model provides a statistically-based correction to population densities in Rural and Suburban areas where Census Block dimensions may greatly exceed the 800-meter scale of interest. A semi-automated application of the GIS model to a subset of routes in Nevada (related to the Yucca Mountain project) are presented, and the results compared to previous models including a model based on published Census and other data. These comparisons demonstrate that meaningful improvement in accuracy and specificity of transportation risk analyses is dependent on correspondingly accurate and geographically-specific population density data

6. A Validated Prediction Model for Overall Survival From Stage III Non-Small Cell Lung Cancer: Toward Survival Prediction for Individual Patients

Energy Technology Data Exchange (ETDEWEB)

Oberije, Cary, E-mail: cary.oberije@maastro.nl [Radiation Oncology, Research Institute GROW of Oncology, Maastricht University Medical Center, Maastricht (Netherlands); De Ruysscher, Dirk [Radiation Oncology, Research Institute GROW of Oncology, Maastricht University Medical Center, Maastricht (Netherlands); Universitaire Ziekenhuizen Leuven, KU Leuven (Belgium); Houben, Ruud [Radiation Oncology, Research Institute GROW of Oncology, Maastricht University Medical Center, Maastricht (Netherlands); Heuvel, Michel van de; Uyterlinde, Wilma [Department of Thoracic Oncology, Netherlands Cancer Institute, Amsterdam (Netherlands); Deasy, Joseph O. [Memorial Sloan Kettering Cancer Center, New York (United States); Belderbos, Jose [Department of Radiation Oncology, Netherlands Cancer Institute, Amsterdam (Netherlands); Dingemans, Anne-Marie C. [Department of Pulmonology, University Hospital Maastricht, Research Institute GROW of Oncology, Maastricht (Netherlands); Rimner, Andreas; Din, Shaun [Memorial Sloan Kettering Cancer Center, New York (United States); Lambin, Philippe [Radiation Oncology, Research Institute GROW of Oncology, Maastricht University Medical Center, Maastricht (Netherlands)

2015-07-15

Purpose: Although patients with stage III non-small cell lung cancer (NSCLC) are homogeneous according to the TNM staging system, they form a heterogeneous group, which is reflected in the survival outcome. The increasing amount of information for an individual patient and the growing number of treatment options facilitate personalized treatment, but they also complicate treatment decision making. Decision support systems (DSS), which provide individualized prognostic information, can overcome this but are currently lacking. A DSS for stage III NSCLC requires the development and integration of multiple models. The current study takes the first step in this process by developing and validating a model that can provide physicians with a survival probability for an individual NSCLC patient. Methods and Materials: Data from 548 patients with stage III NSCLC were available to enable the development of a prediction model, using stratified Cox regression. Variables were selected by using a bootstrap procedure. Performance of the model was expressed as the c statistic, assessed internally and on 2 external data sets (n=174 and n=130). Results: The final multivariate model, stratified for treatment, consisted of age, gender, World Health Organization performance status, overall treatment time, equivalent radiation dose, number of positive lymph node stations, and gross tumor volume. The bootstrapped c statistic was 0.62. The model could identify risk groups in external data sets. Nomograms were constructed to predict an individual patient's survival probability ( (www.predictcancer.org)). The data set can be downloaded at (https://www.cancerdata.org/10.1016/j.ijrobp.2015.02.048). Conclusions: The prediction model for overall survival of patients with stage III NSCLC highlights the importance of combining patient, clinical, and treatment variables. Nomograms were developed and validated. This tool could be used as a first building block for a decision support system.

7. Applying Four Different Risk Models in Local Ore Selection

International Nuclear Information System (INIS)

Richmond, Andrew

2002-01-01

Given the uncertainty in grade at a mine location, a financially risk-averse decision-maker may prefer to incorporate this uncertainty into the ore selection process. A FORTRAN program risksel is presented to calculate local risk-adjusted optimal ore selections using a negative exponential utility function and three dominance models: mean-variance, mean-downside risk, and stochastic dominance. All four methods are demonstrated in a grade control environment. In the case study, optimal selections range with the magnitude of financial risk that a decision-maker is prepared to accept. Except for the stochastic dominance method, the risk models reassign material from higher cost to lower cost processing options as the aversion to financial risk increases. The stochastic dominance model usually was unable to determine the optimal local selection

8. Hidden multidimensional social structure modeling applied to biased social perception

Science.gov (United States)

Maletić, Slobodan; Zhao, Yi

2018-02-01

Intricacies of the structure of social relations are realized by representing a collection of overlapping opinions as a simplicial complex, thus building latent multidimensional structures, through which agents are, virtually, moving as they exchange opinions. The influence of opinion space structure on the distribution of opinions is demonstrated by modeling consensus phenomena when the opinion exchange between individuals may be affected by the false consensus effect. The results indicate that in the cases with and without bias, the road toward consensus is influenced by the structure of multidimensional space of opinions, and in the biased case, complete consensus is achieved. The applications of proposed modeling framework can easily be generalized, as they transcend opinion formation modeling.

9. Differential Evolution algorithm applied to FSW model calibration

Science.gov (United States)

Idagawa, H. S.; Santos, T. F. A.; Ramirez, A. J.

2014-03-01

Friction Stir Welding (FSW) is a solid state welding process that can be modelled using a Computational Fluid Dynamics (CFD) approach. These models use adjustable parameters to control the heat transfer and the heat input to the weld. These parameters are used to calibrate the model and they are generally determined using the conventional trial and error approach. Since this method is not very efficient, we used the Differential Evolution (DE) algorithm to successfully determine these parameters. In order to improve the success rate and to reduce the computational cost of the method, this work studied different characteristics of the DE algorithm, such as the evolution strategy, the objective function, the mutation scaling factor and the crossover rate. The DE algorithm was tested using a friction stir weld performed on a UNS S32205 Duplex Stainless Steel.

10. Modeling the microstructure of surface by applying BRDF function

Science.gov (United States)

Plachta, Kamil

2017-06-01

The paper presents the modeling of surface microstructure using a bidirectional reflectance distribution function. This function contains full information about the reflectance properties of the flat surfaces - it is possible to determine the share of the specular, directional and diffuse components in the reflected luminous stream. The software is based on the authorial algorithm that uses selected elements of this function models, which allows to determine the share of each component. Basing on obtained data, the surface microstructure of each material can be modeled, which allows to determine the properties of this materials. The concentrator directs the reflected solar radiation onto the photovoltaic surface, increasing, at the same time, the value of the incident luminous stream. The paper presents an analysis of selected materials that can be used to construct the solar concentrator system. The use of concentrator increases the power output of the photovoltaic system by up to 17% as compared to the standard solution.

11. Computational modeling applied to stress gradient analysis for metallic alloys

International Nuclear Information System (INIS)

Iglesias, Susana M.; Assis, Joaquim T. de; Monine, Vladimir I.

2009-01-01

Nowadays composite materials including materials reinforced by particles are the center of the researcher's attention. There are problems with the stress measurements in these materials, connected with the superficial stress gradient caused by the difference of the stress state of particles on the surface and in the matrix of the composite material. Computer simulation of diffraction profile formed by superficial layers of material allows simulate the diffraction experiment and gives the possibility to resolve the problem of stress measurements when the stress state is characterized by strong gradient. The aim of this paper is the application of computer simulation technique, initially developed for homogeneous materials, for diffraction line simulation of composite materials and alloys. Specifically we applied this technique for siluminum fabricated by powder metallurgy. (author)

12. Gordon's model applied to nursing care of people with depression.

Science.gov (United States)

Temel, M; Kutlu, F Y

2015-12-01

Psychiatric nurses should consider the patient's biological, psychological and social aspects. Marjory Gordon's Functional Health Pattern Model ensures a holistic approach for the patient. To examine the effectiveness of Gordon's Functional Health Pattern Model in reducing depressive symptoms, increasing self-efficacy, coping with depression and increasing hope in people with depression. A quasi-experimental two-group pre-test and post-test design was adopted. Data were collected from April 2013 to May 2014 from people with depression at the psychiatry clinic of a state hospital in Turkey; they were assigned to the intervention (n = 34) or control group (n = 34). The intervention group received nursing care according to Gordon's Functional Health Pattern Model and routine care, while the control group received routine care only. The Beck Depression Inventory, Beck Hopelessness Scale and Depression Coping Self-Efficacy Scale were used. The intervention group had significantly lower scores on the Beck Depression Inventory and Beck Hopelessness Scale at the post-test and 3-month follow-up; they had higher scores on the Depression Coping Self-Efficacy Scale at the 3-month follow-up when compared with the control group. The study was conducted at only one psychiatry clinic. The intervention and control group patients were at the clinic at the same time and influenced each other. Moreover, because clinical routines were in progress during the study, the results cannot only be attributed to nursing interventions. Nursing models offer guidance for the care provided. Practices based on the models return more efficient and systematic caregiving results with fewer health problems. Gordon's Functional Health Pattern Model was effective in improving the health of people with depression and could be introduced as routine care with ongoing evaluation in psychiatric clinics. More research is needed to evaluate Gordon's Nursing Model effect on people with depression. Future

13. Applying an Employee-Motivation Model to Prevent Student Plagiarism.

Science.gov (United States)

Malouff, John M.; Sims, Randi L.

1996-01-01

A model based on Vroom's expectancy theory of employee motivation posits that instructors can prevent plagiarism by ensuring that students understand the rules of ethical writing, expect assignments to be manageable and have personal benefits, and expect plagiarism to be difficult and have important personal costs. (SK)

14. Applying the knowledge creation model to the management of ...

African Journals Online (AJOL)

In present-day society, the need to manage indigenous knowledge is widely recognised. However, there is a debate in progress on whether or not indigenous knowledge can be easily managed. The purpose of this paper is to examine the possibility of using knowledge management models like knowledge creation theory ...

15. Robust model identification applied to type 1diabetes

DEFF Research Database (Denmark)

Finan, Daniel Aaron; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

2010-01-01

In many realistic applications, process noise is known to be neither white nor normally distributed. When identifying models in these cases, it may be more effective to minimize a different penalty function than the standard sum of squared errors (as in a least-squares identification method). Thi...

16. Dynamics Model Applied to Pricing Options with Uncertain Volatility

Directory of Open Access Journals (Sweden)

Lorella Fatone

2012-01-01

model is proposed. The data used to test the calibration problem included observations of asset prices over a finite set of (known equispaced discrete time values. Statistical tests were used to estimate the statistical significance of the two parameters of the Black-Scholes model: the volatility and the drift. The effects of these estimates on the option pricing problem were investigated. In particular, the pricing of an option with uncertain volatility in the Black-Scholes framework was revisited, and a statistical significance was associated with the price intervals determined using the Black-Scholes-Barenblatt equations. Numerical experiments involving synthetic and real data were presented. The real data considered were the daily closing values of the S&P500 index and the associated European call and put option prices in the year 2005. The method proposed here for calibrating the Black-Scholes dynamics model could be extended to other science and engineering models that may be expressed in terms of stochastic dynamical systems.

17. Modular Modelling and Simulation Approach - Applied to Refrigeration Systems

DEFF Research Database (Denmark)

Sørensen, Kresten Kjær; Stoustrup, Jakob

2008-01-01

This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...

18. A comparison of various modelling approaches applied to Cholera ...

African Journals Online (AJOL)

The analyses are demonstrated on data collected from Beira, Mozambique. Dynamic regression was found to be the preferred forecasting method for this data set. Keywords:Cholera, modelling, signal processing, dynamic regression, negative binomial regression, wavelet analysis, cross-wavelet analysis. ORiON Vol.

19. Leadership Identity Development: Challenges in Applying a Developmental Model

Science.gov (United States)

Komives, Susan R.; Longerbeam, Susan D.; Mainella, Felicia; Osteen, Laura; Owen, Julie E.; Wagner, Wendy

2009-01-01

The leadership identity development (LID) grounded theory (Komives, Owen, Longerbeam, Mainella, & Osteen, 2005) and related LID model (Komives, Longerbeam, Owen, Mainella, & Osteen, 2006) present a framework for understanding how individual college students develop the social identity of being collaborative, relational leaders…

20. Applying the elastic model for various nucleus-nucleus fusion

International Nuclear Information System (INIS)

HASSAN, G.S.; RAGAB, H.S.; SEDDEEK, M.K.

2000-01-01

The Elastic Model of two free parameters m,d given by Scalia has been used for wider energy regions to fit the available experimental data for potential barriers and cross sections. In order to generalize Scalia's formula in both sub- and above-barrier regions, we calculated m, d for pairs rather than those given by Scalia and compared the calculated cross sections with the experimental data. This makes a generalization of the Elastic Model in describing fusion process. On the other hand, Scalia's range of interacting systems was 24 ≤ A ≤194 where A is the compound nucleus mass number. Our extension of that model includes an example of the pairs of A larger than his final limit aiming to make it as a general formula for any type of reactants: light, intermediate or heavy systems. A significant point is the comparison of Elastic Model calculations with the well known methods studying complete fusion and compound nucleus formation, namely with the resultants of using Proximity potential with either Sharp or Smooth cut-off approximations

1. Exact results for survival probability in the multistate Landau-Zener model

International Nuclear Information System (INIS)

Volkov, M V; Ostrovsky, V N

2004-01-01

An exact formula is derived for survival probability in the multistate Landau-Zener model in the special case where the initially populated state corresponds to the extremal (maximum or minimum) slope of a linear diabatic potential curve. The formula was originally guessed by S Brundobler and V Elzer (1993 J. Phys. A: Math. Gen. 26 1211) based on numerical calculations. It is a simple generalization of the expression for the probability of diabatic passage in the famous two-state Landau-Zener model. Our result is obtained via analysis and summation of the entire perturbation theory series

2. Genetic evaluation of calf and heifer survival in Iranian Holstein cattle using linear and threshold models.

Science.gov (United States)

Forutan, M; Ansari Mahyari, S; Sargolzaei, M

2015-02-01

Calf and heifer survival are important traits in dairy cattle affecting profitability. This study was carried out to estimate genetic parameters of survival traits in female calves at different age periods, until nearly the first calving. Records of 49,583 female calves born during 1998 and 2009 were considered in five age periods as days 1-30, 31-180, 181-365, 366-760 and full period (day 1-760). Genetic components were estimated based on linear and threshold sire models and linear animal models. The models included both fixed effects (month of birth, dam's parity number, calving ease and twin/single) and random effects (herd-year, genetic effect of sire or animal and residual). Rates of death were 2.21, 3.37, 1.97, 4.14 and 12.4% for the above periods, respectively. Heritability estimates were very low ranging from 0.48 to 3.04, 0.62 to 3.51 and 0.50 to 4.24% for linear sire model, animal model and threshold sire model, respectively. Rank correlations between random effects of sires obtained with linear and threshold sire models and with linear animal and sire models were 0.82-0.95 and 0.61-0.83, respectively. The estimated genetic correlations between the five different periods were moderate and only significant for 31-180 and 181-365 (r(g) = 0.59), 31-180 and 366-760 (r(g) = 0.52), and 181-365 and 366-760 (r(g) = 0.42). The low genetic correlations in current study would suggest that survival at different periods may be affected by the same genes with different expression or by different genes. Even though the additive genetic variations of survival traits were small, it might be possible to improve these traits by traditional or genomic selection. © 2014 Blackwell Verlag GmbH.

3. Modeling the survivability of brucella to exposure of Ultraviolet radiation and temperature

Science.gov (United States)

Howe, R.

Accumulated summation of daily Ultra Violet-B (UV-B = 290? to 320 ? ) data? from The USDA Ultraviolet Radiation Monitoring Program show good correlation (R^2 = 77%) with daily temperature data during the five month period from February through June, 1998. Exposure of disease organisms, such as brucella to the effects of accumulated UV-B radiation, can be modeled for a 5 month period from February through June, 1998. Estimates of a lethal dosage for brucell of UV-B in the environment is dependent on minimum/maximum temperature and Solar Zenith Angle for the time period. The accumulated increase in temperature over this period also effects the decomposition of an aborted fetus containing brucella. Decomposition begins at some minimum daily temperature at 27 to 30 degrees C and peaks at 39 to 40C. It is useful to view the summation of temperature as a threshold for other bacteria growth, so that accumulated temperature greater than some value causes decomposition through competition with other bacteria and brucella die from the accumulated effects of UV-B, temperature and organism competition. Results of a study (Cook 1998) to determine survivability of brucellosis in the environment through exposure of aborted bovine fetuses show no one cause can be attributed to death of the disease agent. The combination of daily increase in temperature and accumulated UV-B radiation reveal an inverse correlation to survivability data and can be modeled as an indicator of brucella survivability in the environment in arid regions.

4. A linear-quadratic model of cell survival considering both sublethal and potentially lethal radiation damage

International Nuclear Information System (INIS)

Rutz, H.P.; Coucke, P.A.; Mirimanoff, R.O.

1991-01-01

The authors assessed the dose-dependence of repair of potentially lethal damage in Chinese hamster ovary cells x-irradiated in vitro. The recovery ratio (RR) by which survival (SF) of the irradiated cells was enhanced increased exponentially with a linear and a quadratic component namely ζ and ψ: RR=exp(ζD+ψD 2 ). Survival of irradiated cells can thus be expressed by a combined linear-quadratic model considering 4 variables, namely α and β for the capacity of the cells to accumulate sublethal damage, and ζ and ψ for their capacity to repair potentially lethal damage: SF=exp((ζ-α)D+ (ψ-β)D 2 ). author. 26 refs.; 1 fig.; 1 tab

5. Motor fuel demand analysis - applied modelling in the European union

International Nuclear Information System (INIS)

Chorazewiez, S.

1998-01-01

Motor fuel demand in Europe amounts to almost half of petroleum products consumption and to thirty percent of total final energy consumption. This study considers, Firstly, the energy policies of different European countries and the ways in which the consumption of motor gasoline and automotive gas oil has developed. Secondly it provides an abstract of demand models in the energy sector, illustrating their specific characteristics. Then it proposes an economic model of automotive fuel consumption, showing motor gasoline and automotive gas oil separately over a period of thirty years (1960-1993) for five main countries in the European Union. Finally, forecasts of consumption of gasoline and diesel up to the year 2020 are given for different scenarios. (author)

6. APPLYING PETRI NETS EXTENSIONS TO MODELING COMMERCIAL BANK ACTIVITY

Directory of Open Access Journals (Sweden)

Igor ENICOV

2017-02-01

Full Text Available The relevance of the study is determined by the need to improve the methods of modeling andsimulating commercial bank activity, including for the purpose of calculating, controlling and managingthe risk of the bank, in the context of the transition to the application of Basel III standards. Thisimprovement becomes necessary due to a direct transition to new regulatory standards when the internalassessments of the main risks become the initial data for calculating the capital adequacy of a bank. Thepurpose of this article is to argue the opportunity to formulate a theory of the commercial bank model onthe extensions of Petri nets theory. The main methods of research were the method of scientific abstractionand method of logical analysis. The main result obtained in the study and presented in the article is theargumentation of the possibility to analyze the quantitative and qualitative characteristics of acommercial bank with the help of Petri net extensions.

7. Alexandrium minutum growth controlled by phosphorus An applied model

OpenAIRE

Chapelle, Annie; Labry, Claire; Sourisseau, Marc; Lebreton, Carole; Youenou, Agnes; Crassous, Marie-pierre

2010-01-01

Toxic algae are a worldwide problem threatening aquaculture public health and tourism Alexandrium a toxic dinoflagellate proliferates in Northwest France estuaries (i e the Penze estuary) causing Paralytic Shellfish Poisoning events Vegetative growth and in particular the role of nutrient uptake and growth rate are crucial parameters to understand toxic blooms With the goal of modelling in situ Alexandrium blooms related to environmental parameters we first try to calibrate a zero-dimensional...

8. Applying Mechanistic Dam Breach Models to Historic Levee Breaches

OpenAIRE

Risher Paul; Gibson Stanford

2016-01-01

Hurricane Katrina elevated levee risk in the US national consciousness, motivating agencies to assess and improve their levee risk assessment methodology. Accurate computation of the flood flow magnitude and timing associated with a levee breach remains one of the most difficult and uncertain components of levee risk analysis. Contemporary methods are largely empirical and approximate, introducing substantial uncertainty to the damage and life loss models. Levee breach progressions are often ...

9. Applying CIPP Model for Learning-Object Management

Science.gov (United States)

Morgado, Erla M. Morales; Peñalvo, Francisco J. García; Martín, Carlos Muñoz; Gonzalez, Miguel Ángel Conde

Although knowledge management process needs to receive some evaluation in order to determine their suitable functionality. There is not a clear definition about the stages where LOs need to be evaluated and the specific metrics to continuously promote their quality. This paper presents a proposal for LOs evaluation during their management for e-learning systems. To achieve this, we suggest specific steps for LOs design, implementation and evaluation into the four stages proposed by CIPP model (Context, Input, Process, Product).

10. Geometry Based Design Automation : Applied to Aircraft Modelling and Optimization

OpenAIRE

2012-01-01

Product development processes are continuously challenged by demands for increased efficiency. As engineering products become more and more complex, efficient tools and methods for integrated and automated design are needed throughout the development process. Multidisciplinary Design Optimization (MDO) is one promising technique that has the potential to drastically improve concurrent design. MDO frameworks combine several disciplinary models with the aim of gaining a holistic perspective of ...

11. A theoretical intellectual capital model applied to cities

Directory of Open Access Journals (Sweden)

José Luis Alfaro Navarro

2013-06-01

Full Text Available New Management Information Systems (MIS are necessary at local level as the main source of wealth creation. Therefore, tools and approaches that provide a full future vision of any organization should be a strategic priority for economic development. In this line, cities are “centers of knowledge and sources of growth and innovation” and integrated urban development policies are necessary. These policies support communication networks and optimize location structures as strategies that provide opportunities for social and democratic participation for the citizens. This paper proposes a theoretical model to measure and evaluate the cities intellectual capital that allows determine what we must take into account to make cities a source of wealth, prosperity, welfare and future growth. Furthermore, local intellectual capital provides a long run vision. Thus, in this paper we develop and explain how to implement a model to estimate intellectual capital in cities. In this sense, our proposal is to provide a model for measuring and managing intellectual capital using socio-economic indicators for cities. These indicators offer a long term picture supported by a comprehensive strategy for those who occupy the local space, infrastructure for implementation and management of the environment for its development.

12. Simulation of Road Traffic Applying Model-Driven Engineering

Directory of Open Access Journals (Sweden)

Alberto FERNÁNDEZ-ISABEL

2016-05-01

Full Text Available Road traffic is an important phenomenon in modern societies. The study of its different aspects in the multiple scenarios where it happens is relevant for a huge number of problems. At the same time, its scale and complexity make it hard to study. Traffic simulations can alleviate these difficulties, simplifying the scenarios to consider and controlling their variables. However, their development also presents difficulties. The main ones come from the need to integrate the way of working of researchers and developers from multiple fields. Model-Driven Engineering (MDE addresses these problems using Modelling Languages (MLs and semi-automatic transformations to organise and describe the development, from requirements to code. This paper presents a domain-specific MDE framework for simulations of road traffic. It comprises an extensible ML, support tools, and development guidelines. The ML adopts an agent-based approach, which is focused on the roles of individuals in road traffic and their decision-making. A case study shows the process to model a traffic theory with the ML, and how to specialise that specification for an existing target platform and its simulations. The results are the basis for comparison with related work.

13. Modeling a Thermoelectric Generator Applied to Diesel Automotive Heat Recovery

Science.gov (United States)

Espinosa, N.; Lazard, M.; Aixala, L.; Scherrer, H.

2010-09-01

Thermoelectric generators (TEGs) are outstanding devices for automotive waste heat recovery. Their packaging, lack of moving parts, and direct heat to electrical conversion are the main benefits. Usually, TEGs are modeled with a constant hot-source temperature. However, energy in exhaust gases is limited, thus leading to a temperature decrease as heat is recovered. Therefore thermoelectric properties change along the TEG, affecting performance. A thermoelectric generator composed of Mg2Si/Zn4Sb3 for high temperatures followed by Bi2Te3 for low temperatures has been modeled using engineering equation solver (EES) software. The model uses the finite-difference method with a strip-fins convective heat transfer coefficient. It has been validated on a commercial module with well-known properties. The thermoelectric connection and the number of thermoelements have been addressed as well as the optimum proportion of high-temperature material for a given thermoelectric heat exchanger. TEG output power has been estimated for a typical commercial vehicle at 90°C coolant temperature.

14. Applying a Virtual Economy Model in Mexico's Oil Sector

International Nuclear Information System (INIS)

Baker, G.

1994-01-01

The state of Mexico's oil industry, including the accomplishments of Pemex, Mexico's national oil company, was discussed, with particular reference to the progress made in the period of 1988-1994, and the outlook for innovations in the post-Salinas era. The concept of an evolutionary trend from a command economy (State as sole producer), towards market (State as regulator) or mixed economies (State as business partner) in developing countries, was introduced, placing Pemex within this evolutionary model as moving away from centralized control of oil production and distribution, while achieving international competitiveness. The concept of ''virtual market economy'' was also discussed. This model contains the legal basis of a command economy, while instituting modernization programs in order to stimulate market-economic conditions. This type of economy was considered particularly useful in this instance, sine it would allow Pemex units to operate within international performance and price benchmarks while maintaining state monopoly. Specific details of how Pemex could transform itself to a virtual market economy were outlined. It was recommended that Pemex experiment with the virtual mixed economy model; in essence, making the state a co-producer, co-transporter, and co-distributor of hydrocarbons. The effects of such a move would be to bring non-debt funding to oil and gas production, transmission, and associated industrial activities

15. A GOMS model applied to a simplified control panel design

International Nuclear Information System (INIS)

Chavez, C.; Edwards, R.M.

1992-01-01

The design of the user interface for a new system requires many decisions to be considered. To develop sensitivity to user needs requires understanding user behavior. The how-to-do-it knowledge is a mixture of task-related and interface-related components. A conscientious analysis of these components, allows the designer to construct a model in terms of goals, operators, methods, and selection (GOMS model) rules that can be advantageously used in the design process and evaluation of a user interface. The emphasis of the present work is on describing the importance and use of a GOMS model as a formal user interface analysis tool in the development of a simplified panel for the control of a nuclear power plant. At Pennsylvania State University, a highly automated control system with a greatly simplified human interface has been proposed to improve power plant safety. Supervisory control is to be conducted with a simplified control panel with the following functions: startup, shutdown, increase power, decrease power, reset, and scram. Initial programming of the operator interface has been initiated within the framework of a U.S. Department of Energy funded university project for intelligent distributed control. A hypothesis to be tested is that this scheme can be also used to estimate mental work load content and predict human performance

16. Applying fuzzy analytic network process in quality function deployment model

Directory of Open Access Journals (Sweden)

2012-08-01

Full Text Available In this paper, we propose an empirical study of QFD implementation when fuzzy numbers are used to handle the uncertainty associated with different components of the proposed model. We implement fuzzy analytical network to find the relative importance of various criteria and using fuzzy numbers we calculate the relative importance of these factors. The proposed model of this paper uses fuzzy matrix and house of quality to study the products development in QFD and also the second phase i.e. part deployment. In most researches, the primary objective is only on CRs to implement the quality function deployment and some other criteria such as production costs, manufacturing costs etc were disregarded. The results of using fuzzy analysis network process based on the QFD model in Daroupat packaging company to develop PVDC show that the most important indexes are being waterproof, resistant pill packages, and production cost. In addition, the PVDC coating is the most important index in terms of company experts’ point of view.

17. Living donor risk model for predicting kidney allograft and patient survival in an emerging economy.

Science.gov (United States)

Zafar, Mirza Naqi; Wong, Germaine; Aziz, Tahir; Abbas, Khawar; Adibul Hasan Rizvi, S

2018-03-01

Living donor kidney is the main source of donor organs in low to middle income countries. We aimed to develop a living donor risk model that predicts graft and patient survival in an emerging economy. We used data from the Sindh Institute of Urology and Transplantation (SIUT) database (n = 2283 recipients and n = 2283 living kidney donors, transplanted between 1993 and 2009) and conducted Cox proportional hazard analyses to develop a composite score that predicts graft and patient survivals. Donor factors age, creatinine clearance, nephron dose (estimated by donor/recipient body weight ratio) and human leukocyte antigen (HLA) match were included in the living donor risk model. The adjusted hazard ratios (HRs) for graft failures among those who received a kidney with living donor scores (reference to donor score of zero) of 1, 2, 3 and 4 were 1.14 (95%CI: 0.94-1.39), 1.24 (95%CI:1.03-1.49), 1.25 (95%CI:1.03-1.51) and 1.36 (95%CI:1.08-1.72) (P-value for trend =0.05). Similar findings were observed for patient survival. Similar to findings in high income countries, our study suggests that donor characteristics such as age, nephron dose, creatinine clearance and HLA match are important factors that determine the long-term patient and graft survival in low income countries. However, other crucial but undefined factors may play a role in determining the overall risk of graft failure and mortality in living kidney donor transplant recipients. © 2016 Asian Pacific Society of Nephrology.

18. An Optic Nerve Crush Injury Murine Model to Study Retinal Ganglion Cell Survival

Science.gov (United States)

Tang, Zhongshu; Zhang, Shuihua; Lee, Chunsik; Kumar, Anil; Arjunan, Pachiappan; Li, Yang; Zhang, Fan; Li, Xuri

2011-01-01

Injury to the optic nerve can lead to axonal degeneration, followed by a gradual death of retinal ganglion cells (RGCs), which results in irreversible vision loss. Examples of such diseases in human include traumatic optic neuropathy and optic nerve degeneration in glaucoma. It is characterized by typical changes in the optic nerve head, progressive optic nerve degeneration, and loss of retinal ganglion cells, if uncontrolled, leading to vision loss and blindness. The optic nerve crush (ONC) injury mouse model is an important experimental disease model for traumatic optic neuropathy, glaucoma, etc. In this model, the crush injury to the optic nerve leads to gradual retinal ganglion cells apoptosis. This disease model can be used to study the general processes and mechanisms of neuronal death and survival, which is essential for the development of therapeutic measures. In addition, pharmacological and molecular approaches can be used in this model to identify and test potential therapeutic reagents to treat different types of optic neuropathy. Here, we provide a step by step demonstration of (I) Baseline retrograde labeling of retinal ganglion cells (RGCs) at day 1, (II) Optic nerve crush injury at day 4, (III) Harvest the retinae and analyze RGC survival at day 11, and (IV) Representative result. PMID:21540827

19. Inverse geothermal modelling applied to Danish sedimentary basins

DEFF Research Database (Denmark)

Poulsen, Soren E.; Balling, Niels; Bording, Thue S.

2017-01-01

. The numerical model was utilized for predicting and contouring temperatures at 2000 and 3000 m depths and for two main geothermal reservoir units, the Gassum (Lower Jurassic-Upper Triassic) and Bunter/Skagerrak (Triassic) reservoirs, both currently utilized for geothermal energy production. Temperature...... gradients to depths of 2000-3000 m are generally around 25-30. degrees C km(-1), locally up to about 35. degrees C km(-1). Large regions have geothermal reservoirs with characteristic temperatures ranging from ca. 40-50. degrees C, at 1000-1500 m depth, to ca. 80-110. degrees C, at 2500-3500 m, however...

20. Modeling external constraints: Applying expert systems to nuclear plants

International Nuclear Information System (INIS)

Beck, C.E.; Behera, A.K.

1993-01-01

Artificial Intelligence (AI) applications in nuclear plants have received much attention over the past decade. Specific applications that have been addressed include development of models and knowledge-bases, plant maintenance, operations, procedural guidance, risk assessment, and design tools. This paper examines the issue of external constraints, with a focus on the use of Al and expert systems as design tools. It also provides several suggested methods for addressing these constraints within the Al framework. These methods include a State Matrix scheme, a layered structure for the knowledge base, and application of the dynamic parameter concept

1. Neighborhood Design, Physical Activity, and Wellbeing: Applying the Walkability Model

Directory of Open Access Journals (Sweden)

2017-01-01

Full Text Available Neighborhood design affects lifestyle physical activity, and ultimately human wellbeing. There are, however, a limited number of studies that examine neighborhood design types. In this research, we examine four types of neighborhood designs: traditional development, suburban development, enclosed community, and cluster housing development, and assess their level of walkability and their effects on physical activity and wellbeing. We examine significant associations through a questionnaire (n = 486 distributed in Tucson, Arizona using the Walkability Model. Among the tested neighborhood design types, traditional development showed significant associations and the highest value for walkability, as well as for each of the two types of walking (recreation and transportation representing physical activity. Suburban development showed significant associations and the highest mean values for mental health and wellbeing. Cluster housing showed significant associations and the highest mean value for social interactions with neighbors and for perceived safety from crime. Enclosed community did not obtain the highest means for any wellbeing benefit. The Walkability Model proved useful in identifying the walkability categories associated with physical activity and perceived crime. For example, the experience category was strongly and inversely associated with perceived crime. This study provides empirical evidence of the importance of including vegetation, particularly trees, throughout neighborhoods in order to increase physical activity and wellbeing. Likewise, the results suggest that regular maintenance is an important strategy to improve mental health and overall wellbeing in cities.

2. Neighborhood Design, Physical Activity, and Wellbeing: Applying the Walkability Model.

Science.gov (United States)

Zuniga-Teran, Adriana A; Orr, Barron J; Gimblett, Randy H; Chalfoun, Nader V; Guertin, David P; Marsh, Stuart E

2017-01-13

Neighborhood design affects lifestyle physical activity, and ultimately human wellbeing. There are, however, a limited number of studies that examine neighborhood design types. In this research, we examine four types of neighborhood designs: traditional development, suburban development, enclosed community, and cluster housing development, and assess their level of walkability and their effects on physical activity and wellbeing. We examine significant associations through a questionnaire ( n = 486) distributed in Tucson, Arizona using the Walkability Model. Among the tested neighborhood design types, traditional development showed significant associations and the highest value for walkability, as well as for each of the two types of walking (recreation and transportation) representing physical activity. Suburban development showed significant associations and the highest mean values for mental health and wellbeing. Cluster housing showed significant associations and the highest mean value for social interactions with neighbors and for perceived safety from crime. Enclosed community did not obtain the highest means for any wellbeing benefit. The Walkability Model proved useful in identifying the walkability categories associated with physical activity and perceived crime. For example, the experience category was strongly and inversely associated with perceived crime. This study provides empirical evidence of the importance of including vegetation, particularly trees, throughout neighborhoods in order to increase physical activity and wellbeing. Likewise, the results suggest that regular maintenance is an important strategy to improve mental health and overall wellbeing in cities.

3. Applying the Health Belief Model to college students' health behavior

Science.gov (United States)

Kim, Hak-Seon; Ahn, Joo

2012-01-01

The purpose of this research was to investigate how university students' nutrition beliefs influence their health behavioral intention. This study used an online survey engine (Qulatrics.com) to collect data from college students. Out of 253 questionnaires collected, 251 questionnaires (99.2%) were used for the statistical analysis. Confirmatory Factor Analysis (CFA) revealed that six dimensions, "Nutrition Confidence," "Susceptibility," "Severity," "Barrier," "Benefit," "Behavioral Intention to Eat Healthy Food," and "Behavioral Intention to do Physical Activity," had construct validity; Cronbach's alpha coefficient and composite reliabilities were tested for item reliability. The results validate that objective nutrition knowledge was a good predictor of college students' nutrition confidence. The results also clearly showed that two direct measures were significant predictors of behavioral intentions as hypothesized. Perceived benefit of eating healthy food and perceived barrier for eat healthy food to had significant effects on Behavioral Intentions and was a valid measurement to use to determine Behavioral Intentions. These findings can enhance the extant literature on the universal applicability of the model and serve as useful references for further investigations of the validity of the model within other health care or foodservice settings and for other health behavioral categories. PMID:23346306

4. Evaluation of deconvolution modelling applied to numerical combustion

Science.gov (United States)

Mehl, Cédric; Idier, Jérôme; Fiorina, Benoît

2018-01-01

A possible modelling approach in the large eddy simulation (LES) of reactive flows is to deconvolve resolved scalars. Indeed, by inverting the LES filter, scalars such as mass fractions are reconstructed. This information can be used to close budget terms of filtered species balance equations, such as the filtered reaction rate. Being ill-posed in the mathematical sense, the problem is very sensitive to any numerical perturbation. The objective of the present study is to assess the ability of this kind of methodology to capture the chemical structure of premixed flames. For that purpose, three deconvolution methods are tested on a one-dimensional filtered laminar premixed flame configuration: the approximate deconvolution method based on Van Cittert iterative deconvolution, a Taylor decomposition-based method, and the regularised deconvolution method based on the minimisation of a quadratic criterion. These methods are then extended to the reconstruction of subgrid scale profiles. Two methodologies are proposed: the first one relies on subgrid scale interpolation of deconvolved profiles and the second uses parametric functions to describe small scales. Conducted tests analyse the ability of the method to capture the chemical filtered flame structure and front propagation speed. Results show that the deconvolution model should include information about small scales in order to regularise the filter inversion. a priori and a posteriori tests showed that the filtered flame propagation speed and structure cannot be captured if the filter size is too large.

5. A unified framework for benchmark dose estimation applied to mixed models and model averaging

DEFF Research Database (Denmark)

Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

2013-01-01

for hierarchical data structures, reflecting increasingly common types of assay data. We illustrate the usefulness of the methodology by means of a cytotoxicology example where the sensitivity of two types of assays are evaluated and compared. By means of a simulation study, we show that the proposed framework......This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...

6. Modelling lecturer performance index of private university in Tulungagung by using survival analysis with multivariate adaptive regression spline

Science.gov (United States)

Hasyim, M.; Prastyo, D. D.

2018-03-01

Survival analysis performs relationship between independent variables and survival time as dependent variable. In fact, not all survival data can be recorded completely by any reasons. In such situation, the data is called censored data. Moreover, several model for survival analysis requires assumptions. One of the approaches in survival analysis is nonparametric that gives more relax assumption. In this research, the nonparametric approach that is employed is Multivariate Regression Adaptive Spline (MARS). This study is aimed to measure the performance of private university’s lecturer. The survival time in this study is duration needed by lecturer to obtain their professional certificate. The results show that research activities is a significant factor along with developing courses material, good publication in international or national journal, and activities in research collaboration.

7. Mathematical Modeling Applied to Prediction of Landslides in Southern Brazil

Science.gov (United States)

Silva, Lúcia; Araújo, João; Braga, Beatriz; Fernandes, Nelson

2013-04-01

Mass movements are natural phenomena that occur on the slopes and are important agents working in landscape development. These movements have caused serious damage to infrastructure and properties. In addition to the mass movements occurring in natural slopes, there is also a large number of accidents induced by human action in the landscape. The change of use and land cover for the introduction of agriculture is a good example that have affected the stability of slopes. Land use and/or land cover changes have direct and indirect effects on slope stability and frequently represent a major factor controlling the occurrence of man-induced mass movements. In Brazil, especially in the southern and southeastern regions, areas of original natural rain forest have been continuously replaced by agriculture during the last decades, leading to important modifications in soil mechanical properties and to major changes in hillslope hydrology. In these regions, such effects are amplified due to the steep hilly topography, intense summer rainfall events and dense urbanization. In November 2008, a major landslide event took place in a rural area with intensive agriculture in the state of Santa Catarina (Morro do Baú) where many catastrophic landslides were triggered after a long rainy period. In this area, the natural forest has been replaced by huge banana and pine plantations. The state of Santa Catarina in recent decades has been the scene of several incidents of mass movements such as this catastrophic event. In this study, based on field mapping and modeling, we characterize the role played by geomorphological and geological factors in controlling the spatial distribution of landslides in the Morro do Baú area. In order to attain such objective, a digital elevation model of the basin was generated with a 10m grid in which the topographic parameters were obtained. The spatial distribution of the scars from this major event was mapped from another image, obtained immediately

8. DNA-mediated adjuvant immunotherapy extends survival in two different mouse models of myeloid malignancies.

Science.gov (United States)

Le Pogam, Carole; Patel, Satyananda; Gorombei, Petra; Guerenne, Laura; Krief, Patricia; Omidvar, Nader; Tekin, Nilgun; Bernasconi, Elena; Sicre, Flore; Schlageter, Marie-Helene; Chopin, Martine; Noguera, Maria-Elena; West, Robert; Abu, Ansu; Mathews, Vikram; Pla, Marika; Fenaux, Pierre; Chomienne, Christine; Padua, Rose Ann

2015-10-20

We have previously shown that a specific promyelocytic leukemia-retinoic acid receptor alpha (PML-RARA) DNA vaccine combined with all-trans retinoic acid (ATRA) increases the number of long term survivors with enhanced immune responses in a mouse model of acute promyelocytic leukemia (APL). This study reports the efficacy of a non-specific DNA vaccine, pVAX14Flipper (pVAX14), in both APL and high risk myelodysplastic syndrome (HR-MDS) models. PVAX14 is comprised of novel immunogenic DNA sequences inserted into the pVAX1 therapeutic plasmid. APL mice treated with pVAX14 combined with ATRA had increased survival comparable to that obtained with a specific PML-RARA vaccine. Moreover, the survival advantage correlated with decreased PML-RARA transcript levels and increase in anti-RARA antibody production. In HR-MDS mice, pVAX14 significantly improved survival and reduced biomarkers of leukemic transformation such as phosphorylated mitogen-activated protein/extracellular signal-regulated kinase kinase (MEK) 1. In both preclinical models, pVAX14 vaccine significantly increased interferon gamma (IFNγ) production, memory T-cells (memT), reduced the number of colony forming units (CFU) and increased expression of the adapter molecule signalling to NF-κB, MyD88. These results demonstrate the adjuvant properties of pVAX14 providing thus new approaches to improve clinical outcome in two different models of myeloid malignancies, which may have potential for a broader applicability in other cancers.

9. "Let's Move" campaign: applying the extended parallel process model.

Science.gov (United States)

Batchelder, Alicia; Matusitz, Jonathan

2014-01-01

This article examines Michelle Obama's health campaign, "Let's Move," through the lens of the extended parallel process model (EPPM). "Let's Move" aims to reduce the childhood obesity epidemic in the United States. Developed by Kim Witte, EPPM rests on the premise that people's attitudes can be changed when fear is exploited as a factor of persuasion. Fear appeals work best (a) when a person feels a concern about the issue or situation, and (b) when he or she believes to have the capability of dealing with that issue or situation. Overall, the analysis found that "Let's Move" is based on past health campaigns that have been successful. An important element of the campaign is the use of fear appeals (as it is postulated by EPPM). For example, part of the campaign's strategies is to explain the severity of the diseases associated with obesity. By looking at the steps of EPPM, readers can also understand the strengths and weaknesses of "Let's Move."

10. On combined gravity gradient components modelling for applied geophysics

International Nuclear Information System (INIS)

2008-01-01

Gravity gradiometry research and development has intensified in recent years to the extent that technologies providing a resolution of about 1 eotvos per 1 second average shall likely soon be available for multiple critical applications such as natural resources exploration, oil reservoir monitoring and defence establishment. Much of the content of this paper was composed a decade ago, and only minor modifications were required for the conclusions to be just as applicable today. In this paper we demonstrate how gravity gradient data can be modelled, and show some examples of how gravity gradient data can be combined in order to extract valuable information. In particular, this study demonstrates the importance of two gravity gradient components, Txz and Tyz, which, when processed together, can provide more information on subsurface density contrasts than that derived solely from the vertical gravity gradient (Tzz)

11. Applying a Hybrid MCDM Model for Six Sigma Project Selection

Directory of Open Access Journals (Sweden)

Fu-Kwun Wang

2014-01-01

Full Text Available Six Sigma is a project-driven methodology; the projects that provide the maximum financial benefits and other impacts to the organization must be prioritized. Project selection (PS is a type of multiple criteria decision making (MCDM problem. In this study, we present a hybrid MCDM model combining the decision-making trial and evaluation laboratory (DEMATEL technique, analytic network process (ANP, and the VIKOR method to evaluate and improve Six Sigma projects for reducing performance gaps in each criterion and dimension. We consider the film printing industry of Taiwan as an empirical case. The results show that our study not only can use the best project selection, but can also be used to analyze the gaps between existing performance values and aspiration levels for improving the gaps in each dimension and criterion based on the influential network relation map.

12. Electrostatic Model Applied to ISS Charged Water Droplet Experiment

Science.gov (United States)

Stevenson, Daan; Schaub, Hanspeter; Pettit, Donald R.

2015-01-01

The electrostatic force can be used to create novel relative motion between charged bodies if it can be isolated from the stronger gravitational and dissipative forces. Recently, Coulomb orbital motion was demonstrated on the International Space Station by releasing charged water droplets in the vicinity of a charged knitting needle. In this investigation, the Multi-Sphere Method, an electrostatic model developed to study active spacecraft position control by Coulomb charging, is used to simulate the complex orbital motion of the droplets. When atmospheric drag is introduced, the simulated motion closely mimics that seen in the video footage of the experiment. The electrostatic force's inverse dependency on separation distance near the center of the needle lends itself to analytic predictions of the radial motion.

13. Virtual building environments (VBE) - Applying information modeling to buildings

Energy Technology Data Exchange (ETDEWEB)

2004-06-21

A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

14. Applying the welfare model to at-own-risk discharges.

Science.gov (United States)

Krishna, Lalit Kumar Radha; Menon, Sumytra; Kanesvaran, Ravindran

2017-08-01

"At-own-risk discharges" or "self-discharges" evidences an irretrievable breakdown in the patient-clinician relationship when patients leave care facilities before completion of medical treatment and against medical advice. Dissolution of the therapeutic relationship terminates the physician's duty of care and professional liability with respect to care of the patient. Acquiescence of an at-own-risk discharge by the clinician is seen as respecting patient autonomy. The validity of such requests pivot on the assumptions that the patient is fully informed and competent to invoke an at-own-risk discharge and that care up to the point of the at-own-risk discharge meets prevailing clinical standards. Palliative care's use of a multidisciplinary team approach challenges both these assumptions. First by establishing multiple independent therapeutic relations between professionals in the multidisciplinary team and the patient who persists despite an at-own-risk discharge. These enduring therapeutic relationships negate the suggestion that no duty of care is owed the patient. Second, the continued employ of collusion, familial determinations, and the circumnavigation of direct patient involvement in family-centric societies compromises the patient's decision-making capacity and raises questions as to the patient's decision-making capacity and their ability to assume responsibility for the repercussions of invoking an at-own-risk discharge. With the validity of at-own-risk discharge request in question and the welfare and patient interest at stake, an alternative approach to assessing at-own-risk discharge requests are called for. The welfare model circumnavigates these concerns and preserves the patient's welfare through the employ of a multidisciplinary team guided holistic appraisal of the patient's specific situation that is informed by clinical and institutional standards and evidenced-based practice. The welfare model provides a robust decision-making framework for

15. Nonspherical Radiation Driven Wind Models Applied to Be Stars

Science.gov (United States)

Arauxo, F. X.

1990-11-01

16. Impact of sentinel lymphadenectomy on survival in a murine model of melanoma.

Science.gov (United States)

Rebhun, Robert B; Lazar, Alexander J F; Fidler, Isaiah J; Gershenwald, Jeffrey E

2008-01-01

Lymphatic mapping and sentinel lymph node biopsy-also termed sentinel lymphadenectomy (SL)-has become a standard of care for patients with primary invasive cutaneous melanoma. This technique has been shown to provide accurate information about the disease status of the regional lymph node basins at risk for metastasis, provide prognostic information, and provide durable regional lymph node control. The potential survival benefit afforded to patients undergoing SL is controversial. Central to this controversy is whether metastasis to regional lymph nodes occurs independent of or prior to widespread hematogenous dissemination. A related area of uncertainty is whether tumor cells residing within regional lymph nodes have increased metastatic potential. We have used a murine model of primary invasive cutaneous melanoma based on injection of B16-BL6 melanoma cells into the pinna to address two questions: (1) does SL plus wide excision of the primary tumor result in a survival advantage over wide excision alone; and (2) do melanoma cells growing within lymph nodes produce a higher incidence of hematogenous metastases than do cells growing at the primary tumor site? We found that SL significantly improved the survival of mice with small primary tumors. We found no difference in the incidence of lung metastases produced by B16-BL6 melanoma cells growing exclusively within regional lymph nodes and cells growing within the pinna.

17. Accounting for Uncertainty in Decision Analytic Models Using Rank Preserving Structural Failure Time Modeling: Application to Parametric Survival Models.

Science.gov (United States)

Bennett, Iain; Paracha, Noman; Abrams, Keith; Ray, Joshua

2018-01-01

18. The application of cure models in the presence of competing risks: a tool for improved risk communication in population-based cancer patient survival.

Science.gov (United States)

Eloranta, Sandra; Lambert, Paul C; Andersson, Therese M-L; Björkholm, Magnus; Dickman, Paul W

2014-09-01

Quantifying cancer patient survival from the perspective of cure is clinically relevant. However, most cure models estimate cure assuming no competing causes of death. We use a relative survival framework to demonstrate how flexible parametric cure models can be used in combination with competing-risks theory to incorporate noncancer deaths. Under a model that incorporates statistical cure, we present the probabilities that cancer patients (1) have died from their cancer, (2) have died from other causes, (3) will eventually die from their cancer, or (4) will eventually die from other causes, all as a function of time since diagnosis. We further demonstrate how conditional probabilities can be used to update the prognosis among survivors (eg, at 1 or 5 years after diagnosis) by summarizing the proportion of patients who will not die from their cancer. The proposed method is applied to Swedish population-based data for persons diagnosed with melanoma, colon cancer, or acute myeloid leukemia between 1973 and 2007.

19. Development of a likelihood of survival scoring system for hospitalized equine neonates using generalized boosted regression modeling.

Directory of Open Access Journals (Sweden)

Katarzyna A Dembek

Full Text Available BACKGROUND: Medical management of critically ill equine neonates (foals can be expensive and labor intensive. Predicting the odds of foal survival using clinical information could facilitate the decision-making process for owners and clinicians. Numerous prognostic indicators and mathematical models to predict outcome in foals have been published; however, a validated scoring method to predict survival in sick foals has not been reported. The goal of this study was to develop and validate a scoring system that can be used by clinicians to predict likelihood of survival of equine neonates based on clinical data obtained on admission. METHODS AND RESULTS: Data from 339 hospitalized foals of less than four days of age admitted to three equine hospitals were included to develop the model. Thirty seven variables including historical information, physical examination and laboratory findings were analyzed by generalized boosted regression modeling (GBM to determine which ones would be included in the survival score. Of these, six variables were retained in the final model. The weight for each variable was calculated using a generalized linear model and the probability of survival for each total score was determined. The highest (7 and the lowest (0 scores represented 97% and 3% probability of survival, respectively. Accuracy of this survival score was validated in a prospective study on data from 283 hospitalized foals from the same three hospitals. Sensitivity, specificity, positive and negative predictive values for the survival score in the prospective population were 96%, 71%, 91%, and 85%, respectively. CONCLUSIONS: The survival score developed in our study was validated in a large number of foals with a wide range of diseases and can be easily implemented using data available in most equine hospitals. GBM was a useful tool to develop the survival score. Further evaluations of this scoring system in field conditions are needed.

20. Evaluating treatment process redesign by applying the EFQM Excellence Model.

Science.gov (United States)

Nabitz, Udo; Schramade, Mark; Schippers, Gerard

2006-10-01

To evaluate a treatment process redesign programme implementing evidence-based treatment as part of a total quality management in a Dutch addiction treatment centre. Quality management was monitored over a period of more than 10 years in an addiction treatment centre with 550 professionals. Changes are evaluated, comparing the scores on the nine criteria of the European Foundation for Quality Management (EFQM) Excellence Model before and after a major redesign of treatment processes and ISO certification. In the course of 10 years, most intake, care, and cure processes were reorganized, the support processes were restructured and ISO certified, 29 evidence-based treatment protocols were developed and implemented, and patient follow-up measuring was established to make clinical outcomes transparent. Comparing the situation before and after the changes shows that the client satisfaction scores are stable, that the evaluation by personnel and society is inconsistent, and that clinical, production, and financial outcomes are positive. The overall EFQM assessment by external assessors in 2004 shows much higher scores on the nine criteria than the assessment in 1994. Evidence-based treatment can successfully be implemented in addiction treatment centres through treatment process redesign as part of a total quality management strategy, but not all results are positive.

1. Non local theory of excitations applied to the Hubbard model

International Nuclear Information System (INIS)

Kakehashi, Y; Nakamura, T; Fulde, P

2010-01-01

We propose a nonlocal theory of single-particle excitations. It is based on an off-diagonal effective medium and the projection operator method for treating the retarded Green function. The theory determines the nonlocal effective medium matrix elements by requiring that they are consistent with those of the self-energy of the Green function. This arrows for a description of long-range intersite correlations with high resolution in momentum space. Numerical study for the half-filled Hubbard model on the simple cubic lattice demonstrates that the theory is applicable to the strong correlation regime as well as the intermediate regime of Coulomb interaction strength. Furthermore the results show that nonlocal excitations cause sub-bands in the strong Coulomb interaction regime due to strong antiferromagnetic correlations, decrease the quasi-particle peak on the Fermi level with increasing Coulomb interaction, and shift the critical Coulomb interaction U C2 for the divergence of effective mass towards higher energies at least by a factor of two as compared with that in the single-site approximation.

2. Applying revised gap analysis model in measuring hotel service quality.

Science.gov (United States)

Lee, Yu-Cheng; Wang, Yu-Che; Chien, Chih-Hung; Wu, Chia-Huei; Lu, Shu-Chiung; Tsai, Sang-Bing; Dong, Weiwei

2016-01-01

With the number of tourists coming to Taiwan growing by 10-20 % since 2010, the number has increased due to an increasing number of foreign tourists, particularly after deregulation allowed admitting tourist groups, followed later on by foreign individual tourists, from mainland China. The purpose of this study is to propose a revised gap model to evaluate and improve service quality in Taiwanese hotel industry. Thus, service quality could be clearly measured through gap analysis, which was more effective for offering direction in developing and improving service quality. The HOLSERV instrument was used to identify and analyze service gaps from the perceptions of internal and external customers. The sample for this study included three main categories of respondents: tourists, employees, and managers. The results show that five gaps influenced tourists' evaluations of service quality. In particular, the study revealed that Gap 1 (management perceptions vs. customer expectations) and Gap 9 (service provider perceptions of management perceptions vs. service delivery) were more critical than the others in affecting perceived service quality, making service delivery the main area of improvement. This study contributes toward an evaluation of the service quality of the Taiwanese hotel industry from the perspectives of customers, service providers, and managers, which is considerably valuable for hotel managers. It was the aim of this study to explore all of these together in order to better understand the possible gaps in the hotel industry in Taiwan.

3. Applying Dispersive Changes to Lagrangian Particles in Groundwater Transport Models

Science.gov (United States)

Konikow, Leonard F.

2010-01-01

Method-of-characteristics groundwater transport models require that changes in concentrations computed within an Eulerian framework to account for dispersion be transferred to moving particles used to simulate advective transport. A new algorithm was developed to accomplish this transfer between nodal values and advecting particles more precisely and realistically compared to currently used methods. The new method scales the changes and adjustments of particle concentrations relative to limiting bounds of concentration values determined from the population of adjacent nodal values. The method precludes unrealistic undershoot or overshoot for concentrations of individual particles. In the new method, if dispersion causes cell concentrations to decrease during a time step, those particles in the cell having the highest concentration will decrease the most, and those with the lowest concentration will decrease the least. The converse is true if dispersion is causing concentrations to increase. Furthermore, if the initial concentration on a particle is outside the range of the adjacent nodal values, it will automatically be adjusted in the direction of the acceptable range of values. The new method is inherently mass conservative. ?? US Government 2010.

4. STI571 (Gleevec) improves tumor growth delay and survival in irradiated mouse models of glioblastoma

International Nuclear Information System (INIS)

Geng Ling; Shinohara, Eric T.; Kim, Dong; Tan Jiahuai; Osusky, Kate; Shyr, Yu; Hallahan, Dennis E.

2006-01-01

5. Survival of viral pathogens in animal feed ingredients under transboundary shipping models

Science.gov (United States)

Bauermann, Fernando V.; Niederwerder, Megan C.; Singrey, Aaron; Clement, Travis; de Lima, Marcelo; Long, Craig; Patterson, Gilbert; Sheahan, Maureen A.; Stoian, Ana M. M.; Petrovan, Vlad; Jones, Cassandra K.; De Jong, Jon; Ji, Ju; Spronk, Gordon D.; Minion, Luke; Christopher-Hennings, Jane; Zimmerman, Jeff J.; Rowland, Raymond R. R.; Nelson, Eric; Sundberg, Paul; Diel, Diego G.

2018-01-01

The goal of this study was to evaluate survival of important viral pathogens of livestock in animal feed ingredients imported daily into the United States under simulated transboundary conditions. Eleven viruses were selected based on global significance and impact to the livestock industry, including Foot and Mouth Disease Virus (FMDV), Classical Swine Fever Virus (CSFV), African Swine Fever Virus (ASFV), Influenza A Virus of Swine (IAV-S), Pseudorabies virus (PRV), Nipah Virus (NiV), Porcine Reproductive and Respiratory Syndrome Virus (PRRSV), Swine Vesicular Disease Virus (SVDV), Vesicular Stomatitis Virus (VSV), Porcine Circovirus Type 2 (PCV2) and Vesicular Exanthema of Swine Virus (VESV). Surrogate viruses with similar genetic and physical properties were used for 6 viruses. Surrogates belonged to the same virus families as target pathogens, and included Senecavirus A (SVA) for FMDV, Bovine Viral Diarrhea Virus (BVDV) for CSFV, Bovine Herpesvirus Type 1 (BHV-1) for PRV, Canine Distemper Virus (CDV) for NiV, Porcine Sapelovirus (PSV) for SVDV and Feline Calicivirus (FCV) for VESV. For the remaining target viruses, actual pathogens were used. Virus survival was evaluated using Trans-Pacific or Trans-Atlantic transboundary models involving representative feed ingredients, transport times and environmental conditions, with samples tested by PCR, VI and/or swine bioassay. SVA (representing FMDV), FCV (representing VESV), BHV-1 (representing PRV), PRRSV, PSV (representing SVDV), ASFV and PCV2 maintained infectivity during transport, while BVDV (representing CSFV), VSV, CDV (representing NiV) and IAV-S did not. Notably, more viruses survived in conventional soybean meal, lysine hydrochloride, choline chloride, vitamin D and pork sausage casings. These results support published data on transboundary risk of PEDV in feed, demonstrate survival of certain viruses in specific feed ingredients (“high-risk combinations”) under conditions simulating transport between

6. A comparative evaluation of risk-adjustment models for benchmarking amputation-free survival after lower extremity bypass.

Science.gov (United States)

Simons, Jessica P; Goodney, Philip P; Flahive, Julie; Hoel, Andrew W; Hallett, John W; Kraiss, Larry W; Schanzer, Andres

2016-04-01

Providing patients and payers with publicly reported risk-adjusted quality metrics for the purpose of benchmarking physicians and institutions has become a national priority. Several prediction models have been developed to estimate outcomes after lower extremity revascularization for critical limb ischemia, but the optimal model to use in contemporary practice has not been defined. We sought to identify the highest-performing risk-adjustment model for amputation-free survival (AFS) at 1 year after lower extremity bypass (LEB). We used the national Society for Vascular Surgery Vascular Quality Initiative (VQI) database (2003-2012) to assess the performance of three previously validated risk-adjustment models for AFS. The Bypass versus Angioplasty in Severe Ischaemia of the Leg (BASIL), Finland National Vascular (FINNVASC) registry, and the modified Project of Ex-vivo vein graft Engineering via Transfection III (PREVENT III [mPIII]) risk scores were applied to the VQI cohort. A novel model for 1-year AFS was also derived using the VQI data set and externally validated using the PIII data set. The relative discrimination (Harrell c-index) and calibration (Hosmer-May goodness-of-fit test) of each model were compared. Among 7754 patients in the VQI who underwent LEB for critical limb ischemia, the AFS was 74% at 1 year. Each of the previously published models for AFS demonstrated similar discriminative performance: c-indices for BASIL, FINNVASC, mPIII were 0.66, 0.60, and 0.64, respectively. The novel VQI-derived model had improved discriminative ability with a c-index of 0.71 and appropriate generalizability on external validation with a c-index of 0.68. The model was well calibrated in both the VQI and PIII data sets (goodness of fit P = not significant). Currently available prediction models for AFS after LEB perform modestly when applied to national contemporary VQI data. Moreover, the performance of each model was inferior to that of the novel VQI-derived model

7. A chip-level modeling approach for rail span collapse and survivability analyses

International Nuclear Information System (INIS)

Marvis, D.G.; Alexander, D.R.; Dinger, G.L.

1989-01-01

A general semiautomated analysis technique has been developed for analyzing rail span collapse and survivability of VLSI microcircuits in high ionizing dose rate radiation environments. Hierarchical macrocell modeling permits analyses at the chip level and interactive graphical postprocessing provides a rapid visualization of voltage, current and power distributions over an entire VLSIC. The technique is demonstrated for a 16k C MOS/SOI SRAM and a CMOS/SOS 8-bit multiplier. The authors also present an efficient method to treat memory arrays as well as a three-dimensional integration technique to compute sapphire photoconduction from the design layout

8. The synthetic parasite-derived peptide GK1 increases survival in a preclinical mouse melanoma model.

Science.gov (United States)

Pérez-Torres, Armando; Vera-Aguilera, Jesús; Hernaiz-Leonardo, Juan Carlos; Moreno-Aguilera, Eduardo; Monteverde-Suarez, Diego; Vera-Aguilera, Carlos; Estrada-Bárcenas, Daniel

2013-11-01

The therapeutic efficacy of a synthetic parasite-derived peptide GK1, an immune response booster, was evaluated in a mouse melanoma model. This melanoma model correlates with human stage IIb melanoma, which is treated with wide surgical excision; a parallel study employing a surgical treatment was carried out as an instructive goal. C57BL/6 mice were injected subcutaneously in the flank with 2×10(5) B16-F10 murine melanoma cells. When the tumors reached 20 mm3, mice were separated into two different groups; the GK1 group, treated weekly with peritumoral injections of GK1 (10 μg/100 μL of sterile saline solution) and the control group, treated weekly with an antiseptic peritumoral injection of 100 μL of sterile saline solution without further intervention. All mice were monitored daily for clinical appearance, tumor size, and survival. Surgical treatment was performed in parallel when the tumor size was 20 mm3 (group A), 500 mm3 (group B), and >500 mm3 (group C). The GK1 peptide effectively increased the mean survival time by 9.05 days, corresponding to an increase of 42.58%, and significantly delayed tumor growth from day 3 to 12 of treatment. In addition, tumor necrosis was significantly increased (pcancers remains to be determined, and surgical removal remains a challenge for any new experimental treatment of melanoma in mouse models.

9. PREDICT: a new UK prognostic model that predicts survival following surgery for invasive breast cancer.

Science.gov (United States)

Wishart, Gordon C; Azzato, Elizabeth M; Greenberg, David C; Rashbass, Jem; Kearins, Olive; Lawrence, Gill; Caldas, Carlos; Pharoah, Paul D P

2010-01-01

The aim of this study was to develop and validate a prognostication model to predict overall and breast cancer specific survival for women treated for early breast cancer in the UK. Using the Eastern Cancer Registration and Information Centre (ECRIC) dataset, information was collated for 5,694 women who had surgery for invasive breast cancer in East Anglia from 1999 to 2003. Breast cancer mortality models for oestrogen receptor (ER) positive and ER negative tumours were derived from these data using Cox proportional hazards, adjusting for prognostic factors and mode of cancer detection (symptomatic versus screen-detected). An external dataset of 5,468 patients from the West Midlands Cancer Intelligence Unit (WMCIU) was used for validation. Differences in overall actual and predicted mortality were detection for the first time. The model is well calibrated, provides a high degree of discrimination and has been validated in a second UK patient cohort.

10. Improving breast cancer survival analysis through competition-based multidimensional modeling.

Directory of Open Access Journals (Sweden)

Erhan Bilal

Full Text Available Breast cancer is the most common malignancy in women and is responsible for hundreds of thousands of deaths annually. As with most cancers, it is a heterogeneous disease and different breast cancer subtypes are treated differently. Understanding the difference in prognosis for breast cancer based on its molecular and phenotypic features is one avenue for improving treatment by matching the proper treatment with molecular subtypes of the disease. In this work, we employed a competition-based approach to modeling breast cancer prognosis using large datasets containing genomic and clinical information and an online real-time leaderboard program used to speed feedback to the modeling team and to encourage each modeler to work towards achieving a higher ranked submission. We find that machine learning methods combined with molecular features selected based on expert prior knowledge can improve survival predictions compared to current best-in-class methodologies and that ensemble models trained across multiple user submissions systematically outperform individual models within the ensemble. We also find that model scores are highly consistent across multiple independent evaluations. This study serves as the pilot phase of a much larger competition open to the whole research community, with the goal of understanding general strategies for model optimization using clinical and molecular profiling data and providing an objective, transparent system for assessing prognostic models.

11. Selective histone deacetylase 6 inhibition prolongs survival in a lethal two-hit model.

Science.gov (United States)

Cheng, Xin; Liu, Zhengcai; Liu, Baoling; Zhao, Ting; Li, Yongqing; Alam, Hasan B

2015-07-01

Hemorrhagic shock (HS) followed by a subsequent insult ("second hit") often initiates an exaggerated systemic inflammatory response and multiple organ failure. We have previously demonstrated that valproic acid, a pan histone deacetylase inhibitor, could improve survival in a rodent "two-hit" model. In the present study, our goal was to determine whether selective inhibition of histone deacetylase 6 with Tubastatin A (Tub-A) could prolong survival in a two-hit model where HS was followed by sepsis from cecal ligation and puncture (CLP). C57Bl/6J mice were subjected to sublethal HS (30% blood loss) and then randomly divided into two groups (n = 13 per group) such as Tub-A group (treatment) and vehicle (VEH) group (control). The Tub-A group was given an intraperitoneal injection of Tub-A (70 mg/kg) dissolved in dimethyl sulfoxide (DMSO). The VEH group was injected with DMSO (1 μl/g body weight). After 24 h, all mice were subjected CLP followed immediately by another dose of Tub-A or DMSO. Survival was monitored for 10 d. In a parallel study, peritoneal irrigation fluid and liver tissue from Tub-A- or DMSO-treated mice were collected 3 h after CLP. Enzyme-linked immunosorbent assay was performed to quantify activity of the myeloperoxidase and concentrations of tumor necrosis factor-alpha (TNF-α) and interleukin 6 (IL-6) in the peritoneal irrigation fluid. RNA was isolated from the liver tissue, and real-time polymerase chain reaction was performed to measure relative messenger RNA levels of TNF-α and IL-6. Treatment with Tub-A significantly improved survival compared with that of the control (69.2% versus 15.4%). In addition, Tub-A significantly suppressed myeloperoxidase activity (169.9 ± 8.4 ng/mL versus 70.4 ± 17.4 ng/mL; P hit model. Copyright © 2015 Elsevier Inc. All rights reserved.

12. Novel bifunctional anthracycline and nitrosourea chemotherapy for human bladder cancer: analysis in a preclinical survival model.

Science.gov (United States)

Glaves, D; Murray, M K; Raghavan, D

1996-08-01

13. Conditionally replicating adenovirus expressing TIMP2 increases survival in a mouse model of disseminated ovarian cancer.

Directory of Open Access Journals (Sweden)

Sherry W Yang

14. Monte Carlo model to simulate the effects of DNA damage resulting from accumulation of 125I decays during development of colonies and clonogenic survival assays

International Nuclear Information System (INIS)

Lobachevsky, P.; Karagiannis, T.; Martin, R.F.

1998-01-01

Full text: Exposure of cultured cells to an internal source of ionising radiation, such as a radioactive isotope, differs substantially from external irradiation in the determination of delivered dose. In some cases, the radioactive isotope cannot be quickly and completely removed from cells before plating for clonogenic survival assay. This provides an additional dose of irradiation which is not easy to calculate. The contribution of this phenomenon to the cell survival is especially important if a radioactive isotope is incorporated into DNA, or a DNA-binding ligand is labelled with the isotope. The correction of the cell survival due to additional dose cannot be calculated using a simple analytical expression, since the isotope is present in the cells during colony growth. We have developed a Monte Carlo model which simulates the process of the colony growth, and takes into account the extent of damage from isotope decays accumulated between consequent cell divisions. The model considers such factors as cell cycle time, radiosensitivity, colony growth inhibition, isotope specific (per cell) activity, partition of isotope in daughter cells, isotope half-life time, isotope efflux. The model allows estimation of the impact of the irradiation during colony formation on the distribution of colony size, and on the calculation of the survival correction factor, which depends mainly on the isotope cell-specific activity. We applied the model to interpret the difference in survival of K652 cells exposed to 125 I decays with various cell-specific activities: 0.45, 3.21 and 7.42 decays/cell/hour. The cells were treated with 125 I - labelled Hoechst 33258 which binds to DNA in cell nucleus. After accumulation of 125 I decays under non-growth conditions, cells were plated for clonogenic survival assay. The survival correction factors calculated from the model for the given values of 125 I cell-specific activity are in good correlation with differences between experimental

15. Rethinking plant functional types in Earth System Models: pan-tropical analysis of tree survival across environmental gradients

Science.gov (United States)

Johnson, D. J.; Needham, J.; Xu, C.; Davies, S. J.; Bunyavejchewin, S.; Giardina, C. P.; Condit, R.; Cordell, S.; Litton, C. M.; Hubbell, S.; Kassim, A. R. B.; Shawn, L. K. Y.; Nasardin, M. B.; Ong, P.; Ostertag, R.; Sack, L.; Tan, S. K. S.; Yap, S.; McDowell, N. G.; McMahon, S.

2016-12-01

Terrestrial carbon cycling is a function of the growth and survival of trees. Current model representations of tree growth and survival at a global scale rely on coarse plant functional traits that are parameterized very generally. In view of the large biodiversity in the tropical forests, it is important that we account for the functional diversity in order to better predict tropical forest responses to future climate changes. Several next generation Earth System Models are moving towards a size-structured, trait-based approach to modelling vegetation globally, but the challenge of which and how many traits are necessary to capture forest complexity remains. Additionally, the challenge of collecting sufficient trait data to describe the vast species richness of tropical forests is enormous. We propose a more fundamental approach to these problems by characterizing forests by their patterns of survival. We expect our approach to distill real-world tree survival into a reasonable number of functional types. Using 10 large-area tropical forest plots that span geographic, edaphic and climatic gradients, we model tree survival as a function of tree size for hundreds of species. We found surprisingly few categories of size-survival functions emerge. This indicates some fundamental strategies at play across diverse forests to constrain the range of possible size-survival functions. Initial cluster analysis indicates that four to eight functional forms are necessary to describe variation in size-survival relations. Temporal variation in size-survival functions can be related to local environmental variation, allowing us to parameterize how demographically similar groups of species respond to perturbations in the ecosystem. We believe this methodology will yield a synthetic approach to classifying forest systems that will greatly reduce uncertainty and complexity in global vegetation models.

16. A survival model for fractionated radiotherapy with an application to prostate cancer

Energy Technology Data Exchange (ETDEWEB)

Zaider, Marco [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, NY (United States)]. E-mail: Zaiderm@mskcc.org; Zelefsky, Michael J.; Leibel, Steven A. [Department of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, NY (United States); Hanin, Leonid G. [Department of Mathematics, Idaho State University, Pocatello, ID (United States); Tsodikov, Alexander D.; Yakovlev, Andrei Y. [Department of Oncological Sciences, Huntsman Cancer Institute, University of Utah, Salt Lake City, UT (United States)

2001-10-01

This paper explores the applicability of a mechanistic survival model, based on the distribution of clonogens surviving a course of fractionated radiation therapy, to clinical data on patients with prostate cancer. The study was carried out using data on 1100 patients with clinically localized prostate cancer who were treated with three-dimensional conformal radiation therapy. The patients were stratified by radiation dose (group 1: <67.5 Gy; group 2: 67.5-72.5 Gy; group 3: 72.5-77.5 Gy; group 4: 77.5-87.5 Gy) and prognosis category (favourable, intermediate and unfavourable as defined by pre-treatment PSA and Gleason score). A relapse was recorded when tumour recurrence was diagnosed or when three successive prostate specific antigen (PSA) elevations were observed from a post-treatment nadir PSA level. PSA relapse-free survival was used as the primary end point. The model, which is based on an iterated Yule process, is specified in terms of three parameters: the mean number of tumour clonogens that survive the treatment, the mean of the progression time of post-treatment tumour development and its standard deviation. The model parameters were estimated by the maximum likelihood method. The fact that the proposed model provides an excellent description both of the survivor function and of the hazard rate is prima facie evidence of the validity of the model because closeness of the two survivor functions (empirical and model-based) does not generally imply closeness of the corresponding hazard rates. The estimated cure probabilities for the favourable group are 0.80, 0.74 and 0.87 (for dose groups 1-3, respectively); for the intermediate group: 0.25, 0.51, 0.58 and 0.78 (for dose groups 1-4, respectively) and for the unfavourable group: 0.0, 0.27, 0.33 and 0.64 (for dose groups 1-4, respectively). The distribution of progression time to tumour relapse was found to be independent of prognosis group but dependent on dose. As the dose increases the mean progression

17. Immediate survival focus: synthesizing life history theory and dual process models to explain substance use.

Science.gov (United States)

Richardson, George B; Hardesty, Patrick

2012-01-01

Researchers have recently applied evolutionary life history theory to the understanding of behaviors often conceived of as prosocial or antisocial. In addition, researchers have applied cognitive science to the understanding of substance use and used dual process models, where explicit cognitive processes are modeled as relatively distinct from implicit cognitive processes, to explain and predict substance use behaviors. In this paper we synthesized these two theoretical perspectives to produce an adaptive and cognitive framework for explaining substance use. We contend that this framework provides new insights into the nature of substance use that may be valuable for both clinicians and researchers.

18. Immediate Survival Focus: Synthesizing Life History Theory and Dual Process Models to Explain Substance Use

Directory of Open Access Journals (Sweden)

George B. Richardson

2012-10-01

Full Text Available Researchers have recently applied evolutionary life history theory to the understanding of behaviors often conceived of as prosocial or antisocial. In addition, researchers have applied cognitive science to the understanding of substance use and used dual process models, where explicit cognitive processes are modeled as relatively distinct from implicit cognitive processes, to explain and predict substance use behaviors. In this paper we synthesized these two theoretical perspectives to produce an adaptive and cognitive framework for explaining substance use. We contend that this framework provides new insights into the nature of substance use that may be valuable for both clinicians and researchers.

19. NanOx, a new model to predict cell survival in the context of particle therapy

Science.gov (United States)

Cunha, M.; Monini, C.; Testa, E.; Beuve, M.

2017-02-01

Particle therapy is increasingly attractive for the treatment of tumors and the number of facilities offering it is rising worldwide. Due to the well-known enhanced effectiveness of ions, it is of utmost importance to plan treatments with great care to ensure tumor killing and healthy tissues sparing. Hence, the accurate quantification of the relative biological effectiveness (RBE) of ions, used in the calculation of the biological dose, is critical. Nevertheless, the RBE is a complex function of many parameters and its determination requires modeling. The approaches currently used have allowed particle therapy to thrive, but still show some shortcomings. We present herein a short description of a new theoretical framework, NanOx, to calculate cell survival in the context of particle therapy. It gathers principles from existing approaches, while addressing some of their weaknesses. NanOx is a multiscale model that takes the stochastic nature of radiation at nanometric and micrometric scales fully into account, integrating also the chemical aspects of radiation-matter interaction. The latter are included in the model by means of a chemical specific energy, determined from the production of reactive chemical species induced by irradiation. Such a production represents the accumulation of oxidative stress and sublethal damage in the cell, potentially generating non-local lethal events in NanOx. The complementary local lethal events occur in a very localized region and can, alone, lead to cell death. Both these classes of events contribute to cell death. The comparison between experimental data and model predictions for the V79 cell line show a good agreement. In particular, the dependence of the typical shoulders of cell survival curves on linear energy transfer are well described, but also the effectiveness of different ions, including the overkill effect. These results required the adjustment of a number of parameters compatible with the application of the model in

20. Model Proposition for the Fiscal Policies Analysis Applied in Economic Field

Directory of Open Access Journals (Sweden)

Larisa Preda

2007-05-01

Full Text Available This paper presents a study about fiscal policy applied in economic development. Correlations between macroeconomics and fiscal indicators signify the first steep in our analysis. Next step is a new model proposal for the fiscal and budgetary choices. This model is applied on the date of the Romanian case.

1. Intracranial AAV-sTRAIL combined with lanatoside C prolongs survival in an orthotopic xenograft mouse model of invasive glioblastoma.

Science.gov (United States)

Crommentuijn, Matheus H W; Maguire, Casey A; Niers, Johanna M; Vandertop, W Peter; Badr, Christian E; Würdinger, Thomas; Tannous, Bakhos A

2016-04-01

2. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

Directory of Open Access Journals (Sweden)

Nils Ternès

2017-05-01

Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

3. Sodium caseinate induces increased survival in leukaemic mouse J774 model.

Science.gov (United States)

Córdova-Galaviz, Yolanda; Ledesma-Martínez, Edgar; Aguíñiga-Sánchez, Itzen; Soldevila-Melgarejo, Gloria; Soto-Cruz, Isabel; Weiss-Steider, Benny; Santiago-Osorio, Edelmiro

2014-01-01

Acute myeloid leukaemia is a neoplastic disease of haematopoietic stem cells. Although there have been recent advances regarding its treatment, mortality remains high. Consequently, therapeutic alternatives continue to be explored. In the present report, we present evidence that sodium caseinate (CasNa), a salt of the principal protein in milk, may possess important anti-leukaemic properties. J774 leukaemia macrophage-like cells were cultured with CasNa and proliferation, viability and differentiation were evaluated. These cells were also inoculated into BALB/c mice as a model of leukemia. We demonstrated that CasNa inhibits the in vitro proliferation and reduces viability of J774 cells, and leads to increased survival in vivo in a leukaemic mouse model. These data indicate that CasNa may be useful in leukaemia therapy. Copyright © 2014 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

4. Does the interpersonal model apply across eating disorder diagnostic groups? A structural equation modeling approach.

Science.gov (United States)

Ivanova, Iryna V; Tasca, Giorgio A; Proulx, Geneviève; Bissada, Hany

2015-11-01

Interpersonal model has been validated with binge-eating disorder (BED), but it is not yet known if the model applies across a range of eating disorders (ED). The goal of this study was to investigate the validity of the interpersonal model in anorexia nervosa (restricting type; ANR and binge-eating/purge type; ANBP), bulimia nervosa (BN), BED, and eating disorder not otherwise specified (EDNOS). Data from a cross-sectional sample of 1459 treatment-seeking women diagnosed with ANR, ANBP, BN, BED and EDNOS were examined for indirect effects of interpersonal problems on ED psychopathology mediated through negative affect. Findings from structural equation modeling demonstrated the mediating role of negative affect in four of the five diagnostic groups. There were significant, medium to large (.239, .558), indirect effects in the ANR, BN, BED and EDNOS groups but not in the ANBP group. The results of the first reverse model of interpersonal problems as a mediator between negative affect and ED psychopathology were nonsignificant, suggesting the specificity of these hypothesized paths. However, in the second reverse model ED psychopathology was related to interpersonal problems indirectly through negative affect. This is the first study to find support for the interpersonal model of ED in a clinical sample of women with diverse ED diagnoses, though there may be a reciprocal relationship between ED psychopathology and relationship problems through negative affect. Negative affect partially explains the relationship between interpersonal problems and ED psychopathology in women diagnosed with ANR, BN, BED and EDNOS. Interpersonal psychotherapies for ED may be addressing the underlying interpersonal-affective difficulties, thereby reducing ED psychopathology. Copyright © 2015 Elsevier Inc. All rights reserved.

5. Free Base Lysine Increases Survival and Reduces Metastasis in Prostate Cancer Model.

Science.gov (United States)

Ibrahim-Hashim, Arig; Wojtkowiak, Jonathan W; de Lourdes Coelho Ribeiro, Maria; Estrella, Veronica; Bailey, Kate M; Cornnell, Heather H; Gatenby, Robert A; Gillies, Robert J

2011-11-19

Malignant tumor cells typically metabolize glucose anaerobically to lactic acid even under normal oxygen tension, a phenomenon called aerobic glycolysis or the Warburg effect. This results in increased acid production and the acidification of the extracellular microenvironment in solid tumors. H + ions tend to flow along concentration gradients into peritumoral normal tissue causing extracellular matrix degradation and increased tumor cell motility thus promoting invasion and metastasis. We have shown that reducing this acidity with sodium bicarbonate buffer decreases the metastatic fitness of circulating tumor cells in prostate cancer and other cancer models. Mathematical models of the tumor-host dynamics predicted that buffers with a pka around 7 will be more effective in reducing intra- and peri-tumoral acidosis and, thus, and possibly more effective in inhibiting tumor metastasis than sodium bicarbonate which has a pKa around 6. Here we test this prediction the efficacy of free base lysine; a non-bicarbonate/non-volatile buffer with a higher pKa (~10), on prostate tumor metastases model. Oxygen consumption and acid production rate of PC3M prostate cancer cells and normal prostate cells were determined using the Seahorse Extracellular Flux (XF-96) analyzer. In vivo effect of 200 mM lysine started four days prior to inoculation on inhibition of metastasis was examined in PC3M-LUC-C6 prostate cancer model using SCID mice. Metastases were followed by bioluminescence imaging. PC3M prostate cancer cells are highly acidic in comparison to a normal prostate cell line indicating that reduction of intra- and perit-tumoral acidosis should inhibit metastases formation. In vivo administration of 200 mM free base lysine increased survival and reduced metastasis. PC3M prostate cancer cells are highly glycolytic and produce large amounts of acid when compared to normal prostate cells. Administration of non-volatile buffer decreased growth of metastases and improved survival

6. Nutritional intra-amniotic therapy increases survival in a rabbit model of fetal growth restriction

Science.gov (United States)

Illa, Miriam; Pla, Laura; Zamora, Monica; Crispi, Fatima; Gratacos, Eduard

2018-01-01

Objective To evaluate the perinatal effects of a prenatal therapy based on intra-amniotic nutritional supplementation in a rabbit model of intrauterine growth restriction (IUGR). Methods IUGR was surgically induced in pregnant rabbits at gestational day 25 by ligating 40–50% of uteroplacental vessels of each gestational sac. At the same time, modified-parenteral nutrition solution (containing glucose, amino acids and electrolytes) was injected into the amniotic sac of nearly half of the IUGR fetuses (IUGR-T group n = 106), whereas sham injections were performed in the rest of fetuses (IUGR group n = 118). A control group without IUGR induction but sham injection was also included (n = 115). Five days after the ligation procedure, a cesarean section was performed to evaluate fetal cardiac function, survival and birth weight. Results Survival was significantly improved in the IUGR fetuses that were treated with intra-amniotic nutritional supplementation as compared to non-treated IUGR animals (survival rate: controls 71% vs. IUGR 44% p = 0.003 and IUGR-T 63% vs. IUGR 44% p = 0.02), whereas, birth weight (controls mean 43g ± SD 9 vs. IUGR 36g ± SD 9 vs. IUGR-T 35g ± SD 8, p = 0.001) and fetal cardiac function were similar among the IUGR groups. Conclusion Intra-amniotic injection of a modified-parenteral nutrient solution appears to be a promising therapy for reducing mortality among IUGR. These results provide an opportunity to develop new intra-amniotic nutritional strategies to reach the fetus by bypassing the placental insufficiency. PMID:29466434

7. SURVIVAL OF MICROORGANISMS FROM MODERN PROBIOTICS IN MODEL CONDITIONS OF THE INTESTINE

Directory of Open Access Journals (Sweden)

Kabluchko TV

2017-03-01

Full Text Available Introduction. The staye of intestinal microflora affects the work of the whole organism. When composition of normal ibtestine microflora changes, its restoration is required. In our days a wide variety of probiotic drugs are available on the market which can be used to solve this problem. Most bacteria having probiotic properties represent the families Lactobacillus and Bifidobacterium, which have poor resistance to acidic content of the stomach and toxic effects of bile salts. Various studies have clearly shown that in a person with normal acidic and bile secretion, the lactobacilli and bifidobacteria are not detected after the passage through the duodenum, i.e., they perish before reaching the small intestines. In this study we compared the survival of different microorganisms which are contained in 9 probiotic drugs in a model of gastric and intestinal environments. Material and methods. In the laboratory of SI: “Mechnikov Institute Microbiology and Immunology, National Ukrainian Academy Medical Sciences" the in vitro experiments have been evaluated to test the ability of different probiotic bacteria which were contained in 9 probiotic drugs to survive the impact of the model environment of the stomach and duodenum. Bacillus coagulans persistence was evaluated under impact of simulated environment of the stomach and duodenum, it also was assessed by the quantity of CFU by incubation on culture medium. The following were studied: Lactobacillus acidophilus, Lactobacillus rhamnosus, Lactobacillus reuteri, Lactobacillus casei, Lactobacillus plantarum, Lactobacillus bulgaricus, Bifidobacterium bifidum, Bifidobacterium longum , Bifidobacterium breve, Bifidobacterium infantis, Bifidobacterium animalis subsp. Lactis BB-12, Saccharomyces boulardii, Bacillus coagulans, Bacillus clausii, Enterococcus faecium. Microorganisms were incubated for 3 hours in a model environment of the stomach (pepsin 3 g / l, hydrochloric acid of 160 mmol / l, pH 2

8. Analyzing multivariate survival data using composite likelihood and flexible parametric modeling of the hazard functions

DEFF Research Database (Denmark)

Nielsen, Jan; Parner, Erik

2010-01-01

In this paper, we model multivariate time-to-event data by composite likelihood of pairwise frailty likelihoods and marginal hazards using natural cubic splines. Both right- and interval-censored data are considered. The suggested approach is applied on two types of family studies using the gamma...

9. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

Science.gov (United States)

Stamovlasis, Dimitrios; Tsaparlis, Georgios

2012-01-01

In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

10. Methodology for Applying Cyber Security Risk Evaluation from BN Model to PSA Model

Energy Technology Data Exchange (ETDEWEB)

Shin, Jin Soo; Heo, Gyun Young [Kyung Hee University, Youngin (Korea, Republic of); Kang, Hyun Gook [KAIST, Dajeon (Korea, Republic of); Son, Han Seong [Joongbu University, Chubu (Korea, Republic of)

2014-08-15

There are several advantages to use digital equipment such as cost, convenience, and availability. It is inevitable to use the digital I and C equipment replaced analog. Nuclear facilities have already started applying the digital system to I and C system. However, the nuclear facilities also have to change I and C system even though it is difficult to use digital equipment due to high level of safety, irradiation embrittlement, and cyber security. A cyber security which is one of important concerns to use digital equipment can affect the whole integrity of nuclear facilities. For instance, cyber-attack occurred to nuclear facilities such as the SQL slammer worm, stuxnet, DUQU, and flame. The regulatory authorities have published many regulatory requirement documents such as U.S. NRC Regulatory Guide 5.71, 1.152, IAEA guide NSS-17, IEEE Standard, and KINS Regulatory Guide. One of the important problem of cyber security research for nuclear facilities is difficulty to obtain the data through the penetration experiments. Therefore, we make cyber security risk evaluation model with Bayesian network (BN) for nuclear reactor protection system (RPS), which is one of the safety-critical systems to trip the reactor when the accident is happened to the facilities. BN can be used for overcoming these problems. We propose a method to apply BN cyber security model to probabilistic safety assessment (PSA) model, which had been used for safety assessment of system, structure and components of facility. The proposed method will be able to provide the insight of safety as well as cyber risk to the facility.

11. Methodology for Applying Cyber Security Risk Evaluation from BN Model to PSA Model

International Nuclear Information System (INIS)

Shin, Jin Soo; Heo, Gyun Young; Kang, Hyun Gook; Son, Han Seong

2014-01-01

There are several advantages to use digital equipment such as cost, convenience, and availability. It is inevitable to use the digital I and C equipment replaced analog. Nuclear facilities have already started applying the digital system to I and C system. However, the nuclear facilities also have to change I and C system even though it is difficult to use digital equipment due to high level of safety, irradiation embrittlement, and cyber security. A cyber security which is one of important concerns to use digital equipment can affect the whole integrity of nuclear facilities. For instance, cyber-attack occurred to nuclear facilities such as the SQL slammer worm, stuxnet, DUQU, and flame. The regulatory authorities have published many regulatory requirement documents such as U.S. NRC Regulatory Guide 5.71, 1.152, IAEA guide NSS-17, IEEE Standard, and KINS Regulatory Guide. One of the important problem of cyber security research for nuclear facilities is difficulty to obtain the data through the penetration experiments. Therefore, we make cyber security risk evaluation model with Bayesian network (BN) for nuclear reactor protection system (RPS), which is one of the safety-critical systems to trip the reactor when the accident is happened to the facilities. BN can be used for overcoming these problems. We propose a method to apply BN cyber security model to probabilistic safety assessment (PSA) model, which had been used for safety assessment of system, structure and components of facility. The proposed method will be able to provide the insight of safety as well as cyber risk to the facility

12. ePCR: an R-package for survival and time-to-event prediction in advanced prostate cancer, applied to real-world patient cohorts.

Science.gov (United States)

Laajala, Teemu D; Murtojärvi, Mika; Virkki, Arho; Aittokallio, Tero

2018-06-15

Prognostic models are widely used in clinical decision-making, such as risk stratification and tailoring treatment strategies, with the aim to improve patient outcomes while reducing overall healthcare costs. While prognostic models have been adopted into clinical use, benchmarking their performance has been difficult due to lack of open clinical datasets. The recent DREAM 9.5 Prostate Cancer Challenge carried out an extensive benchmarking of prognostic models for metastatic Castration-Resistant Prostate Cancer (mCRPC), based on multiple cohorts of open clinical trial data. We make available an open-source implementation of the top-performing model, ePCR, along with an extended toolbox for its further re-use and development, and demonstrate how to best apply the implemented model to real-world data cohorts of advanced prostate cancer patients. The open-source R-package ePCR and its reference documentation are available at the Central R Archive Network (CRAN): https://CRAN.R-project.org/package=ePCR. R-vignette provides step-by-step examples for the ePCR usage. Supplementary data are available at Bioinformatics online.

13. A model of survival following pre-hospital cardiac arrest based on the Victorian Ambulance Cardiac Arrest Register.

Science.gov (United States)

Fridman, Masha; Barnes, Vanessa; Whyman, Andrew; Currell, Alex; Bernard, Stephen; Walker, Tony; Smith, Karen L

2007-11-01

This study describes the epidemiology of sudden cardiac arrest patients in Victoria, Australia, as captured via the Victorian Ambulance Cardiac Arrest Register (VACAR). We used the VACAR data to construct a new model of out-of-hospital cardiac arrest (OHCA), which was specified in accordance with observed trends. All cases of cardiac arrest in Victoria that were attended by Victorian ambulance services during the period of 2002-2005. Overall survival to hospital discharge was 3.8% among 18,827 cases of OHCA. Survival was 15.7% among 1726 bystander witnessed, adult cardiac arrests of presumed cardiac aetiology, presenting in ventricular fibrillation or ventricular tachycardia (VF/VT), where resuscitation was attempted. In multivariate logistic regression analysis, bystander CPR, cardiac arrest (CA) location, response time, age and sex were predictors of VF/VT, which, in turn, was a strong predictor of survival. The same factors that affected VF/VT made an additional contribution to survival. However, for bystander CPR, CA location and response time this additional contribution was limited to VF/VT patients only. There was no detectable association between survival and age younger than 60 years or response time over 15min. The new model accounts for relationships among predictors of survival. These relationships indicate that interventions such as reduced response times and bystander CPR act in multiple ways to improve survival.

14. Model for breast cancer survival: relative prognostic roles of axillary nodal status, TNM stage, estrogen receptor concentration, and tumor necrosis.

Science.gov (United States)

Shek, L L; Godolphin, W

1988-10-01

The independent prognostic effects of certain clinical and pathological variables measured at the time of primary diagnosis were assessed with Cox multivariate regression analysis. The 859 patients with primary breast cancer, on which the proportional hazards model was based, had a median follow-up of 60 months. Axillary nodal status (categorized as N0, N1-3 or N4+) was the most significant and independent factor in overall survival, but inclusion of TNM stage, estrogen receptor (ER) concentration and tumor necrosis significantly improved survival predictions. Predictions made with the model showed striking subset survival differences within stage: 5-year survival from 36% (N4+, loge[ER] = 0, marked necrosis) to 96% (N0, loge[ER] = 6, no necrosis) in TNM I, and from 0 to 70% for the same categories in TNM IV. Results of the model were used to classify patients into four distinct risk groups according to a derived hazard index. An 8-fold variation in survival was seen with the highest (greater than 3) to lowest index values (less than 1). Each hazard index level included patients with varied combinations of the above factors, but could be considered to denote the same degree of risk of breast cancer mortality. A model with ER concentration, nodal status, and tumor necrosis was found to best predict survival after disease recurrence in 369 patients, thus confirming the enduring biological significance of these factors.

15. Cisplatin Resistant Spheroids Model Clinically Relevant Survival Mechanisms in Ovarian Tumors.

Directory of Open Access Journals (Sweden)

Full Text Available The majority of ovarian tumors eventually recur in a drug resistant form. Using cisplatin sensitive and resistant cell lines assembled into 3D spheroids we profiled gene expression and identified candidate mechanisms and biological pathways associated with cisplatin resistance. OVCAR-8 human ovarian carcinoma cells were exposed to sub-lethal concentrations of cisplatin to create a matched cisplatin-resistant cell line, OVCAR-8R. Genome-wide gene expression profiling of sensitive and resistant ovarian cancer spheroids identified 3,331 significantly differentially expressed probesets coding for 3,139 distinct protein-coding genes (Fc >2, FDR < 0.05 (S2 Table. Despite significant expression changes in some transporters including MDR1, cisplatin resistance was not associated with differences in intracellular cisplatin concentration. Cisplatin resistant cells were significantly enriched for a mesenchymal gene expression signature. OVCAR-8R resistance derived gene sets were significantly more biased to patients with shorter survival. From the most differentially expressed genes, we derived a 17-gene expression signature that identifies ovarian cancer patients with shorter overall survival in three independent datasets. We propose that the use of cisplatin resistant cell lines in 3D spheroid models is a viable approach to gain insight into resistance mechanisms relevant to ovarian tumors in patients. Our data support the emerging concept that ovarian cancers can acquire drug resistance through an epithelial-to-mesenchymal transition.

16. Neuron-specific antioxidant OXR1 extends survival of a mouse model of amyotrophic lateral sclerosis.

Science.gov (United States)

Liu, Kevin X; Edwards, Benjamin; Lee, Sheena; Finelli, Mattéa J; Davies, Ben; Davies, Kay E; Oliver, Peter L

2015-05-01

Amyotrophic lateral sclerosis is a devastating neurodegenerative disorder characterized by the progressive loss of spinal motor neurons. While the aetiological mechanisms underlying the disease remain poorly understood, oxidative stress is a central component of amyotrophic lateral sclerosis and contributes to motor neuron injury. Recently, oxidation resistance 1 (OXR1) has emerged as a critical regulator of neuronal survival in response to oxidative stress, and is upregulated in the spinal cord of patients with amyotrophic lateral sclerosis. Here, we tested the hypothesis that OXR1 is a key neuroprotective factor during amyotrophic lateral sclerosis pathogenesis by crossing a new transgenic mouse line that overexpresses OXR1 in neurons with the SOD1(G93A) mouse model of amyotrophic lateral sclerosis. Interestingly, we report that overexpression of OXR1 significantly extends survival, improves motor deficits, and delays pathology in the spinal cord and in muscles of SOD1(G93A) mice. Furthermore, we find that overexpression of OXR1 in neurons significantly delays non-cell-autonomous neuroinflammatory response, classic complement system activation, and STAT3 activation through transcriptomic analysis of spinal cords of SOD1(G93A) mice. Taken together, these data identify OXR1 as the first neuron-specific antioxidant modulator of pathogenesis and disease progression in SOD1-mediated amyotrophic lateral sclerosis, and suggest that OXR1 may serve as a novel target for future therapeutic strategies. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain.

17. Method for widespread microRNA-155 inhibition prolongs survival in ALS-model mice

Science.gov (United States)

Koval, Erica D.; Shaner, Carey; Zhang, Peter; du Maine, Xavier; Fischer, Kimberlee; Tay, Jia; Chau, B. Nelson; Wu, Gregory F.; Miller, Timothy M.

2013-01-01

microRNAs (miRNAs) are dysregulated in a variety of disease states, suggesting that this newly discovered class of gene expression repressors may be viable therapeutic targets. A microarray of miRNA changes in ALS-model superoxide dismutase 1 (SOD1)G93A rodents identified 12 miRNAs as significantly changed. Six miRNAs tested in human ALS tissues were confirmed increased. Specifically, miR-155 was increased 5-fold in mice and 2-fold in human spinal cords. To test miRNA inhibition in the central nervous system (CNS) as a potential novel therapeutic, we developed oligonucleotide-based miRNA inhibitors (anti-miRs) that could inhibit miRNAs throughout the CNS and in the periphery. Anti-miR-155 caused global derepression of targets in peritoneal macrophages and, following intraventricular delivery, demonstrated widespread functional distribution in the brain and spinal cord. After treating SOD1G93A mice with anti-miR-155, we significantly extended survival by 10 days and disease duration by 15 days (38%) while a scrambled control anti-miR did not significantly improve survival or disease duration. Therefore, antisense oligonucleotides may be used to successfully inhibit miRNAs throughout the brain and spinal cord, and miR-155 is a promising new therapeutic target for human ALS. PMID:23740943

18. Bayesian Analysis for Dynamic Generalized Linear Latent Model with Application to Tree Survival Rate

Directory of Open Access Journals (Sweden)

Yu-sheng Cheng

2014-01-01

Full Text Available Logistic regression model is the most popular regression technique, available for modeling categorical data especially for dichotomous variables. Classic logistic regression model is typically used to interpret relationship between response variables and explanatory variables. However, in real applications, most data sets are collected in follow-up, which leads to the temporal correlation among the data. In order to characterize the different variables correlations, a new method about the latent variables is introduced in this study. At the same time, the latent variables about AR (1 model are used to depict time dependence. In the framework of Bayesian analysis, parameters estimates and statistical inferences are carried out via Gibbs sampler with Metropolis-Hastings (MH algorithm. Model comparison, based on the Bayes factor, and forecasting/smoothing of the survival rate of the tree are established. A simulation study is conducted to assess the performance of the proposed method and a pika data set is analyzed to illustrate the real application. Since Bayes factor approaches vary significantly, efficiency tests have been performed in order to decide which solution provides a better tool for the analysis of real relational data sets.

19. Practical Findings from Applying the PSD Model for Evaluating Software Design Specifications

Science.gov (United States)

Räisänen, Teppo; Lehto, Tuomas; Oinas-Kukkonen, Harri

This paper presents practical findings from applying the PSD model to evaluating the support for persuasive features in software design specifications for a mobile Internet device. On the one hand, our experiences suggest that the PSD model fits relatively well for evaluating design specifications. On the other hand, the model would benefit from more specific heuristics for evaluating each technique to avoid unnecessary subjectivity. Better distinction between the design principles in the social support category would also make the model easier to use. Practitioners who have no theoretical background can apply the PSD model to increase the persuasiveness of the systems they design. The greatest benefit of the PSD model for researchers designing new systems may be achieved when it is applied together with a sound theory, such as the Elaboration Likelihood Model. Using the ELM together with the PSD model, one may increase the chances for attitude change.

20. A comparative study of generalized linear mixed modelling and artificial neural network approach for the joint modelling of survival and incidence of Dengue patients in Sri Lanka

Science.gov (United States)

Hapugoda, J. C.; Sooriyarachchi, M. R.

2017-09-01

Survival time of patients with a disease and the incidence of that particular disease (count) is frequently observed in medical studies with the data of a clustered nature. In many cases, though, the survival times and the count can be correlated in a way that, diseases that occur rarely could have shorter survival times or vice versa. Due to this fact, joint modelling of these two variables will provide interesting and certainly improved results than modelling these separately. Authors have previously proposed a methodology using Generalized Linear Mixed Models (GLMM) by joining the Discrete Time Hazard model with the Poisson Regression model to jointly model survival and count model. As Aritificial Neural Network (ANN) has become a most powerful computational tool to model complex non-linear systems, it was proposed to develop a new joint model of survival and count of Dengue patients of Sri Lanka by using that approach. Thus, the objective of this study is to develop a model using ANN approach and compare the results with the previously developed GLMM model. As the response variables are continuous in nature, Generalized Regression Neural Network (GRNN) approach was adopted to model the data. To compare the model fit, measures such as root mean square error (RMSE), absolute mean error (AME) and correlation coefficient (R) were used. The measures indicate the GRNN model fits the data better than the GLMM model.

1. A new semi-supervised learning model combined with Cox and SP-AFT models in cancer survival analysis.

Science.gov (United States)

Chai, Hua; Li, Zi-Na; Meng, De-Yu; Xia, Liang-Yong; Liang, Yong

2017-10-12

Gene selection is an attractive and important task in cancer survival analysis. Most existing supervised learning methods can only use the labeled biological data, while the censored data (weakly labeled data) far more than the labeled data are ignored in model building. Trying to utilize such information in the censored data, a semi-supervised learning framework (Cox-AFT model) combined with Cox proportional hazard (Cox) and accelerated failure time (AFT) model was used in cancer research, which has better performance than the single Cox or AFT model. This method, however, is easily affected by noise. To alleviate this problem, in this paper we combine the Cox-AFT model with self-paced learning (SPL) method to more effectively employ the information in the censored data in a self-learning way. SPL is a kind of reliable and stable learning mechanism, which is recently proposed for simulating the human learning process to help the AFT model automatically identify and include samples of high confidence into training, minimizing interference from high noise. Utilizing the SPL method produces two direct advantages: (1) The utilization of censored data is further promoted; (2) the noise delivered to the model is greatly decreased. The experimental results demonstrate the effectiveness of the proposed model compared to the traditional Cox-AFT model.

2. Nest survival modelling using a multi-species approach in forests managed for timber and biofuel feedstock

Science.gov (United States)

Loman, Zachary G.; Monroe, Adrian; Riffell, Samuel K.; Miller, Darren A.; Vilella, Francisco; Wheat, Bradley R.; Rush, Scott A.; Martin, James A.

2018-01-01

Switchgrass (Panicum virgatum) intercropping is a novel forest management practice for biomass production intended to generate cellulosic feedstocks within intensively managed loblolly pine‐dominated landscapes. These pine plantations are important for early‐successional bird species, as short rotation times continually maintain early‐successional habitat. We tested the efficacy of using community models compared to individual surrogate species models in understanding influences on nest survival. We analysed nest data to test for differences in habitat use for 14 bird species in plots managed for switchgrass intercropping and controls within loblolly pine (Pinus taeda) plantations in Mississippi, USA.We adapted hierarchical models using hyper‐parameters to incorporate information from both common and rare species to understand community‐level nest survival. This approach incorporates rare species that are often discarded due to low sample sizes, but can inform community‐level demographic parameter estimates. We illustrate use of this approach in generating both species‐level and community‐wide estimates of daily survival rates for songbird nests. We were able to include rare species with low sample size (minimum n = 5) to inform a hyper‐prior, allowing us to estimate effects of covariates on daily survival at the community level, then compare this with a single‐species approach using surrogate species. Using single‐species models, we were unable to generate estimates below a sample size of 21 nests per species.Community model species‐level survival and parameter estimates were similar to those generated by five single‐species models, with improved precision in community model parameters.Covariates of nest placement indicated that switchgrass at the nest site (<4 m) reduced daily nest survival, although intercropping at the forest stand level increased daily nest survival.Synthesis and applications. Community models represent a viable

3. A comparison of economic evaluation models as applied to geothermal energy technology

Science.gov (United States)

Ziman, G. M.; Rosenberg, L. S.

1983-01-01

Several cost estimation and financial cash flow models have been applied to a series of geothermal case studies. In order to draw conclusions about relative performance and applicability of these models to geothermal projects, the consistency of results was assessed. The model outputs of principal interest in this study were net present value, internal rate of return, or levelized breakeven price. The models used were VENVAL, a venture analysis model; the Geothermal Probabilistic Cost Model (GPC Model); the Alternative Power Systems Economic Analysis Model (APSEAM); the Geothermal Loan Guarantee Cash Flow Model (GCFM); and the GEOCOST and GEOCITY geothermal models. The case studies to which the models were applied include a geothermal reservoir at Heber, CA; a geothermal eletric power plant to be located at the Heber site; an alcohol fuels production facility to be built at Raft River, ID; and a direct-use, district heating system in Susanville, CA.

4. Structural Modeling and Analysis of a Wave Energy Converter Applying Dynamical Substructuring Method

DEFF Research Database (Denmark)

Zurkinden, Andrew Stephen; Damkilde, Lars; Gao, Zhen

2013-01-01

to the relative stiff behavior of the arm the calculation can be reduced to a quasi-static analysis. The hydrodynamic and the structural analyses are thus performed separately. In order to reduce the computational time of the finite element calculation the main structure is modeled as a superelement......This paper deals with structural modeling and analysis of a wave energy converter. The device, called Wavestar, is a bottom fixed structure, located in a shallow water environment at the Danish Northwest coast. The analysis is concentrated on a single float and its structural arm which connects...... the WEC to a jackup structure. The wave energy converter is characterized by having an operational and survival mode. The survival mode drastically reduces the exposure to waves and therfore to the wave loads. Structural response analysis of the Wavestar arm is carried out in this study. Due...

5. Modelling of the process of micromycetus survival in fruit and berry syrups

Directory of Open Access Journals (Sweden)

L. Osipova

2017-06-01

Full Text Available In order to develop methods for preserving fruit and berry syrup, which exclude the use of high-temperature sterilization and preservatives, the survival of spores of micromycetes (B. nivea molds in model media with different concentration of food osmotically active substances (sucrose, ethyl alcohol, citric acid at a certain concentration of lethal effects on microorganisms. It has been established that model media (juice based syrups from blueberries with a mass content of 4 % and 6 % alcohol, 50 % sucrose, 1 % and 2 % titrated acids, have a lethal effect on spores of B. nivea molds. The regression equation is obtained expressing the dependence of the amount of spores of B. nivea molds on the concentration of sucrose, acid, alcohol and the storage time of syrups. The form of the dependence and direction of the connection between the variables is established – a negative linear regression, which is expressed in the uniform decrease of the function. The estimation of quality of the received regression model is defined. The deviations of the calculated data from the data of the initial set are calculated. The proposed model has sufficient reliability, since the regression function is defined, interpreted and justified, and the estimation of the accuracy of the regression analysis meets the requirements.

6. Regression modeling strategies with applications to linear models, logistic and ordinal regression, and survival analysis

CERN Document Server

Harrell , Jr , Frank E

2015-01-01

This highly anticipated second edition features new chapters and sections, 225 new references, and comprehensive R software. In keeping with the previous edition, this book is about the art and science of data analysis and predictive modeling, which entails choosing and using multiple tools. Instead of presenting isolated techniques, this text emphasizes problem solving strategies that address the many issues arising when developing multivariable models using real data and not standard textbook examples. It includes imputation methods for dealing with missing data effectively, methods for fitting nonlinear relationships and for making the estimation of transformations a formal part of the modeling process, methods for dealing with "too many variables to analyze and not enough observations," and powerful model validation techniques based on the bootstrap.  The reader will gain a keen understanding of predictive accuracy, and the harm of categorizing continuous predictors or outcomes.  This text realistically...

7. Single-incision laparoscopic surgery in a survival animal model using a transabdominal magnetic anchoring system.

Science.gov (United States)

Cho, Yong Beom; Park, Chan Ho; Kim, Hee Cheol; Yun, Seong Hyeon; Lee, Woo Yong; Chun, Ho-Kyung

2011-12-01

Though single-incision laparoscopic surgery (SILS) can reduce operative scarring and facilitates postoperative recovery, it does have some limitations, such as reduction in instrument working, difficulty in triangulation, and collision of instruments. To overcome these limitations, development of new instruments is needed. The aim of this study is to evaluate the feasibility and safety of a magnetic anchoring system in performing SILS ileocecectomy. Experiments were performed in a living dog model. Five dogs (26.3-29.2 kg) underwent ileocecectomy using a multichannel single port (OCTO port; Darim, Seoul, Korea). The port was inserted at the umbilicus and maintained a CO(2) pneumoperitoneum. Two magnet-fixated vascular clips were attached to the colon using an endoclip applicator, and it was held together across the abdominal wall by using an external handheld magnet. The cecum was then retracted in an upward direction by moving the external handheld magnet, and the mesocolon was dissected with Ultracision(®). Extracorporeal functional end-to-end anastomosis was done using a linear stapler. All animals survived during the observational period of 2 weeks, and then re-exploration was performed under general anesthesia for evaluation of intra-abdominal healing and complications. Mean operation time was 70 min (range 55-100 min), with each subsequent case taking less time. The magnetic anchoring system was effective in achieving adequate exposure in all cases. All animals survived and convalesced normally without evidence of clinical complication during the observation period. At re-exploration, all anastomoses were completely healed and there were no complications such as abscess, bleeding or organ injury. SILS ileocecectomy using a magnetic anchoring system was safe and effective in a dog model. The development of magnetic anchoring systems may be beneficial for overcoming the limitations of SILS.

8. Increased survival rate by local release of diclofenac in a murine model of recurrent oral carcinoma

Directory of Open Access Journals (Sweden)

Will OM

2016-10-01

Full Text Available Olga Maria Will,1,* Nicolai Purcz,2,* Athena Chalaris,3 Carola Heneweer,4,5 Susann Boretius,1 Larissa Purcz,2 Lila Nikkola,6 Nureddin Ashammakhi,6 Holger Kalthoff,7 Claus-Christian Glüer,1 Jörg Wiltfang,2 Yahya Açil,2 Sanjay Tiwari1 1Section Biomedical Imaging, Clinic for Radiology and Neuroradiology, MOIN CC, 2Department of Oral and Maxillofacial Surgery, University Hospital Schleswig-Holstein, 3Institute of Biochemistry, Christian-Albrechts-Universität zu Kiel, 4Clinic for Radiology and Neuroradiology, University Hospital Schleswig-Holstein, Kiel, 5Institute for Diagnostic and Interventional Radiology, University Hospital Cologne, Cologne, Germany; 6Department of Biomedical Engineering, Tampere University of Technology, Tampere, Finland; 7Institute for Experimental Cancer Research, University Hospital Schleswig-Holstein, Kiel, Germany *These authors contributed equally to this work Abstract: Despite aggressive treatment with radiation and combination chemotherapy following tumor resection, the 5-year survival rate for patients with head and neck cancer is at best only 50%. In this study, we examined the therapeutic potential of localized release of diclofenac from electrospun nanofibers generated from poly(d,l-lactide-co-glycolide polymer. Diclofenac was chosen since anti-inflammatory agents that inhibit cyclooxygenase have shown great potential in their ability to directly inhibit tumor growth as well as suppress inflammation-mediated tumor growth. A mouse resection model of oral carcinoma was developed by establishing tumor growth in the oral cavity by ultrasound-guided injection of 1 million SCC-9 cells in the floor of the mouth. Following resection, mice were allocated into four groups with the following treatment: 1 no treatment, 2 implanted scaffolds without diclofenac, 3 implanted scaffolds loaded with diclofenac, and 4 diclofenac given orally. Small animal ultrasound and magnetic resonance imaging were utilized for longitudinal

9. Modeling segregated in- situ combustion processes through a vertical displacement model applied to a Colombian field

International Nuclear Information System (INIS)

Guerra Aristizabal, Jose Julian; Grosso Vargas, Jorge Luis

2005-01-01

Recently it has been proposed the incorporation of horizontal well technologies in thermal EOR processes like the in situ combustion process (ISC). This has taken to the conception of new recovery mechanisms named here as segregated in-situ combustion processes, which are conventional in-situ combustion process with a segregated flow component. Top/Down combustion, Combustion Override Split-production Horizontal-well and Toe-to-Heel Air Injection are three of these processes, which incorporate horizontal producers and gravity drainage phenomena. When applied to thick reservoirs a process of this nature could be reasonably modeled under concepts of conventional in-situ combustion and Crestal Gas injection, especially for heavy oils mobile at reservoir conditions. A process of this nature has been studied through an analytic model conceived for the particular conditions of the Castilla field, a homogeneous thick anticline structure containing high mobility heavy oil, which seems to be an excellent candidate for the application of these technologies

10. Ensemble survival tree models to reveal pairwise interactions of variables with time-to-events outcomes in low-dimensional setting

Science.gov (United States)

Dazard, Jean-Eudes; Ishwaran, Hemant; Mehlotra, Rajeev; Weinberg, Aaron; Zimmerman, Peter

2018-01-01

Unraveling interactions among variables such as genetic, clinical, demographic and environmental factors is essential to understand the development of common and complex diseases. To increase the power to detect such variables interactions associated with clinical time-to-events outcomes, we borrowed established concepts from random survival forest (RSF) models. We introduce a novel RSF-based pairwise interaction estimator and derive a randomization method with bootstrap confidence intervals for inferring interaction significance. Using various linear and nonlinear time-to-events survival models in simulation studies, we first show the efficiency of our approach: true pairwise interaction-effects between variables are uncovered, while they may not be accompanied with their corresponding main-effects, and may not be detected by standard semi-parametric regression modeling and test statistics used in survival analysis. Moreover, using a RSF-based cross-validation scheme for generating prediction estimators, we show that informative predictors may be inferred. We applied our approach to an HIV cohort study recording key host gene polymorphisms and their association with HIV change of tropism or AIDS progression. Altogether, this shows how linear or nonlinear pairwise statistical interactions of variables may be efficiently detected with a predictive value in observational studies with time-to-event outcomes. PMID:29453930

11. Development and external validation of a risk-prediction model to predict 5-year overall survival in advanced larynx cancer

NARCIS (Netherlands)

Petersen, Japke F.; Stuiver, Martijn M.; Timmermans, Adriana J.; Chen, Amy; Zhang, Hongzhen; O'Neill, James P.; Deady, Sandra; Vander Poorten, Vincent; Meulemans, Jeroen; Wennerberg, Johan; Skroder, Carl; Day, Andrew T.; Koch, Wayne; van den Brekel, Michiel W. M.

2017-01-01

TNM-classification inadequately estimates patient-specific overall survival (OS). We aimed to improve this by developing a risk-prediction model for patients with advanced larynx cancer. Cohort study. We developed a risk prediction model to estimate the 5-year OS rate based on a cohort of 3,442

12. Possibilities and limitations of applying software reliability growth models to safety-critical software

International Nuclear Information System (INIS)

Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

2007-01-01

It is generally known that software reliability growth models such as the Jelinski-Moranda model and the Goel-Okumoto's Non-Homogeneous Poisson Process (NHPP) model cannot be applied to safety-critical software due to a lack of software failure data. In this paper, by applying two of the most widely known software reliability growth models to sample software failure data, we demonstrate the possibility of using the software reliability growth models to prove the high reliability of safety-critical software. The high sensitivity of a piece of software's reliability to software failure data, as well as a lack of sufficient software failure data, is also identified as a possible limitation when applying the software reliability growth models to safety-critical software

13. Review and evaluation of performance measures for survival prediction models in external validation settings

Directory of Open Access Journals (Sweden)

M. Shafiqur Rahman

2017-04-01

Full Text Available Abstract Background When developing a prediction model for survival data it is essential to validate its performance in external validation settings using appropriate performance measures. Although a number of such measures have been proposed, there is only limited guidance regarding their use in the context of model validation. This paper reviewed and evaluated a wide range of performance measures to provide some guidelines for their use in practice. Methods An extensive simulation study based on two clinical datasets was conducted to investigate the performance of the measures in external validation settings. Measures were selected from categories that assess the overall performance, discrimination and calibration of a survival prediction model. Some of these have been modified to allow their use with validation data, and a case study is provided to describe how these measures can be estimated in practice. The measures were evaluated with respect to their robustness to censoring and ease of interpretation. All measures are implemented, or are straightforward to implement, in statistical software. Results Most of the performance measures were reasonably robust to moderate levels of censoring. One exception was Harrell’s concordance measure which tended to increase as censoring increased. Conclusions We recommend that Uno’s concordance measure is used to quantify concordance when there are moderate levels of censoring. Alternatively, Gönen and Heller’s measure could be considered, especially if censoring is very high, but we suggest that the prediction model is re-calibrated first. We also recommend that Royston’s D is routinely reported to assess discrimination since it has an appealing interpretation. The calibration slope is useful for both internal and external validation settings and recommended to report routinely. Our recommendation would be to use any of the predictive accuracy measures and provide the corresponding predictive

14. Applying different quality and safety models in healthcare improvement work: Boundary objects and system thinking

International Nuclear Information System (INIS)

Wiig, Siri; Robert, Glenn; Anderson, Janet E.; Pietikainen, Elina; Reiman, Teemu; Macchi, Luigi; Aase, Karina

2014-01-01

A number of theoretical models can be applied to help guide quality improvement and patient safety interventions in hospitals. However there are often significant differences between such models and, therefore, their potential contribution when applied in diverse contexts. The aim of this paper is to explore how two such models have been applied by hospitals to improve quality and safety. We describe and compare the models: (1) The Organizing for Quality (OQ) model, and (2) the Design for Integrated Safety Culture (DISC) model. We analyze the theoretical foundations of the models, and show, by using a retrospective comparative case study approach from two European hospitals, how these models have been applied to improve quality and safety. The analysis shows that differences appear in the theoretical foundations, practical approaches and applications of the models. Nevertheless, the case studies indicate that the choice between the OQ and DISC models is of less importance for guiding the practice of quality and safety improvement work, as they are both systemic and share some important characteristics. The main contribution of the models lay in their role as boundary objects directing attention towards organizational and systems thinking, culture, and collaboration

15. Plants modify biological processes to ensure survival following carbon depletion: a Lolium perenne model.

Directory of Open Access Journals (Sweden)

Julia M Lee

Full Text Available BACKGROUND: Plants, due to their immobility, have evolved mechanisms allowing them to adapt to multiple environmental and management conditions. Short-term undesirable conditions (e.g. moisture deficit, cold temperatures generally reduce photosynthetic carbon supply while increasing soluble carbohydrate accumulation. It is not known, however, what strategies plants may use in the long-term to adapt to situations resulting in net carbon depletion (i.e. reduced photosynthetic carbon supply and carbohydrate accumulation. In addition, many transcriptomic experiments have typically been undertaken under laboratory conditions; therefore, long-term acclimation strategies that plants use in natural environments are not well understood. METHODOLOGY/PRINCIPAL FINDINGS: Perennial ryegrass (Lolium perenne L. was used as a model plant to define whether plants adapt to repetitive carbon depletion and to further elucidate their long-term acclimation mechanisms. Transcriptome changes in both lamina and stubble tissues of field-grown plants with depleted carbon reserves were characterised using reverse transcription-quantitative polymerase chain reaction (RT-qPCR. The RT-qPCR data for select key genes indicated that plants reduced fructan degradation, and increased photosynthesis and fructan synthesis capacities following carbon depletion. This acclimatory response was not sufficient to prevent a reduction (P<0.001 in net biomass accumulation, but ensured that the plant survived. CONCLUSIONS: Adaptations of plants with depleted carbon reserves resulted in reduced post-defoliation carbon mobilization and earlier replenishment of carbon reserves, thereby ensuring survival and continued growth. These findings will help pave the way to improve plant biomass production, for either grazing livestock or biofuel purposes.

16. Flux balance modeling to predict bacterial survival during pulsed-activity events

Science.gov (United States)

Jose, Nicholas A.; Lau, Rebecca; Swenson, Tami L.; Klitgord, Niels; Garcia-Pichel, Ferran; Bowen, Benjamin P.; Baran, Richard; Northen, Trent R.

2018-04-01

Desert biological soil crusts (BSCs) are cyanobacteria-dominated surface soil microbial communities common to plant interspaces in arid environments. The capability to significantly dampen their metabolism allows them to exist for extended periods in a desiccated dormant state that is highly robust to environmental stresses. However, within minutes of wetting, metabolic functions reboot, maximizing activity during infrequent permissive periods. Microcoleus vaginatus, a primary producer within the crust ecosystem and an early colonizer, initiates crust formation by binding particles in the upper layer of soil via exopolysaccharides, making microbial dominated biological soil crusts highly dependent on the viability of this organism. Previous studies have suggested that biopolymers play a central role in the survival of this organism by powering resuscitation, rapidly forming compatible solutes, and fueling metabolic activity in dark, hydrated conditions. To elucidate the mechanism of this phenomenon and provide a basis for future modeling of BSCs, we developed a manually curated, genome-scale metabolic model of Microcoleus vaginatus (iNJ1153). To validate this model, gas chromatography-mass spectroscopy (GC-MS) and liquid chromatography-mass spectroscopy (LC-MS) were used to characterize the rate of biopolymer accumulation and depletion in in hydrated Microcoleus vaginatus under light and dark conditions. Constraint-based flux balance analysis showed agreement between model predictions and experimental reaction fluxes. A significant amount of consumed carbon and light energy is invested into storage molecules glycogen and polyphosphate, while β-polyhydroxybutyrate may function as a secondary resource. Pseudo-steady-state modeling suggests that glycogen, the primary carbon source with the fastest depletion rate, will be exhausted if M. vaginatus experiences dark wetting events 4 times longer than light wetting events.

17. A novel survival model of cardioplegic arrest and cardiopulmonary bypass in rats: a methodology paper

Directory of Open Access Journals (Sweden)

Podgoreanu Mihai V

2008-08-01

Full Text Available Abstract Background Given the growing population of cardiac surgery patients with impaired preoperative cardiac function and rapidly expanding surgical techniques, continued efforts to improve myocardial protection strategies are warranted. Prior research is mostly limited to either large animal models or ex vivo preparations. We developed a new in vivo survival model that combines administration of antegrade cardioplegia with endoaortic crossclamping during cardiopulmonary bypass (CPB in the rat. Methods Sprague-Dawley rats were cannulated for CPB (n = 10. With ultrasound guidance, a 3.5 mm balloon angioplasty catheter was positioned via the right common carotid artery with its tip proximal to the aortic valve. To initiate cardioplegic arrest, the balloon was inflated and cardioplegia solution injected. After 30 min of cardioplegic arrest, the balloon was deflated, ventilation resumed, and rats were weaned from CPB and recovered. To rule out any evidence of cerebral ischemia due to right carotid artery ligation, animals were neurologically tested on postoperative day 14, and their brains histologically assessed. Results Thirty minutes of cardioplegic arrest was successfully established in all animals. Functional assessment revealed no neurologic deficits, and histology demonstrated no gross neuronal damage. Conclusion This novel small animal CPB model with cardioplegic arrest allows for both the study of myocardial ischemia-reperfusion injury as well as new cardioprotective strategies. Major advantages of this model include its overall feasibility and cost effectiveness. In future experiments long-term echocardiographic outcomes as well as enzymatic, genetic, and histologic characterization of myocardial injury can be assessed. In the field of myocardial protection, rodent models will be an important avenue of research.

18. A Research on the E-commerce Applied to the Construction of Marketing Model

Institute of Scientific and Technical Information of China (English)

2007-01-01

The function of E-commerce is becoming more and more widely applied to many fields,which bring about some new challenges and opportunities for the construction of marketing model.It is proved that the more E-com- merce applied to the construction of marketing,the more precision of forecast for the enterprises can acquire,which is very helpful for the production and marketing of enterprises.Therefore,the research on the E-commerce applied to the construction of marketing is popular today.This paper applie...

19. Unified Modeling of Discrete Event and Control Systems Applied in Manufacturing

Directory of Open Access Journals (Sweden)

Amanda Arêas de Souza

2015-05-01

Full Text Available For the development of both a simulation modeland a control system, it is necessary to build, inadvance, a conceptual model. This is what isusually suggested by the methodologies applied inprojects of this nature. Some conceptual modelingtechniques allow for a better understanding ofthe simulation model, and a clear descriptionof the logic of control systems. Therefore, thispaper aims to present and evaluate conceptuallanguages for unified modeling of models ofdiscrete event simulation and control systemsapplied in manufacturing. The results show thatthe IDEF-SIM language can be applied both insimulation systems and in process control.

20. [The survival prediction model of advanced gallbladder cancer based on Bayesian network: a multi-institutional study].

Science.gov (United States)

Tang, Z H; Geng, Z M; Chen, C; Si, S B; Cai, Z Q; Song, T Q; Gong, P; Jiang, L; Qiu, Y H; He, Y; Zhai, W L; Li, S P; Zhang, Y C; Yang, Y

2018-05-01

Objective: To investigate the clinical value of Bayesian network in predicting survival of patients with advanced gallbladder cancer(GBC)who underwent curative intent surgery. Methods: The clinical data of patients with advanced GBC who underwent curative intent surgery in 9 institutions from January 2010 to December 2015 were analyzed retrospectively.A median survival time model based on a tree augmented naïve Bayes algorithm was established by Bayesia Lab software.The survival time, number of metastatic lymph nodes(NMLN), T stage, pathological grade, margin, jaundice, liver invasion, age, sex and tumor morphology were included in this model.Confusion matrix, the receiver operating characteristic curve and area under the curve were used to evaluate the accuracy of the model.A priori statistical analysis of these 10 variables and a posterior analysis(survival time as the target variable, the remaining factors as the attribute variables)was performed.The importance rankings of each variable was calculated with the polymorphic Birnbaum importance calculation based on the posterior analysis results.The survival probability forecast table was constructed based on the top 4 prognosis factors. The survival curve was drawn by the Kaplan-Meier method, and differences in survival curves were compared using the Log-rank test. Results: A total of 316 patients were enrolled, including 109 males and 207 females.The ratio of male to female was 1.0∶1.9, the age was (62.0±10.8)years.There was 298 cases(94.3%) R0 resection and 18 cases(5.7%) R1 resection.T staging: 287 cases(90.8%) T3 and 29 cases(9.2%) T4.The median survival time(MST) was 23.77 months, and the 1, 3, 5-year survival rates were 67.4%, 40.8%, 32.0%, respectively.For the Bayesian model, the number of correctly predicted cases was 121(≤23.77 months) and 115(>23.77 months) respectively, leading to a 74.86% accuracy of this model.The prior probability of survival time was 0.503 2(≤23.77 months) and 0.496 8

1. Foundations for Survivable System Development: Service Traces, Intrusion Traces, and Evaluation Models

National Research Council Canada - National Science Library

Linger, Richard

2001-01-01

.... On the system side, survivability specifications can be defined by essential-service traces that map essential-service workflows, derived from user requirements, into system component dependencies...

2. Applying the ISO 9126 Model to the Evaluation of an E-learning System in Iran

OpenAIRE

2012-01-01

One of the models presented in e-learning quality system field is ISO 9126 model, which applied in this research to evaluate e-learning system of Amirkabir University. This model system for evaluation, the six main variables provided that each of these variables by several other indicators was measured. Thus, the model parameters as ISO 9126 and turned the questionnaire survey among samples (120 experts and students of Amirkabir University) and the distribution were completed. Based on the re...

3. The comparison of proportional hazards and accelerated failure time models in analyzing the first birth interval survival data

Science.gov (United States)

Faruk, Alfensi

2018-03-01

Survival analysis is a branch of statistics, which is focussed on the analysis of time- to-event data. In multivariate survival analysis, the proportional hazards (PH) is the most popular model in order to analyze the effects of several covariates on the survival time. However, the assumption of constant hazards in PH model is not always satisfied by the data. The violation of the PH assumption leads to the misinterpretation of the estimation results and decreasing the power of the related statistical tests. On the other hand, the accelerated failure time (AFT) models do not assume the constant hazards in the survival data as in PH model. The AFT models, moreover, can be used as the alternative to PH model if the constant hazards assumption is violated. The objective of this research was to compare the performance of PH model and the AFT models in analyzing the significant factors affecting the first birth interval (FBI) data in Indonesia. In this work, the discussion was limited to three AFT models which were based on Weibull, exponential, and log-normal distribution. The analysis by using graphical approach and a statistical test showed that the non-proportional hazards exist in the FBI data set. Based on the Akaike information criterion (AIC), the log-normal AFT model was the most appropriate model among the other considered models. Results of the best fitted model (log-normal AFT model) showed that the covariates such as women’s educational level, husband’s educational level, contraceptive knowledge, access to mass media, wealth index, and employment status were among factors affecting the FBI in Indonesia.

4. Advanced age negatively impacts survival in an experimental brain tumor model.

Science.gov (United States)

Ladomersky, Erik; Zhai, Lijie; Gritsina, Galina; Genet, Matthew; Lauing, Kristen L; Wu, Meijing; James, C David; Wainwright, Derek A

2016-09-06

Glioblastoma (GBM) is the most common primary malignant brain tumor in adults, with an average age of 64 years at the time of diagnosis. To study GBM, a number of mouse brain tumor models have been utilized. In these animal models, subjects tend to range from 6 to 12 weeks of age, which is analogous to that of a human teenager. Here, we examined the impact of age on host immunity and the gene expression associated with immune evasion in immunocompetent mice engrafted with syngeneic intracranial GL261. The data indicate that, in mice with brain tumors, youth conveys an advantage to survival. While age did not affect the tumor-infiltrating T cell phenotype or quantity, we discovered that old mice express higher levels of the immunoevasion enzyme, IDO1, which was decreased by the presence of brain tumor. Interestingly, other genes associated with promoting immunosuppression including CTLA-4, PD-L1 and FoxP3, were unaffected by age. These data highlight the possibility that IDO1 contributes to faster GBM outgrowth with advanced age, providing rationale for future investigation into immunotherapeutic targeting in the future. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

5. Simvastatin Treatment Improves Survival in a Murine Model of Burn Sepsis

Science.gov (United States)

Beffa, David C; Fischman, Alan J.; Fagan, Shawn P.; Hamrahi, Victoria F.; Kaneki, Masao; Yu, Yong-Ming; Tompkins, Ronald G.; Carter, Edward A.

2014-01-01

Infection is the most common and most serious complication of a major burn injury related to burn size. Despite improvements in antimicrobial therapies sepsis still accounts for 50–60% of deaths in burn patients. Given the acute onset and unpredictable nature of sepsis, primary prevention was rarely attempted in its management. However, recent studies have demonstrated that statin treatment can decrease mortality is a murine model of sepsis by preservation of cardiac function and reversal of inflammatory alterations. In addition, it has been shown that treatment with statins is associated with reduced incidence of sepsis in human patients. In the current study groups of CD1 male mice (n=12) were anesthetized and subjected to a dorsal 30% TBSA scald burn injury. Starting 2 hours post burn, the animals were divided into a treatment group receiving 0.2 µ/g simvastatin or a sham group receiving placebo. Simvastatin and placebo were administered by intraperitoneal injection with two dosing regimens; once daily and every 12 hours. On Post burn day 7 cecal ligation and puncture with a 21-gauge needle was performed under ketamine/xylazine anesthesia and the two different dosing schedules were continued. A simvastatin dose dependant improvement in survival was observed in the burn sepsis model. PMID:21145172

6. Irreversible electroporation of the pancreas is feasible and safe in a porcine survival model.

Science.gov (United States)

Fritz, Stefan; Sommer, Christof M; Vollherbst, Dominik; Wachter, Miguel F; Longerich, Thomas; Sachsenmeier, Milena; Knapp, Jürgen; Radeleff, Boris A; Werner, Jens

2015-07-01

Use of thermal tumor ablation in the pancreatic parenchyma is limited because of the risk of pancreatitis, pancreatic fistula, or hemorrhage. This study aimed to evaluate the feasibility and safety of irreversible electroporation (IRE) in a porcine model. Ten pigs were divided into 2 study groups. In the first group, animals received IRE of the pancreatic tail and were killed after 60 minutes. In the second group, animals received IRE at the head of the pancreas and were followed up for 7 days. Clinical parameters, computed tomography imaging, laboratory results, and histology were obtained. All animals survived IRE ablation, and no cardiac adverse effects were noted. Sixty minutes after IRE, a hypodense lesion on computed tomography imaging indicated the ablation zone. None of the animals developed clinical signs of acute pancreatitis. Only small amounts of ascites fluid, with a transient increase in amylase and lipase levels, were observed, indicating that no pancreatic fistula occurred. This porcine model shows that IRE is feasible and safe in the pancreatic parenchyma. Computed tomography imaging reveals significant changes at 60 minutes after IRE and therefore might serve as an early indicator of therapeutic success. Clinical studies are needed to evaluate the efficacy of IRE in pancreatic cancer.

7. Applying MDA to SDR for Space to Model Real-time Issues

Science.gov (United States)

Blaser, Tammy M.

2007-01-01

NASA space communications systems have the challenge of designing SDRs with highly-constrained Size, Weight and Power (SWaP) resources. A study is being conducted to assess the effectiveness of applying the MDA Platform-Independent Model (PIM) and one or more Platform-Specific Models (PSM) specifically to address NASA space domain real-time issues. This paper will summarize our experiences with applying MDA to SDR for Space to model real-time issues. Real-time issues to be examined, measured, and analyzed are: meeting waveform timing requirements and efficiently applying Real-time Operating System (RTOS) scheduling algorithms, applying safety control measures, and SWaP verification. Real-time waveform algorithms benchmarked with the worst case environment conditions under the heaviest workload will drive the SDR for Space real-time PSM design.

DEFF Research Database (Denmark)

Lindgren, Peter; Abdullah, Maizura Ailin

2013-01-01

9. Microencapsulation increases survival of the probiotic Lactobacillus plantarum IS-10506, but not Enterococcus faecium IS-27526 in a dynamic, computer-controlled in vitro model of the upper gastrointestinal tract.

Science.gov (United States)

Surono, I; Verhoeven, J; Verbruggen, S; Venema, K

2018-02-23

To test the effect of microencapsulation on the survival of two probiotic strains isolated from Dadih, Indonesian fermented buffalo milk, in a dynamic, computer-controlled in vitro model of the upper gastrointestinal (GI) tract (TIM-1), simulating human adults. Free or microencapsulated probiotics, Lactobacillus plantarum IS-10506 or Enterococcus faecium IS-27526, resuspended in milk were studied for survival in the complete TIM-1 system (stomach + small intestine) or in the gastric compartment of TIM-1 only. Hourly samples collected after the ileal-caecal valve or after the pylorus were plated on MRS agar (for Lactobacillus) or S&B agar (for Enterococcus). Survival of the free cells after transit through the complete TIM-1 system was on average for the E. faecium and L. plantarum 15·0 and 18·5% respectively. Survival of the microencapsulated E. faecium and L. plantarum was 15·7 and 84·5% respectively. The free cells were further assessed in only the gastric compartment of TIM-1. E. faecium and L. plantarum showed an average survival of 39 and 32%, respectively, after gastric passage. There is similar sensitivity to gastric acid as well as survival after complete upper GI tract transit of free cells, but microencapsulation only protected L. plantarum. Survival of microencapsulated L. plantarum IS-10506 is increased compared to free cells in a validated in vitro model of the upper GI tract. It increases its use as an ingredient of functional foods. © 2018 The Society for Applied Microbiology.

10. Clinical variables serve as prognostic factors in a model for survival from glioblastoma multiforme

DEFF Research Database (Denmark)

Michaelsen, Signe Regner; Christensen, Ib Jarle; Grunnet, Kirsten

2013-01-01

Although implementation of temozolomide (TMZ) as a part of primary therapy for glioblastoma multiforme (GBM) has resulted in improved patient survival, the disease is still incurable. Previous studies have correlated various parameters to survival, although no single parameter has yet been...

11. To Be or Not to Be an Entrepreneur: Applying a Normative Model to Career Decisions

Science.gov (United States)

Callanan, Gerard A.; Zimmerman, Monica

2016-01-01

Reflecting the need for a better and broader understanding of the factors influencing the choices to enter into or exit an entrepreneurial career, this article applies a structured, normative model of career management to the career decision-making of entrepreneurs. The application of a structured model can assist career counselors, college career…

12. How High Is the Tramping Track? Mathematising and Applying in a Calculus Model-Eliciting Activity

Science.gov (United States)

Yoon, Caroline; Dreyfus, Tommy; Thomas, Michael O. J.

2010-01-01

Two complementary processes involved in mathematical modelling are mathematising a realistic situation and applying a mathematical technique to a given realistic situation. We present and analyse work from two undergraduate students and two secondary school teachers who engaged in both processes during a mathematical modelling task that required…

13. Divorce and Child Behavior Problems: Applying Latent Change Score Models to Life Event Data

Science.gov (United States)

Malone, Patrick S.; Lansford, Jennifer E.; Castellino, Domini R.; Berlin, Lisa J.; Dodge, Kenneth A.; Bates, John E.; Pettit, Gregory S.

2004-01-01

Effects of parents' divorce on children's adjustment have been studied extensively. This article applies new advances in trajectory modeling to the problem of disentangling the effects of divorce on children's adjustment from related factors such as the child's age at the time of divorce and the child's gender. Latent change score models were used…

14. Spatial Random Effects Survival Models to Assess Geographical Inequalities in Dengue Fever Using Bayesian Approach: a Case Study

Science.gov (United States)

Astuti Thamrin, Sri; Taufik, Irfan

2018-03-01

Dengue haemorrhagic fever (DHF) is an infectious disease caused by dengue virus. The increasing number of people with DHF disease correlates with the neighbourhood, for example sub-districts, and the characteristics of the sub-districts are formed from individuals who are domiciled in the sub-districts. Data containing individuals and sub-districts is a hierarchical data structure, called multilevel analysis. Frequently encountered response variable of the data is the time until an event occurs. Multilevel and spatial models are being increasingly used to obtain substantive information on area-level inequalities in DHF survival. Using a case study approach, we report on the implications of using multilevel with spatial survival models to study geographical inequalities in all cause survival.

15. Modeling nest-survival data: a comparison of recently developed methods that can be implemented in MARK and SAS

Directory of Open Access Journals (Sweden)

Rotella, J. J.

2004-06-01

Full Text Available Estimating nest success and evaluating factors potentially related to the survival rates of nests are key aspects of many studies of avian populations. A strong interest in nest success has led to a rich literature detailing a variety of estimation methods for this vital rate. In recent years, modeling approaches have undergone especially rapid development. Despite these advances, most researchers still employ Mayfield’s ad-hoc method (Mayfield, 1961 or, in some cases, the maximum-likelihood estimator of Johnson (1979 and Bart & Robson (1982. Such methods permit analyses of stratified data but do not allow for more complex and realistic models of nest survival rate that include covariates that vary by individual, nest age, time, etc. and that may be continuous or categorical. Methods that allow researchers to rigorously assess the importance of a variety of biological factors that might affect nest survival rates can now be readily implemented in Program MARK and in SAS’s Proc GENMOD and Proc NLMIXED. Accordingly, use of Mayfield’s estimator without first evaluating the need for more complex models of nest survival rate cannot be justified. With the goal of increasing the use of more flexible methods, we first describe the likelihood used for these models and then consider the question of what the effective sample size is for computation of AICc. Next, we consider the advantages and disadvantages of these different programs in terms of ease of data input and model construction; utility/flexibility of generated estimates and predictions; ease of model selection; and ability to estimate variance components. An example data set is then analyzed using both MARK and SAS to demonstrate implementation of the methods with various models that contain nest-, group- (or block-, and time-specific covariates. Finally, we discuss improvements that would, if they became available, promote a better general understanding of nest survival rates.

16. Generation of a convalescent model of virulent Francisella tularensis infection for assessment of host requirements for survival of tularemia.

Directory of Open Access Journals (Sweden)

Deborah D Crane

Full Text Available Francisella tularensis is a facultative intracellular bacterium and the causative agent of tularemia. Development of novel vaccines and therapeutics for tularemia has been hampered by the lack of understanding of which immune components are required to survive infection. Defining these requirements for protection against virulent F. tularensis, such as strain SchuS4, has been difficult since experimentally infected animals typically die within 5 days after exposure to as few as 10 bacteria. Such a short mean time to death typically precludes development, and therefore assessment, of immune responses directed against virulent F. tularensis. To enable identification of the components of the immune system that are required for survival of virulent F. tularensis, we developed a convalescent model of tularemia in C57Bl/6 mice using low dose antibiotic therapy in which the host immune response is ultimately responsible for clearance of the bacterium. Using this model we demonstrate αβTCR(+ cells, γδTCR(+ cells, and B cells are necessary to survive primary SchuS4 infection. Analysis of mice deficient in specific soluble mediators shows that IL-12p40 and IL-12p35 are essential for survival of SchuS4 infection. We also show that IFN-γ is required for survival of SchuS4 infection since mice lacking IFN-γR succumb to disease during the course of antibiotic therapy. Finally, we found that both CD4(+ and CD8(+ cells are the primary producers of IFN-γand that γδTCR(+ cells and NK cells make a minimal contribution toward production of this cytokine throughout infection. Together these data provide a novel model that identifies key cells and cytokines required for survival or exacerbation of infection with virulent F. tularensis and provides evidence that this model will be a useful tool for better understanding the dynamics of tularemia infection.

17. A clinical-molecular prognostic model to predict survival in patients with post polycythemia vera and post essential thrombocythemia myelofibrosis.

Science.gov (United States)

Passamonti, F; Giorgino, T; Mora, B; Guglielmelli, P; Rumi, E; Maffioli, M; Rambaldi, A; Caramella, M; Komrokji, R; Gotlib, J; Kiladjian, J J; Cervantes, F; Devos, T; Palandri, F; De Stefano, V; Ruggeri, M; Silver, R T; Benevolo, G; Albano, F; Caramazza, D; Merli, M; Pietra, D; Casalone, R; Rotunno, G; Barbui, T; Cazzola, M; Vannucchi, A M

2017-12-01

Polycythemia vera (PV) and essential thrombocythemia (ET) are myeloproliferative neoplasms with variable risk of evolution into post-PV and post-ET myelofibrosis, from now on referred to as secondary myelofibrosis (SMF). No specific tools have been defined for risk stratification in SMF. To develop a prognostic model for predicting survival, we studied 685 JAK2, CALR, and MPL annotated patients with SMF. Median survival of the whole cohort was 9.3 years (95% CI: 8-not reached-NR-). Through penalized Cox regressions we identified negative predictors of survival and according to beta risk coefficients we assigned 2 points to hemoglobin level <11 g/dl, to circulating blasts ⩾3%, and to CALR-unmutated genotype, 1 point to platelet count <150 × 10 9 /l and to constitutional symptoms, and 0.15 points to any year of age. Myelofibrosis Secondary to PV and ET-Prognostic Model (MYSEC-PM) allocated SMF patients into four risk categories with different survival (P<0.0001): low (median survival NR; 133 patients), intermediate-1 (9.3 years, 95% CI: 8.1-NR; 245 patients), intermediate-2 (4.4 years, 95% CI: 3.2-7.9; 126 patients), and high risk (2 years, 95% CI: 1.7-3.9; 75 patients). Finally, we found that the MYSEC-PM represents the most appropriate tool for SMF decision-making to be used in clinical and trial settings.

18. The mass effect model of the survival rate's dose effect of organism irradiated with low energy ion beam

International Nuclear Information System (INIS)

Shao Chunlin; Gui Qifu; Yu Zengliang

1995-01-01

The main characteristic of the low energy ions mutation is its mass deposition effect. Basing on the theory of 'double strand breaking' and the 'mass deposition effect', the authors suggests that the mass deposition products can repair or further damage the double strand breaking of DNA. According to this consideration the dose effect model of the survival rate of organism irradiated by low energy of N + ion beam is deduced as: S exp{-p[αφ + βφ 2 -Rφ 2 exp(-kφ)-Lφ 3 exp(-kφ)]}, which can be called 'mass effect model'. In the low energy ion beam mutation, the dose effects of many survival rates that can not be imitated by previous models are successfully imitated by this model. The suitable application fields of the model are also discussed

19. The design and analysis of salmonid tagging studies in the Columbia basin. Volume 8: A new model for estimating survival probabilities and residualization from a release-recapture study of fall chinook salmon (Oncorhynchus tschawytscha) smolts in the Snake River

International Nuclear Information System (INIS)

Lowther, A.B.; Skalski, J.

1997-09-01

Standard release-recapture analysis using Cormack-Jolly-Seber (CJS) models to estimate survival probabilities between hydroelectric facilities for Snake river fall chinook salmon (Oncorhynchus tschawytscha) ignore the possibility of individual fish residualizing and completing their migration in the year following tagging. These models do not utilize available capture history data from this second year and, thus, produce negatively biased estimates of survival probabilities. A new multinomial likelihood model was developed that results in biologically relevant, unbiased estimates of survival probabilities using the full two years of capture history data. This model was applied to 1995 Snake River fall chinook hatchery releases to estimate the true survival probability from one of three upstream release points (Asotin, Billy Creek, and Pittsburgh Landing) to Lower Granite Dam. In the data analyzed here, residualization is not a common physiological response and thus the use of CJS models did not result in appreciably different results than the true survival probability obtained using the new multinomial likelihood model

20. Applying circular economy innovation theory in business process modeling and analysis

Science.gov (United States)

Popa, V.; Popa, L.

2017-08-01

The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

1. Applying a Particle-only Model to the HL Tau Disk

OpenAIRE

Tabeshian, Maryam; Wiegert, Paul A.

2018-01-01

Observations have revealed rich structures in protoplanetary disks, offering clues about their embedded planets. Due to the complexities introduced by the abundance of gas in these disks, modeling their structure in detail is computationally intensive, requiring complex hydrodynamic codes and substantial computing power. It would be advantageous if computationally simpler models could provide some preliminary information on these disks. Here we apply a particle-only model (that we developed f...

2. Positive end-expiratory pressure improves survival in a rodent model of cardiopulmonary resuscitation using high-dose epinephrine.

LENUS (Irish Health Repository)

McCaul, Conán

2009-10-01

Multiple interventions have been tested in models of cardiopulmonary resuscitation (CPR) to optimize drug use, chest compressions, and ventilation. None has studied the effects of positive end-expiratory pressure (PEEP) on outcome. We hypothesized that because PEEP can reverse pulmonary atelectasis, lower pulmonary vascular resistance, and potentially improve cardiac output, its use during CPR would increase survival.

3. Human immune cells' behavior and survival under bioenergetically restricted conditions in an in vitro fracture hematoma model

Science.gov (United States)

Hoff, Paula; Maschmeyer, Patrick; Gaber, Timo; Schütze, Tabea; Raue, Tobias; Schmidt-Bleek, Katharina; Dziurla, René; Schellmann, Saskia; Lohanatha, Ferenz Leonard; Röhner, Eric; Ode, Andrea; Burmester, Gerd-Rüdiger; Duda, Georg N; Perka, Carsten; Buttgereit, Frank

2013-01-01

The initial inflammatory phase of bone fracture healing represents a critical step for the outcome of the healing process. However, both the mechanisms initiating this inflammatory phase and the function of immune cells present at the fracture site are poorly understood. In order to study the early events within a fracture hematoma, we established an in vitro fracture hematoma model: we cultured hematomas forming during an osteotomy (artificial bone fracture) of the femur during total hip arthroplasty (THA) in vitro under bioenergetically controlled conditions. This model allowed us to monitor immune cell populations, cell survival and cytokine expression during the early phase following a fracture. Moreover, this model enabled us to change the bioenergetical conditions in order to mimic the in vivo situation, which is assumed to be characterized by hypoxia and restricted amounts of nutrients. Using this model, we found that immune cells adapt to hypoxia via the expression of angiogenic factors, chemoattractants and pro-inflammatory molecules. In addition, combined restriction of oxygen and nutrient supply enhanced the selective survival of lymphocytes in comparison with that of myeloid derived cells (i.e., neutrophils). Of note, non-restricted bioenergetical conditions did not show any similar effects regarding cytokine expression and/or different survival rates of immune cell subsets. In conclusion, we found that the bioenergetical conditions are among the crucial factors inducing the initial inflammatory phase of fracture healing and are thus a critical step for influencing survival and function of immune cells in the early fracture hematoma. PMID:23396474

4. Modeling of power train by applying the virtual prototype concept; Kaso genkei ni yoru power train no model ka

Energy Technology Data Exchange (ETDEWEB)

Hiramatsu, S; Harada, Y; Arakawa, H; Komori, S [Mazda Motor Corp., Hiroshima (Japan); Sumida, S [U-Shin Corp., Tokyo (Japan)

1997-10-01

This paper describes the simulation of power train that includes the model developed by applying the virtual prototype concept. By this concept, subsystem models which consist of functional model and mechanism models are integrated into a total system model. This peculiarity in architecture of model, which is called the hierarchical structure, enables us to model a system of large scale with many units, systems and parts easily. Two kinds of computer simulations are performed. One is engine revolution fluctuation by accessory load input, and the other is changing gears by automatic transmission. They are verified to have sufficient accuracy. 2 refs., 12 figs.

5. Mycobacterium tuberculosis PPE18 Protein Reduces Inflammation and Increases Survival in Animal Model of Sepsis.

Science.gov (United States)

Ahmed, Asma; Dolasia, Komal; Mukhopadhyay, Sangita

2018-04-18

Mycobacterium tuberculosis PPE18 is a member of the PPE family. Previous studies have shown that recombinant PPE18 (rPPE18) protein binds to TLR2 and triggers a signaling cascade which reduces levels of TNF-α and IL-12, and increases IL-10 in macrophages. Because TNF-α is a major mediator of the pathophysiology of sepsis and blocking inflammation is a possible line of therapy in such circumstances, we tested the efficacy of rPPE18 in reducing symptoms of sepsis in a mouse model of Escherichia coli- induced septic peritonitis. rPPE18 significantly decreased levels of serum TNF-α, IL-1β, IL-6, and IL-12 and reduced organ damage in mice injected i.p. with high doses of E. coli Peritoneal cells isolated from rPPE18-treated mice had characteristics of M2 macrophages which are protective in excessive inflammation. Additionally, rPPE18 inhibited disseminated intravascular coagulation, which can cause organ damage resulting in death. rPPE18 was able to reduce sepsis-induced mortality when given prophylactically or therapeutically. Additionally, in a mouse model of cecal ligation and puncture-induced sepsis, rPPE18 reduced TNF-α, alanine transaminase, and creatinine, attenuated organ damage, prevented depletion of monocytes and lymphocytes, and improved survival. Our studies show that rPPE18 has potent anti-inflammatory properties and can serve as a novel therapeutic to control sepsis. Copyright © 2018 by The American Association of Immunologists, Inc.

6. Recent progress and modern challenges in applied mathematics, modeling and computational science

CERN Document Server

Makarov, Roman; Belair, Jacques

2017-01-01

This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

7. Experiment and modeling of an atmospheric pressure arc in an applied oscillating magnetic field

International Nuclear Information System (INIS)

Karasik, Max; Roquemore, A. L.; Zweben, S. J.

2000-01-01

A set of experiments are carried out to measure and understand the response of a free-burning atmospheric pressure carbon arc to applied transverse dc and ac magnetic fields. The arc is found to deflect parabolically for the dc field and assumes a growing sinusoidal structure for the ac field. A simple analytic two-parameter fluid model of the arc dynamics is derived, in which the arc response is governed by the arc jet originating at the cathode, with the applied JxB force balanced by inertia. Time variation of the applied field allows evaluation of the parameters individually. A fit of the model to the experimental data gives a value for the average jet speed an order of magnitude below Maecker's estimate of the maximum jet speed [H. Maecker, Z. Phys. 141, 198 (1955)]. An example industrial application of the model is considered. (c) 2000 American Institute of Physics

8. A two-zone cosmic ray propagation model and its implication of the surviving fraction of radioactive cosmic ray isotopes

International Nuclear Information System (INIS)

Simon, M.; Scherzer, R.; Enge, W.

1977-01-01

In cosmic ray propagation calculations one can usually assume a homogeneous distribution of interstellar matter. The crucial astrophysical parameters in these models are: The path length distribution, the age of the cosmic ray particles and the interstellar matter density. These values are interrelated. The surviving fraction of radioactive cosmic ray isotopes is often used to determine a mean matter density of that region, where the cosmic ray particles may mainly reside. Using a Monte Carlo Propagation Program we calculated the change in the surviving fraction quantitatively assuming a region around the sources with higher matter density. (author)

9. Making Faces - State-Space Models Applied to Multi-Modal Signal Processing

DEFF Research Database (Denmark)

Lehn-Schiøler, Tue

2005-01-01

The two main focus areas of this thesis are State-Space Models and multi modal signal processing. The general State-Space Model is investigated and an addition to the class of sequential sampling methods is proposed. This new algorithm is denoted as the Parzen Particle Filter. Furthermore...... optimizer can be applied to speed up convergence. The linear version of the State-Space Model, the Kalman Filter, is applied to multi modal signal processing. It is demonstrated how a State-Space Model can be used to map from speech to lip movements. Besides the State-Space Model and the multi modal...... application an information theoretic vector quantizer is also proposed. Based on interactions between particles, it is shown how a quantizing scheme based on an analytic cost function can be derived....

10. Development and internal validation of a prognostic model to predict recurrence free survival in patients with adult granulosa cell tumors of the ovary

NARCIS (Netherlands)

van Meurs, Hannah S.; Schuit, Ewoud; Horlings, Hugo M.; van der Velden, Jacobus; van Driel, Willemien J.; Mol, Ben Willem J.; Kenter, Gemma G.; Buist, Marrije R.

2014-01-01

Models to predict the probability of recurrence free survival exist for various types of malignancies, but a model for recurrence free survival in individuals with an adult granulosa cell tumor (GCT) of the ovary is lacking. We aimed to develop and internally validate such a prognostic model. We

11. A new formalism for modelling parameters α and β of the linear-quadratic model of cell survival for hadron therapy

Science.gov (United States)

Vassiliev, Oleg N.; Grosshans, David R.; Mohan, Radhe

2017-10-01

We propose a new formalism for calculating parameters α and β of the linear-quadratic model of cell survival. This formalism, primarily intended for calculating relative biological effectiveness (RBE) for treatment planning in hadron therapy, is based on a recently proposed microdosimetric revision of the single-target multi-hit model. The main advantage of our formalism is that it reliably produces α and β that have correct general properties with respect to their dependence on physical properties of the beam, including the asymptotic behavior for very low and high linear energy transfer (LET) beams. For example, in the case of monoenergetic beams, our formalism predicts that, as a function of LET, (a) α has a maximum and (b) the α/β ratio increases monotonically with increasing LET. No prior models reviewed in this study predict both properties (a) and (b) correctly, and therefore, these prior models are valid only within a limited LET range. We first present our formalism in a general form, for polyenergetic beams. A significant new result in this general case is that parameter β is represented as an average over the joint distribution of energies E 1 and E 2 of two particles in the beam. This result is consistent with the role of the quadratic term in the linear-quadratic model. It accounts for the two-track mechanism of cell kill, in which two particles, one after another, damage the same site in the cell nucleus. We then present simplified versions of the formalism, and discuss predicted properties of α and β. Finally, to demonstrate consistency of our formalism with experimental data, we apply it to fit two sets of experimental data: (1) α for heavy ions, covering a broad range of LETs, and (2) β for protons. In both cases, good agreement is achieved.

12. Guidelines for Applying Cohesive Models to the Damage Behaviour of Engineering Materials and Structures

CERN Document Server

Schwalbe, Karl-Heinz; Cornec, Alfred

2013-01-01

This brief provides guidance for the application of cohesive models to determine damage and fracture in materials and structural components. This can be done for configurations with or without a pre-existing crack. Although the brief addresses structural behaviour, the methods described herein may also be applied to any deformation induced material damage and failure, e.g. those occurring during manufacturing processes. The methods described are applicable to the behaviour of ductile metallic materials and structural components made thereof. Hints are also given for applying the cohesive model to other materials.

13. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

Directory of Open Access Journals (Sweden)

Ventura Martins Paula

2017-03-01

Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

14. Modeling the decline of the Porcupine Caribou Herd, 1989-1998: the importance of survival vs. recruitment

OpenAIRE

Arthur, Stephen M.; Whitten, Kenneth R.; Mauer, Francis J.; Cooley, Dorothy

2003-01-01

The Porcupine caribou (Rangifer tarandus granti) herd increased from approximately 100 000 animals during the 1970s to 178 000 in 1989, then declined to 129 000 by 1998. Our objective was to model the dynamics of this herd and investigate the potential that lower calf recruitment, as was observed during 1991-1993, produced the observed population changes. A deterministic model was prepared using estimates of birth and survival rates that reproduced the pattern of population growth from 1971-1...

15. A model for website analysis and\tconception: the Website Canvas Model applied to\tEldiario.es

Directory of Open Access Journals (Sweden)

Carles Sanabre Vives

2015-11-01

Full Text Available This article presents the model of ideation and analysis called Website CanvasModel. It allows identifying the key aspects for a website to be successful, and shows how ithas been applied to Eldiario.es. As a result, the key factors prompting the success of thisdigital newspaper have been identified.

16. A pioneering healthcare model applying large-scale production concepts: Principles and performance after more than 11,000 transplants at Hospital do Rim

Directory of Open Access Journals (Sweden)

José Medina Pestana

Full Text Available Summary The kidney transplant program at Hospital do Rim (hrim is a unique healthcare model that applies the same principles of repetition of processes used in industrial production. This model, devised by Frederick Taylor, is founded on principles of scientific management that involve planning, rational execution of work, and distribution of responsibilities. The expected result is increased efficiency, improvement of results and optimization of resources. This model, almost completely subsidized by the Unified Health System (SUS, in the Portuguese acronym, has been used at the hrim in more than 11,000 transplants over the last 18 years. The hrim model consists of eight interconnected modules: organ procurement organization, preparation for the transplant, admission for transplant, surgical procedure, post-operative period, outpatient clinic, support units, and coordination and quality control. The flow of medical activities enables organized and systematic care of all patients. The improvement of the activities in each module is constant, with full monitoring of various administrative, health care, and performance indicators. The continuous improvement in clinical results confirms the efficiency of the program. Between 1998 and 2015, an increase was noted in graft survival (77.4 vs. 90.4%, p<0.001 and patient survival (90.5 vs. 95.1%, p=0.001. The high productivity, efficiency, and progressive improvement of the results obtained with this model suggest that it could be applied to other therapeutic areas that require large-scale care, preserving the humanistic characteristic of providing health care activity.

17. A pioneering healthcare model applying large-scale production concepts: Principles and performance after more than 11,000 transplants at Hospital do Rim.

Science.gov (United States)

Pestana, José Medina

2016-10-01

The kidney transplant program at Hospital do Rim (hrim) is a unique healthcare model that applies the same principles of repetition of processes used in industrial production. This model, devised by Frederick Taylor, is founded on principles of scientific management that involve planning, rational execution of work, and distribution of responsibilities. The expected result is increased efficiency, improvement of results and optimization of resources. This model, almost completely subsidized by the Unified Health System (SUS, in the Portuguese acronym), has been used at the hrim in more than 11,000 transplants over the last 18 years. The hrim model consists of eight interconnected modules: organ procurement organization, preparation for the transplant, admission for transplant, surgical procedure, post-operative period, outpatient clinic, support units, and coordination and quality control. The flow of medical activities enables organized and systematic care of all patients. The improvement of the activities in each module is constant, with full monitoring of various administrative, health care, and performance indicators. The continuous improvement in clinical results confirms the efficiency of the program. Between 1998 and 2015, an increase was noted in graft survival (77.4 vs. 90.4%, p<0.001) and patient survival (90.5 vs. 95.1%, p=0.001). The high productivity, efficiency, and progressive improvement of the results obtained with this model suggest that it could be applied to other therapeutic areas that require large-scale care, preserving the humanistic characteristic of providing health care activity.

18. NTCP modelling of lung toxicity after SBRT comparing the universal survival curve and the linear quadratic model for fractionation correction

International Nuclear Information System (INIS)

Wennberg, Berit M.; Baumann, Pia; Gagliardi, Giovanna

2011-01-01

Background. In SBRT of lung tumours no established relationship between dose-volume parameters and the incidence of lung toxicity is found. The aim of this study is to compare the LQ model and the universal survival curve (USC) to calculate biologically equivalent doses in SBRT to see if this will improve knowledge on this relationship. Material and methods. Toxicity data on radiation pneumonitis grade 2 or more (RP2+) from 57 patients were used, 10.5% were diagnosed with RP2+. The lung DVHs were corrected for fractionation (LQ and USC) and analysed with the Lyman- Kutcher-Burman (LKB) model. In the LQ-correction α/β = 3 Gy was used and the USC parameters used were: α/β = 3 Gy, D 0 = 1.0 Gy, n = 10, α 0.206 Gy-1 and d T = 5.8 Gy. In order to understand the relative contribution of different dose levels to the calculated NTCP the concept of fractional NTCP was used. This might give an insight to the questions of whether 'high doses to small volumes' or 'low doses to large volumes' are most important for lung toxicity. Results and Discussion. NTCP analysis with the LKB-model using parameters m = 0.4, D50 = 30 Gy resulted for the volume dependence parameter (n) with LQ correction n = 0.87 and with USC correction n = 0.71. Using parameters m = 0.3, D 50 = 20 Gy n = 0.93 with LQ correction and n 0.83 with USC correction. In SBRT of lung tumours, NTCP modelling of lung toxicity comparing models (LQ,USC) for fractionation correction, shows that low dose contribute less and high dose more to the NTCP when using the USC-model. Comparing NTCP modelling of SBRT data and data from breast cancer, lung cancer and whole lung irradiation implies that the response of the lung is treatment specific. More data are however needed in order to have a more reliable modelling

19. Sodium nitroprusside enhanced cardiopulmonary resuscitation improves short term survival in a porcine model of ischemic refractory ventricular fibrillation.

Science.gov (United States)

Yannopoulos, Demetris; Bartos, Jason A; George, Stephen A; Sideris, George; Voicu, Sebastian; Oestreich, Brett; Matsuura, Timothy; Shekar, Kadambari; Rees, Jennifer; Aufderheide, Tom P

2017-01-01

20. A prognostic scoring model for survival after locoregional therapy in de novo stage IV breast cancer.

Science.gov (United States)

Kommalapati, Anuhya; Tella, Sri Harsha; Goyal, Gaurav; Ganti, Apar Kishor; Krishnamurthy, Jairam; Tandra, Pavan Kumar

2018-05-02

The role of locoregional treatment (LRT) remains controversial in de novo stage IV breast cancer (BC). We sought to analyze the role of LRT and prognostic factors of overall survival (OS) in de novo stage IV BC patients treated with LRT utilizing the National Cancer Data Base (NCDB). The objective of the current study is to create and internally validate a prognostic scoring model to predict the long-term OS for de novo stage IV BC patients treated with LRT. We included de novo stage IV BC patients reported to NCDB between 2004 and 2015. Patients were divided into LRT and no-LRT subsets. We randomized LRT subset to training and validation cohorts. In the training cohort, a seventeen-point prognostic scoring system was developed based on the hazard ratios calculated using Cox-proportional method. We stratified both training and validation cohorts into two "groups" [group 1 (0-7 points) and group 2 (7-17 points)]. Kaplan-Meier method and log-rank test were used to compare OS between the two groups. Our prognostic score was validated internally by comparing the OS between the respective groups in both the training and validation cohorts. Among 67,978 patients, LRT subset (21,200) had better median OS as compared to that of no-LRT (45 vs. 24 months; p < 0.0001). The group 1 and group 2 in the training cohort showed a significant difference in the 3-year OS (p < 0.0001) (68 vs. 26%). On internal validation, comparable OS was seen between the respective groups in each cohort (p = 0.77). Our prognostic scoring system will help oncologists to predict the prognosis in de novo stage IV BC patients treated with LRT. Although firm treatment-related conclusions cannot be made due to the retrospective nature of the study, LRT appears to be associated with a better OS in specific subgroups.

1. Risk stratification in middle-aged patients with congestive heart failure: prospective comparison of the Heart Failure Survival Score (HFSS) and a simplified two-variable model.

Science.gov (United States)

Zugck, C; Krüger, C; Kell, R; Körber, S; Schellberg, D; Kübler, W; Haass, M

2001-10-01

The performance of a US-American scoring system (Heart Failure Survival Score, HFSS) was prospectively evaluated in a sample of ambulatory patients with congestive heart failure (CHF). Additionally, it was investigated whether the HFSS might be simplified by assessment of the distance ambulated during a 6-min walk test (6'WT) instead of determination of peak oxygen uptake (peak VO(2)). In 208 middle-aged CHF patients (age 54+/-10 years, 82% male, NYHA class 2.3+/-0.7; follow-up 28+/-14 months) the seven variables of the HFSS: CHF aetiology; heart rate; mean arterial pressure; serum sodium concentration; intraventricular conduction time; left ventricular ejection fraction (LVEF); and peak VO(2), were determined. Additionally, a 6'WT was performed. The HFSS allowed discrimination between patients at low, medium and high risk, with mortality rates of 16, 39 and 50%, respectively. However, the prognostic power of the HFSS was not superior to a two-variable model consisting only of LVEF and peak VO(2). The areas under the receiver operating curves (AUC) for prediction of 1-year survival were even higher for the two-variable model (0.84 vs. 0.74, P<0.05). Replacing peak VO(2) with 6'WT resulted in a similar AUC (0.83). The HFSS continued to predict survival when applied to this patient sample. However, the HFSS was inferior to a two-variable model containing only LVEF and either peak VO(2) or 6'WT. As the 6'WT requires no sophisticated equipment, a simplified two-variable model containing only LVEF and 6'WT may be more widely applicable, and is therefore recommended.

2. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

Science.gov (United States)

Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V

2016-01-01

Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.

3. Development and external validation of a risk-prediction model to predict 5-year overall survival in advanced larynx cancer.

Science.gov (United States)

Petersen, Japke F; Stuiver, Martijn M; Timmermans, Adriana J; Chen, Amy; Zhang, Hongzhen; O'Neill, James P; Deady, Sandra; Vander Poorten, Vincent; Meulemans, Jeroen; Wennerberg, Johan; Skroder, Carl; Day, Andrew T; Koch, Wayne; van den Brekel, Michiel W M

2018-05-01

TNM-classification inadequately estimates patient-specific overall survival (OS). We aimed to improve this by developing a risk-prediction model for patients with advanced larynx cancer. Cohort study. We developed a risk prediction model to estimate the 5-year OS rate based on a cohort of 3,442 patients with T3T4N0N+M0 larynx cancer. The model was internally validated using bootstrapping samples and externally validated on patient data from five external centers (n = 770). The main outcome was performance of the model as tested by discrimination, calibration, and the ability to distinguish risk groups based on tertiles from the derivation dataset. The model performance was compared to a model based on T and N classification only. We included age, gender, T and N classification, and subsite as prognostic variables in the standard model. After external validation, the standard model had a significantly better fit than a model based on T and N classification alone (C statistic, 0.59 vs. 0.55, P statistic to 0.68. A risk prediction model for patients with advanced larynx cancer, consisting of readily available clinical variables, gives more accurate estimations of the estimated 5-year survival rate when compared to a model based on T and N classification alone. 2c. Laryngoscope, 128:1140-1145, 2018. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

4. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

Science.gov (United States)

Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

2017-07-28

Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

5. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data

Directory of Open Access Journals (Sweden)

Justine B. Nasejje

2017-07-01

Full Text Available Abstract Background Random survival forest (RSF models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. Methods In this study, we compare the random survival forest model to the conditional inference model (CIF using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points. The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB which consists of mainly categorical covariates with two levels (few split-points. Results The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Conclusion Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

6. Communication Efficacy and Couples’ Cancer Management: Applying a Dyadic Appraisal Model

OpenAIRE

Magsamen-Conrad, Kate; Checton, Maria G.; Venetis, Maria K.; Greene, Kathryn

2014-01-01

The purpose of the present study was to apply Berg and Upchurch’s (2007) developmental-conceptual model to understand better how couples cope with cancer. Specifically, we hypothesized a dyadic appraisal model in which proximal factors (relational quality), dyadic appraisal (prognosis uncertainty), and dyadic coping (communication efficacy) predicted adjustment (cancer management). The study was cross-sectional and included 83 dyads in which one partner had been diagnosed with and/or treated ...

7. Discussion of the 3P0 model applied to the decay of mesons into two mesons

International Nuclear Information System (INIS)

Bonnaz, R.; Silvestre-Brac, B.

1999-01-01

The 3 P 0 model for the decay of a meson into two mesons is revisited. In particular, the formalism is extended in order to deal with an arbitrary form for the creation vertex and with the exact meson wave functions. A careful analysis of both effects is performed and discussed. The model is then applied to a large class of transitions known experimentally. Two types of quark-antiquark potentials have been tested and compared. (author)

8. Changes in speed distribution: Applying aggregated safety effect models to individual vehicle speeds.

Science.gov (United States)

2017-06-01

This study investigated the effect of applying two aggregated models (the Power model and the Exponential model) to individual vehicle speeds instead of mean speeds. This is of particular interest when the measure introduced affects different parts of the speed distribution differently. The aim was to examine how the estimated overall risk was affected when assuming the models are valid on an individual vehicle level. Speed data from two applications of speed measurements were used in the study: an evaluation of movable speed cameras and a national evaluation of new speed limits in Sweden. The results showed that when applied on individual vehicle speed level compared with aggregated level, there was essentially no difference between these for the Power model in the case of injury accidents. However, for fatalities the difference was greater, especially for roads with new cameras where those driving fastest reduced their speed the most. For the case with new speed limits, the individual approach estimated a somewhat smaller effect, reflecting that changes in the 15th percentile (P15) were somewhat larger than changes in P85 in this case. For the Exponential model there was also a clear, although small, difference between applying the model to mean speed changes and individual vehicle speed changes when speed cameras were used. This applied both for injury accidents and fatalities. There were also larger effects for the Exponential model than for the Power model, especially for injury accidents. In conclusion, applying the Power or Exponential model to individual vehicle speeds is an alternative that provides reasonable results in relation to the original Power and Exponential models, but more research is needed to clarify the shape of the individual risk curve. It is not surprising that the impact on severe traffic crashes was larger in situations where those driving fastest reduced their speed the most. Further investigations on use of the Power and/or the

9. The impact of applying product-modelling techniques in configurator projects

DEFF Research Database (Denmark)

Hvam, Lars; Kristjansdottir, Katrin; Shafiee, Sara

2018-01-01

This paper aims to increase understanding of the impact of using product-modelling techniques to structure and formalise knowledge in configurator projects. Companies that provide customised products increasingly apply configurators in support of sales and design activities, reaping benefits...... that include shorter lead times, improved quality of specifications and products, and lower overall product costs. The design and implementation of configurators are a challenging task that calls for scientifically based modelling techniques to support the formal representation of configurator knowledge. Even...... the phenomenon model and information model are considered visually, (2) non-UML-based modelling techniques, in which only the phenomenon model is considered and (3) non-formal modelling techniques. This study analyses the impact to companies from increased availability of product knowledge and improved control...

10. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

DEFF Research Database (Denmark)

Wang, Chengbo; Luxhøj, James T.; Johansen, John

2004-01-01

This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case......-based reasoning. This paper briefly describes the model's theoretical fundamentals and its conceptual structure; conducts a detailed introduction of the critical elements within the model; exhibits a real world application of the model; and summarizes the review of the model through academia and practice. Finds...... that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by enriching...

11. Hydration kinetics modeling of Portland cement considering the effects of curing temperature and applied pressure

International Nuclear Information System (INIS)

Lin Feng; Meyer, Christian

2009-01-01

A hydration kinetics model for Portland cement is formulated based on thermodynamics of multiphase porous media. The mechanism of cement hydration is discussed based on literature review. The model is then developed considering the effects of chemical composition and fineness of cement, water-cement ratio, curing temperature and applied pressure. The ultimate degree of hydration of Portland cement is also analyzed and a corresponding formula is established. The model is calibrated against the experimental data for eight different Portland cements. Simple relations between the model parameters and cement composition are obtained and used to predict hydration kinetics. The model is used to reproduce experimental results on hydration kinetics, adiabatic temperature rise, and chemical shrinkage of different cement pastes. The comparisons between the model reproductions and the different experimental results demonstrate the applicability of the proposed model, especially for cement hydration at elevated temperature and high pressure.

12. Opioid withdrawal, craving, and use during and after outpatient buprenorphine stabilization and taper: a discrete survival and growth mixture model.

Science.gov (United States)

Northrup, Thomas F; Stotts, Angela L; Green, Charles; Potter, Jennifer S; Marino, Elise N; Walker, Robrina; Weiss, Roger D; Trivedi, Madhukar

2015-02-01

Most patients relapse to opioids within one month of opioid agonist detoxification, making the antecedents and parallel processes of first use critical for investigation. Craving and withdrawal are often studied in relationship to opioid outcomes, and a novel analytic strategy applied to these two phenomena may indicate targeted intervention strategies. Specifically, this secondary data analysis of the Prescription Opioid Addiction Treatment Study used a discrete-time mixture analysis with time-to-first opioid use (survival) simultaneously predicted by craving and withdrawal growth trajectories. This analysis characterized heterogeneity among prescription opioid-dependent individuals (N=653) into latent classes (i.e., latent class analysis [LCA]) during and after buprenorphine/naloxone stabilization and taper. A 4-latent class solution was selected for overall model fit and clinical parsimony. In order of shortest to longest time-to-first use, the 4 classes were characterized as 1) high craving and withdrawal, 2) intermediate craving and withdrawal, 3) high initial craving with low craving and withdrawal trajectories and 4) a low initial craving with low craving and withdrawal trajectories. Odds ratio calculations showed statistically significant differences in time-to-first use across classes. Generally, participants with lower baseline levels and greater decreases in craving and withdrawal during stabilization combined with slower craving and withdrawal rebound during buprenorphine taper remained opioid-free longer. This exploratory work expanded on the importance of monitoring craving and withdrawal during buprenorphine induction, stabilization, and taper. Future research may allow individually tailored and timely interventions to be developed to extend time-to-first opioid use. Copyright © 2014 Elsevier Ltd. All rights reserved.

13. Applying the cube model to pediatric psychology: development of research competency skills at the doctoral level.

Science.gov (United States)

Madan-Swain, Avi; Hankins, Shirley L; Gilliam, Margaux Barnes; Ross, Kelly; Reynolds, Nina; Milby, Jesse; Schwebel, David C

2012-03-01

This article considers the development of research competencies in professional psychology and how that movement might be applied to training in pediatric psychology. The field of pediatric psychology has a short but rich history, and experts have identified critical competencies. However, pediatric psychology has not yet detailed a set of research-based competencies. This article initially reviews the competency initiative in professional psychology, including the cube model as it relates to research training. Next, we review and adapt the knowledge-based/foundational and applied/functional research competencies proposed by health psychology into a cube model for pediatric psychology. We focus especially on graduate-level training but allude to its application throughout professional development. We present the cube model as it is currently being applied to the development of a systematic research competency evaluation for graduate training at our medical/clinical psychology doctoral program at the University of Alabama at Birmingham. Based on the review and synthesis of the literature on research competency in professional psychology we propose future initiatives to develop these competencies for the field of pediatric psychology. The cube model can be successfully applied to the development of research training competencies in pediatric psychology. Future research should address the development, implementation, and assessment of the research competencies for training and career development of future pediatric psychologists.

14. Modeling the current distribution in HTS tapes with transport current and applied magnetic field

NARCIS (Netherlands)

Yazawa, T.; Yazawa, Takashi; Rabbers, J.J.; Chevtchenko, O.A.; ten Haken, Bernard; ten Kate, Herman H.J.; Maeda, Hideaki

1999-01-01

A numerical model is developed for the current distribution in a high temperature superconducting (HTS) tape, (Bi,Pb)2Sr2 Ca2Cu3Ox-Ag, subjected to a combination of a transport current and an applied magnetic field. This analysis is based on a two-dimensional formulation of Maxwell's equations in

15. Risk assessment and food allergy: the probabilistic model applied to allergens

NARCIS (Netherlands)

Spanjersberg, M.Q.I.; Kruizinga, A.G.; Rennen, M.A.J.; Houben, G.F.

2007-01-01

In order to assess the risk of unintended exposure to food allergens, traditional deterministic risk assessment is usually applied, leading to inconsequential conclusions as 'an allergic reaction cannot be excluded'. TNO therefore developed a quantitative risk assessment model for allergens based on

16. Applying a Weighted Maximum Likelihood Latent Trait Estimator to the Generalized Partial Credit Model

Science.gov (United States)

Penfield, Randall D.; Bergeron, Jennifer M.

2005-01-01

This article applies a weighted maximum likelihood (WML) latent trait estimator to the generalized partial credit model (GPCM). The relevant equations required to obtain the WML estimator using the Newton-Raphson algorithm are presented, and a simulation study is described that compared the properties of the WML estimator to those of the maximum…

17. An Investigation of Employees' Use of E-Learning Systems: Applying the Technology Acceptance Model

Science.gov (United States)

Lee, Yi-Hsuan; Hsieh, Yi-Chuan; Chen, Yen-Hsun

2013-01-01

The purpose of this study is to apply the technology acceptance model to examine the employees' attitudes and acceptance of electronic learning (e-learning) systems in organisations. This study examines four factors (organisational support, computer self-efficacy, prior experience and task equivocality) that are believed to influence employees'…

18. Reliability Models Applied to a System of Power Converters in Particle Accelerators

OpenAIRE

Siemaszko, D; Speiser, M; Pittet, S

2012-01-01

Several reliability models are studied when applied to a power system containing a large number of power converters. A methodology is proposed and illustrated in the case study of a novel linear particle accelerator designed for reaching high energies. The proposed methods result in the prediction of both reliability and availability of the considered system for optimisation purposes.

19. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

Science.gov (United States)

What Works Clearinghouse, 2010

2010-01-01

The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours…

20. C code generation applied to nonlinear model predictive control for an artificial pancreas

DEFF Research Database (Denmark)

Boiroux, Dimitri; Jørgensen, John Bagterp

2017-01-01

This paper presents a method to generate C code from MATLAB code applied to a nonlinear model predictive control (NMPC) algorithm. The C code generation uses the MATLAB Coder Toolbox. It can drastically reduce the time required for development compared to a manual porting of code from MATLAB to C...

1. Problems and advantages of applying the e-learning model to the teaching of English

OpenAIRE

Shaparenko, А.; Golikova, А.

2013-01-01

In this article we mention some potential and noted problems and advantages of applying the e-learning model to the teaching of English. In the area of foreign language teaching a lot has been done, but there are constant attempts for new solutions. Another option for e-learning is a hybrid course.

2. The asymmetric rotator model applied to odd-mass iridium isotopes

International Nuclear Information System (INIS)

Piepenbring, R.

1980-04-01

The method of inversion of the eigenvalue problem previously developed for nuclei with axial symmetry is extended to asymmetric equilibrium shapes. This new approach of the asymmetric rotator model is applied to the odd-mass iridium isotopes. A satisfactory and coherent description of the observed energy spectra is obtained, especially for the lighter isotopes

3. Edaravone improves survival and neurological outcomes after CPR in a ventricular fibrillation model of rats.

Science.gov (United States)

Qin, Tao; Lei, Ling-Yan; Li, Nuo; Shi, Fangying Ruan; Chen, Meng-Hua; Xie, Lu

2016-10-01

Overproduction of free radicals is a main factor contributing to cerebral injury after cardiac arrest (CA)/cardiopulmonary resuscitation (CPR). We sought to evaluate the impact of edaravone on the survival and neurological outcomes after CA/CPR in rats. Rats were subjected to CA following CPR. For survival study, the rats with restoration of spontaneous circulation (ROSC) were randomly allocated to one of the two groups (edaravone and saline group, n=20/each group) to received Edaravone (3 mg/kg) or normal saline. Another 10 rats without experiencing CA and CPR served as the sham group. Survival was observed for 72 hours and the neurological deficit score (NDS) was calculated at 12, 24, 48, and 72 hours after ROSC. For the neurological biochemical analysis study, rats were subjected to the same experimental procedures. Then, edaravone group (n=24), saline group (n=24) and sham group (n=16) were further divided into 4 subgroups according to the different time intervals (12, 24, 48, and 72 hours following ROSC). Brain tissues were harvested at relative time intervals for evaluation of oxidative stress, TUNEL staining and apoptotic gene expression. Edaravone improved postresuscitative survival time and neurological deficit, decreased brain malonylaldehyde level, increased superoxide dismutase activities, decreased proapoptotic gene expression of capase-8, capase-3, and Bax, and increased antiapoptotic Bcl-2 expression at 12, 24, 48, and 72 hours after ROSC. Edaravone improves survival and neurological outcomes following CPR via antioxidative and antiapoptotic effects in rats. Copyright © 2016 Elsevier Inc. All rights reserved.

4. Intrastriatal Grafting of Chromospheres: Survival and Functional Effects in the 6-OHDA Rat Model of Parkinson's Disease.

Directory of Open Access Journals (Sweden)

Alejandra Boronat-García

Full Text Available Cell replacement therapy in Parkinson's disease (PD aims at re-establishing dopamine neurotransmission in the striatum by grafting dopamine-releasing cells. Chromaffin cell (CC grafts produce some transitory improvements of functional motor deficits in PD animal models, and have the advantage of allowing autologous transplantation. However, CC grafts have exhibited low survival, poor functional effects and dopamine release compared to other cell types. Recently, chromaffin progenitor-like cells were isolated from bovine and human adult adrenal medulla. Under low-attachment conditions, these cells aggregate and grow as spheres, named chromospheres. Here, we found that bovine-derived chromosphere-cell cultures exhibit a greater fraction of cells with a dopaminergic phenotype and higher dopamine release than CC. Chromospheres grafted in a rat model of PD survived in 57% of the total grafted animals. Behavioral tests showed that surviving chromosphere cells induce a reduction in motor alterations for at least 3 months after grafting. Finally, we found that compared with CC, chromosphere grafts survive more and produce more robust and consistent motor improvements. However, further experiments would be necessary to determine whether the functional benefits induced by chromosphere grafts can be improved, and also to elucidate the mechanisms underlying the functional effects of the grafts.

5. Modeling the decline of the Porcupine Caribou Herd, 1989-1998: the importance of survival vs. recruitment

Directory of Open Access Journals (Sweden)

Stephen M. Arthur

2003-04-01

Full Text Available The Porcupine caribou (Rangifer tarandus granti herd increased from approximately 100 000 animals during the 1970s to 178 000 in 1989, then declined to 129 000 by 1998. Our objective was to model the dynamics of this herd and investigate the potential that lower calf recruitment, as was observed during 1991-1993, produced the observed population changes. A deterministic model was prepared using estimates of birth and survival rates that reproduced the pattern of population growth from 1971-1989. Then, parameters were changed to simulate effects of lower calf recruitment and adult survival. Reducing recruitment for 3 years caused an immediate reduction in population size, but the population began to recover in 5-6 years. Even a dramatic temporary reduction in recruitment did not explain the continuing decline after 1995. In contrast, a slight but persistent reduction in adult survival caused a decline that closely followed the observed pattern. This suggests that survival of adults, and perhaps calves, has declined since the late 1980s.

6. Using simulation to interpret a discrete time survival model in a complex biological system: fertility and lameness in dairy cows.

Directory of Open Access Journals (Sweden)

Christopher D Hudson

Full Text Available The ever-growing volume of data routinely collected and stored in everyday life presents researchers with a number of opportunities to gain insight and make predictions. This study aimed to demonstrate the usefulness in a specific clinical context of a simulation-based technique called probabilistic sensitivity analysis (PSA in interpreting the results of a discrete time survival model based on a large dataset of routinely collected dairy herd management data. Data from 12,515 dairy cows (from 39 herds were used to construct a multilevel discrete time survival model in which the outcome was the probability of a cow becoming pregnant during a given two day period of risk, and presence or absence of a recorded lameness event during various time frames relative to the risk period amongst the potential explanatory variables. A separate simulation model was then constructed to evaluate the wider clinical implications of the model results (i.e. the potential for a herd's incidence rate of lameness to influence its overall reproductive performance using PSA. Although the discrete time survival analysis revealed some relatively large associations between lameness events and risk of pregnancy (for example, occurrence of a lameness case within 14 days of a risk period was associated with a 25% reduction in the risk of the cow becoming pregnant during that risk period, PSA revealed that, when viewed in the context of a realistic clinical situation, a herd's lameness incidence rate is highly unlikely to influence its overall reproductive performance to a meaningful extent in the vast majority of situations. Construction of a simulation model within a PSA framework proved to be a very useful additional step to aid contextualisation of the results from a discrete time survival model, especially where the research is designed to guide on-farm management decisions at population (i.e. herd rather than individual level.

7. Spatial occupancy models applied to atlas data show Southern Ground Hornbills strongly depend on protected areas.

Science.gov (United States)

Broms, Kristin M; Johnson, Devin S; Altwegg, Res; Conquest, Loveday L

2014-03-01

Determining the range of a species and exploring species--habitat associations are central questions in ecology and can be answered by analyzing presence--absence data. Often, both the sampling of sites and the desired area of inference involve neighboring sites; thus, positive spatial autocorrelation between these sites is expected. Using survey data for the Southern Ground Hornbill (Bucorvus leadbeateri) from the Southern African Bird Atlas Project, we compared advantages and disadvantages of three increasingly complex models for species occupancy: an occupancy model that accounted for nondetection but assumed all sites were independent, and two spatial occupancy models that accounted for both nondetection and spatial autocorrelation. We modeled the spatial autocorrelation with an intrinsic conditional autoregressive (ICAR) model and with a restricted spatial regression (RSR) model. Both spatial models can readily be applied to any other gridded, presence--absence data set using a newly introduced R package. The RSR model provided the best inference and was able to capture small-scale variation that the other models did not. It showed that ground hornbills are strongly dependent on protected areas in the north of their South African range, but less so further south. The ICAR models did not capture any spatial autocorrelation in the data, and they took an order, of magnitude longer than the RSR models to run. Thus, the RSR occupancy model appears to be an attractive choice for modeling occurrences at large spatial domains, while accounting for imperfect detection and spatial autocorrelation.

8. Applying an orographic precipitation model to improve mass balance modeling of the Juneau Icefield, AK

Science.gov (United States)

Roth, A. C.; Hock, R.; Schuler, T.; Bieniek, P.; Aschwanden, A.

2017-12-01

Mass loss from glaciers in Southeast Alaska is expected to alter downstream ecological systems as runoff patterns change. To investigate these potential changes under future climate scenarios, distributed glacier mass balance modeling is required. However, the spatial resolution gap between global or regional climate models and the requirements for glacier mass balance modeling studies must be addressed first. We have used a linear theory of orographic precipitation model to downscale precipitation from both the Weather Research and Forecasting (WRF) model and ERA-Interim to the Juneau Icefield region over the period 1979-2013. This implementation of the LT model is a unique parameterization that relies on the specification of snow fall speed and rain fall speed as tuning parameters to calculate the cloud time delay, τ. We assessed the LT model results by considering winter precipitation so the effect of melt was minimized. The downscaled precipitation pattern produced by the LT model captures the orographic precipitation pattern absent from the coarse resolution WRF and ERA-Interim precipitation fields. Observational data constraints limited our ability to determine a unique parameter combination and calibrate the LT model to glaciological observations. We established a reference run of parameter values based on literature and performed a sensitivity analysis of the LT model parameters, horizontal resolution, and climate input data on the average winter precipitation. The results of the reference run showed reasonable agreement with the available glaciological measurements. The precipitation pattern produced by the LT model was consistent regardless of parameter combination, horizontal resolution, and climate input data, but the precipitation amount varied strongly with these factors. Due to the consistency of the winter precipitation pattern and the uncertainty in precipitation amount, we suggest a precipitation index map approach to be used in combination with

9. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

Directory of Open Access Journals (Sweden)

L Potgieter

2012-12-01

Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

10. Ensemble Genetic Fuzzy Neuro Model Applied for the Emergency Medical Service via Unbalanced Data Evaluation

Directory of Open Access Journals (Sweden)

2018-03-01

Full Text Available Equally partitioned data are essential for prediction. However, in some important cases, the data distribution is severely unbalanced. In this study, several algorithms are utilized to maximize the learning accuracy when dealing with a highly unbalanced dataset. A linguistic algorithm is applied to evaluate the input and output relationship, namely Fuzzy c-Means (FCM, which is applied as a clustering algorithm for the majority class to balance the minority class data from about 3 million cases. Each cluster is used to train several artificial neural network (ANN models. Different techniques are applied to generate an ensemble genetic fuzzy neuro model (EGFNM in order to select the models. The first ensemble technique, the intra-cluster EGFNM, works by evaluating the best combination from all the models generated by each cluster. Another ensemble technique is the inter-cluster model EGFNM, which is based on selecting the best model from each cluster. The accuracy of these techniques is evaluated using the receiver operating characteristic (ROC via its area under the curve (AUC. Results show that the AUC of the unbalanced data is 0.67974. The random cluster and best ANN single model have AUCs of 0.7177 and 0.72806, respectively. For the ensemble evaluations, the intra-cluster and the inter-cluster EGFNMs produce 0.7293 and 0.73038, respectively. In conclusion, this study achieved improved results by performing the EGFNM method compared with the unbalanced training. This study concludes that selecting several best models will produce a better result compared with all models combined.

11. Dynamic plant uptake model applied for drip irrigation of an insecticide to pepper fruit plants

DEFF Research Database (Denmark)

Legind, Charlotte Nielsen; Kennedy, C. M.; Rein, Arno

2011-01-01

irrigation, its application for a soil-applied insecticide and a sensitivity analysis of the model parameters. RESULTS: The model predicted the measured increase and decline of residues following two soil applications of an insecticide to peppers, with an absolute error between model and measurement ranging...... from 0.002 to 0.034 mg kg fw—1. Maximum measured concentrations in pepper fruit were approximately 0.22 mg kg fw—1. Temperature was the most sensitive component for predicting the peak and final concentration in pepper fruit, through its influence on soil and plant degradation rates...

12. The Kadanoff lower-bound variational renormalization group applied to an SU(2) lattice spin model

International Nuclear Information System (INIS)

Thorleifsson, G.; Damgaard, P.H.

1990-07-01

We apply the variational lower-bound Renormalization Group transformation of Kadanoff to an SU(2) lattice spin model in 2 and 3 dimensions. Even in the one-hypercube framework of this renormalization group transformation the present model is characterised by having an infinite basis of fundamental operators. We investigate whether the lower-bound variational renormalization group transformation yields results stable under truncations of this operator basis. Our results show that for this particular spin model this is not the case. (orig.)

13. Comparison among Models to Estimate the Shielding Effectiveness Applied to Conductive Textiles

Directory of Open Access Journals (Sweden)

Alberto Lopez

2013-01-01

Full Text Available The purpose of this paper is to present a comparison among two models and its measurement to calculate the shielding effectiveness of electromagnetic barriers, applying it to conductive textiles. Each one, models a conductive textile as either a (1 wire mesh screen or (2 compact material. Therefore, the objective is to perform an analysis of the models in order to determine which one is a better approximation for electromagnetic shielding fabrics. In order to provide results for the comparison, the shielding effectiveness of the sample has been measured by means of the standard ASTM D4935-99.

14. Neuregulin-1/erbB-activation improves cardiac function and survival in models of ischemic, dilated, and viral cardiomyopathy.

Science.gov (United States)

Liu, Xifu; Gu, Xinhua; Li, Zhaoming; Li, Xinyan; Li, Hui; Chang, Jianjie; Chen, Ping; Jin, Jing; Xi, Bing; Chen, Denghong; Lai, Donna; Graham, Robert M; Zhou, Mingdong

2006-10-03

We evaluated the therapeutic potential of a recombinant 61-residue neuregulin-1 (beta2a isoform) receptor-active peptide (rhNRG-1) in multiple animal models of heart disease. Activation of the erbB family of receptor tyrosine kinases by rhNRG-1 could provide a treatment option for heart failure, because neuregulin-stimulated erbB2/erbB4 heterodimerization is not only critical for myocardium formation in early heart development but prevents severe dysfunction of the adult heart and premature death. Disabled erbB-signaling is also implicated in the transition from compensatory hypertrophy to failure, whereas erbB receptor-activation promotes myocardial cell growth and survival and protects against anthracycline-induced cardiomyopathy. rhNRG-1 was administered IV to animal models of ischemic, dilated, and viral cardiomyopathy, and cardiac function and survival were evaluated. Short-term intravenous administration of rhNRG-1 to normal dogs and rats did not alter hemodynamics or cardiac contractility. In contrast, rhNRG-1 improved cardiac performance, attenuated pathological changes, and prolonged survival in rodent models of ischemic, dilated, and viral cardiomyopathy, with the survival benefits in the ischemic model being additive to those of angiotensin-converting enzyme inhibitor therapy. In addition, despite continued pacing, rhNRG-1 produced global improvements in cardiac function in a canine model of pacing-induced heart failure. These beneficial effects make rhNRG-1 promising as a broad-spectrum therapeutic for the treatment of heart failure due to a variety of common cardiac diseases.

15. Real-time slicing algorithm for Stereolithography (STL) CAD model applied in additive manufacturing industry

Science.gov (United States)

Adnan, F. A.; Romlay, F. R. M.; Shafiq, M.

2018-04-01

Owing to the advent of the industrial revolution 4.0, the need for further evaluating processes applied in the additive manufacturing application particularly the computational process for slicing is non-trivial. This paper evaluates a real-time slicing algorithm for slicing an STL formatted computer-aided design (CAD). A line-plane intersection equation was applied to perform the slicing procedure at any given height. The application of this algorithm has found to provide a better computational time regardless the number of facet in the STL model. The performance of this algorithm is evaluated by comparing the results of the computational time for different geometry.

16. The use of simple reparameterizations to improve the efficiency of Markov chain Monte Carlo estimation for multilevel models with applications to discrete time survival models.

Science.gov (United States)

Browne, William J; Steele, Fiona; Golalizadeh, Mousa; Green, Martin J

2009-06-01

We consider the application of Markov chain Monte Carlo (MCMC) estimation methods to random-effects models and in particular the family of discrete time survival models. Survival models can be used in many situations in the medical and social sciences and we illustrate their use through two examples that differ in terms of both substantive area and data structure. A multilevel discrete time survival analysis involves expanding the data set so that the model can be cast as a standard multilevel binary response model. For such models it has been shown that MCMC methods have advantages in terms of reducing estimate bias. However, the data expansion results in very large data sets for which MCMC estimation is often slow and can produce chains that exhibit poor mixing. Any way of improving the mixing will result in both speeding up the methods and more confidence in the estimates that are produced. The MCMC methodological literature is full of alternative algorithms designed to improve mixing of chains and we describe three reparameterization techniques that are easy to implement in available software. We consider two examples of multilevel survival analysis: incidence of mastitis in dairy cattle and contraceptive use dynamics in Indonesia. For each application we show where the reparameterization techniques can be used and assess their performance.

17. A software complex intended for constructing applied models and meta-models on the basis of mathematical programming principles

Directory of Open Access Journals (Sweden)

Михаил Юрьевич Чернышов

2013-12-01

Full Text Available A software complex (SC elaborated by the authors on the basis of the language LMPL and representing a software tool intended for synthesis of applied software models and meta-models constructed on the basis of mathematical programming (MP principles is described. LMPL provides for an explicit form of declarative representation of MP-models, presumes automatic constructing and transformation of models and the capability of adding external software packages. The following software versions of the SC have been implemented: 1 a SC intended for representing the process of choosing an optimal hydroelectric power plant model (on the principles of meta-modeling and 2 a SC intended for representing the logic-sense relations between the models of a set of discourse formations in the discourse meta-model.

18. Modeling nest survival of cavity-nesting birds in relation to postfire salvage logging

Science.gov (United States)

Vicki Saab; Robin E. Russell; Jay Rotella; Jonathan G. Dudley

2011-01-01

Salvage logging practices in recently burned forests often have direct effects on species associated with dead trees, particularly cavity-nesting birds. As such, evaluation of postfire management practices on nest survival rates of cavity nesters is necessary for determining conservation strategies. We monitored 1,797 nests of 6 cavity-nesting bird species: Lewis'...

19. Using the Q10 model to simulate E. coli survival in cowpats on grazing lands

Science.gov (United States)

Microbiological quality of surface waters can be affected by microbial load in runoff from grazing lands. This effect, with other factors, depends on the survival of microorganisms in animal waste deposited on pastures. Since temperature is a leading environmental parameter affec...

20. Dimethylaminoparthenolide and gemcitabine: a survival study using a genetically engineered mouse model of pancreatic cancer

International Nuclear Information System (INIS)

Yip-Schneider, Michele T; Wu, Huangbing; Stantz, Keith; Agaram, Narasimhan; Crooks, Peter A; Schmidt, C Max

2013-01-01

Pancreatic cancer remains one of the deadliest cancers due to lack of early detection and absence of effective treatments. Gemcitabine, the current standard-of-care chemotherapy for pancreatic cancer, has limited clinical benefit. Treatment of pancreatic cancer cells with gemcitabine has been shown to induce the activity of the transcription factor nuclear factor-kappaB (NF-κB) which regulates the expression of genes involved in the inflammatory response and tumorigenesis. It has therefore been proposed that gemcitabine-induced NF-κB activation may result in chemoresistance. We hypothesize that NF-κB suppression by the novel inhibitor dimethylaminoparthenolide (DMAPT) may enhance the effect of gemcitabine in pancreatic cancer. The efficacy of DMAPT and gemcitabine was evaluated in a chemoprevention trial using the mutant Kras and p53-expressing LSL-Kras G12D/+ ; LSL-Trp53 R172H ; Pdx-1-Cre mouse model of pancreatic cancer. Mice were randomized to treatment groups (placebo, DMAPT [40 mg/kg/day], gemcitabine [50 mg/kg twice weekly], and the combination DMAPT/gemcitabine). Treatment was continued until mice showed signs of ill health at which time they were sacrificed. Plasma cytokine levels were determined using a Bio-Plex immunoassay. Statistical tests used included log-rank test, ANOVA with Dunnett’s post-test, Student’s t-test, and Fisher exact test. Gemcitabine or the combination DMAPT/gemcitabine significantly increased median survival and decreased the incidence and multiplicity of pancreatic adenocarcinomas. The DMAPT/gemcitabine combination also significantly decreased tumor size and the incidence of metastasis to the liver. No significant differences in the percentages of normal pancreatic ducts or premalignant pancreatic lesions were observed between the treatment groups. Pancreata in which no tumors formed were analyzed to determine the extent of pre-neoplasia; mostly normal ducts or low grade pancreatic lesions were observed, suggesting prevention

Directory of Open Access Journals (Sweden)

Marco D’Andrea

2018-01-01

Full Text Available Lung tumors are often associated with a poor prognosis although different schedules and treatment modalities have been extensively tested in the clinical practice. The complexity of this disease and the use of combined therapeutic approaches have been investigated and the use of high dose-rates is emerging as effective strategy. Technological improvements of clinical linear accelerators allow combining high dose-rate and a more conformal dose delivery with accurate imaging modalities pre- and during therapy. This paper aims at reporting the state of the art and future direction in the use of radiobiological models and radiobiological-based optimizations in the clinical practice for the treatment of lung cancer. To address this issue, a search was carried out on PubMed database to identify potential papers reporting tumor control probability and normal tissue complication probability for lung tumors. Full articles were retrieved when the abstract was considered relevant, and only papers published in English language were considered. The bibliographies of retrieved papers were also searched and relevant articles included. At the state of the art, dose–response relationships have been reported in literature for local tumor control and survival in stage III non-small cell lung cancer. Due to the lack of published radiobiological models for SBRT, several authors used dose constraints and models derived for conventional fractionation schemes. Recently, several radiobiological models and parameters for SBRT have been published and could be used in prospective trials although external validations are recommended to improve the robustness of model predictive capability. Moreover, radiobiological-based functions have been used within treatment planning systems for plan optimization but the advantages of using this strategy in the clinical practice are still under discussion. Future research should be directed toward combined regimens, in order to

2. On the choice of electromagnetic model for short high-intensity arcs, applied to welding

International Nuclear Information System (INIS)

Choquet, Isabelle; Shirvan, Alireza Javidi; Nilsson, Håkan

2012-01-01

We have considered four different approaches for modelling the electromagnetic fields of high-intensity electric arcs: (i) three-dimensional, (ii) two-dimensional axi-symmetric, (iii) the electric potential formulation and (iv) the magnetic field formulation. The underlying assumptions and the differences between these models are described in detail. Models (i) to (iii) reduce to the same limit for an axi-symmetric configuration with negligible radial current density, contrary to model (iv). Models (i) to (iii) were retained and implemented in the open source CFD software OpenFOAM. The simulation results were first validated against the analytic solution of an infinite electric rod. Perfect agreement was obtained for all the models tested. The electromagnetic models (i) to (iii) were then coupled with thermal fluid mechanics, and applied to axi-symmetric gas tungsten arc welding test cases with short arc (2, 3 and 5 mm) and truncated conical electrode tip. Models (i) and (ii) lead to the same simulation results, but not model (iii). Model (iii) is suited in the specific limit of long axi-symmetric arc with negligible electrode tip effect, i.e. negligible radial current density. For short axi-symmetric arc with significant electrode tip effect, the more general axi-symmetric formulation of model (ii) should instead be used. (paper)

3. Cost-effectiveness Analysis in R Using a Multi-state Modeling Survival Analysis Framework: A Tutorial.

Science.gov (United States)

Williams, Claire; Lewsey, James D; Briggs, Andrew H; Mackay, Daniel F

2017-05-01

This tutorial provides a step-by-step guide to performing cost-effectiveness analysis using a multi-state modeling approach. Alongside the tutorial, we provide easy-to-use functions in the statistics package R. We argue that this multi-state modeling approach using a package such as R has advantages over approaches where models are built in a spreadsheet package. In particular, using a syntax-based approach means there is a written record of what was done and the calculations are transparent. Reproducing the analysis is straightforward as the syntax just needs to be run again. The approach can be thought of as an alternative way to build a Markov decision-analytic model, which also has the option to use a state-arrival extended approach. In the state-arrival extended multi-state model, a covariate that represents patients' history is included, allowing the Markov property to be tested. We illustrate the building of multi-state survival models, making predictions from the models and assessing fits. We then proceed to perform a cost-effectiveness analysis, including deterministic and probabilistic sensitivity analyses. Finally, we show how to create 2 common methods of visualizing the results-namely, cost-effectiveness planes and cost-effectiveness acceptability curves. The analysis is implemented entirely within R. It is based on adaptions to functions in the existing R package mstate to accommodate parametric multi-state modeling that facilitates extrapolation of survival curves.

4. Reynolds stress turbulence model applied to two-phase pressurized thermal shocks in nuclear power plant

Energy Technology Data Exchange (ETDEWEB)

Mérigoux, Nicolas, E-mail: nicolas.merigoux@edf.fr; Laviéville, Jérôme; Mimouni, Stéphane; Guingo, Mathieu; Baudry, Cyril

2016-04-01

Highlights: • NEPTUNE-CFD is used to model two-phase PTS. • k-ε model did produce some satisfactory results but also highlights some weaknesses. • A more advanced turbulence model has been developed, validated and applied for PTS. • Coupled with LIM, the first results confirmed the increased accuracy of the approach. - Abstract: Nuclear power plants are subjected to a variety of ageing mechanisms and, at the same time, exposed to potential pressurized thermal shock (PTS) – characterized by a rapid cooling of the internal Reactor Pressure Vessel (RPV) surface. In this context, NEPTUNE-CFD is used to model two-phase PTS and give an assessment on the structural integrity of the RPV. The first available choice was to use standard first order turbulence model (k-ε) to model high-Reynolds number flows encountered in Pressurized Water Reactor (PWR) primary circuits. In a first attempt, the use of k-ε model did produce some satisfactory results in terms of condensation rate and temperature field distribution on integral experiments, but also highlights some weaknesses in the way to model highly anisotropic turbulence. One way to improve the turbulence prediction – and consequently the temperature field distribution – is to opt for more advanced Reynolds Stress turbulence Model. After various verification and validation steps on separated effects cases – co-current air/steam-water stratified flows in rectangular channels, water jet impingements on water pool free surfaces – this Reynolds Stress turbulence Model (R{sub ij}-ε SSG) has been applied for the first time to thermal free surface flows under industrial conditions on COSI and TOPFLOW-PTS experiments. Coupled with the Large Interface Model, the first results confirmed the adequacy and increased accuracy of the approach in an industrial context.

5. Modelling of composite concrete block pavement systems applying a cohesive zone model

DEFF Research Database (Denmark)

Skar, Asmus; Poulsen, Peter Noe

This paper presents a numerical analysis of the fracture behaviour of the cement bound base material in composite concrete block pavement systems, using a cohesive zone model. The functionality of the proposed model is tested on experimental and numerical investigations of beam bending tests....... The pavement is modelled as a simple slab on grade structure and parameters influencing the response, such as analysis technique, geometry and material parameters are studied. Moreover, the analysis is extended to a real scale example, modelling the pavement as a three-layered structure. It is found...... block pavements. It is envisaged that the methodology implemented in this study can be extended and thereby contribute to the ongoing development of rational failure criteria that can replace the empirical formulas currently used in pavement engineering....

6. Addressing dependability by applying an approach for model-based risk assessment

International Nuclear Information System (INIS)

Gran, Bjorn Axel; Fredriksen, Rune; Thunem, Atoosa P.-J.

2007-01-01

This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development

7. Addressing dependability by applying an approach for model-based risk assessment

Energy Technology Data Exchange (ETDEWEB)

Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

2007-11-15

This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

8. Systemic administration of bevacizumab prolongs survival in an in vivo model of platinum pre-treated ovarian cancer

Science.gov (United States)

REIN, DANIEL T.; VOLKMER, ANNE KATHRIN; VOLKMER, JENS; BEYER, INES M.; JANNI, WOLFGANG; FLEISCH, MARKUS C.; WELTER, ANNE KATHRIN; BAUERSCHLAG, DIRK; SCHÖNDORF, THOMAS; BREIDENBACH, MARTINA

2012-01-01

Ovarian cancer patients often suffer from malignant ascites and pleural effusion. Apart from worsening the outcome, this condition frequently impairs the quality of life in patients who are already distressed by ovarian cancer. This study investigated whether single intraperitoneal administration of the anti-VEGF antibody bevacizumab is capable of reducing the ascites-related body surface and prolonging survival. The study was performed in an orthotopic murine model of peritoneal disseminated platin-resistant ovarian cancer. Mice were treated with bevacizumab and/or paclitaxel or buffer (control). Reduction of body surface and increased survival rates were assessed as therapeutic success. Survival of mice in all treatment groups was significantly enhanced when compared to the non-treatment control group. The combination of paclitaxel plus bevacizumab significantly improved body surface as well as overall survival in comparison to a treatment with only one of the drugs. Treatment of malignant effusion with a single dose of bevacizumab as an intraperitoneal application, with or without cytostatic co-medication, may be a powerful alternative to systemic treatment. PMID:22740945

9. Survival of probiotic lactobacilli in the upper gastrointestinal tract using an in vitro gastric model of digestion.

Science.gov (United States)

Lo Curto, Alberto; Pitino, Iole; Mandalari, Giuseppina; Dainty, Jack Richard; Faulks, Richard Martin; John Wickham, Martin Sean

2011-10-01

The aim of this study was to investigate survival of three commercial probiotic strains (Lactobacillus casei subsp. shirota, L. casei subsp. immunitas, Lactobacillus acidophilus subsp. johnsonii) in the human upper gastrointestinal (GI) tract using a dynamic gastric model (DGM) of digestion followed by incubation under duodenal conditions. Water and milk were used as food matrices and survival was evaluated in both logarithmic and stationary phase. The % of recovery in logarithmic phase ranged from 1.0% to 43.8% in water for all tested strains, and from 80.5% to 197% in milk. Higher survival was observed in stationary phase for all strains. L. acidophilus subsp. johnsonii showed the highest survival rate in both water (93.9%) and milk (202.4%). Lactic acid production was higher in stationary phase, L. casei subsp. shirota producing the highest concentration (98.2 mM) after in vitro gastric plus duodenal digestion. Copyright © 2011 Elsevier Ltd. All rights reserved.

10. Effect of Electroacupuncture at The Zusanli Point (Stomach-36) on Dorsal Random Pattern Skin Flap Survival in a Rat Model.

Science.gov (United States)

Wang, Li-Ren; Cai, Le-Yi; Lin, Ding-Sheng; Cao, Bin; Li, Zhi-Jie

2017-10-01

Random skin flaps are commonly used for wound repair and reconstruction. Electroacupuncture at The Zusanli point could enhance microcirculation and blood perfusion in random skin flaps. To determine whether electroacupuncture at The Zusanli point can improve the survival of random skin flaps in a rat model. Thirty-six male Sprague Dawley rats were randomly divided into 3 groups: control group (no electroacupuncture), Group A (electroacupuncture at a nonacupoint near The Zusanli point), and Group B (electroacupuncture at The Zusanli point). McFarlane flaps were established. On postoperative Day 2, malondialdehyde (MDA) and superoxide dismutase were detected. The flap survival rate was evaluated, inflammation was examined in hematoxylin and eosin-stained slices, and the expression of vascular endothelial growth factor (VEGF) was measured immunohistochemically on Day 7. The mean survival area of the flaps in Group B was significantly larger than that in the control group and Group A. Superoxide dismutase activity and VEGF expression level were significantly higher in Group B than those in the control group and Group A, whereas MDA and inflammation levels in Group B were significantly lower than those in the other 2 groups. Electroacupuncture at The Zusanli point can effectively improve the random flap survival.

11. The Motivational Knowledge Management Model: proposal to apply it in the library sector

Directory of Open Access Journals (Sweden)

Daniel López-Fernández

2016-12-01

Full Text Available In professional environments, attention paid to aspects such as supervisory styles, interpersonal relationships and workers eagerness can have a positive impact on employee motivation and, consequently, on their performance and well-being. To achieve this, knowledge management models such as those presented here can be applied. This model generates diagnoses of motivation and recommendations for improvement, both systematically and scientifically. Consequently, it is especially useful for managers and human resource departments. The proposed model can be adapted to different kinds of professional groups, including those in library and documentation services. The suitability, reliability and usefulness of the proposed model have been empirically checked through case studies with 92 students and 166 professionals. The positive results allow us to conclude that the model is effective and useful for assessing and improving motivation.

12. Modeling principles applied to the simulation of a joule-heated glass melter

International Nuclear Information System (INIS)

Routt, K.R.

1980-05-01

Three-dimensional conservation equations applicable to the operation of a joule-heated glass melter were rigorously examined and used to develop scaling relationships for modeling purposes. By rigorous application of the conservation equations governing transfer of mass, momentum, energy, and electrical charge in three-dimensional cylindrical coordinates, scaling relationships were derived between a glass melter and a physical model for the following independent and dependent variables: geometrical size (scale), velocity, temperature, pressure, mass input rate, energy input rate, voltage, electrode current, electrode current flux, total power, and electrical resistance. The scaling relationships were then applied to the design and construction of a physical model of the semiworks glass melter for the Defense Waste Processing Facility. The design and construction of such a model using glycerine plus LiCl as a model fluid in a one-half-scale Plexiglas tank is described

13. Specific-activity and concentration model applied to cesium movement in an oligotrophic lake

International Nuclear Information System (INIS)

Vanderploeg, H.A.; Booth, R.S.; Clark, F.H.

1975-01-01

A linear systems-analysis model was derived to simulate the time-dependent dynamics of specific activity and concentration of radionuclides in aquatic systems. Transfer coefficients were determined for movement of 137 Cs in the components of an oligotrophic lake. These coefficients were defined in terms of basic environmental and ecological data so that the model can be applied to a wide variety of sites. Simulations with a model that ignored sediment--water interactions predicted much higher 137 Cs specific activities in the lake water and biota than did those with the complete model. Comparing 137 Cs concentrations predicted by the model with concentrations reported for the biota of an experimentally contaminated oligotrophic lake indicated that the transfer coefficients derived for the biota are adequate

14. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

DEFF Research Database (Denmark)

Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist

2015-01-01

The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......) identifiability problems. By using the presented approach it is possible to detect potential identifiability problems and avoid pointless calibration (and experimental!) effort....

15. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

International Nuclear Information System (INIS)

Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

2014-01-01

A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems

16. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

Energy Technology Data Exchange (ETDEWEB)

Elsheikh, Ahmed H., E-mail: aelsheikh@ices.utexas.edu [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Institute of Petroleum Engineering, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom); Wheeler, Mary F. [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Hoteit, Ibrahim [Department of Earth Sciences and Engineering, King Abdullah University of Science and Technology (KAUST), Thuwal (Saudi Arabia)

2014-02-01

A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems.

17. Positive Mathematical Programming Approaches – Recent Developments in Literature and Applied Modelling

Directory of Open Access Journals (Sweden)

Thomas Heckelei

2012-05-01

Full Text Available This paper reviews and discusses the more recent literature and application of Positive Mathematical Programming in the context of agricultural supply models. Specifically, advances in the empirical foundation of parameter specifications as well as the economic rationalisation of PMP models – both criticized in earlier reviews – are investigated. Moreover, the paper provides an overview on a larger set of models with regular/repeated policy application that apply variants of PMP. Results show that most applications today avoid arbitrary parameter specifications and rely on exogenous information on supply responses to calibrate model parameters. However, only few approaches use multiple observations to estimate parameters, which is likely due to the still considerable technical challenges associated with it. Equally, we found only limited reflection on the behavioral or technological assumptions that could rationalise the PMP model structure while still keeping the model’s advantages.

18. Assessing the effect of quantitative and qualitative predictors on gastric cancer individuals survival using hierarchical artificial neural network models.

Science.gov (United States)

Amiri, Zohreh; Mohammad, Kazem; Mahmoudi, Mahmood; Parsaeian, Mahbubeh; Zeraati, Hojjat

2013-01-01

There are numerous unanswered questions in the application of artificial neural network models for analysis of survival data. In most studies, independent variables have been studied as qualitative dichotomous variables, and results of using discrete and continuous quantitative, ordinal, or multinomial categorical predictive variables in these models are not well understood in comparison to conventional models. This study was designed and conducted to examine the application of these models in order to determine the survival of gastric cancer patients, in comparison to the Cox proportional hazards model. We studied the postoperative survival of 330 gastric cancer patients who suffered surgery at a surgical unit of the Iran Cancer Institute over a five-year period. Covariates of age, gender, history of substance abuse, cancer site, type of pathology, presence of metastasis, stage, and number of complementary treatments were entered in the models, and survival probabilities were calculated at 6, 12, 18, 24, 36, 48, and 60 months using the Cox proportional hazards and neural network models. We estimated coefficients of the Cox model and the weights in the neural network (with 3, 5, and 7 nodes in the hidden layer) in the training group, and used them to derive predictions in the study group. Predictions with these two methods were compared with those of the Kaplan-Meier product limit estimator as the gold standard. Comparisons were performed with the Friedman and Kruskal-Wallis tests. Survival probabilities at different times were determined using the Cox proportional hazards and a neural network with three nodes in the hidden layer; the ratios of standard errors with these two methods to the Kaplan-Meier method were 1.1593 and 1.0071, respectively, revealed a significant difference between Cox and Kaplan-Meier (P neural network, and the neural network and the standard (Kaplan-Meier), as well as better accuracy for the neural network (with 3 nodes in the hidden layer

19. Mammographic Density Reduction as a Prognostic Marker for Postmenopausal Breast Cancer: Results Using a Joint Longitudinal-Survival Modeling Approach.

Science.gov (United States)

Andersson, Therese M-L; Crowther, Michael J; Czene, Kamila; Hall, Per; Humphreys, Keith

2017-11-01

Previous studies have linked reductions in mammographic density after a breast cancer diagnosis to an improved prognosis. These studies focused on short-term change, using a 2-stage process, treating estimated change as a fixed covariate in a survival model. We propose the use of a joint longitudinal-survival model. This enables us to model long-term trends in density while accounting for dropout as well as for measurement error. We studied the change in mammographic density after a breast cancer diagnosis and its association with prognosis (measured by cause-specific mortality), overall and with respect to hormone replacement therapy and tamoxifen treatment. We included 1,740 women aged 50-74 years, diagnosed with breast cancer in Sweden during 1993-1995, with follow-up until 2008. They had a total of 6,317 mammographic density measures available from the first 5 years of follow-up, including baseline measures. We found that the impact of the withdrawal of hormone replacement therapy on density reduction was larger than that of tamoxifen treatment. Unlike previous studies, we found that there was an association between density reduction and survival, both for tamoxifen-treated women and women who were not treated with tamoxifen. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.

20. Evaluating the Theoretic Adequacy and Applied Potential of Computational Models of the Spacing Effect.

Science.gov (United States)

Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael

2018-03-02

The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. © 2018 Cognitive Science Society, Inc.

1. The development of a curved beam element model applied to finite elements method

International Nuclear Information System (INIS)

Bento Filho, A.

1980-01-01

A procedure for the evaluation of the stiffness matrix for a thick curved beam element is developed, by means of the minimum potential energy principle, applied to finite elements. The displacement field is prescribed through polynomial expansions, and the interpolation model is determined by comparison of results obtained by the use of a sample of different expansions. As a limiting case of the curved beam, three cases of straight beams, with different dimensional ratios are analised, employing the approach proposed. Finally, an interpolation model is proposed and applied to a curved beam with great curvature. Desplacements and internal stresses are determined and the results are compared with those found in the literature. (Author) [pt

2. Soil-applied imidacloprid translocates to ornamental flowers and reduces survival of adult Coleomegilla maculata, Harmonia axyridis, and Hippodamia convergens lady beetles, and larval Danaus plexippus and Vanessa cardui butterflies.

Directory of Open Access Journals (Sweden)

Vera Krischik

Full Text Available Integrated Pest Management (IPM is a decision making process used to manage pests that relies on many tactics, including cultural and biological control, which are practices that conserve beneficial insects and mites, and when needed, the use of conventional insecticides. However, systemic, soil-applied neonicotinoid insecticides are translocated to pollen and nectar of flowers, often for months, and may reduce survival of flower-feeding beneficial insects. Imidacloprid seed-treated crops (0.05 mg AI (active ingredient /canola seed and 1.2 mg AI/corn seed translocate less than 10 ppb to pollen and nectar. However, higher rates of soil-applied imidacloprid are used in nurseries and urban landscapes, such as 300 mg AI/10 L (3 gallon pot and 69 g AI applied to the soil under a 61 (24 in cm diam. tree. Translocation of imidacloprid from soil (300 mg AI to flowers of Asclepias curassavica resulted in 6,030 ppb in 1X and 10,400 ppb in 2X treatments, which are similar to imidacloprid residues found in another plant species we studied. A second imidacloprid soil application 7 months later resulted in 21,000 ppb in 1X and 45,000 ppb in 2X treatments. Consequently, greenhouse/nursery use of imidacloprid applied to flowering plants can result in 793 to 1,368 times higher concentration compared to an imidacloprid seed treatment (7.6 ppb pollen in seed- treated canola, where most research has focused. These higher imidacloprid levels caused significant mortality in both 1X and 2X treatments in 3 lady beetle species, Coleomegilla maculata, Harmonia axyridis, and Hippodamia convergens, but not a fourth species, Coccinella septempunctata. Adult survival were not reduced for monarch, Danaus plexippus and painted lady, Vanessa cardui, butterflies, but larval survival was significantly reduced. The use of the neonicotinoid imidacloprid at greenhouse/nursery rates reduced survival of beneficial insects feeding on pollen and nectar and is incompatible with the

3. Soil-applied imidacloprid translocates to ornamental flowers and reduces survival of adult Coleomegilla maculata, Harmonia axyridis, and Hippodamia convergens lady beetles, and larval Danaus plexippus and Vanessa cardui butterflies.

Science.gov (United States)

Krischik, Vera; Rogers, Mary; Gupta, Garima; Varshney, Aruna

2015-01-01

Integrated Pest Management (IPM) is a decision making process used to manage pests that relies on many tactics, including cultural and biological control, which are practices that conserve beneficial insects and mites, and when needed, the use of conventional insecticides. However, systemic, soil-applied neonicotinoid insecticides are translocated to pollen and nectar of flowers, often for months, and may reduce survival of flower-feeding beneficial insects. Imidacloprid seed-treated crops (0.05 mg AI (active ingredient) /canola seed and 1.2 mg AI/corn seed) translocate less than 10 ppb to pollen and nectar. However, higher rates of soil-applied imidacloprid are used in nurseries and urban landscapes, such as 300 mg AI/10 L (3 gallon) pot and 69 g AI applied to the soil under a 61 (24 in) cm diam. tree. Translocation of imidacloprid from soil (300 mg AI) to flowers of Asclepias curassavica resulted in 6,030 ppb in 1X and 10,400 ppb in 2X treatments, which are similar to imidacloprid residues found in another plant species we studied. A second imidacloprid soil application 7 months later resulted in 21,000 ppb in 1X and 45,000 ppb in 2X treatments. Consequently, greenhouse/nursery use of imidacloprid applied to flowering plants can result in 793 to 1,368 times higher concentration compared to an imidacloprid seed treatment (7.6 ppb pollen in seed- treated canola), where most research has focused. These higher imidacloprid levels caused significant mortality in both 1X and 2X treatments in 3 lady beetle species, Coleomegilla maculata, Harmonia axyridis, and Hippodamia convergens, but not a fourth species, Coccinella septempunctata. Adult survival were not reduced for monarch, Danaus plexippus and painted lady, Vanessa cardui, butterflies, but larval survival was significantly reduced. The use of the neonicotinoid imidacloprid at greenhouse/nursery rates reduced survival of beneficial insects feeding on pollen and nectar and is incompatible with the principles of IPM.

4. Soil-Applied Imidacloprid Translocates to Ornamental Flowers and Reduces Survival of Adult Coleomegilla maculata, Harmonia axyridis, and Hippodamia convergens Lady Beetles, and Larval Danaus plexippus and Vanessa cardui Butterflies

Science.gov (United States)

Krischik, Vera; Rogers, Mary; Gupta, Garima; Varshney, Aruna

2015-01-01

Integrated Pest Management (IPM) is a decision making process used to manage pests that relies on many tactics, including cultural and biological control, which are practices that conserve beneficial insects and mites, and when needed, the use of conventional insecticides. However, systemic, soil-applied neonicotinoid insecticides are translocated to pollen and nectar of flowers, often for months, and may reduce survival of flower-feeding beneficial insects. Imidacloprid seed-treated crops (0.05 mg AI (active ingredient) /canola seed and 1.2 mg AI/corn seed) translocate less than 10 ppb to pollen and nectar. However, higher rates of soil-applied imidacloprid are used in nurseries and urban landscapes, such as 300 mg AI/10 L (3 gallon) pot and 69 g AI applied to the soil under a 61 (24 in) cm diam. tree. Translocation of imidacloprid from soil (300 mg AI) to flowers of Asclepias curassavica resulted in 6,030 ppb in 1X and 10,400 ppb in 2X treatments, which are similar to imidacloprid residues found in another plant species we studied. A second imidacloprid soil application 7 months later resulted in 21,000 ppb in 1X and 45,000 ppb in 2X treatments. Consequently, greenhouse/nursery use of imidacloprid applied to flowering plants can result in 793 to 1,368 times higher concentration compared to an imidacloprid seed treatment (7.6 ppb pollen in seed- treated canola), where most research has focused. These higher imidacloprid levels caused significant mortality in both 1X and 2X treatments in 3 lady beetle species, Coleomegilla maculata, Harmonia axyridis, and Hippodamia convergens, but not a fourth species, Coccinella septempunctata. Adult survival were not reduced for monarch, Danaus plexippus and painted lady, Vanessa cardui, butterflies, but larval survival was significantly reduced. The use of the neonicotinoid imidacloprid at greenhouse/nursery rates reduced survival of beneficial insects feeding on pollen and nectar and is incompatible with the principles of IPM

5. The IT Advantage Assessment Model: Applying an Expanded Value Chain Model to Academia

Science.gov (United States)

Turner, Walter L.; Stylianou, Antonis C.

2004-01-01

Academia faces an uncertain future as the 21st century unfolds. New demands, discerning students, increased competition from non-traditional competitors are just a few of the forces demanding a response. The use of information technology (IT) in academia has not kept pace with its use in industry. What has been lacking is a model for the strategic…

6. MODELLING AND SIMULATING RISKS IN THE TRAINING OF THE HUMAN RESOURCES BY APPLYING THE CHAOS THEORY

OpenAIRE

Eugen ROTARESCU

2012-01-01

The article approaches the modelling and simulation of risks in the training of the human resources, as well as the forecast of the degree of human resources training impacted by risks by applying the mathematical tools offered by the Chaos Theory and mathematical statistics. We will highlight that the level of knowledge, skills and abilities of the human resources from an organization are autocorrelated in time and they depend on the level of a previous moment of the training, as well as on ...

7. Aggregate Demand–Inflation Adjustment Model Applied to Southeast European Economies

Directory of Open Access Journals (Sweden)

Apostolov Mico

2016-01-01

Full Text Available Applying IS-MP-IA model and the Taylor rule to selected Southeast European economies (Albania, Bosnia and Herzegovina, Macedonia and Serbia we find that the change of effective exchange rate positively affects output, while the change of the world interest rate negatively affects output or it does not affect the output at all, and additional world output would help to increase output of the selected economies.

8. Multidisciplinary Management: Model of Excellence in the Management Applied to Products and Services

OpenAIRE

Guerreiro , Evandro ,; Costa Neto , Pedro ,; Moreira Filho , Ulysses ,

2014-01-01

Part 1: Knowledge-Based Performance Improvement; International audience; The Multidisciplinary Management is the guiding vision of modern organizations and the systems thinking which requires new approaches to organizational excellence and quality management process. The objective of this article is to present a model for multidisciplinary management of quality applied to products and services based on American, Japanese, and Brazilian National Quality Awards. The methodology used to build th...

9. Assimilation of tourism satellite accounts and applied general equilibrium models to inform tourism policy analysis

OpenAIRE

Rossouw, Riaan; Saayman, Melville

2011-01-01

Historically, tourism policy analysis in South Africa has posed challenges to accurate measurement. The primary reason for this is that tourism is not designated as an 'industry' in standard economic accounts. This paper therefore demonstrates the relevance and need for applied general equilibrium (AGE) models to be completed and extended through an integration with tourism satellite accounts (TSAs) as a tool for policy makers (especially tourism policy makers) in South Africa. The paper sets...

10. Applying nonlinear MODM model to supply chain management with quantity discount policy under complex fuzzy environment

Directory of Open Access Journals (Sweden)

Zhe Zhang

2014-06-01

Full Text Available Purpose: The aim of this paper is to deal with the supply chain management (SCM with quantity discount policy under the complex fuzzy environment, which is characterized as the bi-fuzzy variables. By taking into account the strategy and the process of decision making, a bi-fuzzy nonlinear multiple objective decision making (MODM model is presented to solve the proposed problem.Design/methodology/approach: The bi-fuzzy variables in the MODM model are transformed into the trapezoidal fuzzy variables by the DMs's degree of optimism ?1 and ?2, which are de-fuzzified by the expected value index subsequently. For solving the complex nonlinear model, a multi-objective adaptive particle swarm optimization algorithm (MO-APSO is designed as the solution method.Findings: The proposed model and algorithm are applied to a typical example of SCM problem to illustrate the effectiveness. Based on the sensitivity analysis of the results, the bi-fuzzy nonlinear MODM SCM model is proved to be sensitive to the possibility level ?1.Practical implications: The study focuses on the SCM under complex fuzzy environment in SCM, which has a great practical significance. Therefore, the bi-fuzzy MODM model and MO-APSO can be further applied in SCM problem with quantity discount policy.Originality/value: The bi-fuzzy variable is employed in the nonlinear MODM model of SCM to characterize the hybrid uncertain environment, and this work is original. In addition, the hybrid crisp approach is proposed to transferred to model to an equivalent crisp one by the DMs's degree of optimism and the expected value index. Since the MODM model consider the bi-fuzzy environment and quantity discount policy, so this paper has a great practical significance.

11. Applying Spatially Distributed Rainfall to a Hydrological Model in a Tropical Watershed, Manoa Watershed, in Hawaii

Science.gov (United States)

Huang, Y. F.; Tsang, Y. P.

2017-12-01

Rainfall in Hawaii is characterized with high spatial and temporal variability. In the south side of Oahu, the Manoa watershed, with an area of 11 km2, has the annual maximum rainfall of 3900mm and the minimum rainfall of 1000 mm. Despite this high spatial heterogeneity, the rain gage network seems insufficiently capture this pattern. When simulating stream flow and predicting floods with hydrological models in Hawaii, the model performance is often unsatisfactory because of inadequate representation of rainfall data. Longman et al. (in prep.) have developed the spatially distributed daily rainfall across the Hawaiian Islands by applying ordinary kriging, yet these data have not been applied to hydrological models. In this study, we used the Soil and Water Assessment Tool (SWAT) model to assess the streamflow simulation by applying spatially-distributed rainfall in the Manoa watershed. We first used point daily-rainfall at Lyon Arboretum from National Center of Environmental Information (NCEI) as the uniform rainfall input. Secondly, we summarized sub-watershed mean rainfall from the daily spatial-statistical rainfall. Both rainfall data are available from 1999 to 2014. The SWAT was set up for five-year warm-up, nine-year calibration, and two-year validation. The model parameters were calibrated and validated with four U.S. Geological Survey stream gages. We compared the calibrated watershed parameters, characteristics, and assess the streamflow hydrographs from these two rainfall inputs. The differences and improvement of using spatially distributed rainfall input in SWAT were discussed. In addition to improving the model by the representation of rainfall, this study helped us having a better understanding of the watershed hydrological response in Hawaii.

12. Dynamic model reduction using data-driven Loewner-framework applied to thermally morphing structures

Science.gov (United States)

Phoenix, Austin A.; Tarazaga, Pablo A.

2017-05-01

The work herein proposes the use of the data-driven Loewner-framework for reduced order modeling as applied to dynamic Finite Element Models (FEM) of thermally morphing structures. The Loewner-based modeling approach is computationally efficient and accurately constructs reduced models using analytical output data from a FEM. This paper details the two-step process proposed in the Loewner approach. First, a random vibration FEM simulation is used as the input for the development of a Single Input Single Output (SISO) data-based dynamic Loewner state space model. Second, an SVD-based truncation is used on the Loewner state space model, such that the minimal, dynamically representative, state space model is achieved. For this second part, varying levels of reduction are generated and compared. The work herein can be extended to model generation using experimental measurements by replacing the FEM output data in the first step and following the same procedure. This method will be demonstrated on two thermally morphing structures, a rigidly fixed hexapod in multiple geometric configurations and a low mass anisotropic morphing boom. This paper is working to detail the method and identify the benefits of the reduced model methodology.

13. Nonlinear models applied to seed germination of Rhipsalis cereuscula Haw (Cactaceae

Directory of Open Access Journals (Sweden)

Terezinha Aparecida Guedes

2014-09-01

Full Text Available The objective of this analysis was to fit germination data of Rhipsalis cereuscula Haw seeds to the Weibull model with three parameters using Frequentist and Bayesian methods. Five parameterizations were compared using the Bayesian analysis to fit a prior distribution. The parameter estimates from the Frequentist method were similar to the Bayesian responses considering the following non-informative a priori distribution for the parameter vectors: gamma (10³, 10³ in the model M1, normal (0, 106 in the model M2, uniform (0, Lsup in the model M3, exp (μ in the model M4 and Lnormal (μ, 106 in the model M5. However, to achieve the convergence in the models M4 and M5, we applied the μ from the estimates of the Frequentist approach. The best models fitted by the Bayesian method were the M1 and M3. The adequacy of these models was based on the advantages over the Frequentist method such as the reduced computational efforts and the possibility of comparison.

14. Models, validation, and applied geochemistry: Issues in science, communication, and philosophy

International Nuclear Information System (INIS)

Kirk Nordstrom, D.

2012-01-01

Models have become so fashionable that many scientists and engineers cannot imagine working without them. The predominant use of computer codes to execute model calculations has blurred the distinction between code and model. The recent controversy regarding model validation has brought into question what we mean by a ‘model’ and by ‘validation.’ It has become apparent that the usual meaning of validation may be common in engineering practice and seems useful in legal practice but it is contrary to scientific practice and brings into question our understanding of science and how it can best be applied to such problems as hazardous waste characterization, remediation, and aqueous geochemistry in general. This review summarizes arguments against using the phrase model validation and examines efforts to validate models for high-level radioactive waste management and for permitting and monitoring open-pit mines. Part of the controversy comes from a misunderstanding of ‘prediction’ and the need to distinguish logical from temporal prediction. Another problem stems from the difference in the engineering approach contrasted with the scientific approach. The reductionist influence on the way we approach environmental investigations also limits our ability to model the interconnected nature of reality. Guidelines are proposed to improve our perceptions and proper utilization of models. Use of the word ‘validation’ is strongly discouraged when discussing model reliability.

15. A simple mathematical model of society collapse applied to Easter Island

Science.gov (United States)

Bologna, M.; Flores, J. C.

2008-02-01

In this paper we consider a mathematical model for the evolution and collapse of the Easter Island society. Based on historical reports, the available primary resources consisted almost exclusively in the trees, then we describe the inhabitants and the resources as an isolated dynamical system. A mathematical, and numerical, analysis about the Easter Island community collapse is performed. In particular, we analyze the critical values of the fundamental parameters and a demographic curve is presented. The technological parameter, quantifying the exploitation of the resources, is calculated and applied to the case of another extinguished civilization (Copán Maya) confirming the consistency of the adopted model.

16. A generalised chemical precipitation modelling approach in wastewater treatment applied to calcite

DEFF Research Database (Denmark)

Mbamba, Christian Kazadi; Batstone, Damien J.; Flores Alsina, Xavier

2015-01-01

, the present study aims to identify a broadly applicable precipitation modelling approach. The study uses two experimental platforms applied to calcite precipitating from synthetic aqueous solutions to identify and validate the model approach. Firstly, dynamic pH titration tests are performed to define...... an Arrhenius-style correction of kcryst. The influence of magnesium (a common and representative added impurity) on kcryst was found to be significant but was considered an optional correction because of a lesser influence as compared to that of temperature. Other variables such as ionic strength and pH were...

17. Active lubrication applied to radial gas journal bearings. Part 2: Modelling improvement and experimental validation

DEFF Research Database (Denmark)

Pierart, Fabián G.; Santos, Ilmar F.

2016-01-01

Actively-controlled lubrication techniques are applied to radial gas bearings aiming at enhancing one of their most critical drawbacks, their lack of damping. A model-based control design approach is presented using simple feedback control laws, i.e. proportional controllers. The design approach...... by finite element method and the global model is used as control design tool. Active lubrication allows for significant increase in damping factor of the rotor-bearing system. Very good agreement between theory and experiment is obtained, supporting the multi-physic design tool developed....

18. Graft survival and cytokine production profile after limbal transplantation in the experimental mouse model

Czech Academy of Sciences Publication Activity Database

Lenčová, Anna; Pokorná, Kateřina; Zajícová, Alena; Krulová, Magdalena; Filipec, M.; Holáň, Vladimír

2011-01-01

Roč. 24, č. 3 (2011), s. 189-194 ISSN 0966-3274 R&D Projects: GA AV ČR KAN200520804; GA MŠk 1M0506; GA ČR GD310/08/H077 Institutional research plan: CEZ:AV0Z50520514 Keywords : limbal transplantation * graft survival * cytokine response Subject RIV: EC - Immunology Impact factor: 1.459, year: 2011

19. Model description and evaluation of the mark-recapture survival model used to parameterize the 2012 status and threats analysis for the Florida manatee (Trichechus manatus latirostris)

Science.gov (United States)

Langtimm, Catherine A.; Kendall, William L.; Beck, Cathy A.; Kochman, Howard I.; Teague, Amy L.; Meigs-Friend, Gaia; Peñaloza, Claudia L.

2016-11-30

This report provides supporting details and evidence for the rationale, validity and efficacy of a new mark-recapture model, the Barker Robust Design, to estimate regional manatee survival rates used to parameterize several components of the 2012 version of the Manatee Core Biological Model (CBM) and Threats Analysis (TA).  The CBM and TA provide scientific analyses on population viability of the Florida manatee subspecies (Trichechus manatus latirostris) for U.S. Fish and Wildlife Service’s 5-year reviews of the status of the species as listed under the Endangered Species Act.  The model evaluation is presented in a standardized reporting framework, modified from the TRACE (TRAnsparent and Comprehensive model Evaluation) protocol first introduced for environmental threat analyses.  We identify this new protocol as TRACE-MANATEE SURVIVAL and this model evaluation specifically as TRACE-MANATEE SURVIVAL, Barker RD version 1. The longer-term objectives of the manatee standard reporting format are to (1) communicate to resource managers consistent evaluation information over sequential modeling efforts; (2) build understanding and expertise on the structure and function of the models; (3) document changes in model structures and applications in response to evolving management objectives, new biological and ecological knowledge, and new statistical advances; and (4) provide greater transparency for management and research review.

20. Transoral endoscopic esophageal myotomy based on esophageal function testing in a survival porcine model.

Science.gov (United States)

Perretta, Silvana; Dallemagne, Bernard; Donatelli, Gianfranco; Diemunsch, Pierre; Marescaux, Jacques

2011-01-01

The most effective treatment of achalasia is Heller myotomy. To explore a submucosal endoscopic myotomy technique tailored on esophageal physiology testing and to compare it with the open technique. Prospective acute and survival comparative study in pigs (n = 12; 35 kg). University animal research center. Eight acute-4 open and 4 endoscopic-myotomies followed by 4 survival endoscopic procedures. Preoperative and postoperative manometry; esophagogastric junction (EGJ) distensibility before and after selective division of muscular fibers at the EGJ and after the myotomy was prolonged to a standard length by using the EndoFLIP Functional Lumen Imaging Probe (Crospon, Galway, Ireland). All procedures were successful, with no intraoperative and postoperative complications. In the survival group, the animals recovered promptly from surgery. Postoperative manometry demonstrated a 50% drop in mean lower esophageal sphincter pressure (LESp) in the endoscopic group (mean preoperative LESp, 22.2 ± 3.3 mm Hg; mean postoperative LESp, 11.34 ± 2.7 mm Hg; P open procedure group (mean preoperative LESp, 24.2 ± 3.2 mm Hg; mean postoperative LESp, 7.4 ± 4 mm Hg; P myotomy is feasible and safe. The lack of a significant difference in EGJ distensibility between the open and endoscopic procedure is very appealing. Were it to be perfected in a human population, this endoscopic approach could suggest a new strategy in the treatment of selected achalasia patients. Copyright © 2011 American Society for Gastrointestinal Endoscopy. Published by Mosby, Inc. All rights reserved.

1. Possibilities and Limitations of Applying Software Reliability Growth Models to Safety- Critical Software

International Nuclear Information System (INIS)

Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

2006-01-01

As digital systems are gradually introduced to nuclear power plants (NPPs), the need of quantitatively analyzing the reliability of the digital systems is also increasing. Kang and Sung identified (1) software reliability, (2) common-cause failures (CCFs), and (3) fault coverage as the three most critical factors in the reliability analysis of digital systems. For the estimation of the safety-critical software (the software that is used in safety-critical digital systems), the use of Bayesian Belief Networks (BBNs) seems to be most widely used. The use of BBNs in reliability estimation of safety-critical software is basically a process of indirectly assigning a reliability based on various observed information and experts' opinions. When software testing results or software failure histories are available, we can use a process of directly estimating the reliability of the software using various software reliability growth models such as Jelinski- Moranda model and Goel-Okumoto's nonhomogeneous Poisson process (NHPP) model. Even though it is generally known that software reliability growth models cannot be applied to safety-critical software due to small number of expected failure data from the testing of safety-critical software, we try to find possibilities and corresponding limitations of applying software reliability growth models to safety critical software

2. Identifying 'unhealthy' food advertising on television: a case study applying the UK Nutrient Profile model.

Science.gov (United States)

Jenkin, Gabrielle; Wilson, Nick; Hermanson, Nicole

2009-05-01

3. Applying ecological models to communities of genetic elements: the case of neutral theory.

Science.gov (United States)

Linquist, Stefan; Cottenie, Karl; Elliott, Tyler A; Saylor, Brent; Kremer, Stefan C; Gregory, T Ryan

2015-07-01

A promising recent development in molecular biology involves viewing the genome as a mini-ecosystem, where genetic elements are compared to organisms and the surrounding cellular and genomic structures are regarded as the local environment. Here, we critically evaluate the prospects of ecological neutral theory (ENT), a popular model in ecology, as it applies at the genomic level. This assessment requires an overview of the controversy surrounding neutral models in community ecology. In particular, we discuss the limitations of using ENT both as an explanation of community dynamics and as a null hypothesis. We then analyse a case study in which ENT has been applied to genomic data. Our central finding is that genetic elements do not conform to the requirements of ENT once its assumptions and limitations are made explicit. We further compare this genome-level application of ENT to two other, more familiar approaches in genomics that rely on neutral mechanisms: Kimura's molecular neutral theory and Lynch's mutational-hazard model. Interestingly, this comparison reveals that there are two distinct concepts of neutrality associated with these models, which we dub 'fitness neutrality' and 'competitive neutrality'. This distinction helps to clarify the various roles for neutral models in genomics, for example in explaining the evolution of genome size. © 2015 John Wiley & Sons Ltd.

4. Dynamic Models Applied to Landslides: Study Case Angangueo, MICHOACÁN, MÉXICO.

Science.gov (United States)

Torres Fernandez, L.; Hernández Madrigal, V. M., , Dr; Capra, L.; Domínguez Mota, F. J., , Dr

2017-12-01

Most existing models for landslide zonification are static type, do not consider the dynamic behavior of the trigger factor. This results in a limited representation of the actual zonation of slope instability, present a short-term validity, cańt be applied for the design of early warning systems, etc. Particularly in Mexico, these models are static because they do not consider triggering factor such as precipitation. In this work, we present a numerical evaluation to know the landslide susceptibility, based on probabilistic methods. Which are based on the generation of time series, which are generated from the meteorological stations, having limited information an interpolation is made to generate the simulation of the precipitation in the zone. The obtained information is integrated in PCRaster and in conjunction with the conditioning factors it is possible to generate a dynamic model. This model will be applied for landslide zoning in the municipality of Angangueo, characterized by frequent logging of debris and mud flow, translational and rotational landslides, detonated by atypical precipitations, such as those recorded in 2010. These caused economic losses and humans. With these models, it would be possible to generate probable scenarios that help the Angangueo's population to reduce the risks and to carry out actions of constant resilience activities.

5. Online Cancer Information Seeking: Applying and Extending the Comprehensive Model of Information Seeking.

Science.gov (United States)

Van Stee, Stephanie K; Yang, Qinghua

2017-10-30

This study applied the comprehensive model of information seeking (CMIS) to online cancer information and extended the model by incorporating an exogenous variable: interest in online health information exchange with health providers. A nationally representative sample from the Health Information National Trends Survey 4 Cycle 4 was analyzed to examine the extended CMIS in predicting online cancer information seeking. Findings from a structural equation model supported most of the hypotheses derived from the CMIS, as well as the extension of the model related to interest in online health information exchange. In particular, socioeconomic status, beliefs, and interest in online health information exchange predicted utility. Utility, in turn, predicted online cancer information seeking, as did information-carrier characteristics. An unexpected but important finding from the study was the significant, direct relationship between cancer worry and online cancer information seeking. Theoretical and practical implications are discussed.

6. Adapted strategic plannig model applied to small business: a case study in the fitness area

Directory of Open Access Journals (Sweden)

Eduarda Tirelli Hennig

2012-06-01

Full Text Available The strategic planning is an important management tool in the corporate scenario and shall not be restricted to big Companies. However, this kind of planning process in small business may need special adaptations due to their own characteristics. This paper aims to identify and adapt the existent models of strategic planning to the scenario of a small business in the fitness area. Initially, it is accomplished a comparative study among models of different authors to identify theirs phases and activities. Then, it is defined which of these phases and activities should be present in a model that will be utilized in a small business. That model was applied to a Pilates studio; it involves the establishment of an organizational identity, an environmental analysis as well as the definition of strategic goals, strategies and actions to reach them. Finally, benefits to the organization could be identified, as well as hurdles in the implementation of the tool.

7. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

DEFF Research Database (Denmark)

Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

2008-01-01

of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test......In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

8. Twice random, once mixed: applying mixed models to simultaneously analyze random effects of language and participants.

Science.gov (United States)

Janssen, Dirk P

2012-03-01

Psychologists, psycholinguists, and other researchers using language stimuli have been struggling for more than 30 years with the problem of how to analyze experimental data that contain two crossed random effects (items and participants). The classical analysis of variance does not apply; alternatives have been proposed but have failed to catch on, and a statistically unsatisfactory procedure of using two approximations (known as F(1) and F(2)) has become the standard. A simple and elegant solution using mixed model analysis has been available for 15 years, and recent improvements in statistical software have made mixed models analysis widely available. The aim of this article is to increase the use of mixed models by giving a concise practical introduction and by giving clear directions for undertaking the analysis in the most popular statistical packages. The article also introduces the DJMIXED: add-on package for SPSS, which makes entering the models and reporting their results as straightforward as possible.

9. Evaluation model applied to TRANSPETRO's Marine Terminals Standardization Program

Energy Technology Data Exchange (ETDEWEB)

Almeida, Maria Fatima Ludovico de; Mueller, Gabriela [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Garcia, Luciano Maldonado [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

2009-07-01

This paper describes an innovative evaluation model applied to TRANSPETRO's 'Marine Terminals Standardization Program' based on updating approaches of programs evaluation and organizational learning. Since the program was launched in 2004, the need for having an evaluation model able to evaluate its implementation progress, to measure the degree of standards compliance and its potential economic, social and environmental impacts has become evident. Within a vision of safe and environmentally responsible operations of marine terminals, this evaluation model was jointly designed by TRANSPETRO and PUC-Rio to promote continuous improvement and learning in operational practices and in the standardization process itself. TRANSPETRO believes that standardization supports its services and management innovation capability by creating objective and internationally recognized parameters, targets and metrology for its business activities. The conceptual model and application guidelines for this important tool are presented in this paper, as well as the next steps towards its implementation. (author)

10. Growth, survival, and peptidolytic activity of Lactobacillus plantarum I91 in a hard-cheese model.

Science.gov (United States)

Bergamini, C V; Peralta, G H; Milesi, M M; Hynes, E R

2013-09-01

In this work, we studied the growth, survival, and peptidolytic activity of Lactobacillus plantarum I91 in a hard-cheese model consisting of a sterile extract of Reggianito cheese. To assess the influence of the primary starter and initial proteolysis level on these parameters, we prepared the extracts with cheeses that were produced using 2 different starter strains of Lactobacillus helveticus 138 or 209 (Lh138 or Lh209) at 3 ripening times: 3, 90, and 180 d. The experimental extracts were inoculated with Lb. plantarum I91; the control extracts were not inoculated and the blank extracts were heat-treated to inactivate enzymes and were not inoculated. All extracts were incubated at 34°C for 21 d, and then the pH, microbiological counts, and proteolysis profiles were determined. The basal proteolysis profiles in the extracts of young cheeses made with either strain tested were similar, but many differences between the proteolysis profiles of the extracts of the Lh138 and Lh209 cheeses were found when riper cheeses were used. The pH values in the blank and control extracts did not change, and no microbial growth was detected. In contrast, the pH value in experimental extracts decreased, and this decrease was more pronounced in extracts obtained from either of the young cheeses and from the Lh209 cheese at any stage of ripening. Lactobacillus plantarum I91 grew up to 8 log during the first days of incubation in all of the extracts, but then the number of viable cells decreased, the extent of which depended on the starter strain and the age of the cheese used for the extract. The decrease in the counts of Lb. plantarum I91 was observed mainly in the extracts in which the pH had diminished the most. In addition, the extracts that best supported the viability of Lb. plantarum I91 during incubation had the highest free amino acids content. The effect of Lb. plantarum I91 on the proteolysis profile of the extracts was marginal. Significant changes in the content of free

11. Challenges to Applying a Metamodel for Groundwater Flow Beyond Underlying Numerical Model Boundaries

Science.gov (United States)

Reeves, H. W.; Fienen, M. N.; Feinstein, D.

2015-12-01

Metamodels of environmental behavior offer opportunities for decision support, adaptive management, and increased stakeholder engagement through participatory modeling and model exploration. Metamodels are derived from calibrated, computationally demanding, numerical models. They may potentially be applied to non-modeled areas to provide screening or preliminary analysis tools for areas that do not yet have the benefit of more comprehensive study. In this decision-support mode, they may be fulfilling a role often accomplished by application of analytical solutions. The major challenge to transferring a metamodel to a non-modeled area is how to quantify the spatial data in the new area of interest in such a way that it is consistent with the data used to derive the metamodel. Tests based on transferring a metamodel derived from a numerical groundwater-flow model of the Lake Michigan Basin to other glacial settings across the northern U.S. show that the spatial scale of the numerical model must be appropriately scaled to adequately represent different settings. Careful GIS analysis of the numerical model, metamodel, and new area of interest is required for successful transfer of results.

12. Estimating survival of dental fillings on the basis of interval-censored data and multi-state models

DEFF Research Database (Denmark)

Joly, Pierre; Gerds, Thomas A; Qvist, Vibeke

2012-01-01

We aim to compare the life expectancy of a filling in a primary tooth between two types of treatments. We define the probabilities that a dental filling survives without complication until the permanent tooth erupts from beneath (exfoliation). We relate the time to exfoliation of the tooth...... with all these particularities, we propose to use a parametric four-state model with three random effects to take into account the hierarchical cluster structure. For inference, right and interval censoring as well as left truncation have to be dealt with. With the proposed approach, we can conclude...... that the estimated probability that a filling survives without complication until exfoliation is larger for one treatment than for the other, for all ages of the child at the time of treatment....

13. EGFR inhibitor erlotinib delays disease progression but does not extend survival in the SOD1 mouse model of ALS.

Directory of Open Access Journals (Sweden)

Claire E Le Pichon

Full Text Available Amyotrophic lateral sclerosis (ALS is a fatal neurodegenerative disease that causes progressive paralysis due to motor neuron death. Several lines of published evidence suggested that inhibition of epidermal growth factor receptor (EGFR signaling might protect neurons from degeneration. To test this hypothesis in vivo, we treated the SOD1 transgenic mouse model of ALS with erlotinib, an EGFR inhibitor clinically approved for oncology indications. Although erlotinib failed to extend ALS mouse survival it did provide a modest but significant delay in the onset of multiple behavioral measures of disease progression. However, given the lack of protection of motor neuron synapses and the lack of survival extension, the small benefits observed after erlotinib treatment appear purely symptomatic, with no modification of disease course.

14. Theoretical modeling of electroosmotic flow in soft microchannels: A variational approach applied to the rectangular geometry

Science.gov (United States)

2018-03-01

Modeling of fluid flow in polyelectrolyte layer (PEL)-grafted microchannels is challenging due to their two-layer nature. Hence, the pertinent studies are limited only to circular and slit geometries for which matching the solutions for inside and outside the PEL is simple. In this paper, a simple variational-based approach is presented for the modeling of fully developed electroosmotic flow in PEL-grafted microchannels by which the whole fluidic area is considered as a single porous medium of variable properties. The model is capable of being applied to microchannels of a complex cross-sectional area. As an application of the method, it is applied to a rectangular microchannel of uniform PEL properties. It is shown that modeling a rectangular channel as a slit may lead to considerable overestimation of the mean velocity especially when both the PEL and electric double layer (EDL) are thick. It is also demonstrated that the mean velocity is an increasing function of the fixed charge density and PEL thickness and a decreasing function of the EDL thickness and PEL friction coefficient. The influence of the PEL thickness on the mean velocity, however, vanishes when both the PEL thickness and friction coefficient are sufficiently high.

15. Fuzzy uncertainty modeling applied to AP1000 nuclear power plant LOCA

International Nuclear Information System (INIS)

Ferreira Guimaraes, Antonio Cesar; Franklin Lapa, Celso Marcelo; Lamego Simoes Filho, Francisco Fernando; Cabral, Denise Cunha

2011-01-01

Research highlights: → This article presents an uncertainty modelling study using a fuzzy approach. → The AP1000 Westinghouse NPP was used and it is provided of passive safety systems. → The use of advanced passive safety systems in NPP has limited operational experience. → Failure rates and basic events probabilities used on the fault tree analysis. → Fuzzy uncertainty approach was employed to reliability of the AP1000 large LOCA. - Abstract: This article presents an uncertainty modeling study using a fuzzy approach applied to the Westinghouse advanced nuclear reactor. The AP1000 Westinghouse Nuclear Power Plant (NPP) is provided of passive safety systems, based on thermo physics phenomenon, that require no operating actions, soon after an incident has been detected. The use of advanced passive safety systems in NPP has limited operational experience. As it occurs in any reliability study, statistically non-significant events report introduces a significant uncertainty level about the failure rates and basic events probabilities used on the fault tree analysis (FTA). In order to model this uncertainty, a fuzzy approach was employed to reliability analysis of the AP1000 large break Loss of Coolant Accident (LOCA). The final results have revealed that the proposed approach may be successfully applied to modeling of uncertainties in safety studies.

16. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

Directory of Open Access Journals (Sweden)

Full Text Available Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.

17. Modeling of hydrothermal circulation applied to active volcanic areas. The case of Vulcano (Italy)

Energy Technology Data Exchange (ETDEWEB)

Todesco, M. [Dip. Scienze della Terra, Posa (Italy)

1995-03-01

Modeling of fluid and heat flows through porous media has been diffusely applied up to date to the study of geothermal reservoirs. Much less has been done to apply the same methodology to the study of active volcanoes and of the associated volcanic hazard. Hydrothermal systems provide direct information on dormant eruptive centers and significant insights on their state of activity and current evolution. For this reason, the evaluation of volcanic hazard is also based on monitoring of hydrothermal activity. Such monitoring, however, provides measurements of surface parameters, such as fluid temperature or composition, that often are only representative of the shallower portion of the system. The interpretation of these data in terms of global functioning of the hydrothermal circulation can therefore be highly misleading. Numerical modeling of hydrothermal activity provides a physical approach to the description of fluid circulation and can contribute to its understanding and to the interpretation of monitoring data. In this work, the TOUGH2 simulator has been applied to study the hydrothermal activity at Vulcano (Italy). Simulations involved an axisymmetric domain heated from below, and focused on the effects of permeability distribution and carbon dioxide. Results are consistent with the present knowledge of the volcanic system and suggest that permeability distribution plays a major role in the evolution of fluid circulation. This parameter should be considered in the interpretation of monitoring data and in the evaluation of volcanic hazard at Vulcano.

18. Comparsion analysis of data mining models applied to clinical research in traditional Chinese medicine.

Science.gov (United States)

Zhao, Yufeng; Xie, Qi; He, Liyun; Liu, Baoyan; Li, Kun; Zhang, Xiang; Bai, Wenjing; Luo, Lin; Jing, Xianghong; Huo, Ruili

2014-10-01

To help researchers selecting appropriate data mining models to provide better evidence for the clinical practice of Traditional Chinese Medicine (TCM) diagnosis and therapy. Clinical issues based on data mining models were comprehensively summarized from four significant elements of the clinical studies: symptoms, symptom patterns, herbs, and efficacy. Existing problems were further generalized to determine the relevant factors of the performance of data mining models, e.g. data type, samples, parameters, variable labels. Combining these relevant factors, the TCM clinical data features were compared with regards to statistical characters and informatics properties. Data models were compared simultaneously from the view of applied conditions and suitable scopes. The main application problems were the inconsistent data type and the small samples for the used data mining models, which caused the inappropriate results, even the mistake results. These features, i.e. advantages, disadvantages, satisfied data types, tasks of data mining, and the TCM issues, were summarized and compared. By aiming at the special features of different data mining models, the clinical doctors could select the suitable data mining models to resolve the TCM problem.

19. Inhibition of intestinal epithelial apoptosis improves survival in a murine model of radiation combined injury.

Directory of Open Access Journals (Sweden)

Enjae Jung