Statistical Validation of Normal Tissue Complication Probability Models
Energy Technology Data Exchange (ETDEWEB)
Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)
2012-09-01
Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.
Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.
2012-01-01
PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator
Energy Technology Data Exchange (ETDEWEB)
Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)
2012-03-15
Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.
DEFF Research Database (Denmark)
Rønjom, Marianne Feen; Brink, Carsten; Bentzen, Søren
2013-01-01
To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors.......To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors....
DEFF Research Database (Denmark)
Rønjom, Marianne F; Brink, Carsten; Bentzen, Søren M
2015-01-01
BACKGROUND: A normal tissue complication probability (NTCP) model for radiation-induced hypothyroidism (RIHT) was previously derived in patients with squamous cell carcinoma of the head and neck (HNSCC) discerning thyroid volume (Vthyroid), mean thyroid dose (Dmean), and latency as predictive...
Energy Technology Data Exchange (ETDEWEB)
Bakhshandeh, Mohsen [Department of Medical Physics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of); Hashemi, Bijan, E-mail: bhashemi@modares.ac.ir [Department of Medical Physics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of); Mahdavi, Seied Rabi Mehdi [Department of Medical Physics, Faculty of Medical Sciences, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Nikoofar, Alireza; Vasheghani, Maryam [Department of Radiation Oncology, Hafte-Tir Hospital, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Kazemnejad, Anoshirvan [Department of Biostatistics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of)
2013-02-01
Purpose: To determine the dose-response relationship of the thyroid for radiation-induced hypothyroidism in head-and-neck radiation therapy, according to 6 normal tissue complication probability models, and to find the best-fit parameters of the models. Methods and Materials: Sixty-five patients treated with primary or postoperative radiation therapy for various cancers in the head-and-neck region were prospectively evaluated. Patient serum samples (tri-iodothyronine, thyroxine, thyroid-stimulating hormone [TSH], free tri-iodothyronine, and free thyroxine) were measured before and at regular time intervals until 1 year after the completion of radiation therapy. Dose-volume histograms (DVHs) of the patients' thyroid gland were derived from their computed tomography (CT)-based treatment planning data. Hypothyroidism was defined as increased TSH (subclinical hypothyroidism) or increased TSH in combination with decreased free thyroxine and thyroxine (clinical hypothyroidism). Thyroid DVHs were converted to 2 Gy/fraction equivalent doses using the linear-quadratic formula with {alpha}/{beta} = 3 Gy. The evaluated models included the following: Lyman with the DVH reduced to the equivalent uniform dose (EUD), known as LEUD; Logit-EUD; mean dose; relative seriality; individual critical volume; and population critical volume models. The parameters of the models were obtained by fitting the patients' data using a maximum likelihood analysis method. The goodness of fit of the models was determined by the 2-sample Kolmogorov-Smirnov test. Ranking of the models was made according to Akaike's information criterion. Results: Twenty-nine patients (44.6%) experienced hypothyroidism. None of the models was rejected according to the evaluation of the goodness of fit. The mean dose model was ranked as the best model on the basis of its Akaike's information criterion value. The D{sub 50} estimated from the models was approximately 44 Gy. Conclusions: The implemented
Energy Technology Data Exchange (ETDEWEB)
Defraene, Gilles, E-mail: gilles.defraene@uzleuven.be [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Van den Bergh, Laura [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Al-Mamgani, Abrahim [Department of Radiation Oncology, Erasmus Medical Center - Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Haustermans, Karin [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Heemsbergen, Wilma [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands); Van den Heuvel, Frank [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Lebesque, Joos V. [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands)
2012-03-01
Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011-0.013) clinical factor was 'previous abdominal surgery.' As second significant (p = 0.012-0.016) factor, 'cardiac history' was included in all three rectal bleeding fits, whereas including 'diabetes' was significant (p = 0.039-0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003-0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D{sub 50}. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints
Kierkels, Roel G J; Korevaar, Erik W; Steenbakkers, Roel J H M; Janssen, Tomas; van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis; van der Schaaf, Arjen
2014-09-01
Recently, clinically validated multivariable normal tissue complication probability models (NTCP) for head and neck cancer (HNC) patients have become available. We test the feasibility of using multivariable NTCP-models directly in the optimiser for inverse treatment planning of radiotherapy to improve the dose distributions and corresponding NTCP-estimates in HNC patients. For 10 HNC cases, intensity-modulated radiotherapy plans were optimised either using objective functions based on the 'generalised equivalent uniform dose' (OFgEUD) or based on multivariable NTCP-models (OFNTCP). NTCP-models for patient-rated xerostomia, physician-rated RTOG grade II-IV dysphagia, and various patient-rated aspects of swallowing dysfunction were incorporated. The NTCP-models included dose-volume parameters as well as clinical factors contributing to a personalised optimisation process. Both optimisation techniques were compared by means of 'pseudo Pareto fronts' (target dose conformity vs. the sum of the NTCPs). Both optimisation techniques resulted in clinically realistic treatment plans with only small differences. For nine patients the sum-NTCP was lower for the OFNTCP optimised plans (on average 5.7% (95%CI 1.7-9.9%, poptimisation parameters and an intrinsic mechanism of individualisation. Treatment plan optimisation using multivariable NTCP-models directly in the OF is feasible as has been demonstrated for HNC radiotherapy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Bazan, Jose G.; Luxton, Gary; Kozak, Margaret M.; Anderson, Eric M.; Hancock, Steven L.; Kapp, Daniel S.; Kidd, Elizabeth A.; Koong, Albert C.; Chang, Daniel T., E-mail: dtchang@stanford.edu
2013-12-01
Purpose: To determine how chemotherapy agents affect radiation dose parameters that correlate with acute hematologic toxicity (HT) in patients treated with pelvic intensity modulated radiation therapy (P-IMRT) and concurrent chemotherapy. Methods and Materials: We assessed HT in 141 patients who received P-IMRT for anal, gynecologic, rectal, or prostate cancers, 95 of whom received concurrent chemotherapy. Patients were separated into 4 groups: mitomycin (MMC) + 5-fluorouracil (5FU, 37 of 141), platinum ± 5FU (Cis, 32 of 141), 5FU (26 of 141), and P-IMRT alone (46 of 141). The pelvic bone was contoured as a surrogate for pelvic bone marrow (PBM) and divided into subsites: ilium, lower pelvis, and lumbosacral spine (LSS). The volumes of each region receiving 5-40 Gy were calculated. The endpoint for HT was grade ≥3 (HT3+) leukopenia, neutropenia or thrombocytopenia. Normal tissue complication probability was calculated using the Lyman-Kutcher-Burman model. Logistic regression was used to analyze association between HT3+ and dosimetric parameters. Results: Twenty-six patients experienced HT3+: 10 of 37 (27%) MMC, 14 of 32 (44%) Cis, 2 of 26 (8%) 5FU, and 0 of 46 P-IMRT. PBM dosimetric parameters were correlated with HT3+ in the MMC group but not in the Cis group. LSS dosimetric parameters were well correlated with HT3+ in both the MMC and Cis groups. Constrained optimization (0
Kierkels, Roel G J; Wopken, Kim; Visser, Ruurd; Korevaar, Erik W; van der Schaaf, Arjen; Bijl, Hendrik P; Langendijk, Johannes A
2016-12-01
Radiotherapy of the head and neck is challenged by the relatively large number of organs-at-risk close to the tumor. Biologically-oriented objective functions (OF) could optimally distribute the dose among the organs-at-risk. We aimed to explore OFs based on multivariable normal tissue complication probability (NTCP) models for grade 2-4 dysphagia (DYS) and tube feeding dependence (TFD). One hundred head and neck cancer patients were studied. Additional to the clinical plan, two more plans (an OFDYS and OFTFD-plan) were optimized per patient. The NTCP models included up to four dose-volume parameters and other non-dosimetric factors. A fully automatic plan optimization framework was used to optimize the OFNTCP-based plans. All OFNTCP-based plans were reviewed and classified as clinically acceptable. On average, the Δdose and ΔNTCP were small comparing the OFDYS-plan, OFTFD-plan, and clinical plan. For 5% of patients NTCPTFD reduced >5% using OFTFD-based planning compared to the OFDYS-plans. Plan optimization using NTCPDYS- and NTCPTFD-based objective functions resulted in clinically acceptable plans. For patients with considerable risk factors of TFD, the OFTFD steered the optimizer to dose distributions which directly led to slightly lower predicted NTCPTFD values as compared to the other studied plans. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Information-theoretic methods for estimating of complicated probability distributions
Zong, Zhi
2006-01-01
Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur
Probability and stochastic modeling
Rotar, Vladimir I
2012-01-01
Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...
Szlag, Marta; Slosarek, Krzysztof
2010-01-01
To create a presentation method of TCP and NTCP distributions calculated based on dose distribution for a selected CT slice. Three 24-bit colour maps - of dose distribution, delineated structures and CT information - were converted into m-by-n-by-3 data arrays, containing intensities of red, green, and blue colour components for each pixel. All calculations were performed with Matlab v.6.5. The transformation function, which consists of five linear functions, was prepared to translate the colour map into a one-dimensional data array of dose values. A menu-driven application based on the transformation function and mathematical models of complication risk (NTCP) and treatment control probability (TCP) was designed to allow pixel-by-pixel translation of colour maps into one-dimensional arrays of TCP and NTCP values. The result of this work is an application created to visualize the TCP and NTCP distribution for a single CT scan based on the spatial dose distribution calculated in the treatment planning system. The application allows 10 targets (PTV) and 10 organs at risks (OaR) to be defined. The interface allows alpha/beta values to be inserted for each delineated structure. The application computes TCP and NTCP matrices, which are presented as colour maps superimposed on the corresponding CT slice. There is a set of parameters used for TCP/NTCP calculations which can be defined by the user. Our application is a prototype of an evaluation tool. Although limited to a single plane of the treatment plan, it is believed to be a starting point for further development.
Trojková, Darina; Judas, Libor; Trojek, Tomáš
2014-11-01
Minimizing the late rectal toxicity of prostate cancer patients is a very important and widely-discussed topic. Normal tissue complication probability (NTCP) models can be used to evaluate competing treatment plans. In our work, the parameters of the Lyman-Kutcher-Burman (LKB), Källman, and Logit+EUD models are optimized by minimizing the Brier score for a group of 302 prostate cancer patients. The NTCP values are calculated and are compared with the values obtained using previously published values for the parameters. χ2 Statistics were calculated as a check of goodness of optimization.
Blanchard, Pierre; Wong, Andrew J; Gunn, G Brandon; Garden, Adam S; Mohamed, Abdallah S R; Rosenthal, David I; Crutison, Joseph; Wu, Richard; Zhang, Xiaodong; Zhu, X Ronald; Mohan, Radhe; Amin, Mayankkumar V; Fuller, C David; Frank, Steven J
2016-12-01
To externally validate head and neck cancer (HNC) photon-derived normal tissue complication probability (NTCP) models in patients treated with proton beam therapy (PBT). This prospective cohort consisted of HNC patients treated with PBT at a single institution. NTCP models were selected based on the availability of data for validation and evaluated by using the leave-one-out cross-validated area under the curve (AUC) for the receiver operating characteristics curve. 192 patients were included. The most prevalent tumor site was oropharynx (n=86, 45%), followed by sinonasal (n=28), nasopharyngeal (n=27) or parotid (n=27) tumors. Apart from the prediction of acute mucositis (reduction of AUC of 0.17), the models overall performed well. The validation (PBT) AUC and the published AUC were respectively 0.90 versus 0.88 for feeding tube 6months PBT; 0.70 versus 0.80 for physician-rated dysphagia 6months after PBT; 0.70 versus 0.68 for dry mouth 6months after PBT; and 0.73 versus 0.85 for hypothyroidism 12months after PBT. Although a drop in NTCP model performance was expected for PBT patients, the models showed robustness and remained valid. Further work is warranted, but these results support the validity of the model-based approach for selecting treatment for patients with HNC. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Schaake, Wouter; van der Schaaf, Arjen; van Dijk, Lisanne V.; Bongaerts, Alfons H. H.; van den Bergh, Alfons C. M.; Langendijk, Johannes A.
Background and purpose: Curative radiotherapy for prostate cancer may lead to anorectal side effects, including rectal bleeding, fecal incontinence, increased stool frequency and rectal pain. The main objective of this study was to develop multivariable NTCP models for these side effects. Material
Probability output modeling for support vector machines
Zhang, Xiang; Xiao, Xiaoling; Tian, Jinwen; Liu, Jian
2007-11-01
In this paper we propose an approach to model the posterior probability output of multi-class SVMs. The sigmoid function is used to estimate the posterior probability output in binary classification. This approach modeling the posterior probability output of multi-class SVMs is achieved by directly solving the equations that are based on the combination of the probability outputs of binary classifiers using the Bayes's rule. The differences and different weights among these two-class SVM classifiers, based on the posterior probability, are considered and given for the combination of the probability outputs among these two-class SVM classifiers in this method. The comparative experiment results show that our method achieves the better classification precision and the better probability distribution of the posterior probability than the pairwise couping method and the Hastie's optimization method.
DEFF Research Database (Denmark)
Korreman, Stine S; Pedersen, Anders N; Juhler-Nøttrup, Trine
2006-01-01
the remaining breast, internal mammary, and periclavicular nodes were optimized for each scan, prescription dose 48 Gy. Normal tissue complication probabilities were calculated using the relative seriality model for the heart, and the model proposed by Burman et al. for the lung. RESULTS: Previous computed...... tomography studies showed that both voluntary DIBH and IG provided reduction of the lung V50 (relative volume receiving more than 50% of prescription dose) on the order of 30-40%, and a 80-90% reduction of the heart V50 for left-sided cancers. Corresponding pneumonitis probability of 28.1% (range, 0.......7-95.6%) for FB could be reduced to 2.6% (range, 0.1-40.1%) for IG, and 4.3% (range, 0.1-59%) for DIBH. The cardiac mortality probability could be reduced from 4.8% (range, 0.1-23.4%) in FB to 0.5% (range, 0.1-2.6%) for IG and 0.1% (range, 0-3.0%) for DIBH. CONCLUSIONS: Remarkable potential is shown for simple...
Integrated statistical modelling of spatial landslide probability
Mergili, M.; Chu, H.-J.
2015-09-01
Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.
Comparing linear probability model coefficients across groups
DEFF Research Database (Denmark)
Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt
2015-01-01
This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....
A probability space for quantum models
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Modelling the probability of building fires
Directory of Open Access Journals (Sweden)
Vojtěch Barták
2014-12-01
Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.
Sampling, Probability Models and Statistical Reasoning Statistical ...
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...
Multinomial mixture model with heterogeneous classification probabilities
Holland, M.D.; Gray, B.R.
2011-01-01
Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.
Statistical physics of pairwise probability models
DEFF Research Database (Denmark)
Roudi, Yasser; Aurell, Erik; Hertz, John
2009-01-01
(dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data...
Correlations and Non-Linear Probability Models
DEFF Research Database (Denmark)
Breen, Richard; Holm, Anders; Karlson, Kristian Bernt
2014-01-01
Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....
Comparing coefficients of nested nonlinear probability models
DEFF Research Database (Denmark)
Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders
2011-01-01
In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...
Probability in traffic : A challenge for modelling
Calvert, S.C.; Taale, H.; Snelder, M.; Hoogendoorn, S.P.
2012-01-01
In the past decade an increase in research regarding stochasticity and probability in traffic modelling has occurred. The realisation has grown that simple presumptions and basic stochastic elements are insufficient to give accurate modelling results in many cases. This paper puts forward a strong
Probability in traffic: a challenge for modelling
Calvert, S.C.; Taale, H.; Snelder, M.; Hoogendoorn, S.P.
2012-01-01
In the past decade an increase in research regarding stochasticity and probability in traffic modelling has occurred. The realisation has grown that simple presumptions and basic stochastic elements are insufficient to give accurate modelling results in many cases. This paper puts forward a strong
Robust Model-Free Multiclass Probability Estimation
Wu, Yichao; Zhang, Hao Helen; Liu, Yufeng
2010-01-01
Classical statistical approaches for multiclass probability estimation are typically based on regression techniques such as multiple logistic regression, or density estimation approaches such as linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). These methods often make certain assumptions on the form of probability functions or on the underlying distributions of subclasses. In this article, we develop a model-free procedure to estimate multiclass probabilities based on large-margin classifiers. In particular, the new estimation scheme is employed by solving a series of weighted large-margin classifiers and then systematically extracting the probability information from these multiple classification rules. A main advantage of the proposed probability estimation technique is that it does not impose any strong parametric assumption on the underlying distribution and can be applied for a wide range of large-margin classification methods. A general computational algorithm is developed for class probability estimation. Furthermore, we establish asymptotic consistency of the probability estimates. Both simulated and real data examples are presented to illustrate competitive performance of the new approach and compare it with several other existing methods. PMID:21113386
Uncertainty the soul of modeling, probability & statistics
Briggs, William
2016-01-01
This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...
Applied probability models with optimization applications
Ross, Sheldon M
1992-01-01
Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.
A Probability Model for Belady's Anomaly
McMaster, Kirby; Sambasivam, Samuel E.; Anderson, Nicole
2010-01-01
In demand paging virtual memory systems, the page fault rate of a process varies with the number of memory frames allocated to the process. When an increase in the number of allocated frames leads to an increase in the number of page faults, Belady's anomaly is said to occur. In this paper, we present a probability model for Belady's anomaly. We…
Sampling, Probability Models and Statistical Reasoning -RE ...
Indian Academy of Sciences (India)
eligible voters who support a particular political party. A random sample of size n is selected from this population and suppose k voters support this party. What is a good estimate of the required proportion? How do we obtain a probability model for the experi- ment just conducted? Let us examine the following simple ex-.
The Probability Model of Expectation Disconfirmation Process
Directory of Open Access Journals (Sweden)
Hui-Hsin HUANG
2015-06-01
Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.
Maximum Probability Domains for Hubbard Models
Acke, Guillaume; Claeys, Pieter W; Van Raemdonck, Mario; Poelmans, Ward; Van Neck, Dimitri; Bultinck, Patrick
2015-01-01
The theory of Maximum Probability Domains (MPDs) is formulated for the Hubbard model in terms of projection operators and generating functions for both exact eigenstates as well as Slater determinants. A fast MPD analysis procedure is proposed, which is subsequently used to analyse numerical results for the Hubbard model. It is shown that the essential physics behind the considered Hubbard models can be exposed using MPDs. Furthermore, the MPDs appear to be in line with what is expected from Valence Bond Theory-based knowledge.
A probability distribution model for rain rate
Kedem, Benjamin; Pavlopoulos, Harry; Guan, Xiaodong; Short, David A.
1994-01-01
A systematic approach is suggested for modeling the probability distribution of rain rate. Rain rate, conditional on rain and averaged over a region, is modeled as a temporally homogeneous diffusion process with appropiate boundary conditions. The approach requires a drift coefficient-conditional average instantaneous rate of change of rain intensity-as well as a diffusion coefficient-the conditional average magnitude of the rate of growth and decay of rain rate about its drift. Under certain assumptions on the drift and diffusion coefficients compatible with rain rate, a new parametric family-containing the lognormal distribution-is obtained for the continuous part of the stationary limit probability distribution. The family is fitted to tropical rainfall from Darwin and Florida, and it is found that the lognormal distribution provides adequate fits as compared with other members of the family and also with the gamma distribution.
Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling
DEFF Research Database (Denmark)
Scheike, Thomas H.; Zhang, Mei-Jie
2005-01-01
cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...
Statistical physics of pairwise probability models
Directory of Open Access Journals (Sweden)
Yasser Roudi
2009-11-01
Full Text Available Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of data: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying and using pairwise models. We build on our previous work on the subject and study the relation between different methods for fitting these models and evaluating their quality. In particular, using data from simulated cortical networks we study how the quality of various approximate methods for inferring the parameters in a pairwise model depends on the time bin chosen for binning the data. We also study the effect of the size of the time bin on the model quality itself, again using simulated data. We show that using finer time bins increases the quality of the pairwise model. We offer new ways of deriving the expressions reported in our previous work for assessing the quality of pairwise models.
Platelet dysfunction contributes to bleeding complications in patients with probable leptospirosis.
Tunjungputri, Rahajeng N; Gasem, Muhammad Hussein; van der Does, Willemijn; Sasongko, Pandu H; Isbandrio, Bambang; Urbanus, Rolf T; de Groot, Philip G; van der Ven, Andre; de Mast, Quirijn
2017-09-01
Severe leptospirosis is frequently complicated by a hemorrhagic diathesis, of which the pathogenesis is still largely unknown. Thrombocytopenia is common, but often not to the degree that spontaneous bleeding is expected. We hypothesized that the hemorrhagic complications are not only related to thrombocytopenia, but also to platelet dysfunction, and that increased binding of von Willebrand factor (VWF) to platelets is involved in both platelet dysfunction and increased platelet clearance. A prospective study was carried out in Semarang, Indonesia, enrolling 33 hospitalized patients with probable leptospirosis, of whom 15 developed clinical bleeding, and 25 healthy controls. Platelet activation and reactivity were determined using flow cytometry by measuring the expression of P-selectin and activation of the αIIbβ3 integrin by the binding of fibrinogen in unstimulated samples and after ex vivo stimulation by the platelet agonists adenosine-diphosphate (ADP) and thrombin-receptor activating peptide (TRAP). Platelet-VWF binding, before and after VWF stimulation by ristocetin, as well as plasma levels of VWF, active VWF, the VWF-inactivating enzyme ADAMTS13, thrombin-antithrombin complexes (TAT) and P-selectin were also measured. Bleeding complications were graded using the WHO bleeding scale. Our study revealed that platelet activation, with a secondary platelet dysfunction, is a feature of patients with probable leptospirosis, especially in those with bleeding manifestations. There was a significant inverse correlation of bleeding score with TRAP-stimulated P-selectin and platelet-fibrinogen binding (R = -0.72, P = 0.003 and R = -0.66, P = 0.01, respectively) but not with platelet count. Patients with bleeding also had a significantly higher platelet-VWF binding. Platelet counts were inversely correlated with platelet-VWF binding (R = -0.74; P = 0.0009. There were no correlations between platelet-VWF binding and the degree of platelet dysfunction, suggesting that
A COMPLICATED GRIEF INTERVENTION MODEL
African Journals Online (AJOL)
2010-07-29
Bookholane 2004), parental bereavement (Lydall 2004;. Strydom & Fourie 1998), the influence of context on complicated grief (Opperman 2004) and a South African perspective on cultural attitudes towards death and dying (Elion & ...
Modeling the rejection probability in plant imports.
Surkov, I V; van der Werf, W; van Kooten, O; Lansink, A G J M Oude
2008-06-01
Phytosanitary inspection of imported plants and flowers is a major means for preventing pest invasions through international trade, but in a majority of countries availability of resources prevents inspection of all imports. Prediction of the likelihood of pest infestation in imported shipments could help maximize the efficiency of inspection by targeting inspection on shipments with the highest likelihood of infestation. This paper applies a multinomial logistic (MNL) regression model to data on import inspections of ornamental plant commodities in the Netherlands from 1998 to 2001 to investigate whether it is possible to predict the probability that a shipment will be (i) accepted for import, (ii) rejected for import because of detected pests, or (iii) rejected due to other reasons. Four models were estimated: (i) an all-species model, including all plant imports (136,251 shipments) in the data set, (ii) a four-species model, including records on the four ornamental commodities that accounted for 28.9% of inspected and 49.5% of rejected shipments, and two models for single commodities with large import volumes and percentages of rejections, (iii) Dianthus (16.9% of inspected and 46.3% of rejected shipments), and (iv) Chrysanthemum (6.9 and 8.6%, respectively). All models were highly significant (P < 0.001). The models for Dianthus and Chrysanthemum and for the set of four ornamental commodities showed a better fit to data than the model for all ornamental commodities. Variables that characterized the imported shipment's region of origin, the shipment's size, the company that imported the shipment, and season and year of import, were significant in most of the estimated models. The combined results of this study suggest that the MNL model can be a useful tool for modeling the probability of rejecting imported commodities even with a small set of explanatory variables. The MNL model can be helpful in better targeting of resources for import inspection. The
Datamining approaches for modeling tumor control probability.
Naqa, Issam El; Deasy, Joseph O; Mu, Yi; Huang, Ellen; Hope, Andrew J; Lindsay, Patricia E; Apte, Aditya; Alaly, James; Bradley, Jeffrey D
2010-11-01
Tumor control probability (TCP) to radiotherapy is determined by complex interactions between tumor biology, tumor microenvironment, radiation dosimetry, and patient-related variables. The complexity of these heterogeneous variable interactions constitutes a challenge for building predictive models for routine clinical practice. We describe a datamining framework that can unravel the higher order relationships among dosimetric dose-volume prognostic variables, interrogate various radiobiological processes, and generalize to unseen data before when applied prospectively. Several datamining approaches are discussed that include dose-volume metrics, equivalent uniform dose, mechanistic Poisson model, and model building methods using statistical regression and machine learning techniques. Institutional datasets of non-small cell lung cancer (NSCLC) patients are used to demonstrate these methods. The performance of the different methods was evaluated using bivariate Spearman rank correlations (rs). Over-fitting was controlled via resampling methods. Using a dataset of 56 patients with primary NCSLC tumors and 23 candidate variables, we estimated GTV volume and V75 to be the best model parameters for predicting TCP using statistical resampling and a logistic model. Using these variables, the support vector machine (SVM) kernel method provided superior performance for TCP prediction with an rs=0.68 on leave-one-out testing compared to logistic regression (rs=0.4), Poisson-based TCP (rs=0.33), and cell kill equivalent uniform dose model (rs=0.17). The prediction of treatment response can be improved by utilizing datamining approaches, which are able to unravel important non-linear complex interactions among model variables and have the capacity to predict on unseen data for prospective clinical applications.
Probability and statistics: models for research
National Research Council Canada - National Science Library
Bailey, Daniel Edgar
1971-01-01
This book is an interpretative presentation of the mathematical and logical basis of probability and statistics, indulging in some mathematics, but concentrating on the logical and scientific meaning...
A COMPLICATED GRIEF INTERVENTION MODEL
African Journals Online (AJOL)
2010-07-29
Jul 29, 2010 ... is given to the Dual Process Model (Stroebe & Schut 1999) and the Task-Centred approach (a social work approach to therapy) in an ..... networking, Social Sciences Citation Index, Social Sciences. Index, Institute for ... of social functioning, for example social isolation and loneliness and role changes (e.g. ...
Geometric modeling in probability and statistics
Calin, Ovidiu
2014-01-01
This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...
Calculating the Probability of Returning a Loan with Binary Probability Models
Directory of Open Access Journals (Sweden)
Julian Vasilev
2014-12-01
Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.
Stochastic population dynamic models as probability networks
M.E. and D.C. Lee. Borsuk
2009-01-01
The dynamics of a population and its response to environmental change depend on the balance of birth, death and age-at-maturity, and there have been many attempts to mathematically model populations based on these characteristics. Historically, most of these models were deterministic, meaning that the results were strictly determined by the equations of the model and...
Energy Technology Data Exchange (ETDEWEB)
Semenenko, Vladimir A., E-mail: vsemenenko@LandauerMP.com [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Tarima, Sergey S. [Division of Biostatistics, Institute for Health and Society, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Devisetty, Kiran [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Pelizzari, Charles A.; Liauw, Stanley L. [Department of Radiation and Cellular Oncology, University of Chicago Pritzker School of Medicine, Chicago, Illinois (United States)
2013-03-15
Purpose: To perform validation of risk predictions for late rectal toxicity (LRT) in prostate cancer obtained using a new approach to synthesize published normal tissue complication data. Methods and Materials: A published study survey was performed to identify the dose-response relationships for LRT derived from nonoverlapping patient populations. To avoid mixing models based on different symptoms, the emphasis was placed on rectal bleeding. The selected models were used to compute the risk estimates of grade 2+ and grade 3+ LRT for an independent validation cohort composed of 269 prostate cancer patients with known toxicity outcomes. Risk estimates from single studies were combined to produce consolidated risk estimates. An agreement between the actuarial toxicity incidence 3 years after radiation therapy completion and single-study or consolidated risk estimates was evaluated using the concordance correlation coefficient. Goodness of fit for the consolidated risk estimates was assessed using the Hosmer-Lemeshow test. Results: A total of 16 studies of grade 2+ and 5 studies of grade 3+ LRT met the inclusion criteria. The consolidated risk estimates of grade 2+ and 3+ LRT were constructed using 3 studies each. For grade 2+ LRT, the concordance correlation coefficient for the consolidated risk estimates was 0.537 compared with 0.431 for the best-fit single study. For grade 3+ LRT, the concordance correlation coefficient for the consolidated risk estimates was 0.477 compared with 0.448 for the best-fit single study. No evidence was found for a lack of fit for the consolidated risk estimates using the Hosmer-Lemeshow test (P=.531 and P=.397 for grade 2+ and 3+ LRT, respectively). Conclusions: In a large cohort of prostate cancer patients, selected sets of consolidated risk estimates were found to be more accurate predictors of LRT than risk estimates derived from any single study.
Probability Modeling and Thinking: What Can We Learn from Practice?
Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze
2016-01-01
Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…
Model checking meets probability: a gentle introduction
Katoen, Joost P.
2013-01-01
This paper considers fully probabilistic system models. Each transition is quantified with a probability—its likelihood of occurrence. Properties are expressed as automata that either accept or reject system runs. The central question is to determine the fraction of accepted system runs. We also
Modelling spruce bark beetle infestation probability
Paulius Zolubas; Jose Negron; A. Steven Munson
2009-01-01
Spruce bark beetle (Ips typographus L.) risk model, based on pure Norway spruce (Picea abies Karst.) stand characteristics in experimental and control plots was developed using classification and regression tree statistical technique under endemic pest population density. The most significant variable in spruce bark beetle...
Calvert, S.C.; Taale, H.; Hoogendoorn, S.P.
2014-01-01
In this contribution the Core Probability Framework (CPF) is introduced with the application of the Discrete-Element Core Probability Model (DE-CPM) as a new DNL for dynamic macroscopic modelling of stochastic traffic flow. The model is demonstrated for validation in a test case and for
Energy Technology Data Exchange (ETDEWEB)
Daly, Megan E.; Luxton, Gary [Department of Radiation Oncology, Stanford University, Stanford, CA (United States); Choi, Clara Y.H. [Department of Neurosurgery, Stanford University, Stanford, CA (United States); Gibbs, Iris C. [Department of Radiation Oncology, Stanford University, Stanford, CA (United States); Chang, Steven D.; Adler, John R. [Department of Neurosurgery, Stanford University, Stanford, CA (United States); Soltys, Scott G., E-mail: sgsoltys@stanford.edu [Department of Radiation Oncology, Stanford University, Stanford, CA (United States)
2012-04-01
Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear-quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18-30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8-30.9 Gy) and 22.0 Gy (range, 20.2-26.6 Gy), respectively. By use of conventional values for {alpha}/{beta}, volume parameter n, 50% complication probability dose TD{sub 50}, and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of {alpha}/{beta} and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of {alpha}/{beta} and n yielded better predictions (0.7 complications), with n = 0.023 and {alpha}/{beta} = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high {alpha}/{beta} value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models
Probability model for analyzing fire management alternatives: theory and structure
Frederick W. Bratten
1982-01-01
A theoretical probability model has been developed for analyzing program alternatives in fire management. It includes submodels or modules for predicting probabilities of fire behavior, fire occurrence, fire suppression, effects of fire on land resources, and financial effects of fire. Generalized "fire management situations" are used to represent actual fire...
Gendist: An R Package for Generated Probability Distribution Models.
Directory of Open Access Journals (Sweden)
Shaiful Anuar Abu Bakar
Full Text Available In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.
Gendist: An R Package for Generated Probability Distribution Models.
Abu Bakar, Shaiful Anuar; Nadarajah, Saralees; Absl Kamarul Adzhar, Zahrul Azmir; Mohamed, Ibrahim
2016-01-01
In this paper, we introduce the R package gendist that computes the probability density function, the cumulative distribution function, the quantile function and generates random values for several generated probability distribution models including the mixture model, the composite model, the folded model, the skewed symmetric model and the arc tan model. These models are extensively used in the literature and the R functions provided here are flexible enough to accommodate various univariate distributions found in other R packages. We also show its applications in graphing, estimation, simulation and risk measurements.
Review of Literature for Model Assisted Probability of Detection
Energy Technology Data Exchange (ETDEWEB)
Meyer, Ryan M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Crawford, Susan L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lareau, John P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Anderson, Michael T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2014-09-30
This is a draft technical letter report for NRC client documenting a literature review of model assisted probability of detection (MAPOD) for potential application to nuclear power plant components for improvement of field NDE performance estimations.
Discriminative training of CRF models with probably submodular constraints
Zaremba, Wojciech; Blaschko, Matthew
2016-01-01
Zaremba W., Blaschko M., ''Discriminative training of CRF models with probably submodular constraints'', IEEE winter conference on applications of computer vision - WACV 2016, 7 pp., March 7-9, 2016, Lake Placid, NY, USA.
Reach/frequency for printed media: Personal probabilities or models
DEFF Research Database (Denmark)
Mortensen, Peter Stendahl
2000-01-01
that, in order to prevent bias, ratings per group must be used as reading probabilities. Nevertheless, in most cases, the estimates are still biased compared with panel data, thus overestimating net ´reach. Models with the same assumptions as with assignments of reading probabilities are presented......The author evaluates two different ways of estimating reach and frequency of plans for printed media. The first assigns reading probabilities to groups of respondents and calculates reach and frequency by simulation. the second estimates parameters to a model for reach/frequency. It is concluded...
Fujiki, Masato; Hashimoto, Koji; Palaios, Emmanouil; Quintini, Cristiano; Aucejo, Federico N; Uso, Teresa Diago; Eghtesad, Bijan; Miller, Charles M
2017-11-01
Hepatic artery thrombosis after liver transplantation is a devastating complication associated with ischemic cholangiopathy that can occur even after successful revascularization. This study explores long-term outcomes after hepatic artery thrombosis in adult liver transplantation recipients, focusing on the probability, risk factors, and resolution of ischemic cholangiopathy. A retrospective chart review of 1,783 consecutive adult liver transplantations performed between 1995 and 2014 identified 44 cases of hepatic artery thrombosis (2.6%); 10 patients underwent immediate retransplantation, and 34 patients received nontransplant treatments, involving revascularization (n = 19) or expectant nonrevascularization management (n = 15). The 1-year graft survival after nontransplant treatment was favorable (82%); however, 16 of the 34 patients who received a nontransplant treatment developed ischemic cholangiopathy and required long-term biliary intervention. A Cox regression model showed that increased serum transaminase and bilirubin levels at the time of hepatic artery thrombosis diagnosis, but not nonrevascularization treatment versus revascularization, were risk factors for the development of ischemic cholangiopathy. Ischemic cholangiopathy in revascularized grafts was less extensive with a greater likelihood of resolution within 5-years than that in nonrevascularized grafts (100% vs 17%). Most liver abscesses without signs of liver failure also were reversible. Salvage retransplantation after a nontransplant treatment was performed in 8 patients with a 1-year survival rate equivalent to immediate retransplantation (88% vs 80%). Selective nontransplant treatments for hepatic artery thrombosis resulted in favorable graft survival. Biliary intervention can resolve liver abscess and ischemic cholangiopathy that developed in revascularized grafts in the long-term; salvage retransplantation should be considered for ischemic cholangiopathy in nonrevascularized grafts
Bivariate categorical data analysis using normal linear conditional multinomial probability model.
Sun, Bingrui; Sutradhar, Brajendra
2015-02-10
Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.
Models for probability and statistical inference theory and applications
Stapleton, James H
2007-01-01
This concise, yet thorough, book is enhanced with simulations and graphs to build the intuition of readersModels for Probability and Statistical Inference was written over a five-year period and serves as a comprehensive treatment of the fundamentals of probability and statistical inference. With detailed theoretical coverage found throughout the book, readers acquire the fundamentals needed to advance to more specialized topics, such as sampling, linear models, design of experiments, statistical computing, survival analysis, and bootstrapping.Ideal as a textbook for a two-semester sequence on probability and statistical inference, early chapters provide coverage on probability and include discussions of: discrete models and random variables; discrete distributions including binomial, hypergeometric, geometric, and Poisson; continuous, normal, gamma, and conditional distributions; and limit theory. Since limit theory is usually the most difficult topic for readers to master, the author thoroughly discusses mo...
Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie; Chang, Kyung Hwan
2018-01-01
The aim of this study was to derive a new plan-scoring index using normal tissue complication probabilities to verify different plans in the selection of personalized treatment. Plans for 12 patients treated with tomotherapy were used to compare scoring for ranking. Dosimetric and biological indexes were analyzed for the plans for a clearly distinguishable group ( n = 7) and a similar group ( n = 12), using treatment plan verification software that we developed. The quality factor ( QF) of our support software for treatment decisions was consistent with the final treatment plan for the clearly distinguishable group (average QF = 1.202, 100% match rate, n = 7) and the similar group (average QF = 1.058, 33% match rate, n = 12). Therefore, we propose a normal tissue complication probability (NTCP) based on the plan scoring index for verification of different plans for personalized treatment-plan selection. Scoring using the new QF showed a 100% match rate (average NTCP QF = 1.0420). The NTCP-based new QF scoring method was adequate for obtaining biological verification quality and organ risk saving using the treatment-planning decision-support software we developed for prostate cancer.
Estuarine shoreline and sandline change model skill and predicted probabilities
Smith, Kathryn E. L.; Passeri, Davina; Plant, Nathaniel G.
2016-01-01
The Barrier Island and Estuarine Wetland Physical Change Assessment was created to calibrate and test probability models of barrier island estuarine shoreline and sandline change for study areas in Virginia, Maryland, and New Jersey. The models examined the influence of hydrologic and physical variables related to long-term and event-driven (Hurricane Sandy) estuarine back-barrier shoreline and overwash (sandline) change. Input variables were constructed into a Bayesian Network (BN) using Netica. To evaluate the ability of the BN to reproduce the observations used to train the model, the skill, log likelihood ratio and probability predictions were utilized. These data are the probability and skill metrics for all four models: the long-term (LT) back-barrier shoreline change, event-driven (HS) back-barrier shoreline change, long-term (LT) sandline change, and event-driven (HS) sandline change.
Modeling the probability of giving birth at health institutions among ...
African Journals Online (AJOL)
Background: Although ante natal care and institutional delivery is effective means for reducing maternal morbidity and mortality, the probability of giving birth at health institutions among ante natal care attendants has not been modeled in Ethiopia. Therefore, the objective of this study was to model predictors of giving birth at ...
Naive Probability: Model-Based Estimates of Unique Events.
Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N
2015-08-01
We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.
Energy Technology Data Exchange (ETDEWEB)
Samuels, Stuart E.; Eisbruch, Avraham; Vineberg, Karen; Lee, Jae; Lee, Choonik; Matuszak, Martha M.; Ten Haken, Randall K.; Brock, Kristy K., E-mail: kbrock@med.umich.edu
2016-11-01
Purpose: Strategies to reduce the toxicities of head and neck radiation (ie, dysphagia [difficulty swallowing] and xerostomia [dry mouth]) are currently underway. However, the predicted benefit of dose and planning target volume (PTV) reduction strategies is unknown. The purpose of the present study was to compare the normal tissue complication probabilities (NTCP) for swallowing and salivary structures in standard plans (70 Gy [P70]), dose-reduced plans (60 Gy [P60]), and plans eliminating the PTV margin. Methods and Materials: A total of 38 oropharyngeal cancer (OPC) plans were analyzed. Standard organ-sparing volumetric modulated arc therapy plans (P70) were created and then modified by eliminating the PTVs and treating the clinical tumor volumes (CTVs) only (C70) or maintaining the PTV but reducing the dose to 60 Gy (P60). NTCP dose models for the pharyngeal constrictors, glottis/supraglottic larynx, parotid glands (PGs), and submandibular glands (SMGs) were analyzed. The minimal clinically important benefit was defined as a mean change in NTCP of >5%. The P70 NTCP thresholds and overlap percentages of the organs at risk with the PTVs (56-59 Gy, vPTV{sub 56}) were evaluated to identify the predictors for NTCP improvement. Results: With the P60 plans, only the ipsilateral PG (iPG) benefited (23.9% vs 16.2%; P<.01). With the C70 plans, only the iPG (23.9% vs 17.5%; P<.01) and contralateral SMG (cSMG) (NTCP 32.1% vs 22.9%; P<.01) benefited. An iPG NTCP threshold of 20% and 30% predicted NTCP benefits for the P60 and C70 plans, respectively (P<.001). A cSMG NTCP threshold of 30% predicted for an NTCP benefit with the C70 plans (P<.001). Furthermore, for the iPG, a vPTV{sub 56} >13% predicted benefit with P60 (P<.001) and C70 (P=.002). For the cSMG, a vPTV{sub 56} >22% predicted benefit with C70 (P<.01). Conclusions: PTV elimination and dose-reduction lowered the NTCP of the iPG, and PTV elimination lowered the NTCP of the cSMG. NTCP thresholds and the
Illustrating Probability through Roulette: A Spreadsheet Simulation Model
Directory of Open Access Journals (Sweden)
Kala Chand Seal
2005-11-01
Full Text Available Teaching probability can be challenging because the mathematical formulas often are too abstract and complex for the students to fully grasp the underlying meaning and effect of the concepts. Games can provide a way to address this issue. For example, the game of roulette can be an exciting application for teaching probability concepts. In this paper, we implement a model of roulette in a spreadsheet that can simulate outcomes of various betting strategies. The simulations can be analyzed to gain better insights into the corresponding probability structures. We use the model to simulate a particular betting strategy known as the bet-doubling, or Martingale, strategy. This strategy is quite popular and is often erroneously perceived as a winning strategy even though the probability analysis shows that such a perception is incorrect. The simulation allows us to present the true implications of such a strategy for a player with a limited betting budget and relate the results to the underlying theoretical probability structure. The overall validation of the model, its use for teaching, including its application to analyze other types of betting strategies are discussed.
Camera-Model Identification Using Markovian Transition Probability Matrix
Xu, Guanshuo; Gao, Shang; Shi, Yun Qing; Hu, Ruimin; Su, Wei
Detecting the (brands and) models of digital cameras from given digital images has become a popular research topic in the field of digital forensics. As most of images are JPEG compressed before they are output from cameras, we propose to use an effective image statistical model to characterize the difference JPEG 2-D arrays of Y and Cb components from the JPEG images taken by various camera models. Specifically, the transition probability matrices derived from four different directional Markov processes applied to the image difference JPEG 2-D arrays are used to identify statistical difference caused by image formation pipelines inside different camera models. All elements of the transition probability matrices, after a thresholding technique, are directly used as features for classification purpose. Multi-class support vector machines (SVM) are used as the classification tool. The effectiveness of our proposed statistical model is demonstrated by large-scale experimental results.
Predicting the Probability of Lightning Occurrence with Generalized Additive Models
Fabsic, Peter; Mayr, Georg; Simon, Thorsten; Zeileis, Achim
2017-04-01
This study investigates the predictability of lightning in complex terrain. The main objective is to estimate the probability of lightning occurrence in the Alpine region during summertime afternoons (12-18 UTC) at a spatial resolution of 64 × 64 km2. Lightning observations are obtained from the ALDIS lightning detection network. The probability of lightning occurrence is estimated using generalized additive models (GAM). GAMs provide a flexible modelling framework to estimate the relationship between covariates and the observations. The covariates, besides spatial and temporal effects, include numerous meteorological fields from the ECMWF ensemble system. The optimal model is chosen based on a forward selection procedure with out-of-sample mean squared error as a performance criterion. Our investigation shows that convective precipitation and mid-layer stability are the most influential meteorological predictors. Both exhibit intuitive, non-linear trends: higher values of convective precipitation indicate higher probability of lightning, and large values of the mid-layer stability measure imply low lightning potential. The performance of the model was evaluated against a climatology model containing both spatial and temporal effects. Taking the climatology model as a reference forecast, our model attains a Brier Skill Score of approximately 46%. The model's performance can be further enhanced by incorporating the information about lightning activity from the previous time step, which yields a Brier Skill Score of 48%. These scores show that the method is able to extract valuable information from the ensemble to produce reliable spatial forecasts of the lightning potential in the Alps.
ESTIMATION OF TRIP MODE PROBABILITY CHOICE USING MULTINOMIAL LOGISTIC MODEL
Directory of Open Access Journals (Sweden)
Bilous, A.
2012-06-01
Full Text Available The step of modal split of the four step model for determination of urban travel demand is analyzed. Utility functions are composed, their coefficients are calibrated in TransCAD. Equations for estimation of trip mode choice probability are shown and the numerical illustration of estimation is given.
Flight Overbooking Problem–Use of a Probability Model
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. Flight Overbooking Problem – Use of a Probability Model. T Krishnan. Classroom Volume 7 Issue 3 March 2002 pp 56-60. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/007/03/0056-0060 ...
Modeling highway travel time distribution with conditional probability models
Energy Technology Data Exchange (ETDEWEB)
Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)
2014-01-01
ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).
Probability bounds analysis for nonlinear population ecology models.
Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A
2015-09-01
Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals. Copyright © 2015. Published by Elsevier Inc.
Mixture probability distribution functions to model wind speed distributions
Energy Technology Data Exchange (ETDEWEB)
Kollu, Ravindra; Rayapudi, Srinivasa Rao; Pakkurthi, Krishna Mohan [J.N.T. Univ., Kakinada (India). Dept. of Electrical and Electronics Engineering; Narasimham, S.V.L. [J.N.T. Univ., Andhra Pradesh (India). Computer Science and Engineering Dept.
2012-11-01
Accurate wind speed modeling is critical in estimating wind energy potential for harnessing wind power effectively. The quality of wind speed assessment depends on the capability of chosen probability density function (PDF) to describe the measured wind speed frequency distribution. The objective of this study is to describe (model) wind speed characteristics using three mixture probability density functions Weibull-extreme value distribution (GEV), Weibull-lognormal, and GEV-lognormal which were not tried before. Statistical parameters such as maximum error in the Kolmogorov-Smirnov test, root mean square error, Chi-square error, coefficient of determination, and power density error are considered as judgment criteria to assess the fitness of the probability density functions. Results indicate that Weibull- GEV PDF is able to describe unimodal as well as bimodal wind distributions accurately whereas GEV-lognormal PDF is able to describe familiar bell-shaped unimodal distribution well. Results show that mixture probability functions are better alternatives to conventional Weibull, two-component mixture Weibull, gamma, and lognormal PDFs to describe wind speed characteristics. (orig.)
Zebrafish : A Model for Understanding Diabetic Complications
Joergens, K.; Hillebrands, J. -L.; Hammes, H. -P.; Kroll, J.
Diabetes mellitus causes several vascular complications in patients, such as macrovascular problems including myocardial infarction, peripheral artery diseases and stroke and microvascular problems including nephropathy and retinopathy. Likewise, diabetes mellitus is associated with other
A propagation model of computer virus with nonlinear vaccination probability
Gan, Chenquan; Yang, Xiaofan; Liu, Wanping; Zhu, Qingyi
2014-01-01
This paper is intended to examine the effect of vaccination on the spread of computer viruses. For that purpose, a novel computer virus propagation model, which incorporates a nonlinear vaccination probability, is proposed. A qualitative analysis of this model reveals that, depending on the value of the basic reproduction number, either the virus-free equilibrium or the viral equilibrium is globally asymptotically stable. The results of simulation experiments not only demonstrate the validity of our model, but also show the effectiveness of nonlinear vaccination strategies. Through parameter analysis, some effective strategies for eradicating viruses are suggested.
A model to assess dust explosion occurrence probability.
Hassan, Junaid; Khan, Faisal; Amyotte, Paul; Ferdous, Refaul
2014-03-15
Dust handling poses a potential explosion hazard in many industrial facilities. The consequences of a dust explosion are often severe and similar to a gas explosion; however, its occurrence is conditional to the presence of five elements: combustible dust, ignition source, oxidant, mixing and confinement. Dust explosion researchers have conducted experiments to study the characteristics of these elements and generate data on explosibility. These experiments are often costly but the generated data has a significant scope in estimating the probability of a dust explosion occurrence. This paper attempts to use existing information (experimental data) to develop a predictive model to assess the probability of a dust explosion occurrence in a given environment. The pro-posed model considers six key parameters of a dust explosion: dust particle diameter (PD), minimum ignition energy (MIE), minimum explosible concentration (MEC), minimum ignition temperature (MIT), limiting oxygen concentration (LOC) and explosion pressure (Pmax). A conditional probabilistic approach has been developed and embedded in the proposed model to generate a nomograph for assessing dust explosion occurrence. The generated nomograph provides a quick assessment technique to map the occurrence probability of a dust explosion for a given environment defined with the six parameters. Copyright © 2014 Elsevier B.V. All rights reserved.
Mortality Probability Model III and Simplified Acute Physiology Score II
Vasilevskis, Eduard E.; Kuzniewicz, Michael W.; Cason, Brian A.; Lane, Rondall K.; Dean, Mitzi L.; Clay, Ted; Rennie, Deborah J.; Vittinghoff, Eric; Dudley, R. Adams
2009-01-01
Background: To develop and compare ICU length-of-stay (LOS) risk-adjustment models using three commonly used mortality or LOS prediction models. Methods: Between 2001 and 2004, we performed a retrospective, observational study of 11,295 ICU patients from 35 hospitals in the California Intensive Care Outcomes Project. We compared the accuracy of the following three LOS models: a recalibrated acute physiology and chronic health evaluation (APACHE) IV-LOS model; and models developed using risk factors in the mortality probability model III at zero hours (MPM0) and the simplified acute physiology score (SAPS) II mortality prediction model. We evaluated models by calculating the following: (1) grouped coefficients of determination; (2) differences between observed and predicted LOS across subgroups; and (3) intraclass correlations of observed/expected LOS ratios between models. Results: The grouped coefficients of determination were APACHE IV with coefficients recalibrated to the LOS values of the study cohort (APACHE IVrecal) [R2 = 0.422], mortality probability model III at zero hours (MPM0 III) [R2 = 0.279], and simplified acute physiology score (SAPS II) [R2 = 0.008]. For each decile of predicted ICU LOS, the mean predicted LOS vs the observed LOS was significantly different (p ≤ 0.05) for three, two, and six deciles using APACHE IVrecal, MPM0 III, and SAPS II, respectively. Plots of the predicted vs the observed LOS ratios of the hospitals revealed a threefold variation in LOS among hospitals with high model correlations. Conclusions: APACHE IV and MPM0 III were more accurate than SAPS II for the prediction of ICU LOS. APACHE IV is the most accurate and best calibrated model. Although it is less accurate, MPM0 III may be a reasonable option if the data collection burden or the treatment effect bias is a consideration. PMID:19363210
Can quantum probability provide a new direction for cognitive modeling?
Pothos, Emmanuel M; Busemeyer, Jerome R
2013-06-01
Classical (Bayesian) probability (CP) theory has led to an influential research tradition for modeling cognitive processes. Cognitive scientists have been trained to work with CP principles for so long that it is hard even to imagine alternative ways to formalize probabilities. However, in physics, quantum probability (QP) theory has been the dominant probabilistic approach for nearly 100 years. Could QP theory provide us with any advantages in cognitive modeling as well? Note first that both CP and QP theory share the fundamental assumption that it is possible to model cognition on the basis of formal, probabilistic principles. But why consider a QP approach? The answers are that (1) there are many well-established empirical findings (e.g., from the influential Tversky, Kahneman research tradition) that are hard to reconcile with CP principles; and (2) these same findings have natural and straightforward explanations with quantum principles. In QP theory, probabilistic assessment is often strongly context- and order-dependent, individual states can be superposition states (that are impossible to associate with specific values), and composite systems can be entangled (they cannot be decomposed into their subsystems). All these characteristics appear perplexing from a classical perspective. However, our thesis is that they provide a more accurate and powerful account of certain cognitive processes. We first introduce QP theory and illustrate its application with psychological examples. We then review empirical findings that motivate the use of quantum theory in cognitive theory, but also discuss ways in which QP and CP theories converge. Finally, we consider the implications of a QP theory approach to cognition for human rationality.
Ruin probabilities in models with a Markov chain dependence structure
Constantinescu, Corina; Kortschak, Dominik; Maume-Deschamps, Véronique
2013-01-01
International audience; In this paper we derive explicit expressions for the probability of ruin in a renewal risk model with dependence described-by/incorporated-in the real-valued random variable Zk = −cτk + Xk , namely the loss between the (k − 1)–th and the k–th claim. Here c represents the constant premium rate, τk the inter-arrival time between the (k − 1)–th and the k–th claim and Xk is the size of the k–th claim. The dependence structure among (Zk )k>0 is given/driven by a Markov chai...
Naive Probability: A Mental Model Theory of Extensional Reasoning.
Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul
1999-01-01
Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…
Modeling evolution using the probability of fixation: history and implications.
McCandlish, David M; Stoltzfus, Arlin
2014-09-01
Many models of evolution calculate the rate of evolution by multiplying the rate at which new mutations originate within a population by a probability of fixation. Here we review the historical origins, contemporary applications, and evolutionary implications of these "origin-fixation" models, which are widely used in evolutionary genetics, molecular evolution, and phylogenetics. Origin-fixation models were first introduced in 1969, in association with an emerging view of "molecular" evolution. Early origin-fixation models were used to calculate an instantaneous rate of evolution across a large number of independently evolving loci; in the 1980s and 1990s, a second wave of origin-fixation models emerged to address a sequence of fixation events at a single locus. Although origin fixation models have been applied to a broad array of problems in contemporary evolutionary research, their rise in popularity has not been accompanied by an increased appreciation of their restrictive assumptions or their distinctive implications. We argue that origin-fixation models constitute a coherent theory of mutation-limited evolution that contrasts sharply with theories of evolution that rely on the presence of standing genetic variation. A major unsolved question in evolutionary biology is the degree to which these models provide an accurate approximation of evolution in natural populations.
Nahorniak, Matthew; Larsen, David P; Volk, Carol; Jordan, Chris E
2015-01-01
In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools--linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be
Decision from Models: Generalizing Probability Information to Novel Tasks.
Zhang, Hang; Paily, Jacienta T; Maloney, Laurence T
2015-01-01
We investigate a new type of decision under risk where-to succeed-participants must generalize their experience in one set of tasks to a novel set of tasks. We asked participants to trade distance for reward in a virtual minefield where each successive step incurred the same fixed probability of failure (referred to as hazard). With constant hazard, the probability of success (the survival function) decreases exponentially with path length. On each trial, participants chose between a shorter path with smaller reward and a longer (more dangerous) path with larger reward. They received feedback in 160 training trials: encountering a mine along their chosen path resulted in zero reward and successful completion of the path led to the reward associated with the path chosen. They then completed 600 no-feedback test trials with novel combinations of path length and rewards. To maximize expected gain, participants had to learn the correct exponential model in training and generalize it to the test conditions. We compared how participants discounted reward with increasing path length to the predictions of nine choice models including the correct exponential model. The choices of a majority of the participants were best accounted for by a model of the correct exponential form although with marked overestimation of the hazard rate. The decision-from-models paradigm differs from experience-based decision paradigms such as decision-from-sampling in the importance assigned to generalizing experience-based information to novel tasks. The task itself is representative of everyday tasks involving repeated decisions in stochastically invariant environments.
A Probability-Based Hybrid User Model for Recommendation System
Directory of Open Access Journals (Sweden)
Jia Hao
2016-01-01
Full Text Available With the rapid development of information communication technology, the available information or knowledge is exponentially increased, and this causes the well-known information overload phenomenon. This problem is more serious in product design corporations because over half of the valuable design time is consumed in knowledge acquisition, which highly extends the design cycle and weakens the competitiveness. Therefore, the recommender systems become very important in the domain of product domain. This research presents a probability-based hybrid user model, which is a combination of collaborative filtering and content-based filtering. This hybrid model utilizes user ratings and item topics or classes, which are available in the domain of product design, to predict the knowledge requirement. The comprehensive analysis of the experimental results shows that the proposed method gains better performance in most of the parameter settings. This work contributes a probability-based method to the community for implement recommender system when only user ratings and item topics are available.
Recent Advances in Model-Assisted Probability of Detection
Thompson, R. Bruce; Brasche, Lisa J.; Lindgren, Eric; Swindell, Paul; Winfree, William P.
2009-01-01
The increased role played by probability of detection (POD) in structural integrity programs, combined with the significant time and cost associated with the purely empirical determination of POD, provides motivation for alternate means to estimate this important metric of NDE techniques. One approach to make the process of POD estimation more efficient is to complement limited empirical experiments with information from physics-based models of the inspection process or controlled laboratory experiments. The Model-Assisted Probability of Detection (MAPOD) Working Group was formed by the Air Force Research Laboratory, the FAA Technical Center, and NASA to explore these possibilities. Since the 2004 inception of the MAPOD Working Group, 11 meetings have been held in conjunction with major NDE conferences. This paper will review the accomplishments of this group, which includes over 90 members from around the world. Included will be a discussion of strategies developed to combine physics-based and empirical understanding, draft protocols that have been developed to guide application of the strategies, and demonstrations that have been or are being carried out in a number of countries. The talk will conclude with a discussion of future directions, which will include documentation of benefits via case studies, development of formal protocols for engineering practice, as well as a number of specific technical issues.
Modelling the Probability of Landslides Impacting Road Networks
Taylor, F. E.; Malamud, B. D.
2012-04-01
During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m
Detecting gustatory-olfactory flavor mixtures: models of probability summation.
Marks, Lawrence E; Veldhuizen, Maria G; Shepard, Timothy G; Shavit, Adam Y
2012-03-01
Odorants and flavorants typically contain many components. It is generally easier to detect multicomponent stimuli than to detect a single component, through either neural integration or probability summation (PS) (or both). PS assumes that the sensory effects of 2 (or more) stimulus components (e.g., gustatory and olfactory components of a flavorant) are detected in statistically independent channels, that each channel makes a separate decision whether a component is detected, and that the behavioral response depends solely on the separate decisions. Models of PS traditionally assume high thresholds for detecting each component, noise being irrelevant. The core assumptions may be adapted, however, to signal-detection theory, where noise limits detection. The present article derives predictions of high-threshold and signal-detection models of independent-decision PS in detecting gustatory-olfactory flavorants, comparing predictions in yes/no and 2-alternative forced-choice tasks using blocked and intermixed stimulus designs. The models also extend to measures of response times to suprathreshold flavorants. Predictions derived from high-threshold and signal-detection models differ markedly. Available empirical evidence on gustatory-olfactory flavor detection suggests that neither the high-threshold nor the signal-detection versions of PS can readily account for the results, which likely reflect neural integration in the flavor system.
Discrete event simulation: Modeling simultaneous complications and outcomes
Quik, E.H.; Feenstra, T.L.; Krabbe, P.F.M.
2012-01-01
OBJECTIVES: To present an effective and elegant model approach to deal with specific characteristics of complex modeling. METHODS: A discrete event simulation (DES) model with multiple complications and multiple outcomes that each can occur simultaneously was developed. In this DES model parameters,
1982-01-01
This report presents the user instructions and data requirements for SIMCO, a combined simulation and probability computer model developed to quantify and evaluate carbon monoxide in roadside environments. The model permits direct determinations of t...
Energy Technology Data Exchange (ETDEWEB)
Jakobi, Annika, E-mail: Annika.Jakobi@OncoRay.de [OncoRay-National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Bandurska-Luque, Anna [OncoRay-National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Department of Radiation Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Dresden (Germany); Stützer, Kristin; Haase, Robert; Löck, Steffen [OncoRay-National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Wack, Linda-Jacqueline [Section for Biomedical Physics, University Hospital for Radiation Oncology, Eberhard Karls Universät Tübingen (Germany); Mönnich, David [Section for Biomedical Physics, University Hospital for Radiation Oncology, Eberhard Karls Universät Tübingen (Germany); German Cancer Research Center, Heidelberg (Germany); German Cancer Consortium, Tübingen (Germany); Thorwarth, Daniela [Section for Biomedical Physics, University Hospital for Radiation Oncology, Eberhard Karls Universät Tübingen (Germany); and others
2015-08-01
Purpose: The purpose of this study was to determine, by treatment plan comparison along with normal tissue complication probability (NTCP) modeling, whether a subpopulation of patients with head and neck squamous cell carcinoma (HNSCC) could be identified that would gain substantial benefit from proton therapy in terms of NTCP. Methods and Materials: For 45 HNSCC patients, intensity modulated radiation therapy (IMRT) was compared to intensity modulated proton therapy (IMPT). Physical dose distributions were evaluated as well as the resulting NTCP values, using modern models for acute mucositis, xerostomia, aspiration, dysphagia, laryngeal edema, and trismus. Patient subgroups were defined based on primary tumor location. Results: Generally, IMPT reduced the NTCP values while keeping similar target coverage for all patients. Subgroup analyses revealed a higher individual reduction of swallowing-related side effects by IMPT for patients with tumors in the upper head and neck area, whereas the risk reduction of acute mucositis was more pronounced in patients with tumors in the larynx region. More patients with tumors in the upper head and neck area had a reduction in NTCP of more than 10%. Conclusions: Subgrouping can help to identify patients who may benefit more than others from the use of IMPT and, thus, can be a useful tool for a preselection of patients in the clinic where there are limited PT resources. Because the individual benefit differs within a subgroup, the relative merits should additionally be evaluated by individual treatment plan comparisons.
High-resolution urban flood modelling - a joint probability approach
Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen
2017-04-01
(Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al
A complicated grief intervention model | Drenth | Health SA ...
African Journals Online (AJOL)
In this article, the researchers tabulate some of the models and approaches to bereavement and discusses its applicability to complicated grief. Specific attention is given to the Dual Process Model (Stroebe & Schut 1999) and the Task-Centred approach (a social work approach to therapy) in an attempt to develop a model ...
Potential complications to TB vaccine testing in animal models.
Orme, Ian M
2009-06-01
Testing of new vaccines in animal models has certain advantages and disadvantages. As we better understand the complexity of the immune response to vaccines, new information may be complicating the assessment of the efficacy of new candidate vaccines. Four possible complications are discussed here, (i) induction of Foxp3+ T cells; (ii) induction of memory T cell subsets; (iii) location of extracellular organisms in lung necrosis; and (iv) protection against isolates of high/extreme immunopathology.
Model and test in a fungus of the probability that beneficial mutations survive drift
Gifford, D.R.; Visser, de J.A.G.M.; Wahl, L.M.
2013-01-01
Determining the probability of fixation of beneficial mutations is critically important for building predictive models of adaptive evolution. Despite considerable theoretical work, models of fixation probability have stood untested for nearly a century. However, recent advances in experimental and
Modelling probabilities of heavy precipitation by regional approaches
Gaal, L.; Kysely, J.
2009-09-01
Extreme precipitation events are associated with large negative consequences for human society, mainly as they may trigger floods and landslides. The recent series of flash floods in central Europe (affecting several isolated areas) on June 24-28, 2009, the worst one over several decades in the Czech Republic as to the number of persons killed and the extent of damage to buildings and infrastructure, is an example. Estimates of growth curves and design values (corresponding e.g. to 50-yr and 100-yr return periods) of precipitation amounts, together with their uncertainty, are important in hydrological modelling and other applications. The interest in high quantiles of precipitation distributions is also related to possible climate change effects, as climate model simulations tend to project increased severity of precipitation extremes in a warmer climate. The present study compares - in terms of Monte Carlo simulation experiments - several methods to modelling probabilities of precipitation extremes that make use of ‘regional approaches’: the estimation of distributions of extremes takes into account data in a ‘region’ (‘pooling group’), in which one may assume that the distributions at individual sites are identical apart from a site-specific scaling factor (the condition is referred to as ‘regional homogeneity’). In other words, all data in a region - often weighted in some way - are taken into account when estimating the probability distribution of extremes at a given site. The advantage is that sampling variations in the estimates of model parameters and high quantiles are to a large extent reduced compared to the single-site analysis. We focus on the ‘region-of-influence’ (ROI) method which is based on the identification of unique pooling groups (forming the database for the estimation) for each site under study. The similarity of sites is evaluated in terms of a set of site attributes related to the distributions of extremes. The issue of
Parametric modeling of probability of bank loan default in Kenya ...
African Journals Online (AJOL)
This makes the study on probability of a customer defaulting very useful while analyzing the credit risk policies. In this paper, we use a raw data set that contains demographic information about the borrowers. The data sets have been used to identify which risk factors associated with the borrowers contribute towards default.
Optimizing an objective function under a bivariate probability model
X. Brusset; N.M. Temme (Nico)
2007-01-01
htmlabstractThe motivation of this paper is to obtain an analytical closed form of a quadratic objective function arising from a stochastic decision process with bivariate exponential probability distribution functions that may be dependent. This method is applicable when results need to be
Goh, Yan Mei; Tokala, Ajay; Hany, Tarek; Pursnani, Kishore G.; Date, Ravindra S.
2017-01-01
Portomesenteric venous thrombosis (PMVT) is a rare but well-reported complication following laparoscopic surgery. We present three cases of PMVT following laparoscopic surgery. Our first case is a 71-year-old morbidly obese woman admitted for elective laparoscopic giant hiatus hernia (LGHH) repair. Post-operatively, she developed multi-organ dysfunction and computed tomography scan revealed portal venous gas and extensive small bowel infarct. The second patient is a 51-year-old man with known previous deep venous thrombosis who also had elective LGHH repair. He presented 8 weeks post-operatively with severe abdominal pain and required major bowel resection. Our third case is an 86-year-old woman who developed worsening abdominal tenderness 3 days after laparoscopic right hemicolectomy for adenocarcinoma and was diagnosed with an incidental finding of thrombus in the portal vein. She did not require further surgical intervention. The current guidelines for thromboprophylaxis follow-up in this patient group may not be adequate for the patients at risk. Hence, we propose prolonged period of thromboprophylaxis in the patients undergoing major laparoscopic surgery. PMID:28281480
Study on the Confidence and Reliability of the Mean Seismic Probability Risk Model
Wang, Xiao-Lei; Lu, Da-Gang
2017-01-01
The mean seismic probability risk model has widely been used in seismic design and safety evaluation of critical infrastructures. In this paper, the confidence levels analysis and error equations derivation of the mean seismic probability risk model are conducted. It has been found that the confidence levels and error values of the mean seismic probability risk model are changed for different sites and that the confidence levels are low and the error values are large for most sites. Meanwhile...
PROBABILITY MODELS FOR OBTAINING NON-NUMERICAL DATA
Orlov A. I.
2015-01-01
The statistics of objects of non-numerical nature (statistics of non-numerical objects, non-numerical data statistics, non-numeric statistics) is the area of mathematical statistics, devoted to the analysis methods of non-numeric data. Basis of applying the results of mathematical statistics are probabilistic-statistical models of real phenomena and processes, the most important (and often only) which are models for obtaining data. The simplest example of a model for obtaining data is the mod...
Application of Probability Methods to Assess Crash Modeling Uncertainty
Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.
2007-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.
Takemura, Kazuhisa; Murakami, Hajime
2016-01-01
A probability weighting function (w(p)) is considered to be a nonlinear function of probability (p) in behavioral decision theory. This study proposes a psychophysical model of probability weighting functions derived from a hyperbolic time discounting model and a geometric distribution. The aim of the study is to show probability weighting functions from the point of view of waiting time for a decision maker. Since the expected value of a geometrically distributed random variable X is 1/p, we formulized the probability weighting function of the expected value model for hyperbolic time discounting as w(p) = (1 - k log p)(-1). Moreover, the probability weighting function is derived from Loewenstein and Prelec's (1992) generalized hyperbolic time discounting model. The latter model is proved to be equivalent to the hyperbolic-logarithmic weighting function considered by Prelec (1998) and Luce (2001). In this study, we derive a model from the generalized hyperbolic time discounting model assuming Fechner's (1860) psychophysical law of time and a geometric distribution of trials. In addition, we develop median models of hyperbolic time discounting and generalized hyperbolic time discounting. To illustrate the fitness of each model, a psychological experiment was conducted to assess the probability weighting and value functions at the level of the individual participant. The participants were 50 university students. The results of individual analysis indicated that the expected value model of generalized hyperbolic discounting fitted better than previous probability weighting decision-making models. The theoretical implications of this finding are discussed.
Probabilistic independence networks for hidden Markov probability models.
Smyth, P; Heckerman, D; Jordan, M I
1997-02-15
Graphical techniques for modeling the dependencies of random variables have been explored in a variety of different areas, including statistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics. Formalisms for manipulating these models have been developed relatively independently in these research communities. In this paper we explore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independence networks (PINs). The paper presents a self-contained review of the basic principles of PINs. It is shown that the well-known forward-backward (F-B) and Viterbi algorithms for HMMs are special cases of more general inference algorithms for arbitrary PINs. Furthermore, the existence of inference and estimation algorithms for more general graphical models provides a set of analysis tools for HMM practitioners who wish to explore a richer class of HMM structures. Examples of relatively complex models to handle sensor fusion and coarticulation in speech recognition are introduced and treated within the graphical model framework to illustrate the advantages of the general approach.
Probability distribution analysis of observational extreme events and model evaluation
Yu, Q.; Lau, A. K. H.; Fung, J. C. H.; Tsang, K. T.
2016-12-01
Earth's surface temperatures were the warmest in 2015 since modern record-keeping began in 1880, according to the latest study. In contrast, a cold weather occurred in many regions of China in January 2016, and brought the first snowfall to Guangzhou, the capital city of Guangdong province in 67 years. To understand the changes of extreme weather events as well as project its future scenarios, this study use statistical models to analyze on multiple climate data. We first use Granger-causality test to identify the attribution of global mean temperature rise and extreme temperature events with CO2 concentration. The four statistical moments (mean, variance, skewness, kurtosis) of daily maximum temperature distribution is investigated on global climate observational, reanalysis (1961-2010) and model data (1961-2100). Furthermore, we introduce a new tail index based on the four moments, which is a more robust index to measure extreme temperatures. Our results show that the CO2 concentration can provide information to the time series of mean and extreme temperature, but not vice versa. Based on our new tail index, we find that other than mean and variance, skewness is an important indicator should be considered to estimate extreme temperature changes and model evaluation. Among the 12 climate model data we investigate, the fourth version of Community Climate System Model (CCSM4) from National Center for Atmospheric Research performs well on the new index we introduce, which indicate the model have a substantial capability to project the future changes of extreme temperature in the 21st century. The method also shows its ability to measure extreme precipitation/ drought events. In the future we will introduce a new diagram to systematically evaluate the performance of the four statistical moments in climate model output, moreover, the human and economic impacts of extreme weather events will also be conducted.
Brémaud, Pierre
2017-01-01
The emphasis in this book is placed on general models (Markov chains, random fields, random graphs), universal methods (the probabilistic method, the coupling method, the Stein-Chen method, martingale methods, the method of types) and versatile tools (Chernoff's bound, Hoeffding's inequality, Holley's inequality) whose domain of application extends far beyond the present text. Although the examples treated in the book relate to the possible applications, in the communication and computing sciences, in operations research and in physics, this book is in the first instance concerned with theory. The level of the book is that of a beginning graduate course. It is self-contained, the prerequisites consisting merely of basic calculus (series) and basic linear algebra (matrices). The reader is not assumed to be trained in probability since the first chapters give in considerable detail the background necessary to understand the rest of the book. .
DEFF Research Database (Denmark)
Azarang, Leyla; Scheike, Thomas; de Uña-Álvarez, Jacobo
2017-01-01
In this work, we present direct regression analysis for the transition probabilities in the possibly non-Markov progressive illness–death model. The method is based on binomial regression, where the response is the indicator of the occupancy for the given state along time. Randomly weighted score...... equations that are able to remove the bias due to censoring are introduced. By solving these equations, one can estimate the possibly time-varying regression coefficients, which have an immediate interpretation as covariate effects on the transition probabilities. The performance of the proposed estimator...... is investigated through simulations. We apply the method to data from the Registry of Systematic Lupus Erythematosus RELESSER, a multicenter registry created by the Spanish Society of Rheumatology. Specifically, we investigate the effect of age at Lupus diagnosis, sex, and ethnicity on the probability of damage...
Probability models of the x, y pole coordinates data
Sen, A.; Niedzielski, T.; Kosek, W.
2008-12-01
The aim of this study is to find the most appropriate probabilistic model for the x, y pole coordinates time series. We have analyzed the IERS eopc04_05 data set covering the period 1962 - 2008. We have also considered the residual x, y pole coordinate time series, which is computed as the difference between the original data and the corresponding least-squares model of the Chandler circle and annual ellipse. Using the measures of skewness and kurtosis of the empirical distribution of the data, we find that the x, y pole coordinates and the corresponding residuals time series cannot be modeled by a normal (Gaussian) distribution. A normal distribution has zero skewness and a kurtosis value of 3. We have fitted several non- Gaussian distributions to the datasets. They include the Generalized Extreme Value distribution, 4-parameter Beta distribution, Johnson SB and SU distributions, Generalized Pareto distribution and Wakeby distribution. Suitability of these distributions as probabilistic models for the x, y pole coordinates and the corresponding residuals time series are discussed.
Jones, Edmund; Epstein, David; García-Mochón, Leticia
2017-10-01
For health-economic analyses that use multistate Markov models, it is often necessary to convert from transition rates to transition probabilities, and for probabilistic sensitivity analysis and other purposes it is useful to have explicit algebraic formulas for these conversions, to avoid having to resort to numerical methods. However, if there are four or more states then the formulas can be extremely complicated. These calculations can be made using packages such as R, but many analysts and other stakeholders still prefer to use spreadsheets for these decision models. We describe a procedure for deriving formulas that use intermediate variables so that each individual formula is reasonably simple. Once the formulas have been derived, the calculations can be performed in Excel or similar software. The procedure is illustrated by several examples and we discuss how to use a computer algebra system to assist with it. The procedure works in a wide variety of scenarios but cannot be employed when there are several backward transitions and the characteristic equation has no algebraic solution, or when the eigenvalues of the transition rate matrix are very close to each other.
Wind-wave modelling aspects within complicate topography
Directory of Open Access Journals (Sweden)
S. Christopoulos
Full Text Available Wave forecasting aspects for basins with complicate geomorphology, such as the Aegean Sea, are investigated through an intercomparison study. The efficiency of the available wind models (ECMWF, UKMO to reproduce wind patterns over special basins, as well as three wave models incorporating different physics and characteristics (WAM, AUT, WACCAS, are tested for selected storm cases representing the typical wind situations over the basin. From the wave results, discussed in terms of time-series and statistical parameters, the crucial role is pointed out of the wind resolution and the reliability of the different wave models to estimate the wave climate in such a basin. The necessary grid resolution is also tested, while for a specific test case (December 1991 ERS-1 satellite data are compared with those of the model.
Directory of Open Access Journals (Sweden)
Ibsen Chivatá Cárdenas
2010-04-01
Full Text Available This article presents a rainfall model constructed by applying non-parametric modelling and imprecise probabilities; these tools were used because there was not enough homogeneous information in the study area. The area’s hydro-logical information regarding rainfall was scarce and existing hydrological time series were not uniform. A distributed extended rainfall model was constructed from so-called probability boxes (p-boxes, multinomial probability distribu-tion and confidence intervals (a friendly algorithm was constructed for non-parametric modelling by combining the last two tools. This model confirmed the high level of uncertainty involved in local rainfall modelling. Uncertainty en-compassed the whole range (domain of probability values thereby showing the severe limitations on information, leading to the conclusion that a detailed estimation of probability would lead to significant error. Nevertheless, rele-vant information was extracted; it was estimated that maximum daily rainfall threshold (70 mm would be surpassed at least once every three years and the magnitude of uncertainty affecting hydrological parameter estimation. This paper’s conclusions may be of interest to non-parametric modellers and decisions-makers as such modelling and imprecise probability represents an alternative for hydrological variable assessment and maybe an obligatory proce-dure in the future. Its potential lies in treating scarce information and represents a robust modelling strategy for non-seasonal stochastic modelling conditions
Calisto, H.; Bologna, M.
2007-05-01
We report an exact result for the calculation of the probability distribution of the Bernoulli-Malthus-Verhulst model driven by a multiplicative colored noise. We study the conditions under which the probability distribution of the Malthus-Verhulst model can exhibit a transition from a unimodal to a bimodal distribution depending on the value of a critical parameter. Also we show that the mean value of x(t) in the latter model always approaches asymptotically the value 1.
Cold and hot cognition: quantum probability theory and realistic psychological modeling.
Corr, Philip J
2013-06-01
Typically, human decision making is emotionally "hot" and does not conform to "cold" classical probability (CP) theory. As quantum probability (QP) theory emphasises order, context, superimposition states, and nonlinear dynamic effects, one of its major strengths may be its power to unify formal modeling and realistic psychological theory (e.g., information uncertainty, anxiety, and indecision, as seen in the Prisoner's Dilemma).
Liu, En-Bin; Tang, Meng-Ping; Shi, Yong-Jun; Zhou, Guo-Mo; Li, Yong-Fu
2009-11-01
Aiming at the deficiencies in the researches about the probability distribution model for mixed forests tree measurement factors, a joint maximum entropy probability density function was put forward, based on the maximum entropy principle. This function had the characteristics of 1) each element of the function was linked to the maximum entropy function, and hence, could integrate the information about the probability distribution of measurement factors of main tree species in mixed forests, 2) the function had a probability expression of double-weight, being possible to reflect the characteristics of the complex structure of mixed forests, and accurately and completely reflect the probability distribution of tree measurement factors of mixed forests based on the fully use of the information about the probability distribution of measurement factors of main tree species in mixed forests, and 3) the joint maximum entropy probability density function was succinct in structure and excellent in performance. The model was applied and tested in two sampling plots in Tianmu Mountain Nature Reserve. The fitting precision (R2 = 0.9655) and testing accuracy (R2 = 0.9772) were both high, suggesting that this model could be used as a probability distribution model for mixed forests tree measurement factors, and provided a feasible method to fully understand the comprehensive structure of mixed forests.
Estimation and asymptotic theory for transition probabilities in Markov renewal multi-state models.
Spitoni, Cristian; Verduijn, Marion; Putter, Hein
2012-08-07
In this paper we discuss estimation of transition probabilities for semi-Markov multi-state models. Non-parametric and semi-parametric estimators of the transition probabilities for a large class of models (forward going models) are proposed. Large sample theory is derived using the functional delta method and the use of resampling is proposed to derive confidence bands for the transition probabilities. The last part of the paper concerns the presentation of the main ideas of the R implementation of the proposed estimators, and data from a renal replacement study are used to illustrate the behavior of the estimators proposed.
Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno
2016-01-01
Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.
Preclinical models of Graves' disease and associated secondary complications.
Moshkelgosha, Sajad; So, Po-Wah; Diaz-Cano, Salvador; Banga, J Paul
2015-01-01
Autoimmune thyroid disease is the most common organ-specific autoimmune disorder which consists of two opposing clinical syndromes, Hashimoto's thyroiditis and Graves' (hyperthyroidism) disease. Graves' disease is characterized by goiter, hyperthyroidism, and the orbital complication known as Graves' orbitopathy (GO), or thyroid eye disease. The hyperthyroidism in Graves' disease is caused by stimulation of function of thyrotropin hormone receptor (TSHR), resulting from the production of agonist antibodies to the receptor. A variety of induced mouse models of Graves' disease have been developed over the past two decades, with some reproducible models leading to high disease incidence of autoimmune hyperthyroidism. However, none of the models show any signs of the orbital manifestation of GO. We have recently developed an experimental mouse model of GO induced by immunization of the plasmid encoded ligand binding domain of human TSHR cDNA by close field electroporation that recapitulates the orbital pathology in GO. As in human GO patients, immune mice with hyperthyroid or hypothyroid disease induced by anti-TSHR antibodies exhibited orbital pathology and chemosis, characterized by inflammation of orbital muscles and extensive adipogenesis leading to expansion of the orbital retrobulbar space. Magnetic resonance imaging of the head region in immune mice showed a significant expansion of the orbital space, concurrent with proptosis. This review discusses the different strategies for developing mouse models in Graves' disease, with a particular focus on GO. Furthermore, it outlines how this new model will facilitate molecular investigations into pathophysiology of the orbital disease and evaluation of new therapeutic interventions.
Directory of Open Access Journals (Sweden)
Moritz eBoos
2016-05-01
Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.
Optimal selection for and mutation testing using a combination of 'easy to apply' probability models
National Research Council Canada - National Science Library
2006-01-01
... optimal ascertainment, many risk assessment models and prior probability models have been developed and evaluated (; ). Four such models, the Claus, Gilpin, Frank and Evans model (; ; ; ) are empirically derived scoring systems, easy to apply in daily practice with the use of a pencil and a paper and easy to understand for both counsellor...
[Testing the probability of a model to predict suicide risk in high school and university students].
Sahin, Nesrin Hisli; Batigün, Ayşegül Durak
2009-01-01
The aim of this study was to investigate the validity of a model proposed by Batigün and Sahin regarding suicide probability. The sample was composed of 2343 students aged 15-25 years that were attending various high schools and universities. According to the proposed model, 2 risk groups were formed from this sample, according to their scores on the investigation variables (those that simultaneously received high scores 1 standard deviation above the mean on the Problem Solving Inventory, Multidimensional Anger Scale, and Impulsivity Scale). Two other risk groups were formed according to the criteria variable scores (suicide probability scores 1 standard deviation above and below the mean). A series of analyses were conducted to investigate the similarity between the model risk groups and criteria risk groups. The results reveal that the model had a 43.3% success rate for predicting those with high suicide probability, while the false negative rate was 0%. Discriminant analysis showed that the model correctly discriminated 90.2% of those with low suicide probability and 87.3% of those with high suicide probability. The results support the validity of the proposed model for selecting individuals with high suicide probability. In addition, the model can be used to offer these individuals certain preventive measures, such as problem solving, communication skills, and anger management training.
Learning Conditional Probabilities for Dynamic Influence Structures in Medical Decision Models
Cao, Cungen; Leong, Tze-Yun
1997-01-01
Based on the DynaMoL (a Dynamic decision Modeling Language) framework, we examine the critical issues in automated learning of numerical parameters from large medical databases; present a Bayesian method for learning conditional probabilities from data; analyze how to elicit prior probabilities from the domain expert; and examine several important issues on pre-processing raw data for application in dynamic decision modeling.
Directory of Open Access Journals (Sweden)
Ondřej Šimpach
2012-12-01
Full Text Available It is estimated, that in the Czech Republic live about 40 000–50 000 people suffering from celiac disease, which is a disease of gluten intolerance. At the beginning of the independent Czech Republic, the life expectancy at birth of these people was quite low, because just in this period detailed diagnosis of this disease came fromabroad. With an increasing age the probability of death of these people grew faster than that of total population. The aim of this study is to analyse the probability of death of x-year old persons during next five years after the general medical examination in 1990 and 1995. Both analyses will be solved using LOGIT and PROBITmodels and the hypothesis claiming, that probability of death of x-year old person suffering from celiac disease decreased few years after the gaining of new medical knowledge from abroad will be confirmed or refused.
Brown, Paul D; Kline, Robert W; Petersen, Ivy A; Haddock, Michael G
2004-01-01
The treatment of the inguinal lymph nodes with radiotherapy is strongly influenced by the body habitus of the patient. The effect of 7 radiotherapy techniques on femoral head doses was studied. Three female patients of differing body habitus (ectomorph, mesomorph, endomorph) were selected. Radiation fields included the pelvis and contiguous inguinal regions and were representative of fields used in the treatment of cancers of the lower pelvis. Seven treatment techniques were compared. In the ectomorph and mesomorph, normal tissue complication probability (NTCP) for the femoral heads was lowest with use of anteroposterior (AP) and modified posteroanterior (PA) field with inguinal electron field supplements (technique 1). In the endomorph, NTCP was lowest with use of AP and modified PA field without electron field supplements (technique 2) or a 4-field approach (technique 6). Technique 1 for ectomorphs and mesomorphs and techniques 2 and 6 for endomorphs were optimal techniques for providing relatively homogeneous dose distributions within the target area while minimizing the dose to the femoral heads.
INFLATED PROBABILITY MODEL FOR RISK OF VULNERABILITY TO HIV/AIDS INFECTION AMONG FEMALE MIGRANTS
Himanshu Pandey; Jai Kishun
2009-01-01
This paper is concerned in the development of the probability model which is based onthe risk of sexual transmitted infection for the number of close boy friends to describe thedistribution of single unmarried female migrants. The parameters involve in the model areestimated. The application of the model is illustrated through real data.
Improving the Estimation of Markov Transition Probabilities Using Mechanistic-Empirical Models
Directory of Open Access Journals (Sweden)
Daijiro Mizutani
2017-10-01
Full Text Available In many current state-of-the-art bridge management systems, Markov models are used for both the prediction of deterioration and the determination of optimal intervention strategies. Although transition probabilities of Markov models are generally estimated using inspection data, it is not uncommon that there are situations where there are inadequate data available to estimate the transition probabilities. In this article, a methodology is proposed to estimate the transition probabilities from mechanistic-empirical models for reinforced concrete elements. The proposed methodology includes the estimation of the transition probabilities analytically when possible and when not through the use of Bayesian statistics, which requires the formulation of a likelihood function and the use of Markov Chain Monte Carlo simulations. In an example, the difference between the average condition predicted over a 100-year time period with a Markov model developed using the proposed methodology and the condition predicted using mechanistic-empirical models were found to be 54% of that when the state-of-the-art methodology, i.e., a methodology that estimates the transition probabilities using best fit curves based on yearly condition distributions, was used. The variation in accuracy of the Markov model as a function of the number of deterioration paths generated using the mechanistic-empirical models is also shown.
Study on the Confidence and Reliability of the Mean Seismic Probability Risk Model
Directory of Open Access Journals (Sweden)
Xiao-Lei Wang
2017-01-01
Full Text Available The mean seismic probability risk model has widely been used in seismic design and safety evaluation of critical infrastructures. In this paper, the confidence levels analysis and error equations derivation of the mean seismic probability risk model are conducted. It has been found that the confidence levels and error values of the mean seismic probability risk model are changed for different sites and that the confidence levels are low and the error values are large for most sites. Meanwhile, the confidence levels of ASCE/SEI 43-05 design parameters are analyzed and the error equation of achieved performance probabilities based on ASCE/SEI 43-05 is also obtained. It is found that the confidence levels for design results obtained using ASCE/SEI 43-05 criteria are not high, which are less than 95%, while the high confidence level of the uniform risk could not be achieved using ASCE/SEI 43-05 criteria and the error values between risk model with target confidence level and mean risk model using ASCE/SEI 43-05 criteria are large for some sites. It is suggested that the seismic risk model considering high confidence levels instead of the mean seismic probability risk model should be used in the future.
Effects of initial height on the steady-state persistence probability of linear growth models
Chanphana, R.; Chatraphorn, P.; Dasgupta, C.
2013-12-01
The effects of the initial height on the temporal persistence probability of steady-state height fluctuations in up-down symmetric linear models of surface growth are investigated. We study the (1+1)-dimensional Family model and the (1+1)- and (2+1)-dimensional larger curvature (LC) model. Both the Family and LC models have up-down symmetry, so the positive and negative persistence probabilities in the steady state, averaged over all values of the initial height h0, are equal to each other. However, these two probabilities are not equal if one considers a fixed nonzero value of h0. Plots of the positive persistence probability for negative initial height versus time exhibit power-law behavior if the magnitude of the initial height is larger than the interface width at saturation. By symmetry, the negative persistence probability for positive initial height also exhibits the same behavior. The persistence exponent that describes this power-law decay decreases as the magnitude of the initial height is increased. The dependence of the persistence probability on the initial height, the system size, and the discrete sampling time is found to exhibit scaling behavior.
Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model
Vallejo, Jonathon; Hejduk, Matt; Stamey, James
2015-01-01
We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.
Sulis, William H
2017-10-01
Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.
Directory of Open Access Journals (Sweden)
Changhao Fan
2017-01-01
Full Text Available In modeling, only information from the deviation between the output of the support vector regression (SVR model and the training sample is considered, whereas the other prior information of the training sample, such as probability distribution information, is ignored. Probabilistic distribution information describes the overall distribution of sample data in a training sample that contains different degrees of noise and potential outliers, as well as helping develop a high-accuracy model. To mine and use the probability distribution information of a training sample, a new support vector regression model that incorporates probability distribution information weight SVR (PDISVR is proposed. In the PDISVR model, the probability distribution of each sample is considered as the weight and is then introduced into the error coefficient and slack variables of SVR. Thus, the deviation and probability distribution information of the training sample are both used in the PDISVR model to eliminate the influence of noise and outliers in the training sample and to improve predictive performance. Furthermore, examples with different degrees of noise were employed to demonstrate the performance of PDISVR, which was then compared with those of three SVR-based methods. The results showed that PDISVR performs better than the three other methods.
Ordinal probability effect measures for group comparisons in multinomial cumulative link models.
Agresti, Alan; Kateri, Maria
2017-03-01
We consider simple ordinal model-based probability effect measures for comparing distributions of two groups, adjusted for explanatory variables. An "ordinal superiority" measure summarizes the probability that an observation from one distribution falls above an independent observation from the other distribution, adjusted for explanatory variables in a model. The measure applies directly to normal linear models and to a normal latent variable model for ordinal response variables. It equals Φ(β/2) for the corresponding ordinal model that applies a probit link function to cumulative multinomial probabilities, for standard normal cdf Φ and effect β that is the coefficient of the group indicator variable. For the more general latent variable model for ordinal responses that corresponds to a linear model with other possible error distributions and corresponding link functions for cumulative multinomial probabilities, the ordinal superiority measure equals exp(β)/[1+exp(β)] with the log-log link and equals approximately exp(β/2)/[1+exp(β/2)] with the logit link, where β is the group effect. Another ordinal superiority measure generalizes the difference of proportions from binary to ordinal responses. We also present related measures directly for ordinal models for the observed response that need not assume corresponding latent response models. We present confidence intervals for the measures and illustrate with an example. © 2016, The International Biometric Society.
Franceschetti, Donald R; Gire, Elizabeth
2013-06-01
Quantum probability theory offers a viable alternative to classical probability, although there are some ambiguities inherent in transferring the quantum formalism to a less determined realm. A number of physicists are now looking at the applicability of quantum ideas to the assessment of physics learning, an area particularly suited to quantum probability ideas.
van Walraven, Carl
2017-04-01
Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Directory of Open Access Journals (Sweden)
Isabel C. Pérez Hoyos
2016-04-01
Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.
On New Cautious Structural Reliability Models in the Framework of imprecise Probabilities
DEFF Research Database (Denmark)
Utkin, Lev V.; Kozine, Igor
2010-01-01
-abilities. The probabilities are initially vacuous, re-flecting prior ignorance, become more precise as the number of observations increase. The new imprecise structural reliability models are based on imprecise Bayesian inference and are imprecise Dirichlet, imprecise negative binomial, gamma-exponential and normal models...
Modeling time-varying exposure using inverse probability of treatment weights
Grafféo, Nathalie; Latouche, Aurélien; Geskus, Ronald B.; Chevret, Sylvie
2017-01-01
For estimating the causal effect of treatment exposure on the occurrence of adverse events, inverse probability weights (IPW) can be used in marginal structural models to correct for time-dependent confounding. The R package ipw allows IPW estimation by modeling the relationship between the exposure
Aggregate and Individual Replication Probability within an Explicit Model of the Research Process
Miller, Jeff; Schwarz, Wolf
2011-01-01
We study a model of the research process in which the true effect size, the replication jitter due to changes in experimental procedure, and the statistical error of effect size measurement are all normally distributed random variables. Within this model, we analyze the probability of successfully replicating an initial experimental result by…
Wildland fire probabilities estimated from weather model-deduced monthly mean fire danger indices
Haiganoush K. Preisler; Shyh-Chin Chen; Francis Fujioka; John W. Benoit; Anthony L. Westerling
2008-01-01
The National Fire Danger Rating System indices deduced from a regional simulation weather model were used to estimate probabilities and numbers of large fire events on monthly and 1-degree grid scales. The weather model simulations and forecasts are ongoing experimental products from the Experimental Climate Prediction Center at the Scripps Institution of Oceanography...
A Monte Carlo model for determining copperhead probability of acquisition and maneuver
Starks, M.
1980-08-01
This report documents AMSAA's Probability of Acquisition and Maneuver (PAM) model. The model is used to develop performance estimates for COPPERHEAD and related weapon systems. A mathematical method for modeling the acquisition and maneuver portions of a COPPERHEAD trajectory is presented. In addition, the report contains a FORTRAN implementation of the model, a description of the required inputs, and a sample case with input and output.
Energy Technology Data Exchange (ETDEWEB)
Zhang, Jiangjiang [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Lin, Guang [Department of Mathematics and School of Mechanical Engineering, Purdue University, West Lafayette Indiana USA; Zeng, Lingzao [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA
2017-03-01
In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters. To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen–Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.
Transition probabilities of health states for workers in Malaysia using a Markov chain model
Samsuddin, Shamshimah; Ismail, Noriszura
2017-04-01
The aim of our study is to estimate the transition probabilities of health states for workers in Malaysia who contribute to the Employment Injury Scheme under the Social Security Organization Malaysia using the Markov chain model. Our study uses four states of health (active, temporary disability, permanent disability and death) based on the data collected from the longitudinal studies of workers in Malaysia for 5 years. The transition probabilities vary by health state, age and gender. The results show that men employees are more likely to have higher transition probabilities to any health state compared to women employees. The transition probabilities can be used to predict the future health of workers in terms of a function of current age, gender and health state.
The probabilities of one- and multi-track events for modeling radiation-induced cell kill
Energy Technology Data Exchange (ETDEWEB)
Schneider, Uwe; Vasi, Fabiano; Besserer, Juergen [University of Zuerich, Department of Physics, Science Faculty, Zurich (Switzerland); Radiotherapy Hirslanden, Zurich (Switzerland)
2017-08-15
In view of the clinical importance of hypofractionated radiotherapy, track models which are based on multi-hit events are currently reinvestigated. These models are often criticized, because it is believed that the probability of multi-track hits is negligible. In this work, the probabilities for one- and multi-track events are determined for different biological targets. The obtained probabilities can be used with nano-dosimetric cluster size distributions to obtain the parameters of track models. We quantitatively determined the probabilities for one- and multi-track events for 100, 500 and 1000 keV electrons, respectively. It is assumed that the single tracks are statistically independent and follow a Poisson distribution. Three different biological targets were investigated: (1) a DNA strand (2 nm scale); (2) two adjacent chromatin fibers (60 nm); and (3) fiber loops (300 nm). It was shown that the probabilities for one- and multi-track events are increasing with energy, size of the sensitive target structure, and dose. For a 2 x 2 x 2 nm{sup 3} target, one-track events are around 10,000 times more frequent than multi-track events. If the size of the sensitive structure is increased to 100-300 nm, the probabilities for one- and multi-track events are of the same order of magnitude. It was shown that target theories can play a role for describing radiation-induced cell death if the targets are of the size of two adjacent chromatin fibers or fiber loops. The obtained probabilities can be used together with the nano-dosimetric cluster size distributions to determine model parameters for target theories. (orig.)
Modelling the Probability Density Function of IPTV Traffic Packet Delay Variation
Directory of Open Access Journals (Sweden)
Michal Halas
2012-01-01
Full Text Available This article deals with modelling the Probability density function of IPTV traffic packet delay variation. The use of this modelling is in an efficient de-jitter buffer estimation. When an IP packet travels across a network, it experiences delay and its variation. This variation is caused by routing, queueing systems and other influences like the processing delay of the network nodes. When we try to separate these at least three types of delay variation, we need a way to measure these types separately. This work is aimed to the delay variation caused by queueing systems which has the main implications to the form of the Probability density function.
Courey, Karim; Wright, Clara; Asfour, Shihab; Onar, Arzu; Bayliss, Jon; Ludwig, Larry
2009-01-01
In this experiment, an empirical model to quantify the probability of occurrence of an electrical short circuit from tin whiskers as a function of voltage was developed. This empirical model can be used to improve existing risk simulation models. FIB and TEM images of a tin whisker confirm the rare polycrystalline structure on one of the three whiskers studied. FIB cross-section of the card guides verified that the tin finish was bright tin.
Use of the AIC with the EM algorithm: A demonstration of a probability model selection technique
Energy Technology Data Exchange (ETDEWEB)
Glosup, J.G.; Axelrod M.C. [Lawrence Livermore National Lab., CA (United States)
1994-11-15
The problem of discriminating between two potential probability models, a Gaussian distribution and a mixture of Gaussian distributions, is considered. The focus of our interest is a case where the models are potentially non-nested and the parameters of the mixture model are estimated through the EM algorithm. The AIC, which is frequently used as a criterion for discriminating between non-nested models, is modified to work with the EM algorithm and is shown to provide a model selection tool for this situation. A particular problem involving an infinite mixture distribution known as Middleton`s Class A model is used to demonstrate the effectiveness and limitations of this method.
Assessment of different models for computing the probability of a clear line of sight
Bojin, Sorin; Paulescu, Marius; Badescu, Viorel
2017-12-01
This paper is focused on modeling the morphological properties of the cloud fields in terms of the probability of a clear line of sight (PCLOS). PCLOS is defined as the probability that a line of sight between observer and a given point of the celestial vault goes freely without intersecting a cloud. A variety of PCLOS models assuming the cloud shape hemisphere, semi-ellipsoid and ellipsoid are tested. The effective parameters (cloud aspect ratio and absolute cloud fraction) are extracted from high-resolution series of sunshine number measurements. The performance of the PCLOS models is evaluated from the perspective of their ability in retrieving the point cloudiness. The advantages and disadvantages of the tested models are discussed, aiming to a simplified parameterization of PCLOS models.
Tian, Chuan; Sun, Di-Hua
2010-12-01
Considering the effects that the probability of traffic interruption and the friction between two lanes have on the car-following behaviour, this paper establishes a new two-lane microscopic car-following model. Based on this microscopic model, a new macroscopic model was deduced by the relevance relation of microscopic and macroscopic scale parameters for the two-lane traffic flow. Terms related to lane change are added into the continuity equations and velocity dynamic equations to investigate the lane change rate. Numerical results verify that the proposed model can be efficiently used to reflect the effect of the probability of traffic interruption on the shock, rarefaction wave and lane change behaviour on two-lane freeways. The model has also been applied in reproducing some complex traffic phenomena caused by traffic accident interruption.
Modelling detection probabilities to evaluate management and control tools for an invasive species
Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.
2010-01-01
For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially......The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...
DEFF Research Database (Denmark)
Asmussen, Søren; Albrecher, Hansjörg
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence......., extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...
Cool, Geneviève; Lebel, Alexandre; Sadiq, Rehan; Rodriguez, Manuel J
2015-12-01
The regional variability of the probability of occurrence of high total trihalomethane (TTHM) levels was assessed using multilevel logistic regression models that incorporate environmental and infrastructure characteristics. The models were structured in a three-level hierarchical configuration: samples (first level), drinking water utilities (DWUs, second level) and natural regions, an ecological hierarchical division from the Quebec ecological framework of reference (third level). They considered six independent variables: precipitation, temperature, source type, seasons, treatment type and pH. The average probability of TTHM concentrations exceeding the targeted threshold was 18.1%. The probability was influenced by seasons, treatment type, precipitations and temperature. The variance at all levels was significant, showing that the probability of TTHM concentrations exceeding the threshold is most likely to be similar if located within the same DWU and within the same natural region. However, most of the variance initially attributed to natural regions was explained by treatment types and clarified by spatial aggregation on treatment types. Nevertheless, even after controlling for treatment type, there was still significant regional variability of the probability of TTHM concentrations exceeding the threshold. Regional variability was particularly important for DWUs using chlorination alone since they lack the appropriate treatment required to reduce the amount of natural organic matter (NOM) in source water prior to disinfection. Results presented herein could be of interest to authorities in identifying regions with specific needs regarding drinking water quality and for epidemiological studies identifying geographical variations in population exposure to disinfection by-products (DBPs).
Modelling soft error probability in fi rmware: A case study | Kourie ...
African Journals Online (AJOL)
This case study involves an analysis of firmware that controls explosions in mining operations. The purpose is to estimate the probability that external disruptive events (such as electro-magnetic interference) could drive the firmware into a state which results in an unintended explosion. Two probabilistic models are built, ...
2016-12-01
DISTRIBUTION CODE A 13. ABSTRACT (maximum 200 words) Recent advancements in nuclear forensics have enabled the use of lasers via resonance ...release. Distribution is unlimited. MODELING OF PLUTONIUM IONIZATION PROBABILITIES FOR USE IN NUCLEAR FORENSIC ANALYSIS BY RESONANCE IONIZATION...bandwidth, irradiance, relative timing and spatial distribution all of which allow for optimization of the RIMS performance [6]. The LION system uses
DEFF Research Database (Denmark)
Falk, Anne Katrine Vinther; Gryning, Sven-Erik
1997-01-01
In this model for atmospheric dispersion particles are simulated by the Langevin Equation, which is a stochastic differential equation. It uses the probability density function (PDF) of the vertical velocity fluctuations as input. The PDF is constructed as an expansion after Hermite polynomials. ...
Energy Technology Data Exchange (ETDEWEB)
Dong, Jing [ORNL; Mahmassani, Hani S. [Northwestern University, Evanston
2011-01-01
This paper proposes a methodology to produce random flow breakdown endogenously in a mesoscopic operational model, by capturing breakdown probability and duration. Based on previous research findings that probability of flow breakdown can be represented as a function of flow rate and the duration can be characterized by a hazard model. By generating random flow breakdown at various levels and capturing the traffic characteristics at the onset of the breakdown, the stochastic network simulation model provides a tool for evaluating travel time variability. The proposed model can be used for (1) providing reliability related traveler information; (2) designing ITS (intelligent transportation systems) strategies to improve reliability; and (3) evaluating reliability-related performance measures of the system.
Evans, Jason; Sullivan, Jack
2011-01-01
A priori selection of models for use in phylogeny estimation from molecular sequence data is increasingly important as the number and complexity of available models increases. The Bayesian information criterion (BIC) and the derivative decision-theoretic (DT) approaches rely on a conservative approximation to estimate the posterior probability of a given model. Here, we extended the DT method by using reversible jump Markov chain Monte Carlo approaches to directly estimate model probabilities for an extended candidate pool of all 406 special cases of the general time reversible + Γ family. We analyzed 250 diverse data sets in order to evaluate the effectiveness of the BIC approximation for model selection under the BIC and DT approaches. Model choice under DT differed between the BIC approximation and direct estimation methods for 45% of the data sets (113/250), and differing model choice resulted in significantly different sets of trees in the posterior distributions for 26% of the data sets (64/250). The model with the lowest BIC score differed from the model with the highest posterior probability in 30% of the data sets (76/250). When the data indicate a clear model preference, the BIC approximation works well enough to result in the same model selection as with directly estimated model probabilities, but a substantial proportion of biological data sets lack this characteristic, which leads to selection of underparametrized models.
Austin, Samuel H.; Nelms, David L.
2017-01-01
Climate change raises concern that risks of hydrological drought may be increasing. We estimate hydrological drought probabilities for rivers and streams in the United States (U.S.) using maximum likelihood logistic regression (MLLR). Streamflow data from winter months are used to estimate the chance of hydrological drought during summer months. Daily streamflow data collected from 9,144 stream gages from January 1, 1884 through January 9, 2014 provide hydrological drought streamflow probabilities for July, August, and September as functions of streamflows during October, November, December, January, and February, estimating outcomes 5-11 months ahead of their occurrence. Few drought prediction methods exploit temporal links among streamflows. We find MLLR modeling of drought streamflow probabilities exploits the explanatory power of temporally linked water flows. MLLR models with strong correct classification rates were produced for streams throughout the U.S. One ad hoc test of correct prediction rates of September 2013 hydrological droughts exceeded 90% correct classification. Some of the best-performing models coincide with areas of high concern including the West, the Midwest, Texas, the Southeast, and the Mid-Atlantic. Using hydrological drought MLLR probability estimates in a water management context can inform understanding of drought streamflow conditions, provide warning of future drought conditions, and aid water management decision making.
Probability distribution of residence times of grains in models of rice piles.
Pradhan, Punyabrata; Dhar, Deepak
2006-02-01
We study the probability distribution of residence time of a grain at a site, and its total residence time inside a pile, in different rice pile models. The tails of these distributions are dominated by the grains that get deeply buried in the pile. We show that, for a pile of size L, the probabilities that the residence time at a site or the total residence time is greater than t, both decay as 1/t(ln t)x for L(omega) or = 1, and values of x and omega in the two cases are different. In the Oslo rice pile model we find that the probability of the residence time T(i) at a site i being greater than or equal to t is a nonmonotonic function of L for a fixed t and does not obey simple scaling. For model in d dimensions, we show that the probability of minimum slope configuration in the steady state, for large L, varies as exp(-kappaL(d+2)) where kappa is a constant, and hence gamma=d+2.
Computational modeling of the effects of amyloid-beta on release probability at hippocampal synapses
Directory of Open Access Journals (Sweden)
Armando eRomani
2013-01-01
Full Text Available The role of amyloid-beta (Aβ in brain function and in the pathogenesis of Alzheimer’s disease remains elusive. Recent publications reported that an increase in Aβ concentration perturbs pre-synaptic release in hippocampal neurons. In particular, it was shown in vitro that Aβ is an endogenous regulator of synaptic transmission at the CA3-CA1 synapse, enhancing its release probability. How this synaptic modulator influences neuronal output during physiological stimulation patterns, such as those elicited in vivo, is still unknown. Using a realistic model of hippocampal CA1 pyramidal neurons, we first implemented this Aβ-induced enhancement of release probability and validated the model by reproducing the experimental findings. We then demonstrated that this synaptic modification can significantly alter synaptic integration properties in a wide range of physiologically relevant input frequencies (from 5 to 200 Hz. Finally, we used natural input patterns, obtained from CA3 pyramidal neurons in vivo during free exploration of rats in an open field, to investigate the effects of enhanced Aβ on synaptic release under physiological conditions. The model shows that the CA1 neuronal response to these natural patterns is altered in the increased-Aβ condition, especially for frequencies in the theta and gamma ranges. These results suggest that the perturbation of release probability induced by increased Aβ can significantly alter the spike probability of CA1 pyramidal neurons and thus contribute to abnormal hippocampal function during Alzheimer’s disease.
A fault tree model to assess probability of contaminant discharge from shipwrecks.
Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I
2014-11-15
Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.
Energy Technology Data Exchange (ETDEWEB)
Oberkampf, William Louis; Tucker, W. Troy (Applied Biomathematics, Setauket, NY); Zhang, Jianzhong (Iowa State University, Ames, IA); Ginzburg, Lev (Applied Biomathematics, Setauket, NY); Berleant, Daniel J. (Iowa State University, Ames, IA); Ferson, Scott (Applied Biomathematics, Setauket, NY); Hajagos, Janos (Applied Biomathematics, Setauket, NY); Nelsen, Roger B. (Lewis & Clark College, Portland, OR)
2004-10-01
This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.
Directory of Open Access Journals (Sweden)
Chenquan Gan
2014-01-01
Full Text Available Vaccination is one of the most effective measures for suppressing the spread of computer virus, and the bilinear incidence rate assumption for the majority of previous models, which is a good first approximation of the general incidence rate, is in disagreement with the reality. In this paper, a new dynamical model with two kinds of generic nonlinear probabilities (incidence rate and vaccination probability is established. An exhaustive mathematical analysis of this model shows that (a there are two equilibria, virus-free equilibrium and viral equilibrium, and (b the virus-free (or viral equilibrium is globally asymptotically stable when the basic reproduction number is less (or greater than unity. The analysis of the basic reproduction number is also included. Additionally, some numerical examples are given to illustrate the main results, from which it can be seen that the generic nonlinear vaccination is helpful to strengthen computer security.
Balekian, Alex A; Silvestri, Gerard A; Simkovich, Suzanne M; Mestaz, Peter J; Sanders, Gillian D; Daniel, Jamie; Porcel, Jackie; Gould, Michael K
2013-12-01
Management of pulmonary nodules depends critically on the probability of malignancy. Models to estimate probability have been developed and validated, but most clinicians rely on judgment. The aim of this study was to compare the accuracy of clinical judgment with that of two prediction models. Physician participants reviewed up to five clinical vignettes, selected at random from a larger pool of 35 vignettes, all based on actual patients with lung nodules of known final diagnosis. Vignettes included clinical information and a representative slice from computed tomography. Clinicians estimated the probability of malignancy for each vignette. To examine agreement with models, we calculated intraclass correlation coefficients (ICC) and kappa statistics. To examine accuracy, we compared areas under the receiver operator characteristic curve (AUC). Thirty-six participants completed 179 vignettes, 47% of which described patients with malignant nodules. Agreement between participants and models was fair for the Mayo Clinic model (ICC, 0.37; 95% confidence interval [CI], 0.23-0.50) and moderate for the Veterans Affairs model (ICC, 0.46; 95% CI, 0.34-0.57). There was no difference in accuracy between participants (AUC, 0.70; 95% CI, 0.62-0.77) and the Mayo Clinic model (AUC, 0.71; 95% CI, 0.62-0.80; P = 0.90) or the Veterans Affairs model (AUC, 0.72; 95% CI, 0.64-0.80; P = 0.54). In this vignette-based study, clinical judgment and models appeared to have similar accuracy for lung nodule characterization, but agreement between judgment and the models was modest, suggesting that qualitative and quantitative approaches may provide complementary information.
Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng
2013-01-01
New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.
Khokhlov, A.; Hulot, G.
2013-04-01
We introduce and apply the concept of 2-D probability uniformization to palaeomagnetic directional data. 2-D uniformization belongs to a very general class of probability transformations that map multivariate probability distributions into multivariate uniform distributions. Our goal is to produce joint tests of directional data sets assumed generated by a common statistical model, but with different sampling distributions. This situation is encountered when testing so-called Giant Gaussian Process (GGP) models of the Earth's magnetic field against palaeomagnetic directional data collected from different geographical sites, the predicted sampling distributions being site-dependent. To introduce the concept, we first consider 2-D Gaussian distributions in the plane {R}^2, before turning to Angular Gaussian and more general 2-D distributions on the unit sphere S^2. We detail the approach when applied to the 2-D distributions expected for palaeomagnetic directional data, if these are to be consistent with a GGP model while affected by some Fisherian error. We finally provide some example applications to real palaeomagnetic data. In particular, we show how subtle inhomogeneities in the distribution of the data, such as the so-called right-handed effect in palaeomagnetism, can be detected. This effect, whether of geomagnetic origin or not, affects the Brunhes data in such a way that they cannot easily be reconciled with GGP models originally built with the help of these data. 2-D probability uniformization is a powerful tool which, we argue, could be used to build and test better GGP models of the mean palaeomagnetic field and palaeosecular variation. The software designed in the course of this study is available upon request from the authors. It can also be downloaded from http://geomag.ipgp.fr/download/PSVT.tgz.
Issues in estimating probability of detection of NDT techniques - A model assisted approach.
Rentala, Vamsi Krishna; Mylavarapu, Phani; Gautam, Jai Prakash
2018-02-13
In order to successfully implement Damage Tolerance (DT) methodology for aero-engines, Non-Destructive Testing (NDT) techniques are vital for assessing the remaining life of the component. Probability of Detection (POD), a standard measure of NDT reliability, is usually estimated as per MIL-HDBK-1823A standard. Estimation of POD of any NDT technique can be obtained by both experimental and model assisted methods. POD depends on many factors such as material, geometry, defect characteristics, inspection technique, etc. These requirements put enormous limitations on generating experimental POD curves and hence, Model Assisted Probability of Detection (MAPOD) curves are currently in vogue. In this study, MAPOD approaches were demonstrated by addressing various issues related to selection of crack sizes distribution, challenges involved in censoring and regression, estimation of distribution parameters, etc. Ultrasonic testing on volumetric defects has been identified as a platform to discuss the challenges involved. A COMSOL Multiphysics based FEM numerical model developed to simulate ultrasonic response from a Ti-6Al-4V cylindrical block has been validated experimentally. Further, the individual ultrasonic response from various Flat Bottom Hole (FBH) defects following lognormal distribution has been generated using the numerical model. a 90/95 (detecting a flaw with 90% probability and 95% confidence) value obtained from POD curve showed that the POD value increased with an increase in decision threshold. Copyright © 2018 Elsevier B.V. All rights reserved.
Scaling Qualitative Probability
Burgin, Mark
2017-01-01
There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...
Blocking probability in the hose-model optical VPN with different number of wavelengths
Roslyakov, Alexander V.
2017-04-01
Connection setup with guaranteed quality of service (QoS) in the optical virtual private network (OVPN) is a major goal for the network providers. In order to support this we propose a QoS based OVPN connection set up mechanism over WDM network to the end customer. The proposed WDM network model can be specified in terms of QoS parameter such as blocking probability. We estimated this QoS parameter based on the hose-model OVPN. In this mechanism the OVPN connections also can be created or deleted according to the availability of the wavelengths in the optical path. In this paper we have considered the impact of the number of wavelengths on the computation of blocking probability. The goal of the work is to dynamically provide a best OVPN connection during frequent arrival of connection requests with QoS requirements.
Smith, Leonard A.
2010-05-01
This contribution concerns "deep" or "second-order" uncertainty, such as the uncertainty in our probability forecasts themselves. It asks the question: "Is it rational to take (or offer) bets using model-based probabilities as if they were objective probabilities?" If not, what alternative approaches for determining odds, perhaps non-probabilistic odds, might prove useful in practice, given the fact we know our models are imperfect? We consider the case where the aim is to provide sustainable odds: not to produce a profit but merely to rationally expect to break even in the long run. In other words, to run a quantified risk of ruin that is relatively small. Thus the cooperative insurance schemes of coastal villages provide a more appropriate parallel than a casino. A "better" probability forecast would lead to lower premiums charged and less volatile fluctuations in the cash reserves of the village. Note that the Bayesian paradigm does not constrain one to interpret model distributions as subjective probabilities, unless one believes the model to be empirically adequate for the task at hand. In geophysics, this is rarely the case. When a probability forecast is interpreted as the objective probability of an event, the odds on that event can be easily computed as one divided by the probability of the event, and one need not favour taking either side of the wager. (Here we are using "odds-for" not "odds-to", the difference being whether of not the stake is returned; odds of one to one are equivalent to odds of two for one.) The critical question is how to compute sustainable odds based on information from imperfect models. We suggest that this breaks the symmetry between the odds-on an event and the odds-against it. While a probability distribution can always be translated into odds, interpreting the odds on a set of events might result in "implied-probabilities" that sum to more than one. And/or the set of odds may be incomplete, not covering all events. We ask
Directory of Open Access Journals (Sweden)
Onur Satir
2016-09-01
Full Text Available Forest fires are one of the most important factors in environmental risk assessment and it is the main cause of forest destruction in the Mediterranean region. Forestlands have a number of known benefits such as decreasing soil erosion, containing wild life habitats, etc. Additionally, forests are also important player in carbon cycle and decreasing the climate change impacts. This paper discusses forest fire probability mapping of a Mediterranean forestland using a multiple data assessment technique. An artificial neural network (ANN method was used to map forest fire probability in Upper Seyhan Basin (USB in Turkey. Multi-layer perceptron (MLP approach based on back propagation algorithm was applied in respect to physical, anthropogenic, climate and fire occurrence datasets. Result was validated using relative operating characteristic (ROC analysis. Coefficient of accuracy of the MLP was 0.83. Landscape features input to the model were assessed statistically to identify the most descriptive factors on forest fire probability mapping using the Pearson correlation coefficient. Landscape features like elevation (R = −0.43, tree cover (R = 0.93 and temperature (R = 0.42 were strongly correlated with forest fire probability in the USB region.
Directory of Open Access Journals (Sweden)
Samy Ismail Elmahdy
2016-01-01
Full Text Available In the current study, Penang Island, which is one of the several mountainous areas in Malaysia that is often subjected to landslide hazard, was chosen for further investigation. A multi-criteria Evaluation and the spatial probability weighted approach and model builder was applied to map and analyse landslides in Penang Island. A set of automated algorithms was used to construct new essential geological and morphometric thematic maps from remote sensing data. The maps were ranked using the weighted probability spatial model based on their contribution to the landslide hazard. Results obtained showed that sites at an elevation of 100–300 m, with steep slopes of 10°–37° and slope direction (aspect in the E and SE directions were areas of very high and high probability for the landslide occurrence; the total areas were 21.393 km2 (11.84% and 58.690 km2 (32.48%, respectively. The obtained map was verified by comparing variogram models of the mapped and the occurred landslide locations and showed a strong correlation with the locations of occurred landslides, indicating that the proposed method can successfully predict the unpredictable landslide hazard. The method is time and cost effective and can be used as a reference for geological and geotechnical engineers.
Dissolution Model Development: Formulation Effects and Filter Complications
DEFF Research Database (Denmark)
Berthelsen, Ragna; Holm, Rene; Jacobsen, Jette
2016-01-01
This study describes various complications related to sample preparation (filtration) during development of a dissolution method intended to discriminate among different fenofibrate immediate-release formulations. Several dissolution apparatus and sample preparation techniques were tested. The flow...... the mini paddle dissolution method demonstrates that sample preparation influenced the results. The investigations show that excipients from the formulations directly affected the drug–filter interaction, thereby affecting the dissolution profiles and the ability to predict the in vivo data....... With the tested drug–formulation combination, the best in vivo–in vitro correlation was found after filtration of the dissolution samples through 0.45-μm hydrophobic PTFE membrane filters....
A Stochastic Tick-Borne Disease Model: Exploring the Probability of Pathogen Persistence.
Maliyoni, Milliward; Chirove, Faraimunashe; Gaff, Holly D; Govinder, Keshlan S
2017-07-13
We formulate and analyse a stochastic epidemic model for the transmission dynamics of a tick-borne disease in a single population using a continuous-time Markov chain approach. The stochastic model is based on an existing deterministic metapopulation tick-borne disease model. We compare the disease dynamics of the deterministic and stochastic models in order to determine the effect of randomness in tick-borne disease dynamics. The probability of disease extinction and that of a major outbreak are computed and approximated using the multitype Galton-Watson branching process and numerical simulations, respectively. Analytical and numerical results show some significant differences in model predictions between the stochastic and deterministic models. In particular, we find that a disease outbreak is more likely if the disease is introduced by infected deer as opposed to infected ticks. These insights demonstrate the importance of host movement in the expansion of tick-borne diseases into new geographic areas.
Directory of Open Access Journals (Sweden)
Gabriel Rodríguez
2016-06-01
Full Text Available Following Xu and Perron (2014, I applied the extended RLS model to the daily stock market returns of Argentina, Brazil, Chile, Mexico and Peru. This model replaces the constant probability of level shifts for the entire sample with varying probabilities that record periods with extremely negative returns. Furthermore, it incorporates a mean reversion mechanism with which the magnitude and the sign of the level shift component vary in accordance with past level shifts that deviate from the long-term mean. Therefore, four RLS models are estimated: the Basic RLS, the RLS with varying probabilities, the RLS with mean reversion, and a combined RLS model with mean reversion and varying probabilities. The results show that the estimated parameters are highly significant, especially that of the mean reversion model. An analysis of ARFIMA and GARCH models is also performed in the presence of level shifts, which shows that once these shifts are taken into account in the modeling, the long memory characteristics and GARCH effects disappear. Also, I find that the performance prediction of the RLS models is superior to the classic models involving long memory as the ARFIMA(p,d,q models, the GARCH and the FIGARCH models. The evidence indicates that except in rare exceptions, the RLS models (in all its variants are showing the best performance or belong to the 10% of the Model Confidence Set (MCS. On rare occasions the GARCH and the ARFIMA models appear to dominate but they are rare exceptions. When the volatility is measured by the squared returns, the great exception is Argentina where a dominance of GARCH and FIGARCH models is appreciated.
Kurugol, Sila; Freiman, Moti; Afacan, Onur; Perez-Rossello, Jeannette M; Callahan, Michael J; Warfield, Simon K
2016-08-01
Quantitative diffusion-weighted MR imaging (DW-MRI) of the body enables characterization of the tissue microenvironment by measuring variations in the mobility of water molecules. The diffusion signal decay model parameters are increasingly used to evaluate various diseases of abdominal organs such as the liver and spleen. However, previous signal decay models (i.e., mono-exponential, bi-exponential intra-voxel incoherent motion (IVIM) and stretched exponential models) only provide insight into the average of the distribution of the signal decay rather than explicitly describe the entire range of diffusion scales. In this work, we propose a probability distribution model of incoherent motion that uses a mixture of Gamma distributions to fully characterize the multi-scale nature of diffusion within a voxel. Further, we improve the robustness of the distribution parameter estimates by integrating spatial homogeneity prior into the probability distribution model of incoherent motion (SPIM) and by using the fusion bootstrap solver (FBM) to estimate the model parameters. We evaluated the improvement in quantitative DW-MRI analysis achieved with the SPIM model in terms of accuracy, precision and reproducibility of parameter estimation in both simulated data and in 68 abdominal in-vivo DW-MRIs. Our results show that the SPIM model not only substantially reduced parameter estimation errors by up to 26%; it also significantly improved the robustness of the parameter estimates (paired Student's t-test, p < 0.0001) by reducing the coefficient of variation (CV) of estimated parameters compared to those produced by previous models. In addition, the SPIM model improves the parameter estimates reproducibility for both intra- (up to 47%) and inter-session (up to 30%) estimates compared to those generated by previous models. Thus, the SPIM model has the potential to improve accuracy, precision and robustness of quantitative abdominal DW-MRI analysis for clinical applications
Directory of Open Access Journals (Sweden)
L. Ya. Klepper
2015-01-01
Full Text Available Breast cancer (ВС is a common malignant disease of the female reproductive system. Currently we have many treatment strategies given location depending on the clinical data. Radiation therapy is an important component in a comprehensive program of treatment for ВС. Despite the fact that often use a single dose fractionation regime 1.8–2 Gy daily fractions to a total of 50 Gy in 5 weeks, do not run out to try to find new modes of fractionation. According to published research results hypofractionated regimes, we can conclude that the approaches to the value of the dose per fraction,the number of fractions and the time of treatment differ. Dose per fraction ranged from 2.66 to 3.2 Gy, and more recently have been tested modes with a single dose of 6 Gy. Empirical data from these studies are important, but must also be aware of the possibility of applying mathematical methods for computing the probability of cure of the tumor and the occurrence of radiation complications.It is necessary for an individual approach to each patient, picking up for some clinical cases the optimal mode of fractionation. In addition, the search continues and improvement fractionation regimes, and the results of clinical trials can tell a lot about how good the chosen model. In work the opportunity of application of the synthesized mathematical model (SM model, intended for description of NTCP, to the description of probability of local treatment of early stages of the ВС.
Gruber, Susan; Logan, Roger W; Jarrín, Inmaculada; Monge, Susana; Hernán, Miguel A
2015-01-15
Inverse probability weights used to fit marginal structural models are typically estimated using logistic regression. However, a data-adaptive procedure may be able to better exploit information available in measured covariates. By combining predictions from multiple algorithms, ensemble learning offers an alternative to logistic regression modeling to further reduce bias in estimated marginal structural model parameters. We describe the application of two ensemble learning approaches to estimating stabilized weights: super learning (SL), an ensemble machine learning approach that relies on V-fold cross validation, and an ensemble learner (EL) that creates a single partition of the data into training and validation sets. Longitudinal data from two multicenter cohort studies in Spain (CoRIS and CoRIS-MD) were analyzed to estimate the mortality hazard ratio for initiation versus no initiation of combined antiretroviral therapy among HIV positive subjects. Both ensemble approaches produced hazard ratio estimates further away from the null, and with tighter confidence intervals, than logistic regression modeling. Computation time for EL was less than half that of SL. We conclude that ensemble learning using a library of diverse candidate algorithms offers an alternative to parametric modeling of inverse probability weights when fitting marginal structural models. With large datasets, EL provides a rich search over the solution space in less time than SL with comparable results. Copyright © 2014 John Wiley & Sons, Ltd.
Directory of Open Access Journals (Sweden)
Mark A Walker
2017-11-01
Full Text Available Ectopic heartbeats can trigger reentrant arrhythmias, leading to ventricular fibrillation and sudden cardiac death. Such events have been attributed to perturbed Ca2+ handling in cardiac myocytes leading to spontaneous Ca2+ release and delayed afterdepolarizations (DADs. However, the ways in which perturbation of specific molecular mechanisms alters the probability of ectopic beats is not understood. We present a multiscale model of cardiac tissue incorporating a biophysically detailed three-dimensional model of the ventricular myocyte. This model reproduces realistic Ca2+ waves and DADs driven by stochastic Ca2+ release channel (RyR gating and is used to study mechanisms of DAD variability. In agreement with previous experimental and modeling studies, key factors influencing the distribution of DAD amplitude and timing include cytosolic and sarcoplasmic reticulum Ca2+ concentrations, inwardly rectifying potassium current (IK1 density, and gap junction conductance. The cardiac tissue model is used to investigate how random RyR gating gives rise to probabilistic triggered activity in a one-dimensional myocyte tissue model. A novel spatial-average filtering method for estimating the probability of extreme (i.e. rare, high-amplitude stochastic events from a limited set of spontaneous Ca2+ release profiles is presented. These events occur when randomly organized clusters of cells exhibit synchronized, high amplitude Ca2+ release flux. It is shown how reduced IK1 density and gap junction coupling, as observed in heart failure, increase the probability of extreme DADs by multiple orders of magnitude. This method enables prediction of arrhythmia likelihood and its modulation by alterations of other cellular mechanisms.
2014-03-01
7 APFT Data Collection...5 Figure 2. Receiver operating characteristic curve for logistic regression model predicting probability of final APFT failure...Predictive equation for the Final APFT Failure model.. .................................. 19 Table 11. Predictive equation for the Attrition model
Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu
2017-12-01
Despite the development of numerous predictive microbial inactivation models, a model focusing on the variability in time to inactivation for a bacterial population has not been developed. Additionally, an appropriate estimation of the risk of there being any remaining bacterial survivors in foods after the application of an inactivation treatment has not yet been established. Here, Gamma distribution, as a representative probability distribution, was used to estimate the variability in time to inactivation for a bacterial population. Salmonella enterica serotype Typhimurium was evaluated for survival in a low relative humidity environment. We prepared bacterial cells with an initial concentration that was adjusted to 2 × 10n colony-forming units/2 μl (n = 1, 2, 3, 4, 5) by performing a serial 10-fold dilution, and then we placed 2 μl of the inocula into each well of 96-well microplates. The microplates were stored in a desiccated environment at 10-20% relative humidity at 5, 15, or 25 °C. The survival or death of bacterial cells for each well in the 96-well microplate was confirmed by adding tryptic soy broth as an enrichment culture. The changes in the death probability of the 96 replicated bacterial populations were described as a cumulative Gamma distribution. The variability in time to inactivation was described by transforming the cumulative Gamma distribution into a Gamma distribution. We further examined the bacterial inactivation on almond kernels and radish sprout seeds. Additionally, we described certainty levels of bacterial inactivation that ensure the death probability of a bacterial population at six decimal reduction levels, ranging from 90 to 99.9999%. Consequently, the probability model developed in the present study enables us to estimate the death probability of bacterial populations in a desiccated environment over time. This probability model may be useful for risk assessment to estimate the amount of remaining bacteria in a given
Consistent modeling of scalar mixing for presumed, multiple parameter probability density functions
Mortensen, Mikael
2005-01-01
In this Brief Communication we describe a consistent method for calculating the conditional scalar dissipation (or diffusion) rate for inhomogeneous turbulent flows. The model follows from the transport equation for the conserved scalar probability density function (PDF) using a gradient diffusion closure for the conditional mean velocity and a presumed PDF depending on any number of mixture fraction moments. With the presumed β PDF, the model is an inhomogeneous modification to the homogeneous model of Girimaji ["On the modeling of scalar diffusion in isotropic turbulence," Phys. Fluids A 4, 2529 (1992)]. An important feature of the model is that it makes the classical approach to the conditional moment closure completely conservative for inhomogeneous flows.
Probability of detection model for the non-destructive inspection of steam generator tubes of PWRs
Yusa, N.
2017-06-01
This study proposes a probability of detection (POD) model to discuss the capability of non-destructive testing methods for the detection of stress corrosion cracks appearing in the steam generator tubes of pressurized water reactors. Three-dimensional finite element simulations were conducted to evaluate eddy current signals due to stress corrosion cracks. The simulations consider an absolute type pancake probe and model a stress corrosion crack as a region with a certain electrical conductivity inside to account for eddy currents flowing across a flaw. The probabilistic nature of a non-destructive test is simulated by varying the electrical conductivity of the modelled stress corrosion cracking. A two-dimensional POD model, which provides the POD as a function of the depth and length of a flaw, is presented together with a conventional POD model characterizing a flaw using a single parameter. The effect of the number of the samples on the PODs is also discussed.
Inelastic cross section and survival probabilities at the LHC in minijet models
Fagundes, Daniel A.; Grau, Agnes; Pancheri, Giulia; Shekhovtsova, Olga; Srivastava, Yogendra N.
2017-09-01
Recent results for the total and inelastic hadronic cross sections from LHC experiments are compared with predictions from a single-channel eikonal minijet model driven by parton density functions and from an empirical model. The role of soft gluon resummation in the infrared region in taming the rise of minijets and their contribution to the increase of the total cross sections at high energies are discussed. Survival probabilities at the LHC, whose theoretical estimates range from circa 10% to a few per mille, are estimated in this model and compared with results from QCD-inspired models and from multichannel eikonal models. We revisit a previous calculation and examine the origin of these discrepancies.
A Hidden Semi-Markov Model with Duration-Dependent State Transition Probabilities for Prognostics
Directory of Open Access Journals (Sweden)
Ning Wang
2014-01-01
Full Text Available Realistic prognostic tools are essential for effective condition-based maintenance systems. In this paper, a Duration-Dependent Hidden Semi-Markov Model (DD-HSMM is proposed, which overcomes the shortcomings of traditional Hidden Markov Models (HMM, including the Hidden Semi-Markov Model (HSMM: (1 it allows explicit modeling of state transition probabilities between the states; (2 it relaxes observations’ independence assumption by accommodating a connection between consecutive observations; and (3 it does not follow the unrealistic Markov chain’s memoryless assumption and therefore it provides a more powerful modeling and analysis capability for real world problems. To facilitate the computation of the proposed DD-HSMM methodology, new forward-backward algorithm is developed. The demonstration and evaluation of the proposed methodology is carried out through a case study. The experimental results show that the DD-HSMM methodology is effective for equipment health monitoring and management.
Fishnet model for failure probability tail of nacre-like imbricated lamellar materials
Luo, Wen; Bažant, Zdeněk P.
2017-12-01
Nacre, the iridescent material of the shells of pearl oysters and abalone, consists mostly of aragonite (a form of CaCO3), a brittle constituent of relatively low strength (≈10 MPa). Yet it has astonishing mean tensile strength (≈150 MPa) and fracture energy (≈350 to 1,240 J/m2). The reasons have recently become well understood: (i) the nanoscale thickness (≈300 nm) of nacre's building blocks, the aragonite lamellae (or platelets), and (ii) the imbricated, or staggered, arrangement of these lamellea, bound by biopolymer layers only ≈25 nm thick, occupying tail, must be determined. This objective, not pursued previously, is hardly achievable by experiments alone, since >10^8 tests of specimens would be needed. Here we outline a statistical model of strength that resembles a fishnet pulled diagonally, captures the tail of pdf of strength and, importantly, allows analytical safety assessments of nacreous materials. The analysis shows that, in terms of safety, the imbricated lamellar structure provides a major additional advantage—˜10% strength increase at tail failure probability 10^‑6 and a 1 to 2 orders of magnitude tail probability decrease at fixed stress. Another advantage is that a high scatter of microstructure properties diminishes the strength difference between the mean and the probability tail, compared with the weakest link model. These advantages of nacre-like materials are here justified analytically and supported by millions of Monte Carlo simulations.
Stirk, Emily R; Lythe, Grant; van den Berg, Hugo A; Hurst, Gareth A D; Molina-París, Carmen
2010-04-01
The limiting conditional probability distribution (LCD) has been much studied in the field of mathematical biology, particularly in the context of epidemiology and the persistence of epidemics. However, it has not yet been applied to the immune system. One of the characteristic features of the T cell repertoire is its diversity. This diversity declines in old age, whence the concepts of extinction and persistence are also relevant to the immune system. In this paper we model T cell repertoire maintenance by means of a continuous-time birth and death process on the positive integers, where the origin is an absorbing state. We show that eventual extinction is guaranteed. The late-time behaviour of the process before extinction takes place is modelled by the LCD, which we prove always exists for the process studied here. In most cases, analytic expressions for the LCD cannot be computed but the probability distribution may be approximated by means of the stationary probability distributions of two related processes. We show how these approximations are related to the LCD of the original process and use them to study the LCD in two special cases. We also make use of the large N expansion to derive a further approximation to the LCD. The accuracy of the various approximations is then analysed. (c) 2009 Elsevier Inc. All rights reserved.
Modelling the impact of creep on the probability of failure of a solid oxidefuel cell stack
DEFF Research Database (Denmark)
Greco, Fabio; Frandsen, Henrik Lund; Nakajo, Arata
2014-01-01
In solid oxide fuel cell (SOFC) technology a major challenge lies in balancing thermal stresses from an inevitable thermal field. The cells are known to creep, changing over time the stress field. The main objective of this study was to assess the influence of creep on the failure probability...... of an SOFC stack. A finite element analysis on a single repeating unit of the stack was performed, in which the influence of the mechanical interactions,the temperature-dependent mechanical properties and creep of the SOFC materials are considered. Moreover, stresses from the thermo-mechanical simulation...... of sintering of the cells have been obtained and were implemented into the model of the single repeating unit. The significance of the relaxation of the stresses by creep in the cell components and its influence on the probability of cell survival was investigated. Finally, the influence of cell size...
A Novel Probability Model for Suppressing Multipath Ghosts in GPR and TWI Imaging: A Numerical Study
Directory of Open Access Journals (Sweden)
Tan Yun-hua
2015-10-01
Full Text Available A novel concept for suppressing the problem of multipath ghosts in Ground Penetrating Radar (GPR and Through-Wall Imaging (TWI is presented. Ghosts (i.e., false targets mainly arise from the use of the Born or single-scattering approximations that lead to linearized imaging algorithms; however, these approximations neglect the effect of multiple scattering (or multipath between the electromagnetic wavefield and the object under investigation. In contrast to existing methods of suppressing multipath ghosts, the proposed method models for the first time the reflectivity of the probed objects as a probability function up to a normalized factor and introduces the concept of random subaperture by randomly picking up measurement locations from the entire aperture. Thus, the final radar image is a joint probability distribution that corresponds to radar images derived from multiple random subapertures. Finally, numerical experiments are used to demonstrate the performance of the proposed methodology in GPR and TWI imaging.
Li, Zhanling; Li, Zhanjie; Li, Chengcheng
2014-05-01
Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to
Finite element model updating of concrete structures based on imprecise probability
Biswal, S.; Ramaswamy, A.
2017-09-01
Imprecise probability based methods are developed in this study for the parameter estimation, in finite element model updating for concrete structures, when the measurements are imprecisely defined. Bayesian analysis using Metropolis Hastings algorithm for parameter estimation is generalized to incorporate the imprecision present in the prior distribution, in the likelihood function, and in the measured responses. Three different cases are considered (i) imprecision is present in the prior distribution and in the measurements only, (ii) imprecision is present in the parameters of the finite element model and in the measurement only, and (iii) imprecision is present in the prior distribution, in the parameters of the finite element model, and in the measurements. Procedures are also developed for integrating the imprecision in the parameters of the finite element model, in the finite element software Abaqus. The proposed methods are then verified against reinforced concrete beams and prestressed concrete beams tested in our laboratory as part of this study.
Stroh, Rémi; Bect, Julien; Demeyer, Séverine; Fischer, Nicolas; Vazquez, Emmanuel
2017-01-01
International audience; A multi-fidelity simulator is a numerical model, in which one of the inputs controls a trade-off between the realism and the computational cost of the simulation. Our goal is to estimate the probability of exceeding a given threshold on a multi-fidelity stochastic simulator. We propose a fully Bayesian approach based on Gaussian processes to compute the posterior probability distribution of this probability. We pay special attention to the hyper-parameters of the model...
A probability model: Tritium release into the coolant of a light water tritium production reactor
Energy Technology Data Exchange (ETDEWEB)
Anderson, D N
1992-04-01
This report presents a probability model of the total amount of tritium that will be released from a core of tritium target rods into the coolant of a light water reactor during a tritium production cycle.The model relates the total tritium released from a core to the release characteristics of an individual target rod within the core. The model captures total tritium release from two sources-release via target rod breach and release via permeation through the target rod. Specifically, under conservative assumptions about the breach characteristics of a target rod, total tritium released from a core is modeled as a function of the probability of a target breach and the mean and standard deviation of the permeation reduction factor (PRF) of an individual target rod. Two dominant facts emerge from the analysis in this report. First, total tritium release cannot be controlled and minimized solely through the PRF characteristics of a target rod. Tritium release via breach must be abated if acceptable tritium production is to be achieved. Second, PRF values have a saturation point to their effectiveness. Specifically, in the presence of any realistic level of PRF variability, increasing PRF values above approximately 1000 wig contribute little to minimizing total tritium release.
Modelling the probability of ionospheric irregularity occurrence over African low latitude region
Mungufeni, Patrick; Jurua, Edward; Bosco Habarulema, John; Anguma Katrini, Simon
2015-06-01
This study presents models of geomagnetically quiet time probability of occurrence of ionospheric irregularities over the African low latitude region. GNSS-derived ionospheric total electron content data from Mbarara, Uganda (0.60°S, 30.74°E, geographic, 10.22°S, magnetic) and Libreville, Gabon (0.35°N, 9.68°E, geographic, 8.05°S, magnetic) during the period 2001-2012 were used. First, we established the rate of change of total electron content index (ROTI) value associated with background ionospheric irregularity over the region. This was done by analysing GNSS carrier-phases at L-band frequencies L1 and L2 with the aim of identifying cycle slip events associated with ionospheric irregularities. We identified at both stations a total of 699 events of cycle slips. The corresponding median ROTI value at the epochs of the cycle slip events was 0.54 TECU/min. The probability of occurrence of ionospheric irregularities associated with ROTI ≥ 0.5 TECU / min was then modelled by fitting cubic B-splines to the data. The aspects the model captured included diurnal, seasonal, and solar flux dependence patterns of the probability of occurrence of ionospheric irregularities. The model developed over Mbarara was validated with data over Mt. Baker, Uganda (0.35°N, 29.90°E, geographic, 9.25°S, magnetic), Kigali, Rwanda (1.94°S, 30.09°E, geographic, 11.62°S, magnetic), and Kampala, Uganda (0.34°N, 32.60°E, geographic, 9.29°S, magnetic). For the period validated at Mt. Baker (approximately, 137.64 km, north west), Kigali (approximately, 162.42 km, south west), and Kampala (approximately, 237.61 km, north east) the percentages of the number of errors (difference between the observed and the modelled probability of occurrence of ionospheric irregularity) less than 0.05 are 97.3, 89.4, and 81.3, respectively.
Modeling of thermal stresses and probability of survival of tubular SOFC
Energy Technology Data Exchange (ETDEWEB)
Nakajo, Arata [Laboratory for Industrial Energy Systems (LENI), Faculty of Engineering, Swiss Federal Institute of Technology, 1015 Lausanne (Switzerland); Stiller, Christoph; Bolland, Olav [Department of Energy and Process Engineering, Norwegian University of Science and Technology, Trondheim N-7491 (Norway); Haerkegaard, Gunnar [Department of Engineering Design and Materials, Norwegian University of Science and Technology, Trondheim N-7491 (Norway)
2006-07-14
The temperature profile generated by a thermo-electro-chemical model was used to calculate the thermal stress distribution in a tubular solid oxide fuel cell (SOFC). The solid heat balances were calculated separately for each layer of the MEA (membrane electrode assembly) in order to detect the radial thermal gradients more precisely. It appeared that the electrolyte undergoes high tensile stresses at the ends of the cell in limited areas and that the anode is submitted to moderate tensile stresses. A simplified version of the widely used Weibull analysis was used to calculate the global probability of survival for the assessment of the risks related to both operating points and load changes. The cell at room temperature was considered and revealed as critical. As a general trend, the computed probabilities of survival were too low for the typical requirements for a commercial product. A sensitivity analysis showed a strong influence of the thermal expansion mismatch between the layers of the MEA on the probability of survival. The lack of knowledge on mechanical material properties as well as uncertainties about the phenomena occurring in the cell revealed itself as a limiting parameter for the simulation of thermal stresses. (author)
Fixation Probability in a Two-Locus Model by the Ancestral Recombination–Selection Graph
Lessard, Sabin; Kermany, Amir R.
2012-01-01
We use the ancestral influence graph (AIG) for a two-locus, two-allele selection model in the limit of a large population size to obtain an analytic approximation for the probability of ultimate fixation of a single mutant allele A. We assume that this new mutant is introduced at a given locus into a finite population in which a previous mutant allele B is already segregating with a wild type at another linked locus. We deduce that the fixation probability increases as the recombination rate increases if allele A is either in positive epistatic interaction with B and allele B is beneficial or in no epistatic interaction with B and then allele A itself is beneficial. This holds at least as long as the recombination fraction and the selection intensity are small enough and the population size is large enough. In particular this confirms the Hill–Robertson effect, which predicts that recombination renders more likely the ultimate fixation of beneficial mutants at different loci in a population in the presence of random genetic drift even in the absence of epistasis. More importantly, we show that this is true from weak negative epistasis to positive epistasis, at least under weak selection. In the case of deleterious mutants, the fixation probability decreases as the recombination rate increases. This supports Muller’s ratchet mechanism to explain the accumulation of deleterious mutants in a population lacking recombination. PMID:22095080
Fast Outage Probability Simulation for FSO Links with a Generalized Pointing Error Model
Ben Issaid, Chaouki
2017-02-07
Over the past few years, free-space optical (FSO) communication has gained significant attention. In fact, FSO can provide cost-effective and unlicensed links, with high-bandwidth capacity and low error rate, making it an exciting alternative to traditional wireless radio-frequency communication systems. However, the system performance is affected not only by the presence of atmospheric turbulences, which occur due to random fluctuations in the air refractive index but also by the existence of pointing errors. Metrics, such as the outage probability which quantifies the probability that the instantaneous signal-to-noise ratio is smaller than a given threshold, can be used to analyze the performance of this system. In this work, we consider weak and strong turbulence regimes, and we study the outage probability of an FSO communication system under a generalized pointing error model with both a nonzero boresight component and different horizontal and vertical jitter effects. More specifically, we use an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results.
Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.
2009-05-01
Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.
Directory of Open Access Journals (Sweden)
Daniel Ting
2010-04-01
Full Text Available Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1 input data size and criteria for structure inclusion (resolution, R-factor, etc.; 2 filtering of suspect conformations and outliers using B-factors or other features; 3 secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included; 4 the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5 whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp.
Probability density of the mass of the standard model Higgs boson
Erler, J
2001-01-01
The CERN LEP Collaborations have reported a small excess of events in their combined Higgs boson analysis at center of mass energies square root s
Investigation of probability theory on Ising models with different four-spin interactions
Yang, Yuming; Teng, Baohua; Yang, Hongchun; Cui, Haijuan
2017-10-01
Based on probability theory, two types of three-dimensional Ising models with different four-spin interactions are studied. Firstly the partition function of the system is calculated by considering the local correlation of spins in a given configuration, and then the properties of the phase transition are quantitatively discussed with series expansion technique and numerical method. Meanwhile the rounding errors in this calculation is analyzed so that the possibly source of the error in the calculation based on the mean field theory is pointed out.
A Nonlinear, Nontransitive and Additive-Probability Model for Decisions Under Uncertainty
Fishburn, Peter C.; LaValle, Irving H.
1987-01-01
Let $\\succ$ denote a preference relation on a set $F$ of lottery acts. Each $f$ in $F$ maps a state space $S$ into a set $P$ of lotteries on decision outcomes. The paper discusses axioms for $\\succ$ on $F$ which imply the existence of an SSB (skew-symmetric bilinear) functional $\\phi$ on $P \\times P$ and a finitely additive probability measure $\\pi$ on $2^S$ such that, for all $f$ and $g$ in $F$, $f \\succ g \\Leftrightarrow \\int_S \\phi(f(s), g(s)) d\\pi(s) > 0.$ This $S^3B$ (states SSB) model g...
Energy Technology Data Exchange (ETDEWEB)
Duffy, Stephen [Cleveland State Univ., Cleveland, OH (United States)
2013-09-09
This project will implement inelastic constitutive models that will yield the requisite stress-strain information necessary for graphite component design. Accurate knowledge of stress states (both elastic and inelastic) is required to assess how close a nuclear core component is to failure. Strain states are needed to assess deformations in order to ascertain serviceability issues relating to failure, e.g., whether too much shrinkage has taken place for the core to function properly. Failure probabilities, as opposed to safety factors, are required in order to capture the bariability in failure strength in tensile regimes. The current stress state is used to predict the probability of failure. Stochastic failure models will be developed that can accommodate possible material anisotropy. This work will also model material damage (i.e., degradation of mechanical properties) due to radiation exposure. The team will design tools for components fabricated from nuclear graphite. These tools must readily interact with finite element software--in particular, COMSOL, the software algorithm currently being utilized by the Idaho National Laboratory. For the eleastic response of graphite, the team will adopt anisotropic stress-strain relationships available in COMSO. Data from the literature will be utilized to characterize the appropriate elastic material constants.
Exact results for the probability and stochastic dynamics of fixation in the Wright-Fisher model.
Shafiey, Hassan; Waxman, David
2017-10-07
In this work we consider fixation of an allele in a population. Fixation is key to understanding the way long-term evolutionary change occurs at the gene and molecular levels. Two basic aspects of fixation are: (i) the chance it occurs and (ii) the way the gene frequency progresses to fixation. We present exact results for both aspects of fixation for the Wright-Fisher model. We give the exact fixation probability for some different schemes of frequency-dependent selection. We also give the corresponding exact stochastic difference equation that generates frequency trajectories which ultimately fix. Exactness of the results means selection need not be weak. There are possible applications of this work to data analysis, modelling, and tests of approximations. The methodology employed illustrates that knowledge of the fixation probability, for all initial frequencies, fully characterises the dynamics of the Wright-Fisher model. The stochastic equations for fixing trajectories allow insight into the way fixation occurs. They provide the alternative picture that fixation is driven by the injection of one carrier of the fixing allele into the population each generation. The stochastic equations allow explicit calculation of some properties of fixing trajectories and their efficient simulation. The results are illustrated and tested with simulations. Copyright © 2017 Elsevier Ltd. All rights reserved.
A multi-state model for wind farms considering operational outage probability
DEFF Research Database (Denmark)
Cheng, Lin; Liu, Manjun; Sun, Yuanzhang
2013-01-01
developed by considering the following factors: running time, operating environment, operating conditions, and wind speed fluctuations. A multi-state model for wind farms is also established. Numerical results illustrate that the proposed model can be well applied to power system reliability assessment......As one of the most important renewable energy resources, wind power has drawn much attention in recent years. The stochastic characteristics of wind speed lead to generation output uncertainties of wind energy conversion system (WECS) and affect power system reliability, especially at high wind...... power penetration levels. Therefore, a more comprehensive analysis toward WECS as well as an appropriate reliability assessment model are essential for maintaining the reliable operation of power systems. In this paper, the impact of wind turbine outage probability on system reliability is firstly...
Kondoh, Hiroshi; Matsushita, Mitsugu
1986-10-01
Diffusion-limited aggregation (DLA) model with anisotropic sticking probability Ps is computer-simulated on two dimensional square lattice. The cluster grows from a seed particle at the origin in the positive y area with the absorption-type boundary along x-axis. The cluster is found to grow anisotropically as R//˜Nν// and R\\bot˜Nν\\bot, where R\\bot and R// are the radii of gyration of the cluster along x- and y-axes, respectively, and N is the particle number constituting the cluster. The two exponents are shown to become assymptotically ν//{=}2/3, ν\\bot{=}1/3 whenever the sticking anisotropy exists. It is also found that the present model is fairly consistent with Hack’s law of river networks, suggesting that it is a good candidate of a prototype model for the evolution of the river network.
Zhao, F; Leong, T Y
2000-01-01
Data preprocessing is needed when real-life clinical databases are used as the data sources to learn the probabilities for dynamic decision models. Data preprocessing is challenging as it involves extensive manual effort and time in developing the data operation scripts. This paper presents a framework to facilitate automated and interactive generation of the problem-specific data preprocessing scripts. The framework has three major components: 1) A model parser that parses the decision model definition, 2) A graphical user interface that facilitates the interaction between the user and the system, and 3) A script generator that automatically generates the specific database scripts for the data preprocessing. We have implemented a prototype system of the framework and evaluated its effectiveness via a case study in the clinical domain. Preliminary results demonstrate the practical promise of the framework.
DEFF Research Database (Denmark)
Luo, Yangjun; Wu, Xiaoxiang; Zhou, Mingdong
2015-01-01
Both structural sizes and dimensional tolerances strongly influence the manufacturing cost and the functional performance of a practical product. This paper presents an optimization method to simultaneously find the optimal combination of structural sizes and dimensional tolerances. Based...... points (TPPs) and the worst case points (WCPs), which shows better performance than traditional approaches for highly nonlinear problems. Numerical results reveal that reasonable dimensions and tolerances can be suggested for the minimum manufacturing cost and a desirable structural safety....... on a probability-interval mixed reliability model, the imprecision of design parameters is modeled as interval uncertainties fluctuating within allowable tolerance bounds. The optimization model is defined as to minimize the total manufacturing cost under mixed reliability index constraints, which are further...
Uterine disorders and pregnancy complications: insights from mouse models
Lim, Hyunjung Jade; Wang, Haibin
2010-01-01
Much of our knowledge of human uterine physiology and pathology has been extrapolated from the study of diverse animal models, as there is no ideal system for studying human uterine biology in vitro. Although it remains debatable whether mouse models are the most suitable system for investigating human uterine function(s), gene-manipulated mice are considered by many the most useful tool for mechanistic analysis, and numerous studies have identified many similarities in female reproduction be...
Cannon, Alex
2017-04-01
Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another - the N-dimensional probability density function transform - is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin
The coupon collector urn model with unequal probabilities in ecology and evolution.
Zoroa, N; Lesigne, E; Fernández-Sáez, M J; Zoroa, P; Casas, J
2017-02-01
The sequential sampling of populations with unequal probabilities and with replacement in a closed population is a recurrent problem in ecology and evolution. Examples range from biodiversity sampling, epidemiology to the estimation of signal repertoire in animal communication. Many of these questions can be reformulated as urn problems, often as special cases of the coupon collector problem, most simply expressed as the number of coupons that must be collected to have a complete set. We aimed to apply the coupon collector model in a comprehensive manner to one example-hosts (balls) being searched (draws) and parasitized (ball colour change) by parasitic wasps-to evaluate the influence of differences in sampling probabilities between items on collection speed. Based on the model of a complete multinomial process over time, we define the distribution, distribution function, expectation and variance of the number of hosts parasitized after a given time, as well as the inverse problem, estimating the sampling effort. We develop the relationship between the risk distribution on the set of hosts and the speed of parasitization and propose a more elegant proof of the weak stochastic dominance among speeds of parasitization, using the concept of Schur convexity and the 'Robin Hood transfer' numerical operation. Numerical examples are provided and a conjecture about strong dominance-an ordering characteristic of random variables-is proposed. The speed at which new items are discovered is a function of the entire shape of the sampling probability distribution. The sole comparison of values of variances is not sufficient to compare speeds associated with different distributions, as generally assumed in ecological studies. © 2017 The Author(s).
Development of Clinical Prediction Models for Surgery and Complications in Crohn's Disease.
Guizzetti, Leonardo; Zou, Guangyong; Khanna, Reena; Dulai, Parambir S; Sandborn, William J; Jairath, Vipul; Feagan, Brian G
2017-09-19
Crohn's disease-related complications account for a substantial proportion of inflammatory bowel disease-associated healthcare expenditure. Identifying patients at risk for complications may allow for targeted use of early therapeutic interventions to off-set this natural course. We aimed to develop risk prediction models for Crohn's disease-related surgery and complications. Using data from the Randomized Evaluation of an Algorithm for Crohn's Disease cluster-randomized clinical Trial (REACT), which involved 1898 patients from 40 community practices. Separate prediction models were derived and internally validated for predicting Crohn's disease-related surgery and disease-related complications (defined as the first disease-related surgery, hospitalization or complication within 24 months). Model performance was assessed in terms of discrimination and calibration, decision curves and net benefit analyses. There were 130 (6.8%) disease-related surgeries and 504 (26.6%) complications during the 24-month follow-up period. Selected baseline predictors of surgery included age, gender, disease location, Harvey-Bradshaw score, stool frequency, antimetabolite or 5-aminosalicylate use, and the presence of a fistula, abscess or abdominal mass. Selected predictors of complications included those same factors for surgery, plus corticosteroid or anti-tumour necrosis factor use, but excluded 5-aminosalicylate use. Discrimination ability, as measured by validated c-statistics, was 0.70 and 0.62 for the surgery and complication models, respectively. Score charts and nomograms were developed to facilitate future risk score calculation. Separate risk prediction models for Crohn's disease-related surgery and complications were developed using clinical trial data involving community gastroenterology practices. These models could be used to guide Crohn's disease management. External validation is warranted.
Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.
2012-01-01
1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities
Serfling, Robert; Ogola, Gerald
2016-02-10
Among men, prostate cancer (CaP) is the most common newly diagnosed cancer and the second leading cause of death from cancer. A major issue of very large scale is avoiding both over-treatment and under-treatment of CaP cases. The central challenge is deciding clinical significance or insignificance when the CaP biopsy results are positive but only marginally so. A related concern is deciding how to increase the number of biopsy cores for larger prostates. As a foundation for improved choice of number of cores and improved interpretation of biopsy results, we develop a probability model for the number of positive cores found in a biopsy, given the total number of cores, the volumes of the tumor nodules, and - very importantly - the prostate volume. Also, three applications are carried out: guidelines for the number of cores as a function of prostate volume, decision rules for insignificant versus significant CaP using number of positive cores, and, using prior distributions on total tumor size, Bayesian posterior probabilities for insignificant CaP and posterior median CaP. The model-based results have generality of application, take prostate volume into account, and provide attractive tradeoffs of specificity versus sensitivity. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Mu, Chun-sun; Zhang, Ping; Kong, Chun-yan; Li, Yang-ning
2015-09-01
To study the application of Bayes probability model in differentiating yin and yang jaundice syndromes in neonates. Totally 107 jaundice neonates who admitted to hospital within 10 days after birth were assigned to two groups according to syndrome differentiation, 68 in the yang jaundice syndrome group and 39 in the yin jaundice syndrome group. Data collected for neonates were factors related to jaundice before, during and after birth. Blood routines, liver and renal functions, and myocardial enzymes were tested on the admission day or the next day. Logistic regression model and Bayes discriminating analysis were used to screen factors important for yin and yang jaundice syndrome differentiation. Finally, Bayes probability model for yin and yang jaundice syndromes was established and assessed. Factors important for yin and yang jaundice syndrome differentiation screened by Logistic regression model and Bayes discriminating analysis included mothers' age, mother with gestational diabetes mellitus (GDM), gestational age, asphyxia, or ABO hemolytic diseases, red blood cell distribution width (RDW-SD), platelet-large cell ratio (P-LCR), serum direct bilirubin (DBIL), alkaline phosphatase (ALP), cholinesterase (CHE). Bayes discriminating analysis was performed by SPSS to obtain Bayes discriminant function coefficient. Bayes discriminant function was established according to discriminant function coefficients. Yang jaundice syndrome: y1= -21. 701 +2. 589 x mother's age + 1. 037 x GDM-17. 175 x asphyxia + 13. 876 x gestational age + 6. 303 x ABO hemolytic disease + 2.116 x RDW-SD + 0. 831 x DBIL + 0. 012 x ALP + 1. 697 x LCR + 0. 001 x CHE; Yin jaundice syndrome: y2= -33. 511 + 2.991 x mother's age + 3.960 x GDM-12. 877 x asphyxia + 11. 848 x gestational age + 1. 820 x ABO hemolytic disease +2. 231 x RDW-SD +0. 999 x DBIL +0. 023 x ALP +1. 916 x LCR +0. 002 x CHE. Bayes discriminant function was hypothesis tested and got Wilks' λ =0. 393 (P =0. 000). So Bayes
Inferring Pairwise Interactions from Biological Data Using Maximum-Entropy Probability Models.
Directory of Open Access Journals (Sweden)
Richard R Stein
2015-07-01
Full Text Available Maximum entropy-based inference methods have been successfully used to infer direct interactions from biological datasets such as gene expression data or sequence ensembles. Here, we review undirected pairwise maximum-entropy probability models in two categories of data types, those with continuous and categorical random variables. As a concrete example, we present recently developed inference methods from the field of protein contact prediction and show that a basic set of assumptions leads to similar solution strategies for inferring the model parameters in both variable types. These parameters reflect interactive couplings between observables, which can be used to predict global properties of the biological system. Such methods are applicable to the important problems of protein 3-D structure prediction and association of gene-gene networks, and they enable potential applications to the analysis of gene alteration patterns and to protein design.
An Improved Upper Bound for the Critical Probability of the Frog Model on Homogeneous Trees
Lebensztayn, Élcio; Machado, Fábio P.; Popov, Serguei
2005-04-01
We study the frog model on homogeneous trees, a discrete time system of simple symmetric random walks whose description is as follows. There are active and inactive particles living on the vertices. Each active particle performs a simple symmetric random walk having a geometrically distributed random lifetime with parameter (1 - p). When an active particle hits an inactive particle, the latter becomes active. We obtain an improved upper bound for the critical parameter for having indefinite survival of active particles, in the case of one-particle-per-vertex initial configuration. The main tool is to construct a class of branching processes which are dominated by the frog model and analyze their supercritical behavior. This approach allows us also to present an upper bound for the critical probability in the case of random initial configuration.
DEFF Research Database (Denmark)
Lühr, Armin; Löck, Steffen; Jakobi, Annika
2017-01-01
PURPOSE: Objectives of this work are (1) to derive a general clinically relevant approach to model tumor control probability (TCP) for spatially variable risk of failure and (2) to demonstrate its applicability by estimating TCP for patients planned for photon and proton irradiation. METHODS...... highest failure rate in the low-risk CTVE) and differing substantially between photon and proton irradiation. CONCLUSIONS: The presented method is of practical value for three reasons: It (a) is based on empirical clinical outcome data; (b) can be applied to non-uniform dose prescriptions as well...... clinical target volume (CTV), and elective CTV (CTVE). The risk of a local failure in each of these sub-volumes was taken from the literature. RESULTS: Convenient expressions for D50,i and γ50,i were provided for the Poisson and the logistic model. Comparable TCP estimates were obtained for photon...
Modeling time-varying exposure using inverse probability of treatment weights.
Grafféo, Nathalie; Latouche, Aurélien; Geskus, Ronald B; Chevret, Sylvie
2017-12-27
For estimating the causal effect of treatment exposure on the occurrence of adverse events, inverse probability weights (IPW) can be used in marginal structural models to correct for time-dependent confounding. The R package ipw allows IPW estimation by modeling the relationship between the exposure and confounders via several regression models, among which is the Cox model. For right-censored data and time-dependent exposures such as treatment switches, the ipw package allows a single switch, assuming that patients are treated once and for all. However, to accommodate multiple switches, we extend this package by implementing a function that allows for multiple and intermittent exposure status in the estimation of IPW using a survival model. This extension allows for the whole exposure treatment trajectory in the estimation of IPW. The impact of the estimated weights on the estimated causal effect, with both methods, is assessed in a simulation study. Then, the function is illustrated on a real dataset from a nationwide prospective observational cohort including patients with inflammatory bowel disease. In this study, patients received one or multiple medications (thiopurines, methotrexate, and anti-TNF) over time. We used a Cox marginal structural model to assess the effect of thiopurines exposure on the cause-specific hazard for cancer incidence considering other treatments as confounding factors. To this end, we used our extended function which is available online in the Supporting Information. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Modeling Portfolio Optimization Problem by Probability-Credibility Equilibrium Risk Criterion
Directory of Open Access Journals (Sweden)
Ye Wang
2016-01-01
Full Text Available This paper studies the portfolio selection problem in hybrid uncertain decision systems. Firstly the return rates are characterized by random fuzzy variables. The objective is to maximize the total expected return rate. For a random fuzzy variable, this paper defines a new equilibrium risk value (ERV with credibility level beta and probability level alpha. As a result, our portfolio problem is built as a new random fuzzy expected value (EV model subject to ERV constraint, which is referred to as EV-ERV model. Under mild assumptions, the proposed EV-ERV model is a convex programming problem. Furthermore, when the possibility distributions are triangular, trapezoidal, and normal, the EV-ERV model can be transformed into its equivalent deterministic convex programming models, which can be solved by general purpose optimization software. To demonstrate the effectiveness of the proposed equilibrium optimization method, some numerical experiments are conducted. The computational results and comparison study demonstrate that the developed equilibrium optimization method is effective to model portfolio selection optimization problem with twofold uncertain return rates.
He, Xin; Koch, Julian; Sonnenborg, Torben O.; Jørgensen, Flemming; Schamper, Cyril; Christian Refsgaard, Jens
2014-04-01
Geological heterogeneity is a very important factor to consider when developing geological models for hydrological purposes. Using statistically based stochastic geological simulations, the spatial heterogeneity in such models can be accounted for. However, various types of uncertainties are associated with both the geostatistical method and the observation data. In the present study, TProGS is used as the geostatistical modeling tool to simulate structural heterogeneity for glacial deposits in a head water catchment in Denmark. The focus is on how the observation data uncertainty can be incorporated in the stochastic simulation process. The study uses two types of observation data: borehole data and airborne geophysical data. It is commonly acknowledged that the density of the borehole data is usually too sparse to characterize the horizontal heterogeneity. The use of geophysical data gives an unprecedented opportunity to obtain high-resolution information and thus to identify geostatistical properties more accurately especially in the horizontal direction. However, since such data are not a direct measurement of the lithology, larger uncertainty of point estimates can be expected as compared to the use of borehole data. We have proposed a histogram probability matching method in order to link the information on resistivity to hydrofacies, while considering the data uncertainty at the same time. Transition probabilities and Markov Chain models are established using the transformed geophysical data. It is shown that such transformation is in fact practical; however, the cutoff value for dividing the resistivity data into facies is difficult to determine. The simulated geological realizations indicate significant differences of spatial structure depending on the type of conditioning data selected. It is to our knowledge the first time that grid-to-grid airborne geophysical data including the data uncertainty are used in conditional geostatistical simulations in TPro
Model Based Adaptive Piecewise Linear Controller for Complicated Control Systems
Directory of Open Access Journals (Sweden)
Tain-Sou Tsay
2014-01-01
Full Text Available A model based adaptive piecewise linear control scheme for industry processes with specifications on peak overshoots and rise times is proposed. It is a gain stabilized control technique. Large gain is used for large tracking error to get fast response. Small gain is used between large and small tracking error for good performance. Large gain is used again for small tracking error to cope with large disturbance. Parameters of the three-segment piecewise linear controller are found by an automatic regulating time series which is function of output characteristics of the plant and reference model. The time series will be converged to steady values after the time response of the considered system matching that of the reference model. The proposed control scheme is applied to four numerical examples which have been compensated by PID controllers. Parameters of PID controllers are found by optimization method. It gives an almost command independent response and gives significant improvements for response time and performance.
Directory of Open Access Journals (Sweden)
Douglas C. Tozer
2016-12-01
Full Text Available Marsh birds are notoriously elusive, with variation in detection probability across species, regions, seasons, and different times of day and weather. Therefore, it is important to develop regional field survey protocols that maximize detections, but that also produce data for estimating and analytically adjusting for remaining differences in detections. We aimed to improve regional field survey protocols by estimating detection probability of eight elusive marsh bird species throughout two regions that have ongoing marsh bird monitoring programs: the southern Canadian Prairies (Prairie region and the southern portion of the Great Lakes basin and parts of southern Québec (Great Lakes-St. Lawrence region. We accomplished our goal using generalized binomial N-mixture models and data from ~22,300 marsh bird surveys conducted between 2008 and 2014 by Bird Studies Canada's Prairie, Great Lakes, and Québec Marsh Monitoring Programs. Across all species, on average, detection probability was highest in the Great Lakes-St. Lawrence region from the beginning of May until mid-June, and then fell throughout the remainder of the season until the end of June; was lowest in the Prairie region in mid-May and then increased throughout the remainder of the season until the end of June; was highest during darkness compared with light; and did not vary significantly according to temperature (range: 0-30°C, cloud cover (0%-100%, or wind (0-20 kph, or during morning versus evening. We used our results to formulate improved marsh bird survey protocols for each region. Our analysis and recommendations are useful and contribute to conservation of wetland birds at various scales from local single-species studies to the continental North American Marsh Bird Monitoring Program.
Directory of Open Access Journals (Sweden)
Damiano Monelli
2010-11-01
Full Text Available We present here two self-consistent implementations of a short-term earthquake probability (STEP model that produces daily seismicity forecasts for the area of the Italian national seismic network. Both implementations combine a time-varying and a time-invariant contribution, for which we assume that the instrumental Italian earthquake catalog provides the best information. For the time-invariant contribution, the catalog is declustered using the clustering technique of the STEP model; the smoothed seismicity model is generated from the declustered catalog. The time-varying contribution is what distinguishes the two implementations: 1 for one implementation (STEP-LG, the original model parameterization and estimation is used; 2 for the other (STEP-NG, the mean abundance method is used to estimate aftershock productivity. In the STEP-NG implementation, earthquakes with magnitude up to ML= 6.2 are expected to be less productive compared to the STEP-LG implementation, whereas larger earthquakes are expected to be more productive. We have retrospectively tested the performance of these two implementations and applied likelihood tests to evaluate their consistencies with observed earthquakes. Both of these implementations were consistent with the observed earthquake data in space: STEP-NG performed better than STEP-LG in terms of forecast rates. More generally, we found that testing earthquake forecasts issued at regular intervals does not test the full power of clustering models, and future experiments should allow for more frequent forecasts starting at the times of triggering events.
Blessent, Daniela; Therrien, René; Lemieux, Jean-Michel
2011-12-01
This paper presents numerical simulations of a series of hydraulic interference tests conducted in crystalline bedrock at Olkiluoto (Finland), a potential site for the disposal of the Finnish high-level nuclear waste. The tests are in a block of crystalline bedrock of about 0.03 km3 that contains low-transmissivity fractures. Fracture density, orientation, and fracture transmissivity are estimated from Posiva Flow Log (PFL) measurements in boreholes drilled in the rock block. On the basis of those data, a geostatistical approach relying on a transitional probability and Markov chain models is used to define a conceptual model based on stochastic fractured rock facies. Four facies are defined, from sparsely fractured bedrock to highly fractured bedrock. Using this conceptual model, three-dimensional groundwater flow is then simulated to reproduce interference pumping tests in either open or packed-off boreholes. Hydraulic conductivities of the fracture facies are estimated through automatic calibration using either hydraulic heads or both hydraulic heads and PFL flow rates as targets for calibration. The latter option produces a narrower confidence interval for the calibrated hydraulic conductivities, therefore reducing the associated uncertainty and demonstrating the usefulness of the measured PFL flow rates. Furthermore, the stochastic facies conceptual model is a suitable alternative to discrete fracture network models to simulate fluid flow in fractured geological media.
Directory of Open Access Journals (Sweden)
Akshay Dongre
2014-12-01
Full Text Available In the present paper, the flames imitating Moderate and Intense Low Oxygen Dilution (MILD combustion are studied using the Probability Density Function (PDF modeling approach. Two burners which imitate MILD combustion are considered for the current study: one is Adelaide Jet-in-Hot-Coflow (JHC burner and the other one is Delft-Jet-In-Hot-Coflow (DJHC burner. 2D RANS simulations have been carried out using Multi-environment Eulerian Probability Density Function (MEPDF approach along with the Interaction-by-Exchange-with-Mean (IEM micro-mixing model. A quantitative comparison is made to assess the accuracy and predictive capability of the MEPDF model in the MILD combustion regime. The computations are performed for two different jet speeds corresponding to Reynolds numbers of Re = 4100 and Re = 8800 for DJHC burner, while Re = 10000 is considered for the Adelaide burner. In the case of DJHC burner, for Re = 4100, it has been observed that the mean axial velocity profiles and the turbulent kinetic energy profiles are in good agreement with the experimental database while the temperature profiles are slightly over-predicted in the downstream region. For the higher Reynolds number case (Re = 8800, the accuracy of the predictions is found to be reduced. Whereas in the case of Adelaide burner, the computed profiles of temperature and the mass fraction of major species (CH4, H2, N2, O2 are found to be in excellent agreement with the measurements while the discrepancies are observed in the mass fraction profiles of CO2 and H2O. In addition, the effects of differential diffusion are observed due to the presence of H2 in the fuel mixture.
Cheng, Rongjun; Ge, Hongxia; Wang, Jufeng
2017-08-01
Due to the maximum velocity and safe headway distance of the different vehicles are not exactly the same, an extended macro model of traffic flow with the consideration of multiple optimal velocity functions with probabilities is proposed in this paper. By means of linear stability theory, the new model's linear stability condition considering multiple probabilities optimal velocity is obtained. The KdV-Burgers equation is derived to describe the propagating behavior of traffic density wave near the neutral stability line through nonlinear analysis. The numerical simulations of influences of multiple maximum velocities and multiple safety distances on model's stability and traffic capacity are carried out. The cases of two different kinds of maximum speeds with same safe headway distance, two different types of safe headway distances with same maximum speed and two different max velocities and two different time-gaps are all explored by numerical simulations. First cases demonstrate that when the proportion of vehicles with a larger vmax increase, the traffic tends to unstable, which also means that jerk and brakes is not conducive to traffic stability and easier to result in stop and go phenomenon. Second cases show that when the proportion of vehicles with greater safety spacing increases, the traffic tends to be unstable, which also means that too cautious assumptions or weak driving skill is not conducive to traffic stability. Last cases indicate that increase of maximum speed is not conducive to traffic stability, while reduction of the safe headway distance is conducive to traffic stability. Numerical simulation manifests that the mixed driving and traffic diversion does not have effect on the traffic capacity when traffic density is low or heavy. Numerical results also show that mixed driving should be chosen to increase the traffic capacity when the traffic density is lower, while the traffic diversion should be chosen to increase the traffic capacity when
Directory of Open Access Journals (Sweden)
Meredith L McClure
Full Text Available Wild pigs (Sus scrofa, also known as wild swine, feral pigs, or feral hogs, are one of the most widespread and successful invasive species around the world. Wild pigs have been linked to extensive and costly agricultural damage and present a serious threat to plant and animal communities due to their rooting behavior and omnivorous diet. We modeled the current distribution of wild pigs in the United States to better understand the physiological and ecological factors that may determine their invasive potential and to guide future study and eradication efforts. Using national-scale wild pig occurrence data reported between 1982 and 2012 by wildlife management professionals, we estimated the probability of wild pig occurrence across the United States using a logistic discrimination function and environmental covariates hypothesized to influence the distribution of the species. Our results suggest the distribution of wild pigs in the U.S. was most strongly limited by cold temperatures and availability of water, and that they were most likely to occur where potential home ranges had higher habitat heterogeneity, providing access to multiple key resources including water, forage, and cover. High probability of occurrence was also associated with frequent high temperatures, up to a high threshold. However, this pattern is driven by pigs' historic distribution in warm climates of the southern U.S. Further study of pigs' ability to persist in cold northern climates is needed to better understand whether low temperatures actually limit their distribution. Our model highlights areas at risk of invasion as those with habitat conditions similar to those found in pigs' current range that are also near current populations. This study provides a macro-scale approach to generalist species distribution modeling that is applicable to other generalist and invasive species.
Lura, Derek; Wernke, Matthew; Alqasemi, Redwan; Carey, Stephanie; Dubey, Rajiv
2012-01-01
This paper presents the probability density based gradient projection (GP) of the null space of the Jacobian for a 25 degree of freedom bilateral robotic human body model (RHBM). This method was used to predict the inverse kinematics of the RHBM and maximize the similarity between predicted inverse kinematic poses and recorded data of 10 subjects performing activities of daily living. The density function was created for discrete increments of the workspace. The number of increments in each direction (x, y, and z) was varied from 1 to 20. Performance of the method was evaluated by finding the root mean squared (RMS) of the difference between the predicted joint angles relative to the joint angles recorded from motion capture. The amount of data included in the creation of the probability density function was varied from 1 to 10 subjects, creating sets of for subjects included and excluded from the density function. The performance of the GP method for subjects included and excluded from the density function was evaluated to test the robustness of the method. Accuracy of the GP method varied with amount of incremental division of the workspace, increasing the number of increments decreased the RMS error of the method, with the error of average RMS error of included subjects ranging from 7.7° to 3.7°. However increasing the number of increments also decreased the robustness of the method.
LaBudde, Robert A.; Harnly, James M.
2013-01-01
A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive non-target (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given. PMID:22468371
Compound nucleus formation probability PCN defined within the dynamical cluster-decay model
Chopra, Sahila; Kaur, Arshdeep; Gupta, Raj K.
2015-01-01
With in the dynamical cluster-decay model (DCM), the compound nucleus fusion/ formation probability PCN is defined for the first time, and its variation with CN excitation energy E* and fissility parameter χ is studied. In DCM, the (total) fusion cross section σfusion is sum of the compound nucleus (CN) and noncompound nucleus (nCN) decay processes, each calculated as the dynamical fragmentation process. The CN cross section σCN is constituted of the evaporation residues (ER) and fusion-fission (ff), including the intermediate mass fragments (IMFs), each calculated for all contributing decay fragments (A1, A2) in terms of their formation and barrier penetration probabilities P0 and P. The nCN cross section σnCN is determined as the quasi-fission (qf) process where P0=1 and P is calculated for the entrance channel nuclei. The calculations are presented for six different target-projectile combinations of CN mass A~100 to superheavy, at various different center-of-mass energies with effects of deformations and orientations of nuclei included in it. Interesting results are that the PCN=1 for complete fusion, but PCN <1 or ≪1 due to the nCN conribution, depending strongly on both E* and χ.
Flint, Alexander C; Rao, Vivek A; Chan, Sheila L; Cullen, Sean P; Faigeles, Bonnie S; Smith, Wade S; Bath, Philip M; Wahlgren, Nils; Ahmed, Niaz; Donnan, Geoff A; Johnston, S Claiborne
2015-08-01
The Totaled Health Risks in Vascular Events (THRIVE) score is a previously validated ischemic stroke outcome prediction tool. Although simplified scoring systems like the THRIVE score facilitate ease-of-use, when computers or devices are available at the point of care, a more accurate and patient-specific estimation of outcome probability should be possible by computing the logistic equation with patient-specific continuous variables. We used data from 12 207 subjects from the Virtual International Stroke Trials Archive and the Safe Implementation of Thrombolysis in Stroke - Monitoring Study to develop and validate the performance of a model-derived estimation of outcome probability, the THRIVE-c calculation. Models were built with logistic regression using the underlying predictors from the THRIVE score: age, National Institutes of Health Stroke Scale score, and the Chronic Disease Scale (presence of hypertension, diabetes mellitus, or atrial fibrillation). Receiver operator characteristics analysis was used to assess model performance and compare the THRIVE-c model to the traditional THRIVE score, using a two-tailed Chi-squared test. The THRIVE-c model performed similarly in the randomly chosen development cohort (n = 6194, area under the curve = 0·786, 95% confidence interval 0·774-0·798) and validation cohort (n = 6013, area under the curve = 0·784, 95% confidence interval 0·772-0·796) (P = 0·79). Similar performance was also seen in two separate external validation cohorts. The THRIVE-c model (area under the curve = 0·785, 95% confidence interval 0·777-0·793) had superior performance when compared with the traditional THRIVE score (area under the curve = 0·746, 95% confidence interval 0·737-0·755) (P computing the logistic equation with patient-specific continuous variables in the THRIVE-c calculation, outcomes at the individual patient level are more accurately estimated. Given the widespread availability of
Energy Technology Data Exchange (ETDEWEB)
Kukla, G.; Gavin, J. [Columbia Univ., Palisades, NY (United States). Lamont-Doherty Geological Observatory
1994-05-01
This report was prepared at the Lamont-Doherty Geological Observatory of Columbia University at Palisades, New York, under subcontract to Pacific Northwest Laboratory it is a part of a larger project of global climate studies which supports site characterization work required for the selection of a potential high-level nuclear waste repository and forms part of the Performance Assessment Scientific Support (PASS) Program at PNL. The work under the PASS Program is currently focusing on the proposed site at Yucca Mountain, Nevada, and is under the overall direction of the Yucca Mountain Project Office US Department of Energy, Las Vegas, Nevada. The final results of the PNL project will provide input to global atmospheric models designed to test specific climate scenarios which will be used in the site specific modeling work of others. The primary purpose of the data bases compiled and of the astronomic predictive models is to aid in the estimation of the probabilities of future climate states. The results will be used by two other teams working on the global climate study under contract to PNL. They are located at and the University of Maine in Orono, Maine, and the Applied Research Corporation in College Station, Texas. This report presents the results of the third year`s work on the global climate change models and the data bases describing past climates.
Derivation of the Tumour Control Probability (TCP from a Cell Cycle Model
Directory of Open Access Journals (Sweden)
A. Dawson
2006-01-01
Full Text Available In this paper, a model for the radiation treatment of cancer which includes the effects of the cell cycle is derived from first principles. A malignant cell population is divided into two compartments based on radiation sensitivities. The active compartment includes the four phases of the cell cycle, while the quiescent compartment consists of the G0 state. Analysis of this active-quiescent radiation model confirms the classical interpretation of the linear quadratic (LQ model, which is that a larger α/β ratio corresponds to a fast cell cycle, while a smaller ratio corresponds to a slow cell cycle. Additionally, we find that a large α/β ratio indicates the existence of a significant quiescent phase. The active-quiescent model is extended as a nonlinear birth–death process in order to derive an explicit time dependent expression for the tumour control probability (TCP. This work extends the TCP formula from Zaider and Minerbo and it enables the TCP to be calculated for general time dependent treatment schedules.
A scan statistic for continuous data based on the normal probability model
Directory of Open Access Journals (Sweden)
Huang Lan
2009-10-01
Full Text Available Abstract Temporal, spatial and space-time scan statistics are commonly used to detect and evaluate the statistical significance of temporal and/or geographical disease clusters, without any prior assumptions on the location, time period or size of those clusters. Scan statistics are mostly used for count data, such as disease incidence or mortality. Sometimes there is an interest in looking for clusters with respect to a continuous variable, such as lead levels in children or low birth weight. For such continuous data, we present a scan statistic where the likelihood is calculated using the the normal probability model. It may also be used for other distributions, while still maintaining the correct alpha level. In an application of the new method, we look for geographical clusters of low birth weight in New York City.
Yan, Binjun; Bai, Xue; Sheng, Yunjie; Li, Fanzhu
2017-09-01
On account of the complicated compositions of the products like traditional Chinese medicines (TCMs) and functional foods, it is a common practice to determine different sets of analytes in the same product for different purposes. To efficiently develop the corresponding HPLC methods, a statistical model based analytical method adjustment (SMB-AMA) strategy was proposed. In this strategy, the HPLC data acquired with design of experiments methodology were efficiently utilised to build the retention models for all the analytes and interferences shown in the chromatograms with multivariate statistical modelling methods. According to the set of analytes under research, Monte-Carlo simulations were conducted based on these retention models to estimate the probability of achieving adequate separations between all the analytes and their interferences. Then the analytical parameters were mathematically optimised to the point giving a high value of this probability to compose a robust HPLC method. Radix Angelica Sinensis (RAS) and its TCM formula with Folium Epimedii (FE) were taken as the complicated samples for case studies. The retention models for the compounds in RAS and FE were built independently with correlation coefficients all above 0.9799. The analytical parameters were tactfully adjusted to adapt to six cases of different sets of analytes and different sample matrices. In the validation experiments using the adjusted analytical parameters, satisfactory separations were acquired. The results demonstrated that the SMB-AMA strategy was able to develop HPLC methods rationally and rapidly in the adaption of different sets of analytes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Zebrafish as a Model for the Study of Microvascular Complications of Diabetes and Their Mechanisms
Directory of Open Access Journals (Sweden)
Karl Heckler
2017-09-01
Full Text Available Diabetes mellitus (DM is a crucial metabolic disease that leads to severe disorders. These include macrovascular complications such as myocardial infarction, stroke, and peripheral artery disease and microvascular complications including diabetic nephropathy, neuropathy, and retinopathy. Diabetes mellitus, along with its associated organ pathologies, is one of the key problems in today’s medicine. Zebrafish is an upcoming disease model organism in diabetes research. Its glucose metabolism and the pathways of reactive metabolite formation are very similar to those of humans. Moreover, several physiological and pathophysiological pathways that also exist in humans and other mammals have been identified in this species or are currently under intense investigation. Zebrafish offer sophisticated imaging techniques and allow simple and fast genetic and pharmacological approaches with a high throughput. In this review, we highlight achievements and mechanisms concerning microvascular complications discovered in zebrafish, and we discuss the advantages and disadvantages of zebrafish as a model for studying diabetic complications.
A generative probability model of joint label fusion for multi-atlas based brain segmentation.
Wu, Guorong; Wang, Qian; Zhang, Daoqiang; Nie, Feiping; Huang, Heng; Shen, Dinggang
2014-08-01
Automated labeling of anatomical structures in medical images is very important in many neuroscience studies. Recently, patch-based labeling has been widely investigated to alleviate the possible mis-alignment when registering atlases to the target image. However, the weights used for label fusion from the registered atlases are generally computed independently and thus lack the capability of preventing the ambiguous atlas patches from contributing to the label fusion. More critically, these weights are often calculated based only on the simple patch similarity, thus not necessarily providing optimal solution for label fusion. To address these limitations, we propose a generative probability model to describe the procedure of label fusion in a multi-atlas scenario, for the goal of labeling each point in the target image by the best representative atlas patches that also have the largest labeling unanimity in labeling the underlying point correctly. Specifically, sparsity constraint is imposed upon label fusion weights, in order to select a small number of atlas patches that best represent the underlying target patch, thus reducing the risks of including the misleading atlas patches. The labeling unanimity among atlas patches is achieved by exploring their dependencies, where we model these dependencies as the joint probability of each pair of atlas patches in correctly predicting the labels, by analyzing the correlation of their morphological error patterns and also the labeling consensus among atlases. The patch dependencies will be further recursively updated based on the latest labeling results to correct the possible labeling errors, which falls to the Expectation Maximization (EM) framework. To demonstrate the labeling performance, we have comprehensively evaluated our patch-based labeling method on the whole brain parcellation and hippocampus segmentation. Promising labeling results have been achieved with comparison to the conventional patch-based labeling
A Model of Median Auroral Electron Flux Deduced from Hardy 2008 Model Probability Density Maps
2013-11-01
Dr. Cassandra G. Fesen Edward J. Masterson, Colonel, USAF Program Manager, AFRL...since 1985. The upgraded Hardy model, referred here as H-08, turns out to be a large database and lacks functional representation. The purpose of...CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON Dr. Cassandra G. Fesen a. REPORT Unclassified b. ABSTRACT
Tenkès, Lucille-Marie; Hollerbach, Rainer; Kim, Eun-jin
2017-12-01
A probabilistic description is essential for understanding growth processes in non-stationary states. In this paper, we compute time-dependent probability density functions (PDFs) in order to investigate stochastic logistic and Gompertz models, which are two of the most popular growth models. We consider different types of short-correlated multiplicative and additive noise sources and compare the time-dependent PDFs in the two models, elucidating the effects of the additive and multiplicative noises on the form of PDFs. We demonstrate an interesting transition from a unimodal to a bimodal PDF as the multiplicative noise increases for a fixed value of the additive noise. A much weaker (leaky) attractor in the Gompertz model leads to a significant (singular) growth of the population of a very small size. We point out the limitation of using stationary PDFs, mean value and variance in understanding statistical properties of the growth in non-stationary states, highlighting the importance of time-dependent PDFs. We further compare these two models from the perspective of information change that occurs during the growth process. Specifically, we define an infinitesimal distance at any time by comparing two PDFs at times infinitesimally apart and sum these distances in time. The total distance along the trajectory quantifies the total number of different states that the system undergoes in time, and is called the information length. We show that the time-evolution of the two models become more similar when measured in units of the information length and point out the merit of using the information length in unifying and understanding the dynamic evolution of different growth processes.
Probability state modeling of memory CD8⁺ T-cell differentiation.
Inokuma, Margaret S; Maino, Vernon C; Bagwell, C Bruce
2013-11-29
Flow cytometric analysis enables the simultaneous single-cell interrogation of multiple biomarkers for phenotypic and functional identification of heterogeneous populations. Analysis of polychromatic data has become increasingly complex with more measured parameters. Furthermore, manual gating of multiple populations using standard analysis techniques can lead to errors in data interpretation and difficulties in the standardization of analyses. To characterize high-dimensional cytometric data, we demonstrate the use of probability state modeling (PSM) to visualize the differentiation of effector/memory CD8⁺ T cells. With this model, four major CD8⁺ T-cell subsets can be easily identified using the combination of three markers, CD45RA, CCR7 (CD197), and CD28, with the selection markers CD3, CD4, CD8, and side scatter (SSC). PSM enables the translation of complex multicolor flow cytometric data to pathway-specific cell subtypes, the capability of developing averaged models of healthy donor populations, and the analysis of phenotypic heterogeneity. In this report, we also illustrate the heterogeneity in memory T-cell subpopulations as branched differentiation markers that include CD127, CD62L, CD27, and CD57. © 2013. Published by Elsevier B.V. All rights reserved.
Moment-Based Probability Modeling and Extreme Response Estimation, The FITS Routine Version 1.2
Energy Technology Data Exchange (ETDEWEB)
MANUEL,LANCE; KASHEF,TINA; WINTERSTEIN,STEVEN R.
1999-11-01
This report documents the use of the FITS routine, which provides automated fits of various analytical, commonly used probability models from input data. It is intended to complement the previously distributed FITTING routine documented in RMS Report 14 (Winterstein et al., 1994), which implements relatively complex four-moment distribution models whose parameters are fit with numerical optimization routines. Although these four-moment fits can be quite useful and faithful to the observed data, their complexity can make them difficult to automate within standard fitting algorithms. In contrast, FITS provides more robust (lower moment) fits of simpler, more conventional distribution forms. For each database of interest, the routine estimates the distribution of annual maximum response based on the data values and the duration, T, over which they were recorded. To focus on the upper tails of interest, the user can also supply an arbitrary lower-bound threshold, {chi}{sub low}, above which a shifted distribution model--exponential or Weibull--is fit.
Estimation of Probable Maximum Precipitation in Korea using a Regional Climate Model
Directory of Open Access Journals (Sweden)
Jeonghoon Lee
2017-03-01
Full Text Available Extreme precipitation events have been extensively applied to the design of social infra structures. Thus, a method to more scientifically estimate the extreme event is required. This paper suggests a method to estimate the extreme precipitation in Korea using a regional climate model. First, several historical extreme events are identified and the most extreme event of Typhoon Rusa (2002 is selected. Second, the selected event is reconstructed through the Weather Research and Forecasting (WRF model, one of the Regional Climate Models (RCMs. Third, the reconstructed event is maximized by adjusting initial and boundary conditions. Finally, the Probable Maximum Precipitation (PMP is obtained. The WRF could successfully simulate the observed precipitation in terms of spatial and temporal distribution (R2 = 0.81. The combination of the WRF Single-Moment (WSM 6-class graupel scheme (of microphysics, the Betts-Miller-Janjic scheme (of cumulus parameterization and the Mellor-Yamada-Janjic Turbulent Kinetic Energy (TKE scheme (of planetary boundary layer was determined to be the best combination to reconstruct Typhoon Rusa. The estimated PMP (RCM_PMP was compared with the existing PMP. The RCM_PMP was generally in good agreement with the PMP. The suggested methodology is expected to provide assessments of the existing PMP and to provide a new alternative for estimating PMP.
U.S. Geological Survey, Department of the Interior — The Barrier Island and Estuarine Wetland Physical Change Assessment was created to calibrate and test probability models of barrier island estuarine shoreline...
I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.
National Research Council Canada - National Science Library
Bodmer, D; Ligtenberg, M.J.L; Hout, A.H. van der; Gloudemans, S; Ansink, K; Oosterwijk-Wakka, J.C; Hoogerbrugge-van der Linden, N
2006-01-01
To establish an efficient, reliable and easy to apply risk assessment tool to select families with breast and/or ovarian cancer patients for BRCA mutation testing, using available probability models...
National Research Council Canada - National Science Library
Bodmer, D; Ligtenberg, M. J. L; van der Hout, A. H; Gloudemans, S; Ansink, K; Oosterwijk, J. C; Hoogerbrugge, N
2006-01-01
To establish an efficient, reliable and easy to apply risk assessment tool to select families with breast and/or ovarian cancer patients for BRCA mutation testing, using available probability models...
U.S. Geological Survey, Department of the Interior — The Barrier Island and Estuarine Wetland Physical Change Assessment was created to calibrate and test probability models of barrier island estuarine shoreline...
U.S. Geological Survey, Department of the Interior — The Barrier Island and Estuarine Wetland Physical Change Assessment was created to calibrate and test probability models of barrier island estuarine shoreline...
Zhang, Hao H; D'Souza, Warren D; Shi, Leyuan; Meyer, Robert R
2009-08-01
To predict organ-at-risk (OAR) complications as a function of dose-volume (DV) constraint settings without explicit plan computation in a multiplan intensity-modulated radiotherapy (IMRT) framework. Several plans were generated by varying the DV constraints (input features) on the OARs (multiplan framework), and the DV levels achieved by the OARs in the plans (plan properties) were modeled as a function of the imposed DV constraint settings. OAR complications were then predicted for each of the plans by using the imposed DV constraints alone (features) or in combination with modeled DV levels (plan properties) as input to machine learning (ML) algorithms. These ML approaches were used to model two OAR complications after head-and-neck and prostate IMRT: xerostomia, and Grade 2 rectal bleeding. Two-fold cross-validation was used for model verification and mean errors are reported. Errors for modeling the achieved DV values as a function of constraint settings were 0-6%. In the head-and-neck case, the mean absolute prediction error of the saliva flow rate normalized to the pretreatment saliva flow rate was 0.42% with a 95% confidence interval of (0.41-0.43%). In the prostate case, an average prediction accuracy of 97.04% with a 95% confidence interval of (96.67-97.41%) was achieved for Grade 2 rectal bleeding complications. ML can be used for predicting OAR complications during treatment planning allowing for alternative DV constraint settings to be assessed within the planning framework.
Directory of Open Access Journals (Sweden)
A. S. M. Zahid Kausar
2014-01-01
Full Text Available Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS and closest object finder (COF, are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results.
Kausar, A S M Zahid; Reza, Ahmed Wasif; Wo, Lau Chun; Ramiah, Harikrishnan
2014-01-01
Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS) and closest object finder (COF), are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results.
MEASURING MODEL FOR BAD LOANS IN BANKS. THE DEFAULT PROBABILITY MODEL.
Directory of Open Access Journals (Sweden)
SOCOL ADELA
2010-12-01
Full Text Available The banking sectors of the transition countries have progressed remarkably in the last 20 years. In fact, banking in most transition countries has largely shaken off the traumas of the transition eraAt the start of the 21st century banks in these countries look very much like banks elsewhere. That is, they are by no means problem free but they are struggling with the same issues as banks in other emerging market countries during the financial crises conditions. The institutional environment differs considerably among the countries. The goal we set with this article is to examine in terms of methodology the most important assessment criteria of a measuring model for bad loans.
Directory of Open Access Journals (Sweden)
I.V. Zhalinska
2015-09-01
Full Text Available Diagnostics of enterprise bankruptcy occurrence probability is defined as an important tool ensuring the viability of an organization under conditions of unpredictable dynamic environment. The paper aims to define the basic features of diagnostics of bankruptcy occurrence probability models and their classification. The article grounds the objective increasing of crisis probability in modern enterprises where such increasing leads to the need to improve the efficiency of anti-crisis enterprise activities. The system of anti-crisis management is based on the subsystem of diagnostics of bankruptcy occurrence probability. Such a subsystem is the main one for further measures to prevent and overcome the crisis. The classification of existing models of enterprise bankruptcy occurrence probability has been suggested. The classification is based on methodical and methodological principles of models. The following main groups of models are determined: the models using financial ratios, aggregates and scores, the models of discriminated analysis, the methods of strategic analysis, informal models, artificial intelligence systems and the combination of the models. The classification made it possible to identify the analytical capabilities of each of the groups of models suggested.
Chin, Eu Gene; Ebesutani, Chad; Young, John
2013-01-01
The tripartite model of anxiety and depression has received strong support among child and adolescent populations. Clinical samples of children and adolescents in these studies, however, have usually been referred for treatment of anxiety and depression. This study investigated the fit of the tripartite model with a complicated sample of…
Directory of Open Access Journals (Sweden)
Çiğdem ÖZARİ
2018-01-01
Full Text Available In this study, we have worked on developing a brand-new index called Fuzzy-bankruptcy index. The aim of this index is to find out the default probability of any company X, independent from the sector it belongs. Fuzzy logic is used to state the financial ratiointerruption change related with time and inside different sectors, the new index is created to eliminate the number of the relativity of financial ratios. The four input variables inside the five main input variables used for the fuzzy process, are chosen from both factor analysis and clustering and the last input variable calculated from Merton Model. As we analyze in the past cases of the default history of companies, one could explore different reasons such as managerial arrogance, fraud and managerial mistakes, that are responsible for the very poor endings of prestigious companies like Enron, K-Mart. Because of these kind of situations, we try to design a model which one could be able to get a better view of a company’s financial position, and it couldbe prevent credit loan companies from investing in the wrong company and possibly from losing all investments using our Fuzzy-bankruptcy index.
Ruin probability with claims modeled by a stationary ergodic stable process
Mikosch, T; Samorodnitsky, G
2000-01-01
For a random walk with negative drift we study the exceedance probability (ruin probability) of a high threshold. The steps of this walk (claim sizes) constitute a stationary ergodic stable process. We study how ruin occurs in this situation and evaluate the asymptotic behavior of the ruin
Ezawa, Kiyoshi
2016-08-11
Insertions and deletions (indels) account for more nucleotide differences between two related DNA sequences than substitutions do, and thus it is imperative to develop a stochastic evolutionary model that enables us to reliably calculate the probability of the sequence evolution through indel processes. Recently, indel probabilistic models are mostly based on either hidden Markov models (HMMs) or transducer theories, both of which give the indel component of the probability of a given sequence alignment as a product of either probabilities of column-to-column transitions or block-wise contributions along the alignment. However, it is not a priori clear how these models are related with any genuine stochastic evolutionary model, which describes the stochastic evolution of an entire sequence along the time-axis. Moreover, currently none of these models can fully accommodate biologically realistic features, such as overlapping indels, power-law indel-length distributions, and indel rate variation across regions. Here, we theoretically dissect the ab initio calculation of the probability of a given sequence alignment under a genuine stochastic evolutionary model, more specifically, a general continuous-time Markov model of the evolution of an entire sequence via insertions and deletions. Our model is a simple extension of the general "substitution/insertion/deletion (SID) model". Using the operator representation of indels and the technique of time-dependent perturbation theory, we express the ab initio probability as a summation over all alignment-consistent indel histories. Exploiting the equivalence relations between different indel histories, we find a "sufficient and nearly necessary" set of conditions under which the probability can be factorized into the product of an overall factor and the contributions from regions separated by gapless columns of the alignment, thus providing a sort of generalized HMM. The conditions distinguish evolutionary models with
A Tool for Modelling the Probability of Landslides Impacting Road Networks
Taylor, Faith E.; Santangelo, Michele; Marchesini, Ivan; Malamud, Bruce D.; Guzzetti, Fausto
2014-05-01
Triggers such as earthquakes or heavy rainfall can result in hundreds to thousands of landslides occurring across a region within a short space of time. These landslides can in turn result in blockages across the road network, impacting how people move about a region. Here, we show the development and application of a semi-stochastic model to simulate how landslides intersect with road networks during a triggered landslide event. This was performed by creating 'synthetic' triggered landslide inventory maps and overlaying these with a road network map to identify where road blockages occur. Our landslide-road model has been applied to two regions: (i) the Collazzone basin (79 km2) in Central Italy where 422 landslides were triggered by rapid snowmelt in January 1997, (ii) the Oat Mountain quadrangle (155 km2) in California, USA, where 1,350 landslides were triggered by the Northridge Earthquake (M = 6.7) in January 1994. For both regions, detailed landslide inventory maps for the triggered events were available, in addition to maps of landslide susceptibility and road networks of primary, secondary and tertiary roads. To create 'synthetic' landslide inventory maps, landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL. The number of landslide areas selected was based on the observed density of landslides (number of landslides km-2) in the triggered event inventories. Landslide shapes were approximated as ellipses, where the ratio of the major and minor axes varies with AL. Landslides were then dropped over the region semi-stochastically, conditioned by a landslide susceptibility map, resulting in a synthetic landslide inventory map. The originally available landslide susceptibility maps did not take into account susceptibility changes in the immediate vicinity of roads, therefore
A Novel Adaptive Conditional Probability-Based Predicting Model for User’s Personality Traits
Directory of Open Access Journals (Sweden)
Mengmeng Wang
2015-01-01
Full Text Available With the pervasive increase in social media use, the explosion of users’ generated data provides a potentially very rich source of information, which plays an important role in helping online researchers understand user’s behaviors deeply. Since user’s personality traits are the driving force of user’s behaviors, hence, in this paper, along with social network features, we first extract linguistic features, emotional statistical features, and topic features from user’s Facebook status updates, followed by quantifying importance of features via Kendall correlation coefficient. And then, on the basis of weighted features and dynamic updated thresholds of personality traits, we deploy a novel adaptive conditional probability-based predicting model which considers prior knowledge of correlations between user’s personality traits to predict user’s Big Five personality traits. In the experimental work, we explore the existence of correlations between user’s personality traits which provides a better theoretical support for our proposed method. Moreover, on the same Facebook dataset, compared to other methods, our method can achieve an F1-measure of 80.6% when taking into account correlations between user’s personality traits, and there is an impressive improvement of 5.8% over other approaches.
Interpretations of probability
Khrennikov, Andrei
2009-01-01
This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.
Directory of Open Access Journals (Sweden)
Ning Xiang
2013-02-01
Full Text Available Analysis of gait dynamics in children may help understand the development of neuromuscular control and maturation of locomotor function. This paper applied the nonparametric Parzen-window estimation method to establish the probability density function (PDF models for the stride interval time series of 50 children (25 boys and 25 girls. Four statistical parameters, in terms of averaged stride interval (ASI, variation of stride interval (VSI, PDF skewness (SK, and PDF kurtosis (KU, were computed with the Parzen-window PDFs to study the maturation of stride interval in children. By analyzing the results of the children in three age groups (aged 3–5 years, 6–8 years, and 10–14 years, we summarize the key findings of the present study as follows. (1 The gait cycle duration, in terms of ASI, increases until 14 years of age. On the other hand, the gait variability, in terms of VSI, decreases rapidly until 8 years of age, and then continues to decrease at a slower rate. (2 The SK values of both the histograms and Parzen-window PDFs for all of the three age groups are positive, which indicates an imbalance in the stride interval distribution within an age group. However, such an imbalance would be meliorated when the children grow up. (3 The KU values of both the histograms and Parzen-window PDFs decrease with the body growth in children, which suggests that the musculoskeletal growth enables the children to modulate a gait cadence with ease. (4 The SK and KU results also demonstrate the superiority of the Parzen-window PDF estimation method to the Gaussian distribution modeling, for the study of gait maturation in children.
Directory of Open Access Journals (Sweden)
Suvi E. Heinonen
2015-01-01
Full Text Available Diabetes mellitus is a lifelong, incapacitating metabolic disease associated with chronic macrovascular complications (coronary heart disease, stroke, and peripheral vascular disease and microvascular disorders leading to damage of the kidneys (nephropathy and eyes (retinopathy. Based on the current trends, the rising prevalence of diabetes worldwide will lead to increased cardiovascular morbidity and mortality. Therefore, novel means to prevent and treat these complications are needed. Under the auspices of the IMI (Innovative Medicines Initiative, the SUMMIT (SUrrogate markers for Micro- and Macrovascular hard end points for Innovative diabetes Tools consortium is working on the development of novel animal models that better replicate vascular complications of diabetes and on the characterization of the available models. In the past years, with the high level of genomic information available and more advanced molecular tools, a very large number of models has been created. Selecting the right model for a specific study is not a trivial task and will have an impact on the study results and their interpretation. This review gathers information on the available experimental animal models of diabetic macrovascular complications and evaluates their pros and cons for research purposes as well as for drug development.
Lühr, Armin; Löck, Steffen; Jakobi, Annika; Stützer, Kristin; Bandurska-Luque, Anna; Vogelius, Ivan Richter; Enghardt, Wolfgang; Baumann, Michael; Krause, Mechthild
2017-12-01
Objectives of this work are (1) to derive a general clinically relevant approach to model tumor control probability (TCP) for spatially variable risk of failure and (2) to demonstrate its applicability by estimating TCP for patients planned for photon and proton irradiation. The approach divides the target volume into sub-volumes according to retrospectively observed spatial failure patterns. The product of all sub-volume TCPi values reproduces the observed TCP for the total tumor. The derived formalism provides for each target sub-volume i the tumor control dose (D50,i) and slope (γ50,i) parameters at 50% TCPi. For a simultaneous integrated boost (SIB) prescription for 45 advanced head and neck cancer patients, TCP values for photon and proton irradiation were calculated and compared. The target volume was divided into gross tumor volume (GTV), surrounding clinical target volume (CTV), and elective CTV (CTVE). The risk of a local failure in each of these sub-volumes was taken from the literature. Convenient expressions for D50,i and γ50,i were provided for the Poisson and the logistic model. Comparable TCP estimates were obtained for photon and proton plans of the 45 patients using the sub-volume model, despite notably higher dose levels (on average +4.9%) in the low-risk CTVE for photon irradiation. In contrast, assuming a homogeneous dose response in the entire target volume resulted in TCP estimates contradicting clinical experience (the highest failure rate in the low-risk CTVE) and differing substantially between photon and proton irradiation. The presented method is of practical value for three reasons: It (a) is based on empirical clinical outcome data; (b) can be applied to non-uniform dose prescriptions as well as different tumor entities and dose-response models; and (c) is provided in a convenient compact form. The approach may be utilized to target spatial patterns of local failures observed in patient cohorts by prescribing different doses to
Zhu, Jun; Eickhoff, Jens C; Kaiser, Mark S
2003-12-01
Beta-binomial models are widely used for overdispersed binomial data, with the binomial success probability modeled as following a beta distribution. The number of binary trials in each binomial is assumed to be nonrandom and unrelated to the success probability. In many behavioral studies, however, binomial observations demonstrate more complex structures. In this article, a general beta-binomial-Poisson mixture model is developed, to allow for a relation between the number of trials and the success probability for overdispersed binomial data. An EM algorithm is implemented to compute both the maximum likelihood estimates of the model parameters and the corresponding standard errors. For illustration, the methodology is applied to study the feeding behavior of green-backed herons in two southeastern Missouri streams.
Moore, Lynne; Lavoie, André; Bergeron, Eric; Emond, Marcel
2007-03-01
The International Classification of Disease Injury Severity Score (ICISS) and the Trauma Registry Abbreviated Injury Scale Score (TRAIS) are trauma injury severity scores based on probabilities of survival. They are widely used in logistic regression models as raw probability scores to predict the logit of mortality. The aim of this study was to evaluate whether these severity indicators would offer a more accurate prediction of mortality if they were used with a logit transformation. Analyses were based on 25,111 patients from the trauma registries of the four Level I trauma centers in the province of Quebec, Canada, abstracted between 1998 and 2005. The ICISS and TRAIS were calculated using survival proportions from the National Trauma Data Bank. The performance of the ICISS and TRAIS in their widely used form, proportions varying from 0 to 1, was compared with a logit transformation of the scores in logistic regression models predicting in-hospital mortality. Calibration was assessed with the Hosmer-Lemeshow statistic. Neither the ICISS nor the TRAIS had a linear relation with the logit of mortality. A logit transformation of these scores led to a near-linear association and consequently improved model calibration. The Hosmer-Lemeshow statistic was 68 (35-192) and 69 (41-120) with the logit transformation compared with 272 (227-339) and 204 (166-266) with no transformation, for the ICISS and TRAIS, respectively. In logistic regression models predicting mortality, the ICISS and TRAIS should be used with a logit transformation. This study has direct implications for improving the validity of analyses requiring control for injury severity case mix.
A Cost-Utility Model of Care for Peristomal Skin Complications
Neil, Nancy; Inglese, Gary; Manson, Andrea; Townshend, Arden
2016-01-01
PURPOSE: The aim of this study was to evaluate the economic and humanistic implications of using ostomy components to prevent subsequent peristomal skin complications (PSCs) in individuals who experience an initial, leakage-related PSC event. DESIGN: Cost-utility analysis. METHODS: We developed a simple decision model to consider, from a payer's perspective, PSCs managed with and without the use of ostomy components over 1 year. The model evaluated the extent to which outcomes associated with...
Maslennikova, Yu. S.; Nugmanov, I. S.
2016-08-01
The problem of probability density function estimation for a random process is one of the most common in practice. There are several methods to solve this problem. Presented laboratory work uses methods of the mathematical statistics to detect patterns in the realization of random process. On the basis of ergodic theory, we construct algorithm for estimating univariate probability density distribution function for a random process. Correlational analysis of realizations is applied to estimate the necessary size of the sample and the time of observation. Hypothesis testing for two probability distributions (normal and Cauchy) is used on the experimental data, using χ2 criterion. To facilitate understanding and clarity of the problem solved, we use ELVIS II platform and LabVIEW software package that allows us to make the necessary calculations, display results of the experiment and, most importantly, to control the experiment. At the same time students are introduced to a LabVIEW software package and its capabilities.
On the Accurate Modeling of Millimeter Wave Fixed Wireless Channels Exceedance Probability
Georgiadou, E. M.; Panagopoulos, A. D.; Chatzarakis, G. E.; Kanellopoulos, J. D.
2006-07-01
The ever increasing demand for high date rate multimedia services has led to the deployment of Fixed Wireless Access (FWA) networks operating in frequencies above 10GHz. Propagation characteristics of such networks include line-of-sight (LOS) transmissions highly influenced by the presence of rain. In this paper a methodology for evaluating the outage probability of a FWA channel is introduced, making use of the forward scattering amplitude by distorted raindrops of transmitted signals. Expressions for the imaginary part of the scattering amplitude are derived through a regression fitting analysis on the results of the Method of Auxiliary Sources (MAS) to the problem of electromagnetic scattering from a Pruppacher-Pitter raindrop. These expressions are employed and an analytical method to evaluate the rain attenuation exceedance probability over a fixed wireless access link is presented. The derived exceedance probabilities are compared with experimental data from ITU-R databank with encouraging results.
Energy Technology Data Exchange (ETDEWEB)
Hewson, Alex C [Department of Mathematics, Imperial College, London SW7 2AZ (United Kingdom); Bauer, Johannes [Max-Planck Institute for Solid State Research, Heisenbergstrasse 1, 70569 Stuttgart (Germany)
2010-03-24
We show that information on the probability density of local fluctuations can be obtained from a numerical renormalization group calculation of a reduced density matrix. We apply this approach to the Anderson-Holstein impurity model to calculate the ground state probability density rho(x) for the displacement x of the local oscillator. From this density we can deduce an effective local potential for the oscillator and compare its form with that obtained from a semiclassical approximation as a function of the coupling strength. The method is extended to the infinite dimensional Holstein-Hubbard model using dynamical mean field theory. We use this approach to compare the probability densities for the displacement of the local oscillator in the normal, antiferromagnetic and charge ordered phases.
Scheer, Justin K; Smith, Justin S; Schwab, Frank; Lafage, Virginie; Shaffrey, Christopher I; Bess, Shay; Daniels, Alan H; Hart, Robert A; Protopsaltis, Themistocles S; Mundis, Gregory M; Sciubba, Daniel M; Ailon, Tamir; Burton, Douglas C; Klineberg, Eric; Ames, Christopher P
2017-06-01
OBJECTIVE The operative management of patients with adult spinal deformity (ASD) has a high complication rate and it remains unknown whether baseline patient characteristics and surgical variables can predict early complications (intraoperative and perioperative [within 6 weeks]). The development of an accurate preoperative predictive model can aid in patient counseling, shared decision making, and improved surgical planning. The purpose of this study was to develop a model based on baseline demographic, radiographic, and surgical factors that can predict if patients will sustain an intraoperative or perioperative major complication. METHODS This study was a retrospective analysis of a prospective, multicenter ASD database. The inclusion criteria were age ≥ 18 years and the presence of ASD. In total, 45 variables were used in the initial training of the model including demographic data, comorbidities, modifiable surgical variables, baseline health-related quality of life, and coronal and sagittal radiographic parameters. Patients were grouped as either having at least 1 major intraoperative or perioperative complication (COMP group) or not (NOCOMP group). An ensemble of decision trees was constructed utilizing the C5.0 algorithm with 5 different bootstrapped models. Internal validation was accomplished via a 70/30 data split for training and testing each model, respectively. Overall accuracy, the area under the receiver operating characteristic (AUROC) curve, and predictor importance were calculated. RESULTS Five hundred fifty-seven patients were included: 409 (73.4%) in the NOCOMP group, and 148 (26.6%) in the COMP group. The overall model accuracy was 87.6% correct with an AUROC curve of 0.89 indicating a very good model fit. Twenty variables were determined to be the top predictors (importance ≥ 0.90 as determined by the model) and included (in decreasing importance): age, leg pain, Oswestry Disability Index, number of decompression levels, number of
Simulating low-probability peak discharges for the Rhine basin using resampled climate modeling data
te Linde, A.H.; Aerts, J.C.J.M.; Bakker, A.; Kwadijk, J.
2010-01-01
Climate change will increase winter precipitation, and in combination with earlier snowmelt it will cause a shift in peak discharge in the Rhine basin from spring to winter. This will probably lead to an increase in the frequency and magnitude of extreme floods. In this paper we aim to enhance the
Rasanen, Okko
2011-01-01
Word segmentation from continuous speech is a difficult task that is faced by human infants when they start to learn their native language. Several studies indicate that infants might use several different cues to solve this problem, including intonation, linguistic stress, and transitional probabilities between subsequent speech sounds. In this…
A new model for bed load sampler calibration to replace the probability-matching method
Robert B. Thomas; Jack Lewis
1993-01-01
In 1977 extensive data were collected to calibrate six Helley-Smith bed load samplers with four sediment particle sizes in a flume at the St. Anthony Falls Hydraulic Laboratory at the University of Minnesota. Because sampler data cannot be collected at the same time and place as ""true"" trap measurements, the ""probability-matching...
Ratliff, John K; Balise, Ray; Veeravagu, Anand; Cole, Tyler S; Cheng, Ivan; Olshen, Richard A; Tian, Lu
2016-05-18
Postoperative metrics are increasingly important in determining standards of quality for physicians and hospitals. Although complications following spinal surgery have been described, procedural and patient variables have yet to be incorporated into a predictive model of adverse-event occurrence. We sought to develop a predictive model of complication occurrence after spine surgery. We used longitudinal prospective data from a national claims database and developed a predictive model incorporating complication type and frequency of occurrence following spine surgery procedures. We structured our model to assess the impact of features such as preoperative diagnosis, patient comorbidities, location in the spine, anterior versus posterior approach, whether fusion had been performed, whether instrumentation had been used, number of levels, and use of bone morphogenetic protein (BMP). We assessed a variety of adverse events. Prediction models were built using logistic regression with additive main effects and logistic regression with main effects as well as all 2 and 3-factor interactions. Least absolute shrinkage and selection operator (LASSO) regularization was used to select features. Competing approaches included boosted additive trees and the classification and regression trees (CART) algorithm. The final prediction performance was evaluated by estimating the area under a receiver operating characteristic curve (AUC) as predictions were applied to independent validation data and compared with the Charlson comorbidity score. The model was developed from 279,135 records of patients with a minimum duration of follow-up of 30 days. Preliminary assessment showed an adverse-event rate of 13.95%, well within norms reported in the literature. We used the first 80% of the records for training (to predict adverse events) and the remaining 20% of the records for validation. There was remarkable similarity among methods, with an AUC of 0.70 for predicting the occurrence of
Directory of Open Access Journals (Sweden)
Gholamreza Norouzi
2015-01-01
Full Text Available In project management context, time management is one of the most important factors affecting project success. This paper proposes a new method to solve research project scheduling problems (RPSP containing Fuzzy Graphical Evaluation and Review Technique (FGERT networks. Through the deliverables of this method, a proper estimation of project completion time (PCT and success probability can be achieved. So algorithms were developed to cover all features of the problem based on three main parameters “duration, occurrence probability, and success probability.” These developed algorithms were known as PR-FGERT (Parallel and Reversible-Fuzzy GERT networks. The main provided framework includes simplifying the network of project and taking regular steps to determine PCT and success probability. Simplifications include (1 equivalent making of parallel and series branches in fuzzy network considering the concepts of probabilistic nodes, (2 equivalent making of delay or reversible-to-itself branches and impact of changing the parameters of time and probability based on removing related branches, (3 equivalent making of simple and complex loops, and (4 an algorithm that was provided to resolve no-loop fuzzy network, after equivalent making. Finally, the performance of models was compared with existing methods. The results showed proper and real performance of models in comparison with existing methods.
Guan, Li; Hao, Bibo; Cheng, Qijin; Yip, Paul Sf; Zhu, Tingshao
2015-01-01
Traditional offline assessment of suicide probability is time consuming and difficult in convincing at-risk individuals to participate. Identifying individuals with high suicide probability through online social media has an advantage in its efficiency and potential to reach out to hidden individuals, yet little research has been focused on this specific field. The objective of this study was to apply two classification models, Simple Logistic Regression (SLR) and Random Forest (RF), to examine the feasibility and effectiveness of identifying high suicide possibility microblog users in China through profile and linguistic features extracted from Internet-based data. There were nine hundred and nine Chinese microblog users that completed an Internet survey, and those scoring one SD above the mean of the total Suicide Probability Scale (SPS) score, as well as one SD above the mean in each of the four subscale scores in the participant sample were labeled as high-risk individuals, respectively. Profile and linguistic features were fed into two machine learning algorithms (SLR and RF) to train the model that aims to identify high-risk individuals in general suicide probability and in its four dimensions. Models were trained and then tested by 5-fold cross validation; in which both training set and test set were generated under the stratified random sampling rule from the whole sample. There were three classic performance metrics (Precision, Recall, F1 measure) and a specifically defined metric "Screening Efficiency" that were adopted to evaluate model effectiveness. Classification performance was generally matched between SLR and RF. Given the best performance of the classification models, we were able to retrieve over 70% of the labeled high-risk individuals in overall suicide probability as well as in the four dimensions. Screening Efficiency of most models varied from 1/4 to 1/2. Precision of the models was generally below 30%. Individuals in China with high suicide
Cichosz, Simon Lebech; Johansen, Mette Dencker; Hejlesen, Ole
2015-10-14
Diabetes is one of the top priorities in medical science and health care management, and an abundance of data and information is available on these patients. Whether data stem from statistical models or complex pattern recognition models, they may be fused into predictive models that combine patient information and prognostic outcome results. Such knowledge could be used in clinical decision support, disease surveillance, and public health management to improve patient care. Our aim was to review the literature and give an introduction to predictive models in screening for and the management of prevalent short- and long-term complications in diabetes. Predictive models have been developed for management of diabetes and its complications, and the number of publications on such models has been growing over the past decade. Often multiple logistic or a similar linear regression is used for prediction model development, possibly owing to its transparent functionality. Ultimately, for prediction models to prove useful, they must demonstrate impact, namely, their use must generate better patient outcomes. Although extensive effort has been put in to building these predictive models, there is a remarkable scarcity of impact studies. © 2015 Diabetes Technology Society.
We show that a conditional probability analysis that utilizes a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criterai from empirical data. The critical step in this approach is transforming the response ...
Staphylococcus aureus is a foodborne pathogen widespread in the environment and found in various food products. This pathogen can produce enterotoxins that cause illnesses in humans. The objectives of this study were to develop a probability model of S. aureus enterotoxin production as affected by w...
Path Loss, Shadow Fading, and Line-Of-Sight Probability Models for 5G Urban Macro-Cellular Scenarios
DEFF Research Database (Denmark)
Sun, Shu; Thomas, Timothy; Rappaport, Theodore S.
2015-01-01
This paper presents key parameters including the line-of-sight (LOS) probability, large-scale path loss, and shadow fading models for the design of future fifth generation (5G) wireless communication systems in urban macro-cellular (UMa) scenarios, using the data obtained from propagation measure...
DEFF Research Database (Denmark)
Vacca, Alessandro; Prato, Carlo Giacomo; Meloni, Italo
2015-01-01
is the dependency of the parameter estimates from the choice set generation technique. Bias introduced in model estimation has been corrected only for the random walk algorithm, which has problematic applicability to large-scale networks. This study proposes a correction term for the sampling probability of routes...
Panaretos, John
1989-01-01
In this paper, a probability model leading to a Yule distribution is developed in the study of surname frequency data. This distribution, suitably truncated, is fitted to actual data as an alternative to the discrete Pareto distribution, with quite satisfactory results
Probability model of solid to liquid-like transition of a fluid suspension after a shear flow onset
Czech Academy of Sciences Publication Activity Database
Nouar, C.; Říha, Pavel
2008-01-01
Roč. 34, č. 5 (2008), s. 477-483 ISSN 0301-9322 R&D Projects: GA AV ČR IAA200600803 Institutional research plan: CEZ:AV0Z20600510 Keywords : laminar suspension flow * liquid-liquid interface * probability model Subject RIV: BK - Fluid Dynamics Impact factor: 1.497, year: 2008
Janik, Michal R; Walędziak, Maciej; Brągoszewski, Jakub; Kwiatkowski, Andrzej; Paśnik, Krzysztof
2017-04-01
Laparoscopic sleeve gastrectomy (LSG) is one of the most frequently performed bariatric procedures. Hemorrhagic complications (HC) after surgery are common and require surgical revision. Accurate estimation of the risk of postoperative HC can improve surgical decision-making process and minimize the risk of reoperation. The aim of the present study was to develop a predictive model for HC after LSG. The retrospective analysis of 522 patients after primary LSG was performed. Patients underwent surgery from January 2013 to February 2015. The primary outcome was defined as a surgical revision due to hemorrhagic complications. Multiple regression analysis was performed. The rate of hemorrhagic complications was 4 %. The mean age of patients was 41.0 (±11.6) years and mean BMI was 47.3 (±7.3) kg/m2. Of the 12 examined variables, four were associated with risk of HC. Protective factors for HC were no history of obstructive sleep apnea (odds ratio [OR] 0.22; 95 % CI 0.05-0.94) and no history of hypertension (OR 0.38; 95 % CI 0.14-1.05). The low level of expertise in bariatric surgery (OR 2.85; 95 % CI 1.08-7.53) and no staple line reinforcement (OR 3.34; 95 % CI 1.21-9.21) were associated with higher risk of HC. The result revealed the association between hemorrhagic complications and the following factors: obstructive sleep apnea, hypertension, level of expertise in bariatric surgery, and reinforcement of the staple line. The risk assessment model for hemorrhagic complications after LSG can contribute to surgical decision-making process.
Regional Permafrost Probability Modelling in the northwestern Cordillera, 59°N - 61°N, Canada
Bonnaventure, P. P.; Lewkowicz, A. G.
2010-12-01
High resolution (30 x 30 m) permafrost probability models were created for eight mountainous areas in the Yukon and northernmost British Columbia. Empirical-statistical modelling based on the Basal Temperature of Snow (BTS) method was used to develop spatial relationships. Model inputs include equivalent elevation (a variable that incorporates non-uniform temperature change with elevation), potential incoming solar radiation and slope. Probability relationships between predicted BTS and permafrost presence were developed for each area using late-summer physical observations in pits, or by using year-round ground temperature measurements. A high-resolution spatial model for the region has now been generated based on seven of the area models. Each was applied to the entire region, and their predictions were then blended based on a distance decay function from the model source area. The regional model is challenging to validate independently because there are few boreholes in the region. However, a comparison of results to a recently established inventory of rock glaciers for the Yukon suggests its validity because predicted permafrost probabilities were 0.8 or greater for almost 90% of these landforms. Furthermore, the regional model results have a similar spatial pattern to those modelled independently in the eighth area, although predicted probabilities using the regional model are generally higher. The regional model predicts that permafrost underlies about half of the non-glaciated terrain in the region, with probabilities increasing regionally from south to north and from east to west. Elevation is significant, but not always linked in a straightforward fashion because of weak or inverted trends in permafrost probability below treeline. Above treeline, however, permafrost probabilities increase and approach 1.0 in very high elevation areas throughout the study region. The regional model shows many similarities to previous Canadian permafrost maps (Heginbottom
Conceptual and Statistical Issues Regarding the Probability of Default and Modeling Default Risk
Directory of Open Access Journals (Sweden)
Emilia TITAN
2011-03-01
Full Text Available In today’s rapidly evolving financial markets, risk management offers different techniques in order to implement an efficient system against market risk. Probability of default (PD is an essential part of business intelligence and customer relation management systems in the financial institutions. Recent studies indicates that underestimating this important component, and also the loss given default (LGD, might threaten the stability and smooth running of the financial markets. From the perspective of risk management, the result of predictive accuracy of the estimated probability of default is more valuable than the standard binary classification: credible or non credible clients. The Basle II Accord recognizes the methods of reducing credit risk and also PD and LGD as important components of advanced Internal Rating Based (IRB approach.
A Statistical Model for Determining the Probability of Observing Exoplanetary Radio Emissions
Garcia, R.; Knapp, M.; Winterhalter, D.; Majid, W.
2015-12-01
The idea that extrasolar planets should emit radiation in the low-frequency radio regime is a generalization of the observation of decametric and kilometric radio emissions from magnetic planets in our own solar system, yet none of these emissions have been observed. Such radio emissions are a result of the interactions between the host star's magnetized wind and the planet's magnetosphere that accelerate electrons along the field lines, which leads to radio emissions at the electron gyrofrequency. To understand why these emissions had not yet been observed, and to guide in target selection for future detection efforts, we took a statistical approach to determine what the ideal location in parameter space was for these hypothesized exoplanetary radio emissions to be detected. We derived probability distribution functions from current datasets for the observably constrained parameters (such as the radius of the host star), and conducted a review of the literature to construct reasonable probability distribution functions to obtain the unconstrained parameters (such as the magnetic field strength of the exoplanet). We then used Monte Carlo sampling to develop a synthetic population of exoplanetary systems and calculated whether the radio emissions from the systems were detectable depending on the angle of beaming, frequency (above the ionospheric cutoff rate of 10 MHz) and flux density (above 5 mJy) of the emission. From millions of simulations we derived a probability distribution function in parameter space as a function of host star type, orbital radius and planetary or host star radius. The probability distribution function illustrated the optimal parameter values of an exoplanetary system that may make the system's radio emissions detectable to current and currently under development instruments such as the SKA. We found that detection of exoplanetary radio emissions favor planets larger than 5 Earth radii and within 1 AU of their M dwarf host.
Shiryaev, Albert N
2016-01-01
This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.
Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...
Directory of Open Access Journals (Sweden)
López Enrique
2012-11-01
Full Text Available Abstract Background Osteoporotic hip fractures represent major cause of disability, loss of quality of life and even mortality among the elderly population. Decisions on drug therapy are based on the assessment of risk factors for fracture, from BMD measurements. The combination of biomechanical models with clinical studies could better estimate bone strength and supporting the specialists in their decision. Methods A model to assess the probability of fracture, based on the Damage and Fracture Mechanics has been developed, evaluating the mechanical magnitudes involved in the fracture process from clinical BMD measurements. The model is intended for simulating the degenerative process in the skeleton, with the consequent lost of bone mass and hence the decrease of its mechanical resistance which enables the fracture due to different traumatisms. Clinical studies were chosen, both in non-treatment conditions and receiving drug therapy, and fitted to specific patients according their actual BMD measures. The predictive model is applied in a FE simulation of the proximal femur. The fracture zone would be determined according loading scenario (sideway fall, impact, accidental loads, etc., using the mechanical properties of bone obtained from the evolutionary model corresponding to the considered time. Results BMD evolution in untreated patients and in those under different treatments was analyzed. Evolutionary curves of fracture probability were obtained from the evolution of mechanical damage. The evolutionary curve of the untreated group of patients presented a marked increase of the fracture probability, while the curves of patients under drug treatment showed variable decreased risks, depending on the therapy type. Conclusion The FE model allowed to obtain detailed maps of damage and fracture probability, identifying high-risk local zones at femoral neck and intertrochanteric and subtrochanteric areas, which are the typical locations of
Rosa, A. N. F.; Wiatr, P.; Cavdar, C.; Carvalho, S. V.; Costa, J. C. W. A.; Wosinska, L.
2015-11-01
In Elastic Optical Network (EON), spectrum fragmentation refers to the existence of non-aligned, small-sized blocks of free subcarrier slots in the optical spectrum. Several metrics have been proposed in order to quantify a level of spectrum fragmentation. Approximation methods might be used for estimating average blocking probability and some fragmentation measures, but are so far unable to accurately evaluate the influence of different sizes of connection requests and do not allow in-depth investigation of blocking events and their relation to fragmentation. The analytical study of the effect of fragmentation on requests' blocking probability is still under-explored. In this work, we introduce new definitions for blocking that differentiate between the reasons for the blocking events. We developed a framework based on Markov modeling to calculate steady-state probabilities for the different blocking events and to analyze fragmentation related problems in elastic optical links under dynamic traffic conditions. This framework can also be used for evaluation of different definitions of fragmentation in terms of their relation to the blocking probability. We investigate how different allocation request sizes contribute to fragmentation and blocking probability. Moreover, we show to which extend blocking events, due to insufficient amount of available resources, become inevitable and, compared to the amount of blocking events due to fragmented spectrum, we draw conclusions on the possible gains one can achieve by system defragmentation. We also show how efficient spectrum allocation policies really are in reducing the part of fragmentation that in particular leads to actual blocking events. Simulation experiments are carried out showing good match with our analytical results for blocking probability in a small scale scenario. Simulated blocking probabilities for the different blocking events are provided for a larger scale elastic optical link.
DEFF Research Database (Denmark)
Rojas-Nandayapa, Leonardo
Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think....... By doing so, we will obtain a deeper insight into how events involving large values of sums of heavy-tailed random variables are likely to occur....
Directory of Open Access Journals (Sweden)
Xufeng Wang
2017-01-01
Full Text Available One of the keys to the success of aerial refueling for probe-drogue aerial refueling system (PDARS is the successful docking between the probe and drogue. The study of probe-drogue docking success probability offers an important support to achieving successful docking. During the docking phase of PDARS, based on prior information and reasonable assumptions for the movements of the drogue under atmospheric disturbance, the probe-drogue docking success probability is converted to the probability of the drogue center located in a specific area. A model of the probe-drogue docking success probability is established with and without actuation error, respectively. The curves of the probe-drogue docking success probability with the standard deviation of the drogue central position, the maximum distance from the drogue center position to the equilibrium position, the actuation error, and the standard deviation of the actuation error are obtained through simulations. The study has referential value for the docking maneuver decision of aerial refueling for PDARS.
Haber, M; An, Q; Foppa, I M; Shay, D K; Ferdinands, J M; Orenstein, W A
2015-05-01
As influenza vaccination is now widely recommended, randomized clinical trials are no longer ethical in many populations. Therefore, observational studies on patients seeking medical care for acute respiratory illnesses (ARIs) are a popular option for estimating influenza vaccine effectiveness (VE). We developed a probability model for evaluating and comparing bias and precision of estimates of VE against symptomatic influenza from two commonly used case-control study designs: the test-negative design and the traditional case-control design. We show that when vaccination does not affect the probability of developing non-influenza ARI then VE estimates from test-negative design studies are unbiased even if vaccinees and non-vaccinees have different probabilities of seeking medical care against ARI, as long as the ratio of these probabilities is the same for illnesses resulting from influenza and non-influenza infections. Our numerical results suggest that in general, estimates from the test-negative design have smaller bias compared to estimates from the traditional case-control design as long as the probability of non-influenza ARI is similar among vaccinated and unvaccinated individuals. We did not find consistent differences between the standard errors of the estimates from the two study designs.
Ezawa, Kiyoshi
2016-09-27
Insertions and deletions (indels) account for more nucleotide differences between two related DNA sequences than substitutions do, and thus it is imperative to develop a method to reliably calculate the occurrence probabilities of sequence alignments via evolutionary processes on an entire sequence. Previously, we presented a perturbative formulation that facilitates the ab initio calculation of alignment probabilities under a continuous-time Markov model, which describes the stochastic evolution of an entire sequence via indels with quite general rate parameters. And we demonstrated that, under some conditions, the ab initio probability of an alignment can be factorized into the product of an overall factor and contributions from regions (or local alignments) delimited by gapless columns. Here, using our formulation, we attempt to approximately calculate the probabilities of local alignments under space-homogeneous cases. First, for each of all types of local pairwise alignments (PWAs) and some typical types of local multiple sequence alignments (MSAs), we numerically computed the total contribution from all parsimonious indel histories and that from all next-parsimonious histories, and compared them. Second, for some common types of local PWAs, we derived two integral equation systems that can be numerically solved to give practically exact solutions. We compared the total parsimonious contribution with the practically exact solution for each such local PWA. Third, we developed an algorithm that calculates the first-approximate MSA probability by multiplying total parsimonious contributions from all local MSAs. Then we compared the first-approximate probability of each local MSA with its absolute frequency in the MSAs created via a genuine sequence evolution simulator, Dawg. In all these analyses, the total parsimonious contributions approximated the multiplication factors fairly well, as long as gap sizes and branch lengths are at most moderate. Examination of
Kramer, Andrew A; Higgins, Thomas L; Zimmerman, Jack E
2014-03-01
To examine the accuracy of the original Mortality Probability Admission Model III, ICU Outcomes Model/National Quality Forum modification of Mortality Probability Admission Model III, and Acute Physiology and Chronic Health Evaluation IVa models for comparing observed and risk-adjusted hospital mortality predictions. Retrospective paired analyses of day 1 hospital mortality predictions using three prognostic models. Fifty-five ICUs at 38 U.S. hospitals from January 2008 to December 2012. Among 174,001 intensive care admissions, 109,926 met model inclusion criteria and 55,304 had data for mortality prediction using all three models. None. We compared patient exclusions and the discrimination, calibration, and accuracy for each model. Acute Physiology and Chronic Health Evaluation IVa excluded 10.7% of all patients, ICU Outcomes Model/National Quality Forum 20.1%, and Mortality Probability Admission Model III 24.1%. Discrimination of Acute Physiology and Chronic Health Evaluation IVa was superior with area under receiver operating curve (0.88) compared with Mortality Probability Admission Model III (0.81) and ICU Outcomes Model/National Quality Forum (0.80). Acute Physiology and Chronic Health Evaluation IVa was better calibrated (lowest Hosmer-Lemeshow statistic). The accuracy of Acute Physiology and Chronic Health Evaluation IVa was superior (adjusted Brier score = 31.0%) to that for Mortality Probability Admission Model III (16.1%) and ICU Outcomes Model/National Quality Forum (17.8%). Compared with observed mortality, Acute Physiology and Chronic Health Evaluation IVa overpredicted mortality by 1.5% and Mortality Probability Admission Model III by 3.1%; ICU Outcomes Model/National Quality Forum underpredicted mortality by 1.2%. Calibration curves showed that Acute Physiology and Chronic Health Evaluation performed well over the entire risk range, unlike the Mortality Probability Admission Model and ICU Outcomes Model/National Quality Forum models. Acute
Anuwar, Muhammad Hafidz; Jaffar, Maheran Mohd
2017-08-01
This paper provides an overview for the assessment of credit risk specific to the banks. In finance, risk is a term to reflect the potential of financial loss. The risk of default on loan may increase when a company does not make a payment on that loan when the time comes. Hence, this framework analyses the KMV-Merton model to estimate the probabilities of default for Malaysian listed companies. In this way, banks can verify the ability of companies to meet their loan commitments in order to overcome bad investments and financial losses. This model has been applied to all Malaysian listed companies in Bursa Malaysia for estimating the credit default probabilities of companies and compare with the rating given by the rating agency, which is RAM Holdings Berhad to conform to reality. Then, the significance of this study is a credit risk grade is proposed by using the KMV-Merton model for the Malaysian listed companies.
Childers, Timothy
2013-01-01
Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,
A PROBABILITY MODEL FOR DROUGHT PREDICTION USING FUSION OF MARKOV CHAIN AND SAX METHODS
Directory of Open Access Journals (Sweden)
Y. Jouybari-Moghaddam
2017-09-01
Full Text Available Drought is one of the most powerful natural disasters which are affected on different aspects of the environment. Most of the time this phenomenon is immense in the arid and semi-arid area. Monitoring and prediction the severity of the drought can be useful in the management of the natural disaster caused by drought. Many indices were used in predicting droughts such as SPI, VCI, and TVX. In this paper, based on three data sets (rainfall, NDVI, and land surface temperature which are acquired from MODIS satellite imagery, time series of SPI, VCI, and TVX in time limited between winters 2000 to summer 2015 for the east region of Isfahan province were created. Using these indices and fusion of symbolic aggregation approximation and hidden Markov chain drought was predicted for fall 2015. For this purpose, at first, each time series was transformed into the set of quality data based on the state of drought (5 group by using SAX algorithm then the probability matrix for the future state was created by using Markov hidden chain. The fall drought severity was predicted by fusion the probability matrix and state of drought severity in summer 2015. The prediction based on the likelihood for each state of drought includes severe drought, middle drought, normal drought, severe wet and middle wet. The analysis and experimental result from proposed algorithm show that the product of this algorithm is acceptable and the proposed algorithm is appropriate and efficient for predicting drought using remote sensor data.
DotKnot: pseudoknot prediction using the probability dot plot under a refined energy model
Sperschneider, Jana; Datta, Amitava
2010-01-01
RNA pseudoknots are functional structure elements with key roles in viral and cellular processes. Prediction of a pseudoknotted minimum free energy structure is an NP-complete problem. Practical algorithms for RNA structure prediction including restricted classes of pseudoknots suffer from high runtime and poor accuracy for longer sequences. A heuristic approach is to search for promising pseudoknot candidates in a sequence and verify those. Afterwards, the detected pseudoknots can be further analysed using bioinformatics or laboratory techniques. We present a novel pseudoknot detection method called DotKnot that extracts stem regions from the secondary structure probability dot plot and assembles pseudoknot candidates in a constructive fashion. We evaluate pseudoknot free energies using novel parameters, which have recently become available. We show that the conventional probability dot plot makes a wide class of pseudoknots including those with bulged stems manageable in an explicit fashion. The energy parameters now become the limiting factor in pseudoknot prediction. DotKnot is an efficient method for long sequences, which finds pseudoknots with higher accuracy compared to other known prediction algorithms. DotKnot is accessible as a web server at http://dotknot.csse.uwa.edu.au. PMID:20123730
a Probability Model for Drought Prediction Using Fusion of Markov Chain and SAX Methods
Jouybari-Moghaddam, Y.; Saradjian, M. R.; Forati, A. M.
2017-09-01
Drought is one of the most powerful natural disasters which are affected on different aspects of the environment. Most of the time this phenomenon is immense in the arid and semi-arid area. Monitoring and prediction the severity of the drought can be useful in the management of the natural disaster caused by drought. Many indices were used in predicting droughts such as SPI, VCI, and TVX. In this paper, based on three data sets (rainfall, NDVI, and land surface temperature) which are acquired from MODIS satellite imagery, time series of SPI, VCI, and TVX in time limited between winters 2000 to summer 2015 for the east region of Isfahan province were created. Using these indices and fusion of symbolic aggregation approximation and hidden Markov chain drought was predicted for fall 2015. For this purpose, at first, each time series was transformed into the set of quality data based on the state of drought (5 group) by using SAX algorithm then the probability matrix for the future state was created by using Markov hidden chain. The fall drought severity was predicted by fusion the probability matrix and state of drought severity in summer 2015. The prediction based on the likelihood for each state of drought includes severe drought, middle drought, normal drought, severe wet and middle wet. The analysis and experimental result from proposed algorithm show that the product of this algorithm is acceptable and the proposed algorithm is appropriate and efficient for predicting drought using remote sensor data.
Quantum Probabilities as Behavioral Probabilities
Directory of Open Access Journals (Sweden)
Vyacheslav I. Yukalov
2017-03-01
Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.
Meisner, Søren; Lehur, Paul-Antoine; Moran, Brendan; Martins, Lina; Jemec, Gregor Borut Ernst
2012-01-01
Background Peristomal skin complications (PSCs) are the most common post-operative complications following creation of a stoma. Living with a stoma is a challenge, not only for the patient and their carers, but also for society as a whole. Due to methodological problems of PSC assessment, the associated health-economic burden of medium to longterm complications has been poorly described. Aim The aim of the present study was to create a model to estimate treatment costs of PSCs using the standardized assessment Ostomy Skin Tool as a reference. The resultant model was applied to a real-life global data set of stoma patients (n = 3017) to determine the prevalence and financial burden of PSCs. Methods Eleven experienced stoma care nurses were interviewed to get a global understanding of a treatment algorithm that formed the basis of the cost analysis. The estimated costs were based on a seven week treatment period. PSC costs were estimated for five underlying diagnostic categories and three levels of severity. The estimated treatment costs of severe cases of PSCs were increased 2–5 fold for the different diagnostic categories of PSCs compared with mild cases. French unit costs were applied to the global data set. Results The estimated total average cost for a seven week treatment period (including appliances and accessories) was 263€ for those with PSCs (n = 1742) compared to 215€ for those without PSCs (n = 1172). A co-variance analysis showed that leakage level had a significant impact on PSC cost from ‘rarely/never’ to ‘always/often’ p<0.00001 and from ‘rarely/never’ to ‘sometimes’ p = 0.0115. Conclusion PSCs are common and troublesome and the consequences are substantial, both for the patient and from a health economic viewpoint. PSCs should be diagnosed and treated at an early stage to prevent long term, debilitating and expensive complications. PMID:22679479
Directory of Open Access Journals (Sweden)
Søren Meisner
Full Text Available BACKGROUND: Peristomal skin complications (PSCs are the most common post-operative complications following creation of a stoma. Living with a stoma is a challenge, not only for the patient and their carers, but also for society as a whole. Due to methodological problems of PSC assessment, the associated health-economic burden of medium to longterm complications has been poorly described. AIM: The aim of the present study was to create a model to estimate treatment costs of PSCs using the standardized assessment Ostomy Skin Tool as a reference. The resultant model was applied to a real-life global data set of stoma patients (n = 3017 to determine the prevalence and financial burden of PSCs. METHODS: Eleven experienced stoma care nurses were interviewed to get a global understanding of a treatment algorithm that formed the basis of the cost analysis. The estimated costs were based on a seven week treatment period. PSC costs were estimated for five underlying diagnostic categories and three levels of severity. The estimated treatment costs of severe cases of PSCs were increased 2-5 fold for the different diagnostic categories of PSCs compared with mild cases. French unit costs were applied to the global data set. RESULTS: The estimated total average cost for a seven week treatment period (including appliances and accessories was 263€ for those with PSCs (n = 1742 compared to 215€ for those without PSCs (n = 1172. A co-variance analysis showed that leakage level had a significant impact on PSC cost from 'rarely/never' to 'always/often' p<0.00001 and from 'rarely/never' to 'sometimes' p = 0.0115. CONCLUSION: PSCs are common and troublesome and the consequences are substantial, both for the patient and from a health economic viewpoint. PSCs should be diagnosed and treated at an early stage to prevent long term, debilitating and expensive complications.
S Varadhan, S R
2001-01-01
This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando
Dubreuil, S.; Salaün, M.; Rodriguez, E.; Petitjean, F.
2018-01-01
This study investigates the construction and identification of the probability distribution of random modal parameters (natural frequencies and effective parameters) in structural dynamics. As these parameters present various types of dependence structures, the retained approach is based on pair copula construction (PCC). A literature review leads us to choose a D-Vine model for the construction of modal parameters probability distributions. Identification of this model is based on likelihood maximization which makes it sensitive to the dimension of the distribution, namely the number of considered modes in our context. To this respect, a mode selection preprocessing step is proposed. It allows the selection of the relevant random modes for a given transfer function. The second point, addressed in this study, concerns the choice of the D-Vine model. Indeed, D-Vine model is not uniquely defined. Two strategies are proposed and compared. The first one is based on the context of the study whereas the second one is purely based on statistical considerations. Finally, the proposed approaches are numerically studied and compared with respect to their capabilities, first in the identification of the probability distribution of random modal parameters and second in the estimation of the 99 % quantiles of some transfer functions.
DEFF Research Database (Denmark)
Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik
2010-01-01
The construction of detailed geological models for heterogeneous settings such as clay till is important to describe transport processes, particularly with regard to potential contamination pathways. In low-permeability clay matrices transport is controlled by diffusion, but fractures and sand...... of sand-lenses in clay till. Sand-lenses mainly account for horizontal transport and are prioritised in this study. Based on field observations, the distribution has been modeled using two different geostatistical approaches. One method uses a Markov chain model calculating the transition probabilities...
Paladim, Daniel; Kerfriden, Pierre; Moitinho de Almeida, José; Bordas, Stéphane
2014-01-01
Homogenised constitutive laws are largely used to predict the behaviour of composite structures. Assessing the validity of such homogenised models can be done by making use of the concept of “modelling error”. First, a microscopic “faithful” -and potentially intractable- model of the structure is defined. Then, one tries to quantify the effect of the homogenisation procedure on a result that would be obtained by directly using the “faithful” model. Such an approach requires (a)...
On new cautious structural reliability models in the framework of imprecise probabilities
DEFF Research Database (Denmark)
Utkin, Lev; Kozine, Igor
2010-01-01
New imprecise structural reliability models are described in this paper. They are developed based on the imprecise Bayesian inference and are imprecise Dirichlet, imprecise negative binomial, gamma-exponential and normal models. The models are applied to computing cautious structural reliability...
Chase, Thomas D.; Splawn, Keith; Christiansen, Eric L.
2007-01-01
The NASA Extravehicular Mobility Unit (EMU) micrometeoroid and orbital debris protection ability has recently been assessed against an updated, higher threat space environment model. The new environment was analyzed in conjunction with a revised EMU solid model using a NASA computer code. Results showed that the EMU exceeds the required mathematical Probability of having No Penetrations (PNP) of any suit pressure bladder over the remaining life of the program (2,700 projected hours of 2 person spacewalks). The success probability was calculated to be 0.94, versus a requirement of >0.91, for the current spacesuit s outer protective garment. In parallel to the probability assessment, potential improvements to the current spacesuit s outer protective garment were built and impact tested. A NASA light gas gun was used to launch projectiles at test items, at speeds of approximately 7 km per second. Test results showed that substantial garment improvements could be made, with mild material enhancements and moderate assembly development. The spacesuit s PNP would improve marginally with the tested enhancements, if they were available for immediate incorporation. This paper discusses the results of the model assessment process and test program. These findings add confidence to the continued use of the existing NASA EMU during International Space Station (ISS) assembly and Shuttle Operations. They provide a viable avenue for improved hypervelocity impact protection for the EMU, or for future space suits.
Bhatta, Anil; Sangani, Rajnikumar; Kolhe, Ravindra; Toque, Haroldo A; Cain, Michael; Wong, Abby; Howie, Nicole; Shinde, Rahul; Elsalanty, Mohammed; Yao, Lin; Chutkan, Norman; Hunter, Monty; Caldwell, Ruth B; Isales, Carlos; Caldwell, R William; Fulzele, Sadanand
2016-02-15
A balanced diet is crucial for healthy development and prevention of musculoskeletal related diseases. Diets high in fat content are known to cause obesity, diabetes and a number of other disease states. Our group and others have previously reported that activity of the urea cycle enzyme arginase is involved in diabetes-induced dysregulation of vascular function due to decreases in nitric oxide formation. We hypothesized that diabetes may also elevate arginase activity in bone and bone marrow, which could lead to bone-related complications. To test this we determined the effects of diabetes on expression and activity of arginase, in bone and bone marrow stromal cells (BMSCs). We demonstrated that arginase 1 is abundantly present in the bone and BMSCs. We also demonstrated that arginase activity and expression in bone and bone marrow is up-regulated in models of diabetes induced by HFHS diet and streptozotocin (STZ). HFHS diet down-regulated expression of healthy bone metabolism markers (BMP2, COL-1, ALP, and RUNX2) and reduced bone mineral density, bone volume and trabecular thickness. However, treatment with an arginase inhibitor (ABH) prevented these bone-related complications of diabetes. In-vitro study of BMSCs showed that high glucose treatment increased arginase activity and decreased nitric oxide production. These effects were reversed by treatment with an arginase inhibitor (ABH). Our study provides evidence that deregulation of l-arginine metabolism plays a vital role in HFHS diet-induced diabetic complications and that these complications can be prevented by treatment with arginase inhibitors. The modulation of l-arginine metabolism in disease could offer a novel therapeutic approach for osteoporosis and other musculoskeletal related diseases. Published by Elsevier Ireland Ltd.
Directory of Open Access Journals (Sweden)
Gyutae Kim
2013-01-01
Full Text Available The convolution of the transmembrane current of an excitable cell and a weighting function generates a single fiber action potential (SFAP model by using the volume conductor theory. Here, we propose an empirical muscle IAP model with multiple Erlang probability density functions (PDFs based on a modified Newton method. In addition, we generate SFAPs based on our IAP model and referent sources, and use the peak-to-peak ratios (PPRs of SFAPs for model verification. Through this verification, we find that the relation between an IAP profile and the PPR of its SFAP is consistent with some previous studies, and our IAP model shows close profiles to the referent sources. Moreover, we simulate and discuss some possible ionic activities by using the Erlang PDFs in our IAP model, which might present the underlying activities of ions or their channels during an IAP.
Kim, Gyutae; Ferdjallah, Mohammed M; McKenzie, Frederic D
2013-01-01
The convolution of the transmembrane current of an excitable cell and a weighting function generates a single fiber action potential (SFAP) model by using the volume conductor theory. Here, we propose an empirical muscle IAP model with multiple Erlang probability density functions (PDFs) based on a modified Newton method. In addition, we generate SFAPs based on our IAP model and referent sources, and use the peak-to-peak ratios (PPRs) of SFAPs for model verification. Through this verification, we find that the relation between an IAP profile and the PPR of its SFAP is consistent with some previous studies, and our IAP model shows close profiles to the referent sources. Moreover, we simulate and discuss some possible ionic activities by using the Erlang PDFs in our IAP model, which might present the underlying activities of ions or their channels during an IAP.
Crispin, Alexander; Klinger, Carsten; Rieger, Anna; Strahwald, Brigitte; Lehmann, Kai; Buhr, Heinz-Johannes; Mansmann, Ulrich
2017-10-01
The purpose of this study is to provide a web-based calculator predicting complication probabilities of patients undergoing colorectal cancer (CRC) surgery in Germany. Analyses were based on records of first-time CRC surgery between 2010 and February 2017, documented in the database of the Study, Documentation, and Quality Center (StuDoQ) of the Deutsche Gesellschaft für Allgemein- und Viszeralchirurgie (DGAV), a registry of CRC surgery in hospitals throughout Germany, covering demography, medical history, tumor features, comorbidity, behavioral risk factors, surgical procedures, and outcomes. Using logistic ridge regression, separate models were developed in learning samples of 6729 colon and 4381 rectum cancer patients and evaluated in validation samples of sizes 2407 and 1287. Discrimination was assessed using c statistics. Calibration was examined graphically by plotting observed versus predicted complication probabilities and numerically using Brier scores. We report validation results regarding 15 outcomes such as any major complication, surgical site infection, anastomotic leakage, bladder voiding disturbance after rectal surgery, abdominal wall dehiscence, various internistic complications, 30-day readmission, 30-day reoperation rate, and 30-day mortality. When applied to the validation samples, c statistics ranged between 0.60 for anastomosis leakage and 0.85 for mortality after rectum cancer surgery. Brier scores ranged from 0.003 to 0.127. While most models showed satisfactory discrimination and calibration, this does not preclude overly optimistic or pessimistic individual predictions. To avoid misinterpretation, one has to understand the basic principles of risk calculation and risk communication. An e-learning tool outlining the appropriate use of the risk calculator is provided.
Lessard, Sabin
2010-01-01
Recurrence equations for the number of types and the frequency of each type in a random sample drawn from a finite population undergoing discrete, nonoverlapping generations and reproducing according to the Cannings exchangeable model are deduced under the assumption of a mutation scheme with infinitely many types. The case of overlapping generations in discrete time is also considered. The equations are developed for the Wright-Fisher model and the Moran model, and extended...
Finite Genome Length Corrections for the Mean Fitness and Gene Probabilities in Evolution Models
Kirakosyan, Zara; Saakian, David B.; Hu, Chin-Kun
2011-07-01
Using the Hamilton-Jacobi equation approach to study genomes of length L, we obtain 1/ L corrections for the steady state population distributions and mean fitness functions for horizontal gene transfer model, as well as for the diploid evolution model with general fitness landscapes. Our numerical solutions confirm the obtained analytic equations. Our method could be applied to the general case of nonlinear Markov models.
Hsieh, Chih-Sheng; Lee, Lung fei
2017-01-01
In this paper, we model network formation and network interactions under a unified framework. The key feature of our model is to allow individuals to respond to incentives stemming from interaction benefits on certain activities when they choose friends (network links), while capturing homophily in terms of unobserved characteristic variables in network formation and activities. There are two advantages of this modeling approach: first, one can evaluate whether incentives from certain interac...
The probability of connectivity in a hyperbolic model of complex networks
Bode, Michel; Fountoulakis, Nikolaos; Müller, Tobias
2016-01-01
We consider a model for complex networks that was introduced by Krioukov et al. (Phys Rev E 82 (2010) 036106). In this model, N points are chosen randomly inside a disk on the hyperbolic plane according to a distorted version of the uniform distribution and any two of them are joined by an edge if
Taylor, Faith E.; Santangelo, Michele; Marchesini, Ivan; Malamud, Bruce D.
2013-04-01
During a landslide triggering event, the tens to thousands of landslides resulting from the trigger (e.g., earthquake, heavy rainfall) may block a number of sections of the road network, posing a risk to rescue efforts, logistics and accessibility to a region. Here, we present initial results from a semi-stochastic model we are developing to evaluate the probability of landslides intersecting a road network and the network-accessibility implications of this across a region. This was performed in the open source GRASS GIS software, where we took 'model' landslides and dropped them on a 79 km2 test area region in Collazzone, Umbria, Central Italy, with a given road network (major and minor roads, 404 km in length) and already determined landslide susceptibilities. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m.2 The number of landslide areas selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. 79 landslide areas chosen randomly for each iteration. Landslides were then 'dropped' over the region semi-stochastically: (i) random points were generated across the study region; (ii) based on the landslide susceptibility map, points were accepted/rejected based on the probability of a landslide occurring at that location. After a point was accepted, it was assigned a landslide area (AL) and length to width ratio. Landslide intersections with roads were then assessed and indices such as the location, number and size of road blockage recorded. The GRASS-GIS model was performed 1000 times in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event of 1 landslide km-2 over a 79 km2 region with 404 km of road, the number of road blockages
Combining 3d Volume and Mesh Models for Representing Complicated Heritage Buildings
Tsai, F.; Chang, H.; Lin, Y.-W.
2017-08-01
This study developed a simple but effective strategy to combine 3D volume and mesh models for representing complicated heritage buildings and structures. The idea is to seamlessly integrate 3D parametric or polyhedral models and mesh-based digital surfaces to generate a hybrid 3D model that can take advantages of both modeling methods. The proposed hybrid model generation framework is separated into three phases. Firstly, after acquiring or generating 3D point clouds of the target, these 3D points are partitioned into different groups. Secondly, a parametric or polyhedral model of each group is generated based on plane and surface fitting algorithms to represent the basic structure of that region. A "bare-bones" model of the target can subsequently be constructed by connecting all 3D volume element models. In the third phase, the constructed bare-bones model is used as a mask to remove points enclosed by the bare-bones model from the original point clouds. The remaining points are then connected to form 3D surface mesh patches. The boundary points of each surface patch are identified and these boundary points are projected onto the surfaces of the bare-bones model. Finally, new meshes are created to connect the projected points and original mesh boundaries to integrate the mesh surfaces with the 3D volume model. The proposed method was applied to an open-source point cloud data set and point clouds of a local historical structure. Preliminary results indicated that the reconstructed hybrid models using the proposed method can retain both fundamental 3D volume characteristics and accurate geometric appearance with fine details. The reconstructed hybrid models can also be used to represent targets in different levels of detail according to user and system requirements in different applications.
COMBINING 3D VOLUME AND MESH MODELS FOR REPRESENTING COMPLICATED HERITAGE BUILDINGS
Directory of Open Access Journals (Sweden)
F. Tsai
2017-08-01
Full Text Available This study developed a simple but effective strategy to combine 3D volume and mesh models for representing complicated heritage buildings and structures. The idea is to seamlessly integrate 3D parametric or polyhedral models and mesh-based digital surfaces to generate a hybrid 3D model that can take advantages of both modeling methods. The proposed hybrid model generation framework is separated into three phases. Firstly, after acquiring or generating 3D point clouds of the target, these 3D points are partitioned into different groups. Secondly, a parametric or polyhedral model of each group is generated based on plane and surface fitting algorithms to represent the basic structure of that region. A “bare-bones” model of the target can subsequently be constructed by connecting all 3D volume element models. In the third phase, the constructed bare-bones model is used as a mask to remove points enclosed by the bare-bones model from the original point clouds. The remaining points are then connected to form 3D surface mesh patches. The boundary points of each surface patch are identified and these boundary points are projected onto the surfaces of the bare-bones model. Finally, new meshes are created to connect the projected points and original mesh boundaries to integrate the mesh surfaces with the 3D volume model. The proposed method was applied to an open-source point cloud data set and point clouds of a local historical structure. Preliminary results indicated that the reconstructed hybrid models using the proposed method can retain both fundamental 3D volume characteristics and accurate geometric appearance with fine details. The reconstructed hybrid models can also be used to represent targets in different levels of detail according to user and system requirements in different applications.
Directory of Open Access Journals (Sweden)
Seyed Shamseddin Alizadeh
2014-12-01
Full Text Available Background: Falls from height are one of the main causes of fatal occupational injuries. The objective of this study was to present a model for estimating occurrence probability of falling from height. Methods: In order to make a list of factors affecting falls, we used four expert group's judgment, literature review and an available database. Then the validity and reliability of designed questionnaire were determined and Bayesian networks were built. The built network, nodes and curves were quantified. For network sensitivity analysis, four types of analysis carried out. Results: A Bayesian network for assessment of posterior probabilities of falling from height proposed. The presented Bayesian network model shows the interrelationships among 37 causes affecting the falling from height and can calculate its posterior probabilities. The most important factors affecting falling were Non-compliance with safety instructions for work at height (0.127, Lack of safety equipment for work at height (0.094 and Lack of safety instructions for work at height (0.071 respectively. Conclusion: The proposed Bayesian network used to determine how different causes could affect the falling from height at work. The findings of this study can be used to decide on the falling accident prevention programs.
Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.
2009-01-01
To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish.
Nemeth, Noel
2013-01-01
Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software
Alizadeh, Seyed Shamseddin; Mortazavi, Seyed Bagher; Sepehri, Mohammad Mehdi
2014-01-01
Falls from height are one of the main causes of fatal occupational injuries. The objective of this study was to present a model for estimating occurrence probability of falling from height. In order to make a list of factors affecting falls, we used four expert group's judgment, literature review and an available database. Then the validity and reliability of designed questionnaire were determined and Bayesian networks were built. The built network, nodes and curves were quantified. For network sensitivity analysis, four types of analysis carried out. A Bayesian network for assessment of posterior probabilities of falling from height proposed. The presented Bayesian network model shows the interrelationships among 37 causes affecting the falling from height and can calculate its posterior probabilities. The most important factors affecting falling were Non-compliance with safety instructions for work at height (0.127), Lack of safety equipment for work at height (0.094) and Lack of safety instructions for work at height (0.071) respectively. The proposed Bayesian network used to determine how different causes could affect the falling from height at work. The findings of this study can be used to decide on the falling accident prevention programs.
National Research Council Canada - National Science Library
2016-01-01
... are justified, and the assessment problem of the protected object vulnerability is formulated. The main advantage of the developed model is the extensive opportunities of formalization of diverse information on the security status of the object...
National Aeronautics and Space Administration — This invention develops a mathematical model to describe battery behavior during individual discharge cycles as well as over its cycle life. The basis for the form...
Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor)
2012-01-01
This invention develops a mathematical model to describe battery behavior during individual discharge cycles as well as over its cycle life. The basis for the form of the model has been linked to the internal processes of the battery and validated using experimental data. Effects of temperature and load current have also been incorporated into the model. Subsequently, the model has been used in a Particle Filtering framework to make predictions of remaining useful life for individual discharge cycles as well as for cycle life. The prediction performance was found to be satisfactory as measured by performance metrics customized for prognostics for a sample case. The work presented here provides initial steps towards a comprehensive health management solution for energy storage devices.
Ding, Aidong Adam; Hsieh, Jin-Jian; Wang, Weijing
2015-01-01
Bivariate survival analysis has wide applications. In the presence of covariates, most literature focuses on studying their effects on the marginal distributions. However covariates can also affect the association between the two variables. In this article we consider the latter issue by proposing a nonstandard local linear estimator for the concordance probability as a function of covariates. Under the Clayton copula, the conditional concordance probability has a simple one-to-one correspondence with the copula parameter for different data structures including those subject to independent or dependent censoring and dependent truncation. The proposed method can be used to study how covariates affect the Clayton association parameter without specifying marginal regression models. Asymptotic properties of the proposed estimators are derived and their finite-sample performances are examined via simulations. Finally, for illustration, we apply the proposed method to analyze a bone marrow transplant data set.
Hedell, Ronny; Stephansson, Olga; Mostad, Petter; Andersson, Mats Gunnar
2017-01-16
Efficient and correct evaluation of sampling results with respect to hypotheses about the concentration or distribution of bacteria generally requires knowledge about the performance of the detection method. To assess the sensitivity of the detection method an experiment is usually performed where the target matrix is spiked (i.e. artificially contaminated) with different concentrations of the bacteria, followed by analyses of the samples using the pre-enrichment method and the analytical detection method of interest. For safety reasons or because of economic or time limits it is not always possible to perform exactly such an experiment, with the desired number of samples. In this paper, we show how heterogeneous data from diverse sources may be combined within a single model to obtain not only estimates of detection probabilities, but also, crucially, uncertainty estimates. We indicate how such results can then be used to obtain optimal conclusions about presence of bacteria, and illustrate how strongly the sampling results speak in favour of or against contamination. In our example, we consider the case when B. cereus is used as surrogate for B. anthracis, for safety reasons. The statistical modelling of the detection probabilities and of the growth characteristics of the bacteria types is based on data from four experiments where different matrices of food were spiked with B. anthracis or B. cereus and analysed using plate counts and qPCR. We show how flexible and complex Bayesian models, together with inference tools such as OpenBUGS, can be used to merge information about detection probability curves. Two different modelling approaches, differing in whether the pre-enrichment step and the PCR detection step are modelled separately or together, are applied. The relative importance on the detection curves for various existing data sets are evaluated and illustrated. Copyright Â© 2016 Elsevier B.V. All rights reserved.
O'Connor, Maja; Lasgaard, Mathias; Shevlin, Mark; Guldin, Mai-Britt
2010-10-01
The aim of this study was to assess the factorial structure of complicated grief (CG) and investigate the relationship between CG and posttraumatic stress disorder (PTSD) through the assessment of models combining both constructs. The questionnaire was completed by elderly, married respondents with a history of at least one significant, interpersonal loss (145 males and 147 females, 60-81 years). Confirmatory factor analysis (CFA) supported a two-factor model of separation and traumatic distress in CG. To investigate the relationship between CG and PTSD three combined models were specified and estimated using CFA. A model where all five factors, the two factors of CG and the three factors of PTSD, as defined by the DSM-IV, were allowed to correlate provided the best fit. The results indicated a considerable overlap between the dimensions of CG and PTSD, and complicated grief is construct that appears to be largely accounted for by especially the intrusive component of PTSD. Copyright 2010 Elsevier Ltd. All rights reserved.
Dynamic update with probabilities
Van Benthem, Johan; Gerbrandy, Jelle; Kooi, Barteld
2009-01-01
Current dynamic-epistemic logics model different types of information change in multi-agent scenarios. We generalize these logics to a probabilistic setting, obtaining a calculus for multi-agent update with three natural slots: prior probability on states, occurrence probabilities in the relevant
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2013-01-01
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...
Khan, Akhtar Nawaz
2017-11-01
Currently, analytical models are used to compute approximate blocking probabilities in opaque and all-optical WDM networks with the homogeneous link capacities. Existing analytical models can also be extended to opaque WDM networking with heterogeneous link capacities due to the wavelength conversion at each switch node. However, existing analytical models cannot be utilized for all-optical WDM networking with heterogeneous structure of link capacities due to the wavelength continuity constraint and unequal numbers of wavelength channels on different links. In this work, a mathematical model is extended for computing approximate network blocking probabilities in heterogeneous all-optical WDM networks in which the path blocking is dominated by the link along the path with fewer number of wavelength channels. A wavelength assignment scheme is also proposed for dynamic traffic, termed as last-fit-first wavelength assignment, in which a wavelength channel with maximum index is assigned first to a lightpath request. Due to heterogeneous structure of link capacities and the wavelength continuity constraint, the wavelength channels with maximum indexes are utilized for minimum hop routes. Similarly, the wavelength channels with minimum indexes are utilized for multi-hop routes between source and destination pairs. The proposed scheme has lower blocking probability values compared to the existing heuristic for wavelength assignments. Finally, numerical results are computed in different network scenarios which are approximately equal to values obtained from simulations. Since January 2016, he is serving as Head of Department and an Assistant Professor in the Department of Electrical Engineering at UET, Peshawar-Jalozai Campus, Pakistan. From May 2013 to June 2015, he served Department of Telecommunication Engineering as an Assistant Professor at UET, Peshawar-Mardan Campus, Pakistan. He also worked as an International Internship scholar in the Fukuda Laboratory, National
Business model risk analysis: predicting the probability of business network profitability
Johnson, Pontus; Iacob, Maria Eugenia; Valja, Margus; Magnusson, Christer; Ladhe, Tobias; van Sinderen, Marten J.; Oude Luttighuis, P.H.W.M.; Folmer, Erwin Johan Albert; Bosems, S.
In the design phase of business collaboration, it is desirable to be able to predict the profitability of the business-to-be. Therefore, techniques to assess qualities such as costs, revenues, risks, and profitability have been previously proposed. However, they do not allow the modeler to properly
Modeling of Kidney Hemodynamics: Probability-Based Topology of an Arterial Network
DEFF Research Database (Denmark)
Postnov, Dmitry D; Marsh, Donald J; Postnov, Dmitry E
2016-01-01
. We provide experimental data and a new modeling approach to clarify this problem. To resolve details of microvascular structure, we collected 3D data from more than 150 afferent arterioles in an optically cleared rat kidney. Using these results together with published micro-computed tomography (μ...
DEFF Research Database (Denmark)
Schjær-Jacobsen, Hans
2012-01-01
Two basic approaches to quantitative non-monotonic modeling of economic uncertainty are available today and have been applied to a number of real world uncertainty problems, such as investment analyses and budgeting of large infra structure projects. This paper further contributes to the understa...
Mirman, Daniel; Estes, Katharine Graf; Magnuson, James S.
2010-01-01
Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network…
Litchford, Ron J.; Jeng, San-Mou
1992-01-01
The performance of a recently introduced statistical transport model for turbulent particle dispersion is studied here for rigid particles injected into a round turbulent jet. Both uniform and isosceles triangle pdfs are used. The statistical sensitivity to parcel pdf shape is demonstrated.
Smith, O. E.; Adelfang, S. I.
1981-01-01
A model of the largest gust amplitude and gust length is presented which uses the properties of the bivariate gamma distribution. The gust amplitude and length are strongly dependent on the filter function; the amplitude increases with altitude and is larger in winter than in summer.
Directory of Open Access Journals (Sweden)
E. Chumak
2014-04-01
Full Text Available The concept model and various approaches to creating of such models are analyzed in the paper. The essence of the model which reflects the process of implementing all the components of designed teaching methodology in their interaction is presented. Professionally oriented education model on the probability theory and stochastic processes course for future engineers is proposed by author. It consists of four parts: theoretical; methodological; content and organization unit; control and effective unit. Applying of methodological foundations of the theory of professionally oriented, heuristic, problem-based learning for forming of intensive learning students’ activities during practical classes is shown. Organizational methods, forms and tools of training, which promote the formation of the internal purposes of students, are presented in the paper. Methods of designing a system of professional-oriented tasks and its applying at the practical classes are given by author. Some ways of developing of students’ skills and abilities during generalization and systematization of knowledge, integrated practical exercises, laboratory works, and business games are considered. Indicators of the formation levels of training activities motivation, professional motivation, self-motivation, levels of knowledge and skills in the probability theory and stochastic processes course, levels of development of professional and analytical thinking, level of applying some e-tools are analyzed by author. The possibility of using measuring tools, including questionnaires, surveys, freshman test, modular tests, exams and special engineering disciplines test, current tests is underlined.
On the probability distribution of stock returns in the Mike-Farmer model
Gu, G.-F.; Zhou, W.-X.
2009-02-01
Recently, Mike and Farmer have constructed a very powerful and realistic behavioral model to mimick the dynamic process of stock price formation based on the empirical regularities of order placement and cancelation in a purely order-driven market, which can successfully reproduce the whole distribution of returns, not only the well-known power-law tails, together with several other important stylized facts. There are three key ingredients in the Mike-Farmer (MF) model: the long memory of order signs characterized by the Hurst index Hs, the distribution of relative order prices x in reference to the same best price described by a Student distribution (or Tsallis’ q-Gaussian), and the dynamics of order cancelation. They showed that different values of the Hurst index Hs and the freedom degree αx of the Student distribution can always produce power-law tails in the return distribution fr(r) with different tail exponent αr. In this paper, we study the origin of the power-law tails of the return distribution fr(r) in the MF model, based on extensive simulations with different combinations of the left part L(x) for x 0 of fx(x). We find that power-law tails appear only when L(x) has a power-law tail, no matter R(x) has a power-law tail or not. In addition, we find that the distributions of returns in the MF model at different timescales can be well modeled by the Student distributions, whose tail exponents are close to the well-known cubic law and increase with the timescale.
National Research Council Canada - National Science Library
CUKER, A; AREPALLY, G; CROWTHER, M. A; RICE, L; DATKO, F; HOOK, K; PROPERT, K. J; KUTER, D. J; ORTEL, T. L; KONKLE, B. A; CINES, D. B
2010-01-01
.... Objectives: To develop a pre‐test clinical scoring model for HIT based on broad expert opinion that may be useful in guiding clinical decisions regarding therapy. Patients/methods: A pre...
Mahmud, Zamalia; Porter, Anne; Salikin, Masniyati; Ghani, Nor Azura Md
2015-12-01
Students' understanding of probability concepts have been investigated from various different perspectives. Competency on the other hand is often measured separately in the form of test structure. This study was set out to show that perceived understanding and competency can be calibrated and assessed together using Rasch measurement tools. Forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW have volunteered to participate in the study. Rasch measurement which is based on a probabilistic model is used to calibrate the responses from two survey instruments and investigate the interactions between them. Data were captured from the e-learning platform Moodle where students provided their responses through an online quiz. The study shows that majority of the students perceived little understanding about conditional and independent events prior to learning about it but tend to demonstrate a slightly higher competency level afterward. Based on the Rasch map, there is indication of some increase in learning and knowledge about some probability concepts at the end of the two weeks lessons on probability concepts.
Directory of Open Access Journals (Sweden)
2016-01-01
Full Text Available The article is devoted to the realization of the principles of the probabilistic and linguistic approach to the formaliza- tion, presentation and subsequent processing of diverse information on the security status of important technical objects. On the basis of the process analysis of the potential infringer overcoming of the safety system the main restrictions are justified, and the assessment problem of the protected object vulnerability is formulated. The main advantage of the developed model is the extensive opportunities of formalization of diverse information on the security status of the object. On the basis of the developed model analysis two conclusions are drawn. The first conclusion is that the main nature of the ambiguity of infor- mation, formalized in the context of the developed model, is of a subjectively colored character, as the source of information is an expert with his knowledge and experience of potential infringer. The second important conclusion is that though theoreti- cally the choice of the next impact on the safe operation system of important technical objects depends on the whole back- ground of states and influences but in practice what influences are available to the «neutralization» of a functional element in the safe operation system of important technical object in the form of probabilistic and linguistic syndrome is given.
Directory of Open Access Journals (Sweden)
Soojeong Lee
2013-10-01
Full Text Available The maximum amplitude algorithm (MAA is generally utilized in the estimation of the pressure values, and it uses heuristically obtained ratios of systolic and diastolic oscillometric amplitude to the mean arterial pressure (known as systolic and diastolic ratios in order to estimate the systolic and diastolic pressures. This paper proposes a Bayesian model to estimate the systolic and diastolic ratios. These ratios are an improvement over the single fixed systolic and diastolic ratios used in the algorithms that are available in the literature. The proposed method shows lower mean difference (MD with standard deviation (SD compared to the MAA for both SBP and DBP consistently in all the five measurements.
Lee, Soojeong; Jeon, Gwanggil; Lee, Gangseong
2013-01-01
The maximum amplitude algorithm (MAA) is generally utilized in the estimation of the pressure values, and it uses heuristically obtained ratios of systolic and diastolic oscillometric amplitude to the mean arterial pressure (known as systolic and diastolic ratios) in order to estimate the systolic and diastolic pressures. This paper proposes a Bayesian model to estimate the systolic and diastolic ratios. These ratios are an improvement over the single fixed systolic and diastolic ratios used in the algorithms that are available in the literature. The proposed method shows lower mean difference (MD) with standard deviation (SD) compared to the MAA for both SBP and DBP consistently in all the five measurements. PMID:24152924
Jacobsen, J L; Saleur, H
2008-02-29
We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.
Zhou, Su-Min; Das, Shiva; Wang, Zhiheng; Marks, Lawrence B
2004-09-01
The generalized equivalent uniform dose (GEUD) model uses a power-law formalism, where the outcome is related to the dose via a power law. We herein investigate the mathematical compatibility between this GEUD model and the Poisson statistics based tumor control probability (TCP) model. The GEUD and TCP formulations are combined and subjected to a compatibility constraint equation. This compatibility constraint equates tumor control probability from the original heterogeneous target dose distribution to that from the homogeneous dose from the GEUD formalism. It is shown that this constraint equation possesses a unique, analytical closed-form solution which relates radiation dose to the tumor cell survival fraction. It is further demonstrated that, when there is no positive threshold or finite critical dose in the tumor response to radiation, this relationship is not bounded within the realistic cell survival limits of 0%-100%. Thus, the GEUD and TCP formalisms are, in general, mathematically inconsistent. However, when a threshold dose or finite critical dose exists in the tumor response to radiation, there is a unique mathematical solution for the tumor cell survival fraction that allows the GEUD and TCP formalisms to coexist, provided that all portions of the tumor are confined within certain specific dose ranges.
Directory of Open Access Journals (Sweden)
Mi Mi Ko
2016-01-01
Full Text Available Background. Pattern identification (PI is the basic system for diagnosis of patients in traditional Korean medicine (TKM. The purpose of this study was to identify misclassification objects in discriminant model of PI for improving the classification accuracy of PI for stroke. Methods. The study included 3306 patients with stroke who were admitted to 15 TKM hospitals from June 2006 to December 2012. We derive the four kinds of measure (D, R, S, and C score based on the pattern of the profile graphs according to classification types. The proposed measures are applied to the data to evaluate how well those detect misclassification objects. Results. In 10–20% of the filtered data, misclassification rate of C score was highest compared to those rates of other scores (42.60%, 41.15%, resp.. In 30% of the filtered data, misclassification rate of R score was highest compared to those rates of other scores (40.32%. And, in 40–90% of the filtered data, misclassification rate of D score was highest compared to those rates of other scores. Additionally, we can derive the same result of C score from multiple regression model with two independent variables. Conclusions. The results of this study should assist the development of diagnostic standards in TKM.
Liu, Zhao-Jie; Jia, Jian; Zhang, Yin-Guang; Tian, Wei; Jin, Xin; Hu, Yong-Cheng
2017-05-01
The purpose of this article is to evaluate the efficacy and feasibility of preoperative surgery with 3D printing-assisted internal fixation of complicated acetabular fractures. A retrospective case review was performed for the above surgical procedure. A 23-year-old man was confirmed by radiological examination to have fractures of multiple ribs, with hemopneumothorax and communicated fractures of the left acetabulum. According to the Letounel and Judet classification, T-shaped fracture involving posterior wall was diagnosed. A 3D printing pelvic model was established using CT digital imaging and communications in medicine (DICOM) data preoperatively, with which surgical procedures were simulated in preoperative surgery to confirm the sequence of the reduction and fixation as well as the position and length of the implants. Open reduction with internal fixation (ORIF) of the acetabular fracture using modified ilioinguinal and Kocher-Langenbeck approaches was performed 25 days after injury. Plates that had been pre-bent in the preoperative surgery were positioned and screws were tightened in the directions determined in the preoperative planning following satisfactory reduction. The duration of the operation was 170 min and blood loss was 900 mL. Postoperative X-rays showed that anatomical reduction of the acetabulum was achieved and the hip joint was congruous. The position and length of the implants were not different when compared with those in preoperative surgery on 3D printing models. We believe that preoperative surgery using 3D printing models is beneficial for confirming the reduction and fixation sequence, determining the reduction quality, shortening the operative time, minimizing preoperative difficulties, and predicting the prognosis for complicated fractures of acetabulam. © 2017 Chinese Orthopaedic Association and John Wiley & Sons Australia, Ltd.
Rangayyan, Rangaraj M; Wu, Yunfeng
2008-01-01
Diagnostic information related to the articular cartilage surfaces of knee-joints may be derived from vibro-arthrographic (VAG) signals. Although several studies have proposed many different types of parameters for the analysis and classification of VAG signals, no statistical modeling methods have been explored to represent the fundamental distinctions between normal and abnormal VAG signals. In the present work, we derive models of probability density functions (PDFs), using the Parzen-window approach, to represent the basic statistical characteristics of normal and abnormal VAG signals. The Kullback-Leibler distance (KLD) is then computed between the PDF of the signal to be classified and the PDF models for normal and abnormal VAG signals. A classification accuracy of 73.03% was obtained with a database of 89 VAG signals. The screening efficiency was derived to be 0.6724, in terms of the area under the receiver operating characteristics curve.
Elizalde, E.; Gaztanaga, E.
1992-01-01
The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.
DEFF Research Database (Denmark)
Hinrichsen, H.H.; Schmidt, J.O.; Petereit, C.
2005-01-01
patterns on the overlap of Baltic cod larvae with their prey. A three-dimensional hydrodynamic model was used to analyse spatio-temporally resolved drift patterns of larval Baltic cod. A coefficient of overlap between modelled larval and idealized prey distributions indicated the probability of predator......-prey overlap, dependent on the hatching time of cod larvae. By performing model runs for the years 1979-1998 investigated the intra- and interannual variability of potential spatial overlap between predator and prey. Assuming uniform prey distributions, we generally found the overlap to have decreased since...... the mid-1980s, but with the highest variability during the 1990s. Seasonally, predator-prey overlap on the Baltic cod spawning grounds was highest in summer and lowest at the end of the cod spawning season. Horizontally variable prey distributions generally resulted in decreased overlap coefficients...
Concepts of probability theory
Pfeiffer, Paul E
1979-01-01
Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.
Binary logistic regression modelling: Measuring the probability of relapse cases among drug addict
Ismail, Mohd Tahir; Alias, Siti Nor Shadila
2014-07-01
For many years Malaysia faced the drug addiction issues. The most serious case is relapse phenomenon among treated drug addict (drug addict who have under gone the rehabilitation programme at Narcotic Addiction Rehabilitation Centre, PUSPEN). Thus, the main objective of this study is to find the most significant factor that contributes to relapse to happen. The binary logistic regression analysis was employed to model the relationship between independent variables (predictors) and dependent variable. The dependent variable is the status of the drug addict either relapse, (Yes coded as 1) or not, (No coded as 0). Meanwhile the predictors involved are age, age at first taking drug, family history, education level, family crisis, community support and self motivation. The total of the sample is 200 which the data are provided by AADK (National Antidrug Agency). The finding of the study revealed that age and self motivation are statistically significant towards the relapse cases..
Survival under uncertainty an introduction to probability models of social structure and evolution
Volchenkov, Dimitri
2016-01-01
This book introduces and studies a number of stochastic models of subsistence, communication, social evolution and political transition that will allow the reader to grasp the role of uncertainty as a fundamental property of our irreversible world. At the same time, it aims to bring about a more interdisciplinary and quantitative approach across very diverse fields of research in the humanities and social sciences. Through the examples treated in this work – including anthropology, demography, migration, geopolitics, management, and bioecology, among other things – evidence is gathered to show that volatile environments may change the rules of the evolutionary selection and dynamics of any social system, creating a situation of adaptive uncertainty, in particular, whenever the rate of change of the environment exceeds the rate of adaptation. Last but not least, it is hoped that this book will contribute to the understanding that inherent randomness can also be a great opportunity – for social systems an...
Varouchakis, Emmanouil; Kourgialas, Nektarios; Karatzas, George; Giannakis, Georgios; Lilli, Maria; Nikolaidis, Nikolaos
2014-05-01
Riverbank erosion affects the river morphology and the local habitat and results in riparian land loss, damage to property and infrastructures, ultimately weakening flood defences. An important issue concerning riverbank erosion is the identification of the areas vulnerable to erosion, as it allows for predicting changes and assists with stream management and restoration. One way to predict the vulnerable to erosion areas is to determine the erosion probability by identifying the underlying relations between riverbank erosion and the geomorphological and/or hydrological variables that prevent or stimulate erosion. A statistical model for evaluating the probability of erosion based on a series of independent local variables and by using logistic regression is developed in this work. The main variables affecting erosion are vegetation index (stability), the presence or absence of meanders, bank material (classification), stream power, bank height, river bank slope, riverbed slope, cross section width and water velocities (Luppi et al. 2009). In statistics, logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable, e.g. binary response, based on one or more predictor variables (continuous or categorical). The probabilities of the possible outcomes are modelled as a function of independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. 1 = "presence of erosion" and 0 = "no erosion") for any value of the independent variables. The regression coefficients are estimated by using maximum likelihood estimation. The erosion occurrence probability can be calculated in conjunction with the model deviance regarding
Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo
2017-02-01
Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.
Field, Edward H.
2015-01-01
A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.
Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu
2013-01-04
Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .
Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan
2016-05-01
Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.
Directory of Open Access Journals (Sweden)
ZHAO Qunhe
2015-04-01
Full Text Available Satellite laser ranging system calculates the distance from ground-based observatories to satellites using the round-trip travel time of laser pulse. The position of retro-reflectors on satellites needs to be corrected which is helpful to improvie the measuring precision of satellite laser ranging. The correction errors of center-of-mass(CoMare mainly caused by the distribution effects of retro-reflectors on satellites.CoM is related to incident angle, structural alignment of retro-reflectors and ground-based position. Based on the reflecting probability of photons for retro-reflectors is proportional to the cross sections of retro-reflectors, the cross section area of corner reflectors is fitted and the probabilistic model is established using incident angle as the random variable. The corrections of CoMs of spherical satellite such as LAGEOS-1/2 are calculated and different CoM values are applied for SLR precise orbit determination using long-term full rate observation data with different WRMS results analyzed. At last, for the planar array retro-reflectors, the CoMs of BeiDou navigational satellite such as BeiDou-M3 are also calculated and analyzed using one month SLR full rate data. The result shows that the calculated CoMs based on probability theory have the comparative precision in SLR precise orbit determination.
Xie, Xin-Ping; Xie, Yu-Feng; Wang, Hong-Qiang
2017-08-23
Large-scale accumulation of omics data poses a pressing challenge of integrative analysis of multiple data sets in bioinformatics. An open question of such integrative analysis is how to pinpoint consistent but subtle gene activity patterns across studies. Study heterogeneity needs to be addressed carefully for this goal. This paper proposes a regulation probability model-based meta-analysis, jGRP, for identifying differentially expressed genes (DEGs). The method integrates multiple transcriptomics data sets in a gene regulatory space instead of in a gene expression space, which makes it easy to capture and manage data heterogeneity across studies from different laboratories or platforms. Specifically, we transform gene expression profiles into a united gene regulation profile across studies by mathematically defining two gene regulation events between two conditions and estimating their occurring probabilities in a sample. Finally, a novel differential expression statistic is established based on the gene regulation profiles, realizing accurate and flexible identification of DEGs in gene regulation space. We evaluated the proposed method on simulation data and real-world cancer datasets and showed the effectiveness and efficiency of jGRP in identifying DEGs identification in the context of meta-analysis. Data heterogeneity largely influences the performance of meta-analysis of DEGs identification. Existing different meta-analysis methods were revealed to exhibit very different degrees of sensitivity to study heterogeneity. The proposed method, jGRP, can be a standalone tool due to its united framework and controllable way to deal with study heterogeneity.
Hou, Rui; Changyue, Jiana; He, Tingting; Mao, Tengyue; Yu, Jianwei; Lei, Bo
2013-04-01
In an optical burst switching core node, each output port is equipped with a different network interface unit that can provide a specific data rate. Bursts will use different probabilities of select output ports, which is in accordance to the path-length metric-based routing optimal algorithm and wavelength resource situation. Previous studies ignore this issue. We establish a burst-outputted model considering the different service rate of output ports and different port-selected probabilities. We calculate burst-blocking probability and analyze the relationship between service rate and output-port-selected probability in detail.
Lee, Si-Yong; Carle, Steven F.; Fogg, Graham E.
2007-09-01
A covariance-based model-fitting approach is often considered valid to represent field spatial variability of hydraulic properties. This study examines the representation of geologic heterogeneity in two types of geostatistical models under the same mean and spatial covariance structure, and subsequently its effect on the hydraulic response to a pumping test based on 3D high-resolution numerical simulation and field data. Two geostatistical simulation methods, sequential Gaussian simulation (SGS) and transition probability indicator simulation (TPROGS) were applied to create conditional realizations of alluvial fan aquifer systems in the Lawrence Livermore National Laboratory (LLNL) area. The simulated K fields were then used in a numerical groundwater flow model to simulate a pumping test performed at the LLNL site. Spatial connectivity measures of high- K materials (channel facies) captured connectivity characteristics of each geostatistical model and revealed that the TPROGS model created an aquifer (channel) network having greater lateral connectivity. SGS realizations neglected important geologic structures associated with channel and overbank (levee) facies, even though the covariance model used to create these realizations provided excellent fits to sample covariances computed from exhaustive samplings of TPROGS realizations. Observed drawdown response in monitoring wells during a pumping test and its numerical simulation shows that in an aquifer system with strongly connected network of high- K materials, the Gaussian approach could not reproduce a similar behavior in simulated drawdown response found in TPROGS case. Overall, the simulated drawdown responses demonstrate significant disagreement between TPROGS and SGS realizations. This study showed that important geologic characteristics may not be captured by a spatial covariance model, even if that model is exhaustively determined and closely fits the exponential function.
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Directory of Open Access Journals (Sweden)
N. SHAHRAKI
2013-03-01
Full Text Available Water scarcity is a major problem in arid and semi-arid areas. The scarcity of water is further stressed by the growing demand due to increase in population growth in developing countries. Climate change and its outcomes on precipitation and water resources is the other problem in these areas. Several models are widely used for modeling daily precipitation occurrence. In this study, Markov Chain Model has been extensively used to study spell distribution. For this purpose, a day period was considered as the optimum length of time. Given the assumption that the Markov chain model is the right model for daily precipitation occurrence, the choice of Markov model order was examined on a daily basis for 4 synoptic weather stations with different climates in Iran (Gorgan, Khorram Abad, Zahedan, Tabrizduring 1978-2009. Based on probability rules, events possibility of sequential dry and wet days, these data were analyzed by stochastic process and Markov Chain method. Then probability matrix was calculated by maximum likelihood method. The possibility continuing2-5days of dry and wet days were calculated. The results showed that the probability maximum of consecutive dry period and climatic probability of dry days has occurred in Zahedan. The probability of consecutive dry period has fluctuated from 73.3 to 100 percent. Climatic probability of occurrence of dry days would change in the range of 70.96 to 100 percent with the average probability of about 90.45 percent.
A Cost-Utility Model of Care for Peristomal Skin Complications.
Neil, Nancy; Inglese, Gary; Manson, Andrea; Townshend, Arden
2016-01-01
The aim of this study was to evaluate the economic and humanistic implications of using ostomy components to prevent subsequent peristomal skin complications (PSCs) in individuals who experience an initial, leakage-related PSC event. Cost-utility analysis. We developed a simple decision model to consider, from a payer's perspective, PSCs managed with and without the use of ostomy components over 1 year. The model evaluated the extent to which outcomes associated with the use of ostomy components (PSC events avoided; quality-adjusted life days gained) offset the costs associated with their use. Our base case analysis of 1000 hypothetical individuals over 1 year assumes that using ostomy components following a first PSC reduces recurrent events versus PSC management without components. In this analysis, component acquisition costs were largely offset by lower resource use for ostomy supplies (barriers; pouches) and lower clinical utilization to manage PSCs. The overall annual average resource use for individuals using components was about 6.3% ($139) higher versus individuals not using components. Each PSC event avoided yielded, on average, 8 additional quality-adjusted life days over 1 year. In our analysis, (1) acquisition costs for ostomy components were offset in whole or in part by the use of fewer ostomy supplies to manage PSCs and (2) use of ostomy components to prevent PSCs produced better outcomes (fewer repeat PSC events; more health-related quality-adjusted life days) over 1 year compared to not using components.
A Cost-Utility Model of Care for Peristomal Skin Complications
Inglese, Gary; Manson, Andrea; Townshend, Arden
2016-01-01
PURPOSE: The aim of this study was to evaluate the economic and humanistic implications of using ostomy components to prevent subsequent peristomal skin complications (PSCs) in individuals who experience an initial, leakage-related PSC event. DESIGN: Cost-utility analysis. METHODS: We developed a simple decision model to consider, from a payer's perspective, PSCs managed with and without the use of ostomy components over 1 year. The model evaluated the extent to which outcomes associated with the use of ostomy components (PSC events avoided; quality-adjusted life days gained) offset the costs associated with their use. RESULTS: Our base case analysis of 1000 hypothetical individuals over 1 year assumes that using ostomy components following a first PSC reduces recurrent events versus PSC management without components. In this analysis, component acquisition costs were largely offset by lower resource use for ostomy supplies (barriers; pouches) and lower clinical utilization to manage PSCs. The overall annual average resource use for individuals using components was about 6.3% ($139) higher versus individuals not using components. Each PSC event avoided yielded, on average, 8 additional quality-adjusted life days over 1 year. CONCLUSIONS: In our analysis, (1) acquisition costs for ostomy components were offset in whole or in part by the use of fewer ostomy supplies to manage PSCs and (2) use of ostomy components to prevent PSCs produced better outcomes (fewer repeat PSC events; more health-related quality-adjusted life days) over 1 year compared to not using components. PMID:26633166
Elements of quantum probability
Kummerer, B.; Maassen, Hans
1996-01-01
This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with finitely many states, a probabilistic model is developed. It can describe, in particular, the polarization experiments. Some examples of quantum coin tosses are discussed, closely related to V.F.R....
Lee, Tsair-Fwu; Lin, Wei-Chun; Wang, Hung-Yu; Lin, Shu-Yuan; Wu, Li-Fu; Guo, Shih-Sian; Huang, Hsiang-Jui; Ting, Hui-Min; Chao, Pei-Ju
2015-01-01
To develop the logistic and the probit models to analyse electromyographic (EMG) equivalent uniform voltage- (EUV-) response for the tenderness of tennis elbow. In total, 78 hands from 39 subjects were enrolled. In this study, surface EMG (sEMG) signal is obtained by an innovative device with electrodes over forearm region. The analytical endpoint was defined as Visual Analog Score (VAS) 3+ tenderness of tennis elbow. The logistic and the probit diseased probability (DP) models were established for the VAS score and EMG absolute voltage-time histograms (AVTH). TV50 is the threshold equivalent uniform voltage predicting a 50% risk of disease. Twenty-one out of 78 samples (27%) developed VAS 3+ tenderness of tennis elbow reported by the subject and confirmed by the physician. The fitted DP parameters were TV50 = 153.0 mV (CI: 136.3-169.7 mV), γ 50 = 0.84 (CI: 0.78-0.90) and TV50 = 155.6 mV (CI: 138.9-172.4 mV), m = 0.54 (CI: 0.49-0.59) for logistic and probit models, respectively. When the EUV ≥ 153 mV, the DP of the patient is greater than 50% and vice versa. The logistic and the probit models are valuable tools to predict the DP of VAS 3+ tenderness of tennis elbow.
Lin, Wei-Chun; Lin, Shu-Yuan; Wu, Li-Fu; Guo, Shih-Sian; Huang, Hsiang-Jui; Chao, Pei-Ju
2015-01-01
To develop the logistic and the probit models to analyse electromyographic (EMG) equivalent uniform voltage- (EUV-) response for the tenderness of tennis elbow. In total, 78 hands from 39 subjects were enrolled. In this study, surface EMG (sEMG) signal is obtained by an innovative device with electrodes over forearm region. The analytical endpoint was defined as Visual Analog Score (VAS) 3+ tenderness of tennis elbow. The logistic and the probit diseased probability (DP) models were established for the VAS score and EMG absolute voltage-time histograms (AVTH). TV50 is the threshold equivalent uniform voltage predicting a 50% risk of disease. Twenty-one out of 78 samples (27%) developed VAS 3+ tenderness of tennis elbow reported by the subject and confirmed by the physician. The fitted DP parameters were TV50 = 153.0 mV (CI: 136.3–169.7 mV), γ 50 = 0.84 (CI: 0.78–0.90) and TV50 = 155.6 mV (CI: 138.9–172.4 mV), m = 0.54 (CI: 0.49–0.59) for logistic and probit models, respectively. When the EUV ≥ 153 mV, the DP of the patient is greater than 50% and vice versa. The logistic and the probit models are valuable tools to predict the DP of VAS 3+ tenderness of tennis elbow. PMID:26380281
Directory of Open Access Journals (Sweden)
Tsair-Fwu Lee
2015-01-01
Full Text Available To develop the logistic and the probit models to analyse electromyographic (EMG equivalent uniform voltage- (EUV- response for the tenderness of tennis elbow. In total, 78 hands from 39 subjects were enrolled. In this study, surface EMG (sEMG signal is obtained by an innovative device with electrodes over forearm region. The analytical endpoint was defined as Visual Analog Score (VAS 3+ tenderness of tennis elbow. The logistic and the probit diseased probability (DP models were established for the VAS score and EMG absolute voltage-time histograms (AVTH. TV50 is the threshold equivalent uniform voltage predicting a 50% risk of disease. Twenty-one out of 78 samples (27% developed VAS 3+ tenderness of tennis elbow reported by the subject and confirmed by the physician. The fitted DP parameters were TV50 = 153.0 mV (CI: 136.3–169.7 mV, γ50 = 0.84 (CI: 0.78–0.90 and TV50 = 155.6 mV (CI: 138.9–172.4 mV, m = 0.54 (CI: 0.49–0.59 for logistic and probit models, respectively. When the EUV ≥ 153 mV, the DP of the patient is greater than 50% and vice versa. The logistic and the probit models are valuable tools to predict the DP of VAS 3+ tenderness of tennis elbow.
Gronewold, Andrew D; Wolpert, Robert L
2008-07-01
Most probable number (MPN) and colony-forming-unit (CFU) estimates of fecal coliform bacteria concentration are common measures of water quality in coastal shellfish harvesting and recreational waters. Estimating procedures for MPN and CFU have intrinsic variability and are subject to additional uncertainty arising from minor variations in experimental protocol. It has been observed empirically that the standard multiple-tube fermentation (MTF) decimal dilution analysis MPN procedure is more variable than the membrane filtration CFU procedure, and that MTF-derived MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the variability in, and discrepancy between, MPN and CFU measurements. We then compare our model to water quality samples analyzed using both MPN and CFU procedures, and find that the (often large) observed differences between MPN and CFU values for the same water body are well within the ranges predicted by our probabilistic model. Our results indicate that MPN and CFU intra-sample variability does not stem from human error or laboratory procedure variability, but is instead a simple consequence of the probabilistic basis for calculating the MPN. These results demonstrate how probabilistic models can be used to compare samples from different analytical procedures, and to determine whether transitions from one procedure to another are likely to cause a change in quality-based management decisions.
Probability of Ship on Collision Courses Based on the New PAW Using MMG Model and AIS Data
Directory of Open Access Journals (Sweden)
I Putu Sindhu Asmara
2015-03-01
Full Text Available This paper proposes an estimation method for ships on collision courses taking crash astern maneuvers based on a new potential area of water (PAW for maneuvering. A crash astern maneuver is an emergency option a ship can take when exposed to the risk of a collision with other ships that have lost control. However, lateral forces and yaw moments exerted by the reversing propeller, as well as the uncertainty of the initial speed and initial yaw rate, will move the ship out of the intended stopping position landing it in a dangerous area. A new PAW for crash astern maneuvers is thus introduced. The PAW is developed based on a probability density function of the initial yaw rate. Distributions of the yaw rates and speeds are analyzed from automatic identification system (AIS data in Madura Strait, and estimated paths of the maneuvers are simulated using a mathematical maneuvering group model.
Directory of Open Access Journals (Sweden)
Kristiansen Ivar
2010-11-01
Full Text Available Abstract Background Estimating the economic impact of influenza is complicated because the disease may have non-specific symptoms, and many patients with influenza are registered with other diagnoses. Furthermore, in some countries like Norway, employees can be on paid sick leave for a specified number of days without a doctor's certificate ("self-reported sick leave" and these sick leaves are not registered. Both problems result in gaps in the existing literature: costs associated with influenza-related illness and self-reported sick leave are rarely included. The aim of this study was to improve estimates of total influenza-related health-care costs and productivity losses by estimating these missing costs. Methods Using Norwegian data, the weekly numbers of influenza-attributable hospital admissions and certified sick leaves registered with other diagnoses were estimated from influenza-like illness surveillance data using quasi-Poisson regression. The number of self-reported sick leaves was estimated using a Monte-Carlo simulation model of illness recovery curves based on the number of certified sick leaves. A probabilistic sensitivity analysis was conducted on the economic outcomes. Results During the 1998/99 through 2005/06 influenza seasons, the models estimated an annual average of 2700 excess influenza-associated hospitalizations in Norway, of which 16% were registered as influenza, 51% as pneumonia and 33% were registered with other diagnoses. The direct cost of seasonal influenza totaled US$22 million annually, including costs of pharmaceuticals and outpatient services. The annual average number of working days lost was predicted at 793 000, resulting in an estimated productivity loss of US$231 million. Self-reported sick leave accounted for approximately one-third of the total indirect cost. During a pandemic, the total cost could rise to over US$800 million. Conclusions Influenza places a considerable burden on patients and society
2010-01-01
Background Estimating the economic impact of influenza is complicated because the disease may have non-specific symptoms, and many patients with influenza are registered with other diagnoses. Furthermore, in some countries like Norway, employees can be on paid sick leave for a specified number of days without a doctor's certificate ("self-reported sick leave") and these sick leaves are not registered. Both problems result in gaps in the existing literature: costs associated with influenza-related illness and self-reported sick leave are rarely included. The aim of this study was to improve estimates of total influenza-related health-care costs and productivity losses by estimating these missing costs. Methods Using Norwegian data, the weekly numbers of influenza-attributable hospital admissions and certified sick leaves registered with other diagnoses were estimated from influenza-like illness surveillance data using quasi-Poisson regression. The number of self-reported sick leaves was estimated using a Monte-Carlo simulation model of illness recovery curves based on the number of certified sick leaves. A probabilistic sensitivity analysis was conducted on the economic outcomes. Results During the 1998/99 through 2005/06 influenza seasons, the models estimated an annual average of 2700 excess influenza-associated hospitalizations in Norway, of which 16% were registered as influenza, 51% as pneumonia and 33% were registered with other diagnoses. The direct cost of seasonal influenza totaled US$22 million annually, including costs of pharmaceuticals and outpatient services. The annual average number of working days lost was predicted at 793 000, resulting in an estimated productivity loss of US$231 million. Self-reported sick leave accounted for approximately one-third of the total indirect cost. During a pandemic, the total cost could rise to over US$800 million. Conclusions Influenza places a considerable burden on patients and society with indirect costs greatly
U.S. Geological Survey, Department of the Interior — The Barrier Island and Estuarine Wetland Physical Change Assessment was created to calibrate and test probability models of barrier island estuarine shoreline...
Choice probability generating functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel
2010-01-01
This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....
Directory of Open Access Journals (Sweden)
Yu Zou
2013-07-01
Full Text Available Reconciling competing desires to build urban models that can be simple and complicated is something of a grand challenge for urban simulation. It also prompts difficulties in many urban policy situations, such as urban sprawl, where simple, actionable ideas may need to be considered in the context of the messily complex and complicated urban processes and phenomena that work within cities. In this paper, we present a novel architecture for achieving both simple and complicated realizations of urban sprawl in simulation. Fine-scale simulations of sprawl geography are run using geographic automata to represent the geographical drivers of sprawl in intricate detail and over fine resolutions of space and time. We use Equation-Free computing to deploy population as a coarse observable of sprawl, which can be leveraged to run automata-based models as short-burst experiments within a meta-simulation framework.
Energy Technology Data Exchange (ETDEWEB)
Kikuchi, Ryota; Misaka, Takashi; Obayashi, Shigeru, E-mail: rkikuchi@edge.ifs.tohoku.ac.jp [Institute of Fluid Science, Tohoku University, 2-1-1 Katahira, Aoba-ku, Sendai, Miyagi 980-8577 (Japan)
2015-10-15
An integrated method of a proper orthogonal decomposition based reduced-order model (ROM) and data assimilation is proposed for the real-time prediction of an unsteady flow field. In this paper, a particle filter (PF) and an ensemble Kalman filter (EnKF) are compared for data assimilation and the difference in the predicted flow fields is evaluated focusing on the probability density function (PDF) of the model variables. The proposed method is demonstrated using identical twin experiments of an unsteady flow field around a circular cylinder at the Reynolds number of 1000. The PF and EnKF are employed to estimate temporal coefficients of the ROM based on the observed velocity components in the wake of the circular cylinder. The prediction accuracy of ROM-PF is significantly better than that of ROM-EnKF due to the flexibility of PF for representing a PDF compared to EnKF. Furthermore, the proposed method reproduces the unsteady flow field several orders faster than the reference numerical simulation based on the Navier–Stokes equations. (paper)
Butler, Doug; Bauman, David; Johnson-Throop, Kathy
2011-01-01
The Integrated Medical Model (IMM) Project has been developing a probabilistic risk assessment tool, the IMM, to help evaluate in-flight crew health needs and impacts to the mission due to medical events. This package is a follow-up to a data package provided in June 2009. The IMM currently represents 83 medical conditions and associated ISS resources required to mitigate medical events. IMM end state forecasts relevant to the ISS PRA model include evacuation (EVAC) and loss of crew life (LOCL). The current version of the IMM provides the basis for the operational version of IMM expected in the January 2011 timeframe. The objectives of this data package are: 1. To provide a preliminary understanding of medical risk data used to update the ISS PRA Model. The IMM has had limited validation and an initial characterization of maturity has been completed using NASA STD 7009 Standard for Models and Simulation. The IMM has been internally validated by IMM personnel but has not been validated by an independent body external to the IMM Project. 2. To support a continued dialogue between the ISS PRA and IMM teams. To ensure accurate data interpretation, and that IMM output format and content meets the needs of the ISS Risk Management Office and ISS PRA Model, periodic discussions are anticipated between the risk teams. 3. To help assess the differences between the current ISS PRA and IMM medical risk forecasts of EVAC and LOCL. Follow-on activities are anticipated based on the differences between the current ISS PRA medical risk data and the latest medical risk data produced by IMM.
The perception of probability.
Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E
2014-01-01
We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
The July 17, 2006 Java Tsunami: Tsunami Modeling and the Probable Causes of the Extreme Run-up
Kongko, W.; Schlurmann, T.
2009-04-01
On 17 July 2006, an Earthquake magnitude Mw 7.8 off the south coast of west Java, Indonesia generated tsunami that affected over 300 km of south Java coastline and killed more than 600 people. Observed tsunami heights and field measurement of run-up distributions were uniformly scattered approximately 5 to 7 m along a 200 km coastal stretch; remarkably, a locally focused tsunami run-up height exceeding 20 m at Nusakambangan Island has been observed. Within the framework of the German Indonesia Tsunami Early Warning System (GITEWS) Project, a high-resolution near-shore bathymetrical survey equipped by multi-beam echo-sounder has been recently conducted. Additional geodata have been collected using Intermap Technologies STAR-4 airborne interferometric SAR data acquisition system on a 5 m ground sample distance basis in order to establish a most-sophisticated Digital Terrain Model (DTM). This paper describes the outcome of tsunami modelling approaches using high resolution data of bathymetry and topography being part of a general case study in Cilacap, Indonesia, and medium resolution data for other area along coastline of south Java Island. By means of two different seismic deformation models to mimic the tsunami source generation, a numerical code based on the 2D nonlinear shallow water equations is used to simulate probable tsunami run-up scenarios. Several model tests are done and virtual points in offshore, near-shore, coastline, as well as tsunami run-up on the coast are collected. For the purpose of validation, the model results are compared with field observations and sea level data observed at several tide gauges stations. The performance of numerical simulations and correlations with observed field data are highlighted, and probable causes for the extreme wave heights and run-ups are outlined. References Ammon, C.J., Kanamori, K., Lay, T., and Velasco, A., 2006. The July 2006 Java Tsunami Earthquake, Geophysical Research Letters, 33(L24308). Fritz, H
2017-01-01
This paper seeks to develop a more thermodynamically sound pedagogy for students of biological transport than is currently available from either of the competing schools of linear non-equilibrium thermodynamics (LNET) or Michaelis–Menten kinetics (MMK). To this end, a minimal model of facilitated diffusion was constructed comprising four reversible steps: cis-substrate binding, cis→trans bound enzyme shuttling, trans-substrate dissociation and trans→cis free enzyme shuttling. All model parameters were subject to the second law constraint of the probability isotherm, which determined the unidirectional and net rates for each step and for the overall reaction through the law of mass action. Rapid equilibration scenarios require sensitive ‘tuning’ of the thermodynamic binding parameters to the equilibrium substrate concentration. All non-equilibrium scenarios show sigmoidal force–flux relations, with only a minority of cases having their quasi-linear portions close to equilibrium. Few cases fulfil the expectations of MMK relating reaction rates to enzyme saturation. This new approach illuminates and extends the concept of rate-limiting steps by focusing on the free energy dissipation associated with each reaction step and thereby deducing its respective relative chemical impedance. The crucial importance of an enzyme's being thermodynamically ‘tuned’ to its particular task, dependent on the cis- and trans-substrate concentrations with which it deals, is consistent with the occurrence of numerous isoforms for enzymes that transport a given substrate in physiologically different circumstances. This approach to kinetic modelling, being aligned with neither MMK nor LNET, is best described as intuitive non-equilibrium thermodynamics, and is recommended as a useful adjunct to the design and interpretation of experiments in biotransport. PMID:28680687
Casares-Magaz, Oscar; van der Heide, Uulke A; Rørvik, Jarle; Steenbergen, Peter; Muren, Ludvig Paul
2016-04-01
Standard tumour control probability (TCP) models assume uniform tumour cell density across the tumour. The aim of this study was to develop an individualised TCP model by including index-tumour regions extracted form multi-parametric magnetic resonance imaging (MRI) and apparent diffusion coefficient (ADC) maps-based cell density distributions. ADC maps in a series of 20 prostate cancer patients were applied to estimate the initial number of cells within each voxel, using three different approaches for the relation between ADC values and cell density: a linear, a binary and a sigmoid relation. All TCP models were based on linear-quadratic cell survival curves assuming α/β=1.93Gy (consistent with a recent meta-analysis) and α set to obtain a 70% of TCP when 77Gy was delivered to the entire prostate in 35 fractions (α=0.18Gy(-1)). Overall, TCP curves based on ADC maps showed larger differences between individuals than those assuming uniform cell densities. The range of the dose required to reach 50% TCP across the patient cohort was 20.1Gy, 18.7Gy and 13.2Gy using an MRI-based voxel density (linear, binary and sigmoid approach, respectively), compared to 4.1Gy using a constant density. Inclusion of tumour-index information together with ADC maps-based cell density increases inter-patient tumour response differentiation for use in prostate cancer RT, resulting in TCP curves with a larger range in D50% across the cohort compared with those based on uniform cell densities. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Chapman, Brian
2017-06-01
This paper seeks to develop a more thermodynamically sound pedagogy for students of biological transport than is currently available from either of the competing schools of linear non-equilibrium thermodynamics (LNET) or Michaelis-Menten kinetics (MMK). To this end, a minimal model of facilitated diffusion was constructed comprising four reversible steps: cis-substrate binding, cis→trans bound enzyme shuttling, trans-substrate dissociation and trans→cis free enzyme shuttling. All model parameters were subject to the second law constraint of the probability isotherm, which determined the unidirectional and net rates for each step and for the overall reaction through the law of mass action. Rapid equilibration scenarios require sensitive 'tuning' of the thermodynamic binding parameters to the equilibrium substrate concentration. All non-equilibrium scenarios show sigmoidal force-flux relations, with only a minority of cases having their quasi-linear portions close to equilibrium. Few cases fulfil the expectations of MMK relating reaction rates to enzyme saturation. This new approach illuminates and extends the concept of rate-limiting steps by focusing on the free energy dissipation associated with each reaction step and thereby deducing its respective relative chemical impedance. The crucial importance of an enzyme's being thermodynamically 'tuned' to its particular task, dependent on the cis- and trans-substrate concentrations with which it deals, is consistent with the occurrence of numerous isoforms for enzymes that transport a given substrate in physiologically different circumstances. This approach to kinetic modelling, being aligned with neither MMK nor LNET, is best described as intuitive non-equilibrium thermodynamics, and is recommended as a useful adjunct to the design and interpretation of experiments in biotransport.
Fonseca, Marisa; Alves, Lincoln; Aguiar, Ana Paula; Anderson, Liana; Aragão, Luiz
2017-04-01
Climate and land-use change are expected to amplify fire incidence in the Amazon. Modelling the influence of land-use and climate change scenarios on fire occurrence is therefore important to better understand their impacts on the carbon emissions and ecosystems' degradation in the region. Here we use the Maximum Entropy method (MaxEnt) to estimate the impact of different climate and land-use change scenarios on the relative fire probability (RFP) during the 2071-2099 period in the Brazilian Amazon with a 0.25° spatial resolution. The model was calibrated using satellite-based fire detections during the 2006-2015 period (hereafter "baseline"). The land-use change variables were obtained considering alternative pathways of clear-cut deforestation, secondary vegetation and old growth forest degradation resulting from major socioeconomic, institutional and environmental dynamics in the region. The climatic variables were generated using a regional model (ETA) nested in an earth system global model (HadGEM2-ES). A land-use "sustainability" scenario considering that institutional and political conditions would favour the increase in forest regeneration and decrease of the old growth forest degradation and clear-cut deforestation rates was combined with the representative concentration pathway (RCP) 4.5 climatic scenario (hereafter SUST-4.5). To access the worst-case scenario of fire incidence, a "fragmentation" land-use scenario, representing the opposite tendency of the "sustainability" conditions, was combined with the climatic variables resulting from the RCP 8.5 (FRAG-8.5). The test AUC (area under de curve) metric (0.768 ± 0.018) indicated satisfactory model performance. In the FRAG-8.5 scenario 63% ( 2.900.000 km2) of the study region shows from 0.35 to 0.55 of RFP, while in the baseline and under the SUST-4.5 scenario, 30% and 40% of the region is within this range of RFP, respectively. Conversely, in the baseline 29% of the area shows up to 0.1 RFP, but this
Indian Academy of Sciences (India)
Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 2. On Randomness and Probability How to Mathematically Model Uncertain Events ... Author Affiliations. Rajeeva L Karandikar1. Statistics and Mathematics Unit, Indian Statistical Institute, 7 S J S Sansanwal Marg, New Delhi 110 016, India.
Laze, Kuenda
2016-08-01
Modelling of land use may be improved by incorporating the results of species distribution modelling and species distribution modelling may be upgraded if a variable of the process-based variable of forest cover change or accessibility of forest from human settlement is included. This work presents the results of spatially explicit analyses of the changes in forest cover from 2000 to 2007 using the method of Geographically Weighted Regression (GWR) and of the species distribution for protected species of Lynx lynx martinoi, Ursus arctos using Generalized Linear Models (GLMs). The methodological approach is separately searching for a parsimonious model for forest cover change and species distribution for the entire territory of Albania. The findings of this work show that modelling of land change and of species distribution is indeed value-added by showing higher values of model selection of corrected Akaike Information Criterion. These results provide evidences on the effects of process-based variables on species distribution modelling and on the performance of species distribution modelling as well as show an example of the incorporation of estimated probability of species occurrences in a land change modelling.
Wei Wu; Charlesb Hall; Lianjun Zhang
2006-01-01
We predicted the spatial pattern of hourly probability of cloud cover in the Luquillo Experimental Forest (LEF) in North-Eastern Puerto Rico using four different models. The probability of cloud cover (defined as âthe percentage of the area covered by clouds in each pixel on the mapâ in this paper) at any hour and any place is a function of three topographic variables...
Zhang, Jiangjiang; Li, Weixuan; Lin, Guang; Zeng, Lingzao; Wu, Laosheng
2017-03-01
In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters. To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen-Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.
Probability, Nondeterminism and Concurrency
DEFF Research Database (Denmark)
Varacca, Daniele
reveals the computational intuition lying behind the mathematics. In the second part of the thesis we provide an operational reading of continuous valuations on certain domains (the distributive concrete domains of Kahn and Plotkin) through the model of probabilistic event structures. Event structures......Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particular...... there is no categorical distributive law between them. We introduce the powerdomain of indexed valuations which modifies the usual probabilistic powerdomain to take more detailed account of where probabilistic choices are made. We show the existence of a distributive law between the powerdomain of indexed valuations...
... To receive Pregnancy email updates Enter email Submit Pregnancy complications Complications of pregnancy are health problems that ... pregnancy. Expand all | Collapse all Health problems before pregnancy Before pregnancy, make sure to talk to your ...
Probability Aggregates in Probability Answer Set Programming
Saad, Emad
2013-01-01
Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...
Chiang, Jui-Kun; Cheng, Yu-Hsiang; Koo, Malcolm; Kao, Yee-Hsin; Chen, Ching-Yu
2010-05-01
The aim of the present study is to compare the accuracy in using laboratory data or clinical factors, or both, in predicting probability of dying within 7 days of hospice admission in terminal cancer patients. We conducted a prospective cohort study of 727 patients with terminal cancer. Three models for predicting the probability of dying within 7 days of hospice admission were developed: (i) demographic data and laboratory data (Model 1); (ii) demographic data and clinical symptoms (Model 2); and (iii) combination of demographic data, laboratory data and clinical symptoms (Model 3). We compared the models by using the area under the receiver operator curve using stepwise multiple logistic regression. We estimated the probability dying within 7 days of hospice admission using the logistic function, P = Exp(betax)/[1 + Exp(betax)]. The highest prediction accuracy was observed in Model 3 (82.3%), followed by Model 2 (77.8%) and Model 1 (75.5%). The log[probability of dying within 7 days/(1 - probability of dying within 7 days)] = -6.52 + 0.77 x (male = 1, female = 0) + 0.59 x (cancer, liver = 1, others = 0) + 0.82 x (ECOG score) + 0.59 x (jaundice, yes = 1, no = 0) + 0.54 x (Grade 3 edema = 1, others = 0) + 0.95 x (fever, yes = 1, no = 0) + 0.07 x (respiratory rate, as per minute) + 0.01 x (heart rate, as per minute) - 0.92 x (intervention tube = 1, no = 0) - 0.37 x (mean muscle power). We proposed a computer-assisted estimated probability formula for predicting dying within 7 days of hospice admission in terminal cancer patients.
Xufeng Wang; Jianmin Li; Xingwei Kong; Xinmin Dong; Bo Zhang
2017-01-01
One of the keys to the success of aerial refueling for probe-drogue aerial refueling system (PDARS) is the successful docking between the probe and drogue. The study of probe-drogue docking success probability offers an important support to achieving successful docking. During the docking phase of PDARS, based on prior information and reasonable assumptions for the movements of the drogue under atmospheric disturbance, the probe-drogue docking success probability is converted to the probabili...
Collision Probability Analysis
DEFF Research Database (Denmark)
Hansen, Peter Friis; Pedersen, Preben Terndrup
1998-01-01
It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...
Liu, Feng; Tai, An; Lee, Percy; Biswas, Tithi; Ding, George X.; El Naqa, Isaam; Grimm, Jimm; Jackson, Andrew; Kong, Feng-Ming (Spring); LaCouture, Tamara; Loo, Billy; Miften, Moyed; Solberg, Timothy; Li, X Allen
2017-01-01
Purpose To analyze pooled clinical data using different radiobiological models and to understand the relationship between biologically effective dose (BED) and tumor control probability (TCP) for stereotactic body radiotherapy (SBRT) of early-stage non-small cell lung cancer (NSCLC). Method and Materials The clinical data of 1-, 2-, 3-, and 5-year actuarial or Kaplan-Meier TCP from 46 selected studies were collected for SBRT of NSCLC in the literature. The TCP data were separated for Stage T1 and T2 tumors if possible, otherwise collected for combined stages. BED was calculated at isocenters using six radiobiological models. For each model, the independent model parameters were determined from a fit to the TCP data using the least chi-square (χ2) method with either one set of parameters regardless of tumor stages or two sets for T1 and T2 tumors separately. Results The fits to the clinic data yield consistent results of large α/β ratios of about 20 Gy for all models investigated. The regrowth model that accounts for the tumor repopulation and heterogeneity leads to a better fit to the data, compared to other 5 models where the fits were indistinguishable between the models. The models based on the fitting parameters predict that the T2 tumors require about additional 1 Gy physical dose at isocenters per fraction (≤5 fractions) to achieve the optimal TCP when compared to the T1 tumors. Conclusion This systematic analysis of a large set of published clinical data using different radiobiological models shows that local TCP for SBRT of early-stage NSCLC has strong dependence on BED with large α/β ratios of about 20 Gy. The six models predict that a BED (calculated with α/β of 20) of 90 Gy is sufficient to achieve TCP ≥ 95%. Among the models considered, the regrowth model leads to a better fit to the clinical data. PMID:27871671
Directory of Open Access Journals (Sweden)
Valentina Viego
2017-07-01
Full Text Available Background. Hypertension, diabetes and hypercholesterolemia are the most frequent and diagnosed chronic diseases in Argentina. They contribute largely to the burden of chronic disease and they are strongly influenced by a small number of risk factors. These risk factors are all modifiable at the population and individual level and offer major prospects for their prevention. We are interested in socioeconomic determinants of prevalence of those 3 specific diseases. Design and methods. We estimate 3-equation probit model, combined with 3 separate probit estimations and a probit-based Heckman correction considering possible sample selection bias. Estimations were carried out using secondary self-reported data coming from the 2013 Risk Factor National Survey. Results. We find a negative association between socioeconomic status and prevalence of hypertension, cholesterolemia and diabetes; main increases concentrate in the transition from low to high SES in hypertension and diabetes. In cholesterol, the major effect takes place when individual crosses from low to middle SES and then vanishes. Anyway, in Argentina SES exhibit and independent effect on chronic diseases apart from those based on habits and body weight. Conclusions. Public strategies to prevent chronic diseases must be specially targeted at women, poorest households and the least educated individuals in order to achieve efficacy. Also, as the probability of having a condition related to excessive blood pressure, high levels of cholesterol or glucose in the blood do not increase proportionally with age, so public campaigns promoting healthy diets, physical activity and medical checkups should be focused on young individuals to facilitate prophylaxis.
Structural Minimax Probability Machine.
Gu, Bin; Sun, Xingming; Sheng, Victor S
2017-07-01
Minimax probability machine (MPM) is an interesting discriminative classifier based on generative prior knowledge. It can directly estimate the probabilistic accuracy bound by minimizing the maximum probability of misclassification. The structural information of data is an effective way to represent prior knowledge, and has been found to be vital for designing classifiers in real-world problems. However, MPM only considers the prior probability distribution of each class with a given mean and covariance matrix, which does not efficiently exploit the structural information of data. In this paper, we use two finite mixture models to capture the structural information of the data from binary classification. For each subdistribution in a finite mixture model, only its mean and covariance matrix are assumed to be known. Based on the finite mixture models, we propose a structural MPM (SMPM). SMPM can be solved effectively by a sequence of the second-order cone programming problems. Moreover, we extend a linear model of SMPM to a nonlinear model by exploiting kernelization techniques. We also show that the SMPM can be interpreted as a large margin classifier and can be transformed to support vector machine and maxi-min margin machine under certain special conditions. Experimental results on both synthetic and real-world data sets demonstrate the effectiveness of SMPM.
Si, Sheng; Yan, Yan; Fuller, Brian M; Liang, Stephen Y
2018-02-21
Patients with chronic SCI hospitalized for UTI can have significant morbidity. It is unclear whether SIRS criteria, SOFA score, or quick SOFA score can be used to predict complicated outcome. Retrospective cohort study. A risk prediction model was developed and internally validated using bootstrapping methodology. Urban, academic hospital in St. Louis, Missouri. 402 hospitalizations for UTI between October 1, 2010 and September 30, 2015, arising from 164 patients with chronic SCI, were included in the final analysis. Outcome/measures: An a priori composite complicated outcome defined as: 30-day hospital mortality, length of hospital stay >4 days, intensive care unit (ICU) admission, and hospital revisit within 30 days of discharge. Mean age of patients was 46.4 ± 12.3 years; 83.6% of patient-visits involved males. The primary outcome occurred in 278 (69.2%) hospitalizations. In multivariate analysis, male sex was protective (odds ratio [OR], 0.43; 95% confidence interval [CI], 0.18-0.99; P = 0.048) while Gram-positive urine culture (OR 3.07; 95% CI, 1.05-9.01; P = 0.041), urine culture with no growth (OR, 1.69; 95% CI, 1.02-2.80; P = 0.041), and greater SOFA score (for one-point increments, OR, 1.41; 95% CI, 1.18-1.69; P complicated outcome. SIRS criteria and qSOFA score were not associated with complicated outcome. Our risk prediction model demonstrated good overall performance (Brier score, 0.19), fair discriminatory power (c-index, 0.72), and good calibration during internal validation. Clinical variables present on hospital admission with UTI may help identify SCI patients at risk for complicated outcomes and inform future clinical decision-making.
Directory of Open Access Journals (Sweden)
N.V. Grygorieva
2017-06-01
Full Text Available Background. Vertebral fractures are one of the severe complications of systemic osteoporosis, which lead to the low-back pain, decrease or loss of efficiency and increase of mortality in older people. FRAX and dual-energy X-ray absorptiometry (DXA are important methods in determining major osteoporotic fractures risk, including vertebral fractures. Materials and methods. We studied the parameters of Ukrainian model of FRAX in women depending on the presence of vertebral fractures. 652 patients aged 40–89 years examined at the Ukrainian Scientific Medical Center of Osteoporosis were divided into two groups: the first one — 523 women without any previous fractures, the second one — 129 patients with previous vertebral fractures. The assessment of bone mineral density (BMD was performed using DXA (Prodigy, General Electric. The 10-year probability of major osteoporotic fractures (FRAX-MOF and hip fractures (FRAX-HF has been determined using Ukrainian model of FRAX according to two methods — with body mass index (FRAXBMI and BMD. Results. According the distribution of FRAXBMI-MOF parameters in women depending on the presence of vertebral fractures, it was found that index of FRAXBMI-MOF was less than 20 % (the limit indicated as the criterion for treatment initiation in US guidelines in 100 and 100 % of subjects, respectively. The indices of FRAX BMD-HF were less than 3 % (the limit for starting treatment in US guidelines in 95 and 55 % of women, respectively. It was shown the significant moderate correlation between the indices of two methods in all groups for both parameters of the algorithm — FRAX-MOF and FRAX-HF. Conclusions. The study of the age-specific features of FRAX in women depending on the presence of vertebral fractures showed a significant increase in the risks for both major osteoporotic and hip fractures, regardless of the used technique (with BMI or BMD in women with vertebral fractures or without any fractures. Our
Directory of Open Access Journals (Sweden)
Nils Ternès
2017-05-01
Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4
Directory of Open Access Journals (Sweden)
H. Zhong
2013-07-01
Full Text Available The Lower Rhine Delta, a transitional area between the River Rhine and Meuse and the North Sea, is at risk of flooding induced by infrequent events of a storm surge or upstream flooding, or by more infrequent events of a combination of both. A joint probability analysis of the astronomical tide, the wind induced storm surge, the Rhine flow and the Meuse flow at the boundaries is established in order to produce the joint probability distribution of potential flood events. Three individual joint probability distributions are established corresponding to three potential flooding causes: storm surges and normal Rhine discharges, normal sea levels and high Rhine discharges, and storm surges and high Rhine discharges. For each category, its corresponding joint probability distribution is applied, in order to stochastically simulate a large number of scenarios. These scenarios can be used as inputs to a deterministic 1-D hydrodynamic model in order to estimate the high water level frequency curves at the transitional locations. The results present the exceedance probability of the present design water level for the economically important cities of Rotterdam and Dordrecht. The calculated exceedance probability is evaluated and compared to the governmental norm. Moreover, the impact of climate change on the high water level frequency curves is quantified for the year 2050 in order to assist in decisions regarding the adaptation of the operational water management system and the flood defense system.
Measurement uncertainty and probability
National Research Council Canada - National Science Library
Willink, Robin
2013-01-01
... and probability models 3.4 Inference and confidence 3.5 Two central limit theorems 3.6 The Monte Carlo method and process simulation 4 The randomization of systematic errors page xi xii 3 3 5 7 10 12 16 19 21 21 23 28 30 32 33 39 43 45 52 53 56 viiviii 4.1 4.2 4.3 4.4 4.5 Contents The Working Group of 1980 From classical repetition to practica...
Fenlon, Caroline; O'Grady, Luke; Doherty, Michael L; Dunnion, John; Shalloo, Laurence; Butler, Stephen T
2017-07-01
Reproductive performance in pasture-based production systems has a fundamentally important effect on economic efficiency. The individual factors affecting the probability of submission and conception are multifaceted and have been extensively researched. The present study analyzed some of these factors in relation to service-level probability of conception in seasonal-calving pasture-based dairy cows to develop a predictive model of conception. Data relating to 2,966 services from 737 cows on 2 research farms were used for model development and data from 9 commercial dairy farms were used for model testing, comprising 4,212 services from 1,471 cows. The data spanned a 15-yr period and originated from seasonal-calving pasture-based dairy herds in Ireland. The calving season for the study herds extended from January to June, with peak calving in February and March. A base mixed-effects logistic regression model was created using a stepwise model-building strategy and incorporated parity, days in milk, interservice interval, calving difficulty, and predicted transmitting abilities for calving interval and milk production traits. To attempt to further improve the predictive capability of the model, the addition of effects that were not statistically significant was considered, resulting in a final model composed of the base model with the inclusion of BCS at service. The models' predictions were evaluated using discrimination to measure their ability to correctly classify positive and negative cases. Precision, recall, F-score, and area under the receiver operating characteristic curve (AUC) were calculated. Calibration tests measured the accuracy of the predicted probabilities. These included tests of overall goodness-of-fit, bias, and calibration error. Both models performed better than using the population average probability of conception. Neither of the models showed high levels of discrimination (base model AUC 0.61, final model AUC 0.62), possibly because of the
Frič, Roman; Papčo, Martin
2010-12-01
Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.
Zemskov, Serguey V.; Jonkers, Henk M.; Vermolen, Fred J.
The present study is performed in the framework of the investigation of the potential of bacteria to act as a catalyst of the self-healing process in concrete, i.e. their ability to repair occurring cracks autonomously. Spherical clay capsules containing the healing agent (calcium lactate) are embedded in the concrete structure. Water entering a freshly formed crack releases the healing agent and activates the bacteria which will seal the crack through the process of metabolically mediated calcium carbonate precipitation. In the paper, an analytic formalism is developed for the computation of the probability that a crack hits an encapsulated particle, i.e. the probability that the self-healing process starts. Most computations are performed in closed algebraic form in the computer algebra system Mathematica which allows to perform the last step of calculations numerically with a higher accuracy.
Courey, Karim; Wright, Clara; Asfour, Shihab; Bayliss, Jon; Ludwig, Larry
2008-01-01
Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that has a currently unknown probability associated with it. Due to contact resistance, electrical shorts may not occur at lower voltage levels. In this experiment, we study the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From this data, we can estimate the probability of an electrical short, as a function of voltage, given that a free tin whisker has bridged two adjacent exposed electrical conductors. In addition, three tin whiskers grown from the same Space Shuttle Orbiter card guide used in the aforementioned experiment were cross sectioned and studied using a focused ion beam (FIB).
van den Akker, H.P.; Baart, J.A.; Baart, J.A.; Brand, H.S.
2017-01-01
Local anaesthesia is frequently used in dentistry and seldom leads to serious local complications. Nevertheless, it is of great importance to be aware of the causes of each local complication and – if necessary – implement correct treatment. The patient must be informed extensively and, if
Two-slit experiment: quantum and classical probabilities
Khrennikov, Andrei
2015-06-01
Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum-classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane).
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Carberry, George A; Nocerino, Elisabetta; Mason, Peter J; Schwahn, Denise J; Hetzel, Scott; Turnquist, Alyssa M; Lee, Fred T; Brace, Christopher L
2017-03-01
Purpose To determine how close to the heart pulmonary microwave ablation can be performed without causing cardiac tissue injury or significant arrhythmia. Materials and Methods The study was performed with approval from the institutional animal care and use committee. Computed tomographic fluoroscopically guided microwave ablation of the lung was performed in 12 swine. Antennas were randomized to either parallel (180° ± 20°) or perpendicular (90° ± 20°) orientation relative to the heart surface and to distances of 0-10 mm from the heart. Ablations were performed at 65 W for 5 minutes or until a significant arrhythmia (asystole, heart block, bradycardia, supraventricular or ventricular tachycardia) developed. Heart tissue was evaluated with vital staining and histologic examination. Data were analyzed with mixed effects logistic regression, receiver operating characteristic curves, and the Fisher exact test. Results Thirty-four pulmonary microwave ablations were performed with the antenna a median distance of 4 mm from the heart in both perpendicular (n = 17) and parallel (n = 17) orientation. Significant arrhythmias developed during six (18%) ablations. Cardiac tissue injury occurred with 17 ablations (50%). Risk of arrhythmia and tissue injury decreased with increasing antenna distance from the heart with both antenna orientations. No cardiac complication occurred with a distance of greater than or equal to 4.4 mm from the heart. The ablation zone extended to the pleural surface adjacent to the heart in 71% of parallel and 17% of perpendicular ablations performed 5-10 mm from the heart. Conclusion Microwave lung ablations performed more than or equal to 5 mm from the heart were associated with a low risk of cardiac complications. © RSNA, 2016.
Thangaratinam, Shakila; Allotey, John; Marlin, Nadine; Dodds, Julie; Cheong-See, Fiona; von Dadelszen, Peter; Ganzevoort, Wessel; Akkermans, Joost; Kerry, Sally; Mol, Ben W; Moons, Karl G M; Riley, Richard D; Khan, Khalid S
2017-03-30
Unexpected clinical deterioration before 34 weeks gestation is an undesired course in early-onset pre-eclampsia. To safely prolong preterm gestation, accurate and timely prediction of complications is required. Women with confirmed early onset pre-eclampsia were recruited from 53 maternity units in the UK to a large prospective cohort study (PREP-946) for development of prognostic models for the overall risk of experiencing a complication using logistic regression (PREP-L), and for predicting the time to adverse maternal outcome using a survival model (PREP-S). External validation of the models were carried out in a multinational cohort (PIERS-634) and another cohort from the Netherlands (PETRA-216). Main outcome measures were C-statistics to summarise discrimination of the models and calibration plots and calibration slopes. A total of 169 mothers (18%) in the PREP dataset had adverse outcomes by 48 hours, and 633 (67%) by discharge. The C-statistics of the models for predicting complications by 48 hours and by discharge were 0.84 (95% CI, 0.81-0.87; PREP-S) and 0.82 (0.80-0.84; PREP-L), respectively. The PREP-S model included maternal age, gestation, medical history, systolic blood pressure, deep tendon reflexes, urine protein creatinine ratio, platelets, serum alanine amino transaminase, urea, creatinine, oxygen saturation and treatment with antihypertensives or magnesium sulfate. The PREP-L model included the above except deep tendon reflexes, serum alanine amino transaminase and creatinine. On validation in the external PIERS dataset, the reduced PREP-S model showed reasonable calibration (slope 0.80) and discrimination (C-statistic 0.75) for predicting adverse outcome by 48 hours. Reduced PREP-L model showed excellent calibration (slope: 0.93 PIERS, 0.90 PETRA) and discrimination (0.81 PIERS, 0.75 PETRA) for predicting risk by discharge in the two external datasets. PREP models can be used to obtain predictions of adverse maternal outcome risk, including
Joukar, Siyavash; Ghorbani-Shahrbabaki, Soodabe
2016-05-01
Some of the previous studies have used animal model of paradoxical sleep deprivation for investigation of sleep loss complications. The present study is designed to examine the effectiveness and reliability of this model for investigation and assessment of some cardiovascular complications of obstructive sleep apnea syndrome. The Wistar rat groups were divided into the control group, the Test48 and Test72 groups, who experienced paradoxical sleep deprivation for 48 and 72 h, and the Sham48 and Sham72 groups, who were exposed to environmental conditions same to test groups but without sleep deprivation, respectively. At the end of the experiment, blood pressure and heart rate variability were assessed. The results showed that 72 h rapid eye movements sleep deprivation significantly increased the systolic blood pressure compared to the control (p sleep deprivation may be a suitable model for induction and investigation of hemodynamic alterations which occurs in obstructive sleep apnea syndrome; however, it cannot be an alternative model to induce heart rate variability alterations similar to those reported in patient with obstructive sleep apnea.
Goldberg, Samuel
1960-01-01
Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.
Koo, Reginald; Jones, Martin L.
2011-01-01
Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.
Expected utility with lower probabilities
DEFF Research Database (Denmark)
Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte
1994-01-01
An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...
Contributions to quantum probability
Energy Technology Data Exchange (ETDEWEB)
Fritz, Tobias
2010-06-25
Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a
Quantum probability measures and tomographic probability densities
Amosov, GG; Man'ko, [No Value
2004-01-01
Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the
Agreeing Probability Measures for Comparative Probability Structures
P.P. Wakker (Peter)
1981-01-01
textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid
Directory of Open Access Journals (Sweden)
Emil Bayramov
2016-05-01
Full Text Available The main goal of this research was to detect oil spills, to determine the oil spill frequencies and to approximate oil leak sources around the Oil Rocks Settlement, the Chilov and Pirallahi Islands in the Caspian Sea using 136 multi-temporal ENVISAT Advanced Synthetic Aperture Radar Wide Swath Medium Resolution images acquired during 2006–2010. The following oil spill frequencies were observed around the Oil Rocks Settlement, the Chilov and Pirallahi Islands: 2–10 (3471.04 sq km, 11–20 (971.66 sq km, 21–50 (692.44 sq km, 51–128 (191.38 sq km. The most critical oil leak sources with the frequency range of 41–128 were observed at the Oil Rocks Settlement. The exponential regression analysis between wind speeds and oil slick areas detected from 136 multi-temporal ENVISAT images revealed the regression coefficient equal to 63%. The regression model showed that larger oil spill areas were observed with decreasing wind speeds. The spatiotemporal patterns of currents in the Caspian Sea explained the multi-directional spatial distribution of oil spills around Oil Rocks Settlement, the Chilov and Pirallahi Islands. The linear regression analysis between detected oil spill frequencies and predicted oil contamination probability by the stochastic model showed the positive trend with the regression coefficient of 30%.
... Search Form Controls Cancel Submit Search The CDC Diphtheria Note: Javascript is disabled or is not supported ... message, please visit this page: About CDC.gov . Diphtheria Home About Diphtheria Causes and Transmission Symptoms Complications ...
Bakker, O.J.
2015-01-01
Research questions addressed in this thesis: What is the accuracy of serum blood urea nitrogen as early predictor of complicated pancreatitis? ; What is difference in clinical outcome between patients with pancreatic parenchymal necrosis and patients with extrapancreatic necrosis without necrosis
Currier, Joseph M; Irish, Jennifer E F; Neimeyer, Robert A; Foster, Joshua D
2015-01-01
There is increasing consensus that mourners' general attachment security and ongoing sense of connectedness to the deceased figure prominently in adjustment to bereavement. However, the interplay between these variables has not been investigated thoroughly. We therefore studied 195 young adults who were bereaved by violent causes (homicide, suicide, and fatal accidents) in the previous 2 years, measuring their attachment-related insecurities (anxiety and avoidance), their specific ongoing attachment or "continuing bond" (CB) to the deceased, and their complicated grief (CG) symptomatology over the loss of this relationship. Analyses indicated that CBs were concurrently linked with greater CG symptomatology. However, other results also suggested that attachment could moderate the adaptiveness of maintaining a sense of connection to the deceased loved one. Specifically, CBs were less predictive of CG symptomatology for individuals with high anxiety and low avoidance, and most predictive of intense grieving for bereaved people whose attachment styles were more highly avoidant and minimally anxious. These findings suggest the relevance of evaluating the appropriateness of clinical techniques that emphasize or deemphasize the CB for mourners who differ in their styles of attachment. Such studies could potentially promote a better match of interventions to clients whose styles of coping are congruent with these procedures.
Hansen, F.S.
2016-01-01
Complicated rhinosinusitis: a title chosen for its multi-interpretable nature. In the Oxford dictionary ‘complicated’ is defined as ‘consisting of many interconnecting parts or elements’ and ‘involving many different and confusing aspects’ as well as ‘involving complications’ in medicine. It is the last definition that is applicable to chapter 2 which focuses on the medical complications of acute rhinosinusitis. Chapter 2.1 describes the incidence and management of orbital and intracranial co...
Stationary algorithmic probability
National Research Council Canada - National Science Library
Müller, Markus
2010-01-01
...,sincetheiractualvaluesdependonthechoiceoftheuniversal referencecomputer.Inthispaper,weanalyzeanaturalapproachtoeliminatethismachine- dependence. Our method is to assign algorithmic probabilities to the different...
Models Show Subsurface Cracking May Complicate Groundwater Cleanup at Hazardous Waste Sites
Chlorinated solvents like trichloroethylene contaminate groundwater at numerous sites nationwide. This modeling study, conducted at the Air Force Institute of Technology, shows that subsurface cracks, either natural or due to the presence of the contaminant itself, may result in...
DECOFF Probabilities of Failed Operations
DEFF Research Database (Denmark)
Gintautas, Tomas
A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha......-factor method defined by (DNV, 2011) and model performance is evaluated. Also, the effects that weather forecast uncertainty has on the output Probabilities of Failure is analysed and reported....
Factual and cognitive probability
Chuaqui, Rolando
2012-01-01
This modification separates the two aspects of probability: probability as a part of physical theories (factual), and as a basis for statistical inference (cognitive). Factual probability is represented by probability structures as in the earlier papers, but now built independently of the language. Cognitive probability is interpreted as a form of "partial truth". The paper also contains a discussion of the Principle of Insufficient Reason and of Bayesian and classical statistical methods, in...
Evaluating probability forecasts
Lai, Tze Leung; Gross, Shulamith T.; Shen, David Bo
2011-01-01
Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability for...
Ballard, P G; Bean, N G; Ross, J V
2016-03-21
Epidemic fade-out refers to infection elimination in the trough between the first and second waves of an outbreak. The number of infectious individuals drops to a relatively low level between these waves of infection, and if elimination does not occur at this stage, then the disease is likely to become endemic. For this reason, it appears to be an ideal target for control efforts. Despite this obvious public health importance, the probability of epidemic fade-out is not well understood. Here we present new algorithms for approximating the probability of epidemic fade-out for the Markovian SIR model with demography. These algorithms are more accurate than previously published formulae, and one of them scales well to large population sizes. This method allows us to investigate the probability of epidemic fade-out as a function of the effective transmission rate, recovery rate, population turnover rate, and population size. We identify an interesting feature: the probability of epidemic fade-out is very often greatest when the basic reproduction number, R0, is approximately 2 (restricting consideration to cases where a major outbreak is possible, i.e., R0>1). The public health implication is that there may be instances where a non-lethal infection should be allowed to spread, or antiviral usage should be moderated, to maximise the chance of the infection being eliminated before it becomes endemic. Copyright © 2016 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Oldřich Coufal
2014-01-01
Full Text Available Objectives. The aim of the study was to develop a clinical prediction model for assessing the probability of having invasive cancer in the definitive surgical resection specimen in patients with biopsy diagnosis of ductal carcinoma in situ (DCIS of the breast, to facilitate decision making regarding axillary surgery. Methods. In 349 women with DCIS, predictors of invasion in the definitive resection specimen were identified. A model to predict the probability of invasion was developed and subsequently simplified to divide patients into two risk categories. The model’s performance was validated on another patient population. Results. Multivariate logistic regression revealed four independent predictors of invasion: (i suspicious (microinvasion in the biopsy specimen; (ii visibility of the lesion on ultrasonography; (iii size of the lesion on mammography >30 mm; (iv clinical palpability of the lesion. The actual frequency of invasion in the high-risk patient group in the test and validation population was 52.6% and 48.3%, respectively; in the low-risk group it was 16.8% and 7.1%, respectively. Conclusion. The model proved to have good performance. In patients with a low probability of invasion, an axillary procedure can be omitted without a substantial risk of additional surgery.
Vasilevskis, Eduard E; Kuzniewicz, Michael W; Cason, Brian A; Lane, Rondall K; Dean, Mitzi L; Clay, Ted; Rennie, Deborah J; Vittinghoff, Eric; Dudley, R Adams
2009-07-01
To develop and compare ICU length-of-stay (LOS) risk-adjustment models using three commonly used mortality or LOS prediction models. Between 2001 and 2004, we performed a retrospective, observational study of 11,295 ICU patients from 35 hospitals in the California Intensive Care Outcomes Project. We compared the accuracy of the following three LOS models: a recalibrated acute physiology and chronic health evaluation (APACHE) IV-LOS model; and models developed using risk factors in the mortality probability model III at zero hours (MPM(0)) and the simplified acute physiology score (SAPS) II mortality prediction model. We evaluated models by calculating the following: (1) grouped coefficients of determination; (2) differences between observed and predicted LOS across subgroups; and (3) intraclass correlations of observed/expected LOS ratios between models. The grouped coefficients of determination were APACHE IV with coefficients recalibrated to the LOS values of the study cohort (APACHE IVrecal) [R(2) = 0.422], mortality probability model III at zero hours (MPM(0) III) [R(2) = 0.279], and simplified acute physiology score (SAPS II) [R(2) = 0.008]. For each decile of predicted ICU LOS, the mean predicted LOS vs the observed LOS was significantly different (p APACHE IVrecal, MPM(0) III, and SAPS II, respectively. Plots of the predicted vs the observed LOS ratios of the hospitals revealed a threefold variation in LOS among hospitals with high model correlations. APACHE IV and MPM(0) III were more accurate than SAPS II for the prediction of ICU LOS. APACHE IV is the most accurate and best calibrated model. Although it is less accurate, MPM(0) III may be a reasonable option if the data collection burden or the treatment effect bias is a consideration.
Ravyn, Dana; Ravyn, Vipa; Lowney, Rob; Ferraris, Victor
2014-01-01
Investments in continuing medical education (CME) exceed $2 billion annually, but few studies report the economic impact of CME activities. Analysis of patient-level economic outcomes data is often not feasible. Accordingly, we developed a model to illustrate estimation of the potential economic impact associated with CME activity outcomes. Outcomes impact analysis demonstrated how costs averted from a CME symposium that promoted prevention of bleeding-related complications (BRC) and reoperation for bleeding (RFB) in cardiac and thoracic operations could be estimated. Model parameter estimates were from published studies of costs associated with BRC and RFB. Operative volume estimates came from the Society of Thoracic Surgeons workforce data. The base case predicted 3 in 10 participants preventing one BRC or RFB in 2% or 1.5% of annual operations, respectively. Probabilistic sensitivity analysis (PSA) evaluated the effect of parameter uncertainty. 92% of participants (n = 133) self-reported commitment to change, a validated measure of behavior change. For BRC, estimates for costs averted were $1,502,769 (95% confidence interval [CI], $869,860-$2,359,068) for cardiac operations and $2,715,246 (95% CI, $1,590,308-$4,217,092) for thoracic operations. For RFB, the savings estimates were $2,233,988 (95% CI, $1,223,901-$3,648,719). Our economic model demonstrates that application of CME-related learning to prevent bleeding complications may yield substantial cost savings. Model prediction of averted costs associated with CME allows estimation of the economic impact on outcomes in the absence of patient-level outcomes data related to CME activities. © 2014 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.
Energy Technology Data Exchange (ETDEWEB)
Bevelhimer, Mark S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Environmental Sciences Division; Colby, Jonathan [Verdant Power, Inc., New York, NY (United States); Adonizio, Mary Ann [Verdant Power, Inc., New York, NY (United States); Tomichek, Christine [Kleinschmidt Associates, Pittsfield, ME (United States); Scherelis, Constantin C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Environmental Sciences Division
2016-07-31
One of the most important biological questions facing the marine and hydrokinetic (MHK) energy industry is whether fish and marine mammals that encounter MHK devices are likely to be struck by moving components. For hydrokinetic (HK) devices, i.e., those that generate energy from flowing water, this concern is greatest for large organisms because their increased length increases the probability that they will be struck as they pass through the area of blade sweep and because their increased mass means that the force absorbed if struck is greater and potentially more damaging (Amaral et al. 2015). Key to answering this question is understanding whether aquatic organisms change their swimming behavior as they encounter a device in a way that decreases their likelihood of being struck and possibly injured by the device. Whether near-field or far-field behavior results in general avoidance of or attraction to HK devices is a significant factor in the possible risk of physical contact with rotating turbine blades (Cada and Bevelhimer 2011).
Wenmackers, S.; Vanpoucke, D. E. P.; Douven, I.
We present a model for studying communities of epistemically interacting agents who update their belief states by averaging (in a specified way) the belief states of other agents in the community. The agents in our model have a rich belief state, involving multiple independent issues which are
Probabilities for Solar Siblings
Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.
2015-02-01
We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.
van Dussen, Laura; Biegstraaten, Marieke; Dijkgraaf, Marcel Gw; Hollak, Carla Em
2014-01-01
Long-term complications and associated conditions of type 1 Gaucher Disease (GD) can include splenectomy, bone complications, pulmonary hypertension, Parkinson disease and malignancies. Enzyme replacement therapy (ERT) reverses cytopenia and reduces organomegaly. To study the effects of ERT on
DEFF Research Database (Denmark)
Jepsen, J. U.; Baveco, J. M.; Topping, C. J.
2004-01-01
Spatially explicit simulation models of varying degree of complexity are increasingly used in landscape and species management and conservation. The choice as to which type of model to employ in a particular situation, is however, far too often governed by logistic constraints and the personal...... and demographics (IBPM)). The IBPM was analysed in two versions (IBPM_st and IBPM_dyn). Both assumed spatial heterogeneity of the matrix, but the IBPM_dyn in addition included temporal matrix dynamics. The models were developed with a shared minimum objective, namely to predict dynamics of individuals...
Directory of Open Access Journals (Sweden)
Pitágoras Báskara Justino
2003-06-01
Full Text Available Para comparar diversas técnicas de irradiação para o câncer de esôfago, foi utilizado sistema de planejamento tridimensional. Em um paciente com carcinoma espinocelular de esôfago médio, foram estudadas as seguintes técnicas de tratamento: dois campos ântero-posteriores e dois campos látero-laterais paralelos e opostos, três campos em "Y" e em "T" e quatro campos em "X". Foram obtidos os histogramas dose-volume, considerando como órgãos de risco medula espinhal e pulmões. Os resultados foram analisados de acordo com as recomendações da Normal Tissue Complication Probability (NTCP e Tumor Control Probability (TCP. Quanto às doses de irradiação em pulmão, a melhor opção foi a técnica em dois campos ântero-posteriores paralelos e opostos. A medula foi mais poupada quando se utilizaram campos látero-laterais. Sugerimos a combinação de pelo menos duas técnicas de tratamento: ântero-posterior e as técnicas com campos em "Y", "T" ou látero-laterais, para o balanceamento das doses em pulmões e medula espinhal. Ou, ainda, a utilização de técnicas de três campos durante todo o tratamento.Radiotherapy techniques for esophageal cancer were compared using a three-dimensional planning system. We studied the following treatment techniques used for a patient with squamous cell carcinoma of the middle third of the esophagus: two antero-posterior and two latero-lateral parallel opposed fields, three fields ("Y" and "T", and four fields ("X". Dose-volume histograms were obtained considering spinal cord and lungs as organs at risk. Analysis was performed comparing doses in these organs as recommended by the Normal Tissue Complication Probability (NTCP and Tumor Control Probability (TCP. When only the lungs were considered the best technique was two antero-posterior parallel opposed fields. The spinal cord was best protected using latero-lateral fields. We suggest the combination of at least two treatment techniques: antero
Valero, Antonio; Pasquali, Frédérique; De Cesare, Alessandra; Manfreda, Gerardo
2014-08-01
Current sampling plans assume a random distribution of microorganisms in food. However, food-borne pathogens are estimated to be heterogeneously distributed in powdered foods. This spatial distribution together with very low level of contaminations raises concern of the efficiency of current sampling plans for the detection of food-borne pathogens like Cronobacter and Salmonella in powdered foods such as powdered infant formula or powdered eggs. An alternative approach based on a Poisson distribution of the contaminated part of the lot (Habraken approach) was used in order to evaluate the probability of falsely accepting a contaminated lot of powdered food when different sampling strategies were simulated considering variables such as lot size, sample size, microbial concentration in the contaminated part of the lot and proportion of contaminated lot. The simulated results suggest that a sample size of 100g or more corresponds to the lower number of samples to be tested in comparison with sample sizes of 10 or 1g. Moreover, the number of samples to be tested greatly decrease if the microbial concentration is 1CFU/g instead of 0.1CFU/g or if the proportion of contamination is 0.05 instead of 0.01. Mean contaminations higher than 1CFU/g or proportions higher than 0.05 did not impact on the number of samples. The Habraken approach represents a useful tool for risk management in order to design a fit-for-purpose sampling plan for the detection of low levels of food-borne pathogens in heterogeneously contaminated powdered food. However, it must be outlined that although effective in detecting pathogens, these sampling plans are difficult to be applied since the huge number of samples that needs to be tested. Sampling does not seem an effective measure to control pathogens in powdered food. Copyright © 2014 Elsevier B.V. All rights reserved.
Ballestra, Luca Vincenzo; Pacelli, Graziella; Radi, Davide
2016-12-01
We propose a numerical method to compute the first-passage probability density function in a time-changed Brownian model. In particular, we derive an integral representation of such a density function in which the integrand functions must be obtained solving a system of Volterra equations of the first kind. In addition, we develop an ad-hoc numerical procedure to regularize and solve this system of integral equations. The proposed method is tested on three application problems of interest in mathematical finance, namely the calculation of the survival probability of an indebted firm, the pricing of a single-knock-out put option and the pricing of a double-knock-out put option. The results obtained reveal that the novel approach is extremely accurate and fast, and performs significantly better than the finite difference method.
... that may increase the risk of developing complicated grief include: An unexpected or violent death, such as death from a car accident, or the murder or suicide of a loved one Death of a child Close or dependent relationship to the deceased person Social isolation or loss ...
Hansen, F.S.
2016-01-01
Complicated rhinosinusitis: a title chosen for its multi-interpretable nature. In the Oxford dictionary ‘complicated’ is defined as ‘consisting of many interconnecting parts or elements’ and ‘involving many different and confusing aspects’ as well as ‘involving complications’ in medicine. It is the
Nguyen, Truong-Huy; El Outayek, Sarah; Lim, Sun Hee; Nguyen, Van-Thanh-Van
2017-10-01
Many probability distributions have been developed to model the annual maximum rainfall series (AMS). However, there is no general agreement as to which distribution should be used due to the lack of a suitable evaluation method. This paper presents hence a general procedure for assessing systematically the performance of ten commonly used probability distributions in rainfall frequency analyses based on their descriptive as well as predictive abilities. This assessment procedure relies on an extensive set of graphical and numerical performance criteria to identify the most suitable models that could provide the most accurate and most robust extreme rainfall estimates. The proposed systematic assessment approach has been shown to be more efficient and more robust than the traditional model selection method based on only limited goodness-of-fit criteria. To test the feasibility of the proposed procedure, an illustrative application was carried out using 5-min, 1-h, and 24-h annual maximum rainfall data from a network of 21 raingages located in the Ontario region in Canada. Results have indicated that the GEV, GNO, and PE3 models were the best models for describing the distribution of daily and sub-daily annual maximum rainfalls in this region. The GEV distribution, however, was preferred to the GNO and PE3 because it was based on a more solid theoretical basis for representing the distribution of extreme random variables.
Ciporen, Jeremy N; Lucke-Wold, Brandon; Mendez, Gustavo; Cameron, William E; McCartney, Shirley
2017-02-01
Endoscopic surgical treatment of pituitary tumors, lateral invading tumors, or aneurysms requires surgeons to operate adjacent to the cavernous sinus. During these endoscopic endonasal procedures, the carotid artery is vulnerable to surgical injury at its genu. The objective of this simulation model was to evaluate trainees regarding management of a potentially life-threatening vascular injury. Cadaveric heads were prepared in accordance with the Oregon Health & Science University body donation program. An endoscopic endonasal approach was used, and a perfusion pump with a catheter was placed in the ipsilateral common carotid artery at its origin in the neck. Learners used a muscle graft to establish vascular control and were evaluated over 3 training sessions. Simulation assessment, blood loss during sessions, and performance metric data were collected for learners. Vascular control was obtained at a mean arterial pressure of 65 mm Hg using a muscle graft correctly positioned at the arteriotomy site. Learners improved over the course of training, with senior residents (n = 4) performing better across all simulation categories (situation awareness, decision making, communications and teamwork, and leadership); the largest mean difference was in communication and teamwork. Additionally, learner performance concerning blood loss improved between sessions (t = 3.667, P management that transcend this model. Copyright © 2016 Elsevier Inc. All rights reserved.
Haemodynamic resistance model of monochorionic twin pregnancies complicated by acardiac twinning
Energy Technology Data Exchange (ETDEWEB)
Umur, Asli [Laser Center and Department of Obstetrics and Gynecology, Academic Medical Center, University of Amsterdam, Amsterdam(Netherlands); Gemert, Martin J C van [Laser Center and Department of Obstetrics and Gynecology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Wijngaard, Jeroen P H M van den [Laser Center and Department of Obstetrics and Gynecology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Ross, Michael G [Department of Obstetrics and Gynecology, Harbor University of California-Los Angeles Medical Center, Torrence, CA 9050 (United States); Nikkels, Peter G J [Department of Pathology, University Medical Center, Utrecht (Netherlands)
2004-07-21
An acardiac twin is a severely malformed monochorionic twin fetus that lacks most organs, particularly a heart. It grows during pregnancy, because it is perfused by its developmentally normal co-twin (called the pump twin) via a set of placental arterioarterial and venovenous anastomoses. The pump twin dies intrauterine or neonatally in about 50% of the cases due to congestive heart failure, polyhydramnios and prematurity. Because the pathophysiology of this pregnancy is currently incompletely understood, we modified our previous haemodynamic model of monochorionic twins connected by placental vascular anastomoses to include the analysis of acardiac twin pregnancies. We incorporated the fetoplacental circulation as a resistance circuit and used the fetal umbilical flow that perfuses the body to define fetal growth, rather than the placental flow as done previously. Using this modified model, we predicted that the pump twin has excess blood volume and increased mean arterial blood pressure compared to those in the acardiac twin. Placental perfusion of the acardiac twin is significantly reduced compared to normal, as a consequence of an increased venous pressure, possibly implying reduced acardiac placental growth. In conclusion, the haemodynamic analysis may contribute to an increased knowledge of the pathophysiologic consequences of an acardiac body mass for the pump twin. (note)
Umur, Asli; van Gemert, Martin J. C.; van den Wijngaard, Jeroen P. H. M.; Ross, Michael G.; Nikkels, Peter G. J.
2004-07-01
An acardiac twin is a severely malformed monochorionic twin fetus that lacks most organs, particularly a heart. It grows during pregnancy, because it is perfused by its developmentally normal co-twin (called the pump twin) via a set of placental arterioarterial and venovenous anastomoses. The pump twin dies intrauterine or neonatally in about 50% of the cases due to congestive heart failure, polyhydramnios and prematurity. Because the pathophysiology of this pregnancy is currently incompletely understood, we modified our previous haemodynamic model of monochorionic twins connected by placental vascular anastomoses to include the analysis of acardiac twin pregnancies. We incorporated the fetoplacental circulation as a resistance circuit and used the fetal umbilical flow that perfuses the body to define fetal growth, rather than the placental flow as done previously. Using this modified model, we predicted that the pump twin has excess blood volume and increased mean arterial blood pressure compared to those in the acardiac twin. Placental perfusion of the acardiac twin is significantly reduced compared to normal, as a consequence of an increased venous pressure, possibly implying reduced acardiac placental growth. In conclusion, the haemodynamic analysis may contribute to an increased knowledge of the pathophysiologic consequences of an acardiac body mass for the pump twin.
Hoyal Cuthill, Jennifer
2015-02-07
This study models the probability of incompatibility versus compatibility for binary or unordered multistate phylogenetic characters, by treating the allocation of taxa to character states as a classical occupancy problem in probability. It is shown that, under this model, the number of character states has a non-linear effect on the probability of character incompatibility, which is also affected by the number of taxa. Effects on homoplasy from the number of character states are further explored using evolutionary computer simulations. The results indicate that the character state space affects both the known levels of homoplasy (recorded during simulated evolution) and those inferred from parsimony analysis of the resulting character data, with particular relevance for morphological phylogenetic analyses which generally use the parsimony method. When the evolvable state space is large (more potential states per character) there is a reduction in the known occurrence of homoplasy (as reported previously). However, this is not always reflected in the levels of homoplasy detected in a parsimony analysis, because higher numbers of states per character can lead to an increase in the probability of character incompatibility (as well as the maximum homoplasy measurable with some indices). As a result, inferred trends in homoplasy can differ markedly from the underlying trend (that recorded during evolutionary simulation). In such cases, inferred homoplasy can be entirely misleading with regard to tree quality (with higher levels of homoplasy inferred for better quality trees). When rates of evolution are low, commonly used indices such as the number of extra steps (H) and the consistency index (CI) provide relatively good measures of homoplasy. However, at higher rates, estimates may be improved by using the retention index (RI), and particularly by accounting for homoplasy measured among randomised character data using the homoplasy excess ratio (HER). Copyright
Directory of Open Access Journals (Sweden)
Michael L Mann
Full Text Available The costly interactions between humans and wildfires throughout California demonstrate the need to understand the relationships between them, especially in the face of a changing climate and expanding human communities. Although a number of statistical and process-based wildfire models exist for California, there is enormous uncertainty about the location and number of future fires, with previously published estimates of increases ranging from nine to fifty-three percent by the end of the century. Our goal is to assess the role of climate and anthropogenic influences on the state's fire regimes from 1975 to 2050. We develop an empirical model that integrates estimates of biophysical indicators relevant to plant communities and anthropogenic influences at each forecast time step. Historically, we find that anthropogenic influences account for up to fifty percent of explanatory power in the model. We also find that the total area burned is likely to increase, with burned area expected to increase by 2.2 and 5.0 percent by 2050 under climatic bookends (PCM and GFDL climate models, respectively. Our two climate models show considerable agreement, but due to potential shifts in rainfall patterns, substantial uncertainty remains for the semiarid inland deserts and coastal areas of the south. Given the strength of human-related variables in some regions, however, it is clear that comprehensive projections of future fire activity should include both anthropogenic and biophysical influences. Previous findings of substantially increased numbers of fires and burned area for California may be tied to omitted variable bias from the exclusion of human influences. The omission of anthropogenic variables in our model would overstate the importance of climatic ones by at least 24%. As such, the failure to include anthropogenic effects in many models likely overstates the response of wildfire to climatic change.
Mann, Michael L; Batllori, Enric; Moritz, Max A; Waller, Eric K; Berck, Peter; Flint, Alan L; Flint, Lorraine E; Dolfi, Emmalee
2016-01-01
The costly interactions between humans and wildfires throughout California demonstrate the need to understand the relationships between them, especially in the face of a changing climate and expanding human communities. Although a number of statistical and process-based wildfire models exist for California, there is enormous uncertainty about the location and number of future fires, with previously published estimates of increases ranging from nine to fifty-three percent by the end of the century. Our goal is to assess the role of climate and anthropogenic influences on the state's fire regimes from 1975 to 2050. We develop an empirical model that integrates estimates of biophysical indicators relevant to plant communities and anthropogenic influences at each forecast time step. Historically, we find that anthropogenic influences account for up to fifty percent of explanatory power in the model. We also find that the total area burned is likely to increase, with burned area expected to increase by 2.2 and 5.0 percent by 2050 under climatic bookends (PCM and GFDL climate models, respectively). Our two climate models show considerable agreement, but due to potential shifts in rainfall patterns, substantial uncertainty remains for the semiarid inland deserts and coastal areas of the south. Given the strength of human-related variables in some regions, however, it is clear that comprehensive projections of future fire activity should include both anthropogenic and biophysical influences. Previous findings of substantially increased numbers of fires and burned area for California may be tied to omitted variable bias from the exclusion of human influences. The omission of anthropogenic variables in our model would overstate the importance of climatic ones by at least 24%. As such, the failure to include anthropogenic effects in many models likely overstates the response of wildfire to climatic change.
Zehentmayr, Franz; Söhn, Matthias; Exeli, Ann-Katrin; Wurstbauer, Karl; Tröller, Almut; Deutschmann, Heinz; Fastner, Gerd; Fussl, Christoph; Steininger, Philipp; Kranzinger, Manfred; Belka, Claus; Studnicka, Michael; Sedlmayer, Felix
2015-05-28
One of the primary dose-limiting toxicities during thoracic irradiation is acute esophagitis (AE). The aim of this study is to investigate dosimetric and clinical predictors for AE grade ≥ 2 in patients treated with accelerated radiotherapy for locally advanced non-small cell lung cancer (NSCLC). 66 NSCLC patients were included in the present analysis: 4 stage II, 44 stage IIIA and 18 stage IIIB. All patients received induction chemotherapy followed by dose differentiated accelerated radiotherapy (DART-bid). Depending on size (mean of three perpendicular diameters) tumors were binned in four dose groups: 6 cm 90 Gy. Patients were treated in 3D target splitting technique. In order to estimate the normal tissue complication probability (NTCP), two Lyman models and the cutoff-logistic regression model were fitted to the data with AE ≥ grade 2 as statistical endpoint. Inter-model comparison was performed with the corrected Akaike information criterion (AICc), which calculates the model's quality of fit (likelihood value) in relation to its complexity (i.e. number of variables in the model) corrected by the number of patients in the dataset. Toxicity was documented prospectively according to RTOG. The median follow up was 686 days (range 84-2921 days), 23/66 patients (35 %) experienced AE ≥ grade 2. The actuarial local control rates were 72.6 % and 59.4 % at 2 and 3 years, regional control was 91 % at both time points. The Lyman-MED model (D50 = 32.8 Gy, m = 0.48) and the cutoff dose model (Dc = 38 Gy) provide the most efficient fit to the current dataset. On multivariate analysis V38 (volume of the esophagus that receives 38 Gy or above, 95 %-CI 28.2-57.3) was the most significant predictor of AE ≥ grade 2 (HR = 1.05, CI 1.01-1.09, p = 0.007). Following high-dose accelerated radiotherapy the rate of AE ≥ grade 2 is slightly lower than reported for concomitant radio-chemotherapy with the additional benefit of markedly
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
Efficient probability sequence
Regnier, Eva
2014-01-01
A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficient forecasting systems, including memorylessness and increasing discrimination. These results suggest tests for efficiency and ...
Efficient probability sequences
Regnier, Eva
2014-01-01
DRMI working paper A probability sequence is an ordered set of probability forecasts for the same event. Although single-period probabilistic forecasts and methods for evaluating them have been extensively analyzed, we are not aware of any prior work on evaluating probability sequences. This paper proposes an efficiency condition for probability sequences and shows properties of efficiency forecasting systems, including memorylessness and increasing discrimination. These res...
Philosophical theories of probability
Gillies, Donald
2000-01-01
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Krieger, Martin
2000-04-01
The first derivation or solution of a hard problem in mathematical physics may well be lengthy and messy and complicated, seemingly run by the needs of calculation rather than physics. Eventually, more perspicuous solutions are found, and the original paper's insights and sources of difficulty are now much clearer, in retrospect, the length and messiness actually incorporating all sorts of interesting physics. I review the history of Onsager's 1944 solution to the Ising Model of ferromagnetism, and Dyson and Lenard's 1967 proof of the stability of matter, and the history of subsequent clarifying work through alternative derivations and analyses of the original work, for the most part by others. There are perhaps two lessons: The technical details of the mathemtical physics reveal the essential physics of these many-body systems. And, the original papers repay reading by research physicists, even if their interest is not at all historical.
Energy Technology Data Exchange (ETDEWEB)
Filipy, R.E.; Borst, F.J.; Cross, F.T.; Park, J.F.; Moss, O.R.; Roswell, R.L.; Stevens, D.L.
1980-05-01
A mathematical model was constructed for the purpose of predicting the fraction of human population which would die within 1 year of an accidental exposure to airborne radionuclides. The model is based on data from laboratory experiments with rats, dogs and baboons, and from human epidemiological data. Doses from external, whole-body irradiation and from inhaled, alpha- and beta-emitting radionuclides are calculated for several organs. The probabilities of death from radiation pneumonitis and from bone marrow irradiation are predicted from doses accumulated within 30 days of exposure to the radioactive aerosol. The model is compared with existing similar models under hypothetical exposure conditions. Suggestions for further experiments with inhaled radionuclides are included. 25 refs., 16 figs., 13 tabs.
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
Subjective probabilities play a central role in many economic decisions, and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must...
Estimating Subjective Probabilities
DEFF Research Database (Denmark)
Andersen, Steffen; Fountain, John; Harrison, Glenn W.
2014-01-01
Subjective probabilities play a central role in many economic decisions and act as an immediate confound of inferences about behavior, unless controlled for. Several procedures to recover subjective probabilities have been proposed, but in order to recover the correct latent probability one must ...
Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia
We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned
Mourik, van S.; Braak, ter C.J.F.; Stigter, J.D.; Molenaar, J.
2014-01-01
Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate
A Model of Left Ventricular Dysfunction Complicated by CAWS Arteritis in DBA/2 Mice
Directory of Open Access Journals (Sweden)
Naoto Hirata
2012-01-01
Full Text Available It was reported previously that a Candida albicans water-soluble fraction (CAWS, including a mannoprotein and β-glucan complex, has strong potency in inducing fatal necrotizing arteritis in DBA/2 mice. In this study, histopathological changes and cardiac function were investigated in this system. One mg/day of CAWS was given to DBA/2 mice via peritoneal injection for five days. The CAWS-treated DBA/2 mice were induced aortitis and died at an incidence of 100% within several weeks. Histological findings included stenosis in the left ventricular outflow tract (LVOT and severe inflammatory changes of the aortic valve with fibrinoid necrosis. Cardiomegaly was observed and heart weight increased 1.62 fold (<0.01. Echocardiography revealed a severe reduction in contractility and dilatation of the cavity in the left ventricle (LV: LV fractional shortening (LVFS decreased from 71% to 38% (<0.01, and the LV end-diastolic diameter (LVDd increased from 2.21 mm to 3.26 mm (<0.01. The titer of BNP mRNA increased in the CAWS-treated group. Severe inflammatory changes resulting from CAWS brought about lethal LV dysfunction by aortic valve deformation with LVOT stenosis. This system is proposed as an easy and useful experimental model of heart failure because CAWS arteritis can be induced by CAWS injection alone.
The Metabolic Syndrome and Microvascular Complications in a Murine Model of Type 2 Diabetes
Hur, Junguk; Dauch, Jacqueline R.; Hinder, Lucy M.; Hayes, John M.; Backus, Carey; Pennathur, Subramaniam; Kretzler, Matthias; Brosius, Frank C.
2015-01-01
To define the components of the metabolic syndrome that contribute to diabetic polyneuropathy (DPN) in type 2 diabetes mellitus (T2DM), we treated the BKS db/db mouse, an established murine model of T2DM and the metabolic syndrome, with the thiazolidinedione class drug pioglitazone. Pioglitazone treatment of BKS db/db mice produced a significant weight gain, restored glycemic control, and normalized measures of serum oxidative stress and triglycerides but had no effect on LDLs or total cholesterol. Moreover, although pioglitazone treatment normalized renal function, it had no effect on measures of large myelinated nerve fibers, specifically sural or sciatic nerve conduction velocities, but significantly improved measures of small unmyelinated nerve fiber architecture and function. Analyses of gene expression arrays of large myelinated sciatic nerves from pioglitazone-treated animals revealed an unanticipated increase in genes related to adipogenesis, adipokine signaling, and lipoprotein signaling, which likely contributed to the blunted therapeutic response. Similar analyses of dorsal root ganglion neurons revealed a salutary effect of pioglitazone on pathways related to defense and cytokine production. These data suggest differential susceptibility of small and large nerve fibers to specific metabolic impairments associated with T2DM and provide the basis for discussion of new treatment paradigms for individuals with T2DM and DPN. PMID:25979075
Measure, integral and probability
Capiński, Marek
2004-01-01
Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.
Smith, O. E.; Adelfang, S. I.; Tubbs, J. D.
1982-01-01
A five-parameter gamma distribution (BGD) having two shape parameters, two location parameters, and a correlation parameter is investigated. This general BGD is expressed as a double series and as a single series of the modified Bessel function. It reduces to the known special case for equal shape parameters. Practical functions for computer evaluations for the general BGD and for special cases are presented. Applications to wind gust modeling for the ascent flight of the space shuttle are illustrated.
Directory of Open Access Journals (Sweden)
Simon van Mourik
2014-06-01
Full Text Available Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.
van Mourik, Simon; Ter Braak, Cajo; Stigter, Hans; Molenaar, Jaap
2014-01-01
Multi-parameter models in systems biology are typically 'sloppy': some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC) algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler) and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.
Oxygen boundary crossing probabilities.
Busch, N A; Silver, I A
1987-01-01
The probability that an oxygen particle will reach a time dependent boundary is required in oxygen transport studies involving solution methods based on probability considerations. A Volterra integral equation is presented, the solution of which gives directly the boundary crossing probability density function. The boundary crossing probability is the probability that the oxygen particle will reach a boundary within a specified time interval. When the motion of the oxygen particle may be described as strongly Markovian, then the Volterra integral equation can be rewritten as a generalized Abel equation, the solution of which has been widely studied.
In All Probability, Probability is not All
Helman, Danny
2004-01-01
The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.
Lammers, Rianne J M; Hendriks, Jan C M; Rodriguez Faba, O Rodriguez Faba; Witjes, Wim P J; Palou, Joan; Witjes, J Alfred
2016-02-01
To develop a model to predict recurrence for patients with intermediate-risk (IR) non-muscle-invasive bladder cancer (NMIBC) treated with intravesical chemotherapy which can be challenging because of the heterogeneous characteristics of these patients. Data from three Dutch trials were combined. Patients treated with intravesical chemotherapy with characteristics according to the IR definition of the EAU guideline 2013 were included. Uni- and multivariable Cox regression with selection methods were used to identify predictors of recurrence at 1, 2, and 5 years. An easy-readable table for recurrence probabilities was developed. An external validation was done using data from Spanish patients. A total of 724 patients were available for analyses, of which 305 were primary patients. Recurrences occurred in 413 patients (57%). History of recurrences, history of intravesical treatment, grade 2, multiple tumors, and adjuvant treatment with epirubicin were relevant predictors for recurrence-free survival with hazard ratios of 1.48, 1.38, 1.22, 1.56, and 1.27, respectively. A table for recurrence probabilities was developed using these five predictors. Based on the probability of recurrence, three risk groups were identified. Patients in each of the separate risk groups should be scheduled for less or more aggressive treatment. The model showed sufficient discrimination and good predictive accuracy. External validation showed good validity. In our model, we identified five relevant predictors for recurrence-free survival in IR-NMIBC patients treated with intravesical chemotherapy. These recurrence predictors allow the urologists to stratify patients in risk groups for recurrence that could help in deciding for an individualized treatment approach.
Energy Technology Data Exchange (ETDEWEB)
Galiano, G.; Grau, A.
1994-07-01
An intelligent computer program has been developed to obtain the mathematical formulae to compute the probabilities and reduced energies of the different atomic rearrangement pathways following electron-capture decay. Creation and annihilation operators for Auger and X processes have been introduced. Taking into account the symmetries associated with each process, 262 different pathways were obtained. This model allows us to obtain the influence of the M-electro capture in the counting efficiency when the atomic number of the nuclide is high. (Author)
Annambhotla, Pallavi D; Gurbaxani, Brian M; Kuehnert, Matthew J; Basavaraju, Sridhar V
2017-04-01
In 2013, guidelines were released for reducing the risk of viral bloodborne pathogen transmission through organ transplantation. Eleven criteria were described that result in a donor being designated at increased infectious risk. Human immunodeficiency virus (HIV) and hepatitis C virus (HCV) transmission risk from an increased-risk donor (IRD), despite negative nucleic acid testing (NAT), likely varies based on behavior type and timing. We developed a Monte Carlo risk model to quantify probability of HIV among IRDs. The model included NAT performance, viral load dynamics, and per-act risk of acquiring HIV by each behavior. The model also quantifies the probability of HCV among IRDs by non-medical intravenous drug use (IVDU). Highest risk is among donors with history of unprotected, receptive anal male-to-male intercourse with partner of unknown HIV status (MSM), followed by sex with an HIV-infected partner, IVDU, and sex with a commercial sex worker. With NAT screening, the estimated risk of undetected HIV remains small even at 1 day following a risk behavior. The estimated risk for HCV transmission through IVDU is likewise small and decreases quicker with time owing to the faster viral growth dynamics of HCV compared with HIV. These findings may allow for improved organ allocation, utilization, and recipient informed consent. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong
2013-01-01
As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.
Singh, D D; Saikrishnan, K; Kumar, Prashant; Surolia, A; Sekar, K; Vijayan, M
2005-10-01
The crystal structure of a complex of methyl-alpha-D-mannoside with banana lectin from Musa paradisiaca reveals two primary binding sites in the lectin, unlike in other lectins with beta-prism I fold which essentially consists of three Greek key motifs. It has been suggested that the fold evolved through successive gene duplication and fusion of an ancestral Greek key motif. In other lectins, all from dicots, the primary binding site exists on one of the three motifs in the three-fold symmetric molecule. Banana is a monocot, and the three motifs have not diverged enough to obliterate sequence similarity among them. Two Greek key motifs in it carry one primary binding site each. A common secondary binding site exists on the third Greek key. Modelling shows that both the primary sites can support 1-2, 1-3, and 1-6 linked mannosides with the second residue interacting in each case primarily with the secondary binding site. Modelling also readily leads to a bound branched mannopentose with the nonreducing ends of the two branches anchored at the two primary binding sites, providing a structural explanation for the lectin's specificity for branched alpha-mannans. A comparison of the dimeric banana lectin with other beta-prism I fold lectins, provides interesting insights into the variability in their quaternary structure.
Predicting Cumulative Incidence Probability by Direct Binomial Regression
DEFF Research Database (Denmark)
Scheike, Thomas H.; Zhang, Mei-Jie
Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard......Binomial modelling; cumulative incidence probability; cause-specific hazards; subdistribution hazard...
Curreri, Peter A.
2010-01-01
Two contemporary issues foretell a shift from our historical Earth based industrial economy and habitation to a solar system based society. The first is the limits to Earth s carrying capacity, that is the maximum number of people that the Earth can support before a catastrophic impact to the health of the planet and human species occurs. The simple example of carrying capacity is that of a bacterial colony in a Petri dish with a limited amount of nutrient. The colony experiences exponential population growth until the carrying capacity is reached after which catastrophic depopulation often results. Estimates of the Earth s carrying capacity vary between 14 and 40 billion people. Although at current population growth rates we may have over a century before we reach Earth s carrying limit our influence on climate and resources on the planetary scale is becoming scientifically established. The second issue is the exponential growth of knowledge and technological power. The exponential growth of technology interacts with the exponential growth of population in a manner that is unique to a highly intelligent species. Thus, the predicted consequences (world famines etc.) of the limits to growth have been largely avoided due to technological advances. However, at the mid twentieth century a critical coincidence occurred in these two trends humanity obtained the technological ability to extinguish life on the planetary scale (by nuclear, chemical, biological means) and attained the ability to expand human life beyond Earth. This paper examines an optimized O'Neill/Glaser model (O Neill 1975; Curreri 2007; Detweiler and Curreri 2008) for the economic human population of space. Critical to this model is the utilization of extraterrestrial resources, solar power and spaced based labor. A simple statistical analysis is then performed which predicts the robustness of a single planet based technological society versus that of multiple world (independent habitats) society.
DEFF Research Database (Denmark)
Casares Magaz, Oscar; Van der Heide, Uulke A; Rørvik, Jarle
2016-01-01
: Overall, TCP curves based on ADC maps showed larger differences between individuals than those assuming uniform cell densities. The range of the dose required to reach 50% TCP across the patient cohort was 20.1 Gy, 18.7 Gy and 13.2 Gy using an MRI-based voxel density (linear, binary and sigmoid approach......) and apparent diffusion coefficient (ADC) maps-based cell density distributions. Materials and methods: ADC maps in a series of 20 prostate cancer patients were applied to estimate the initial number of cells within each voxel, using three different approaches for the relation between ADC values and cell...... density: a linear, a binary and a sigmoid relation. All TCP models were based on linear- quadratic cell survival curves assuming a/b = 1.93 Gy (consistent with a recent meta-analysis) and a set to obtain a 70% of TCP when 77 Gy was delivered to the entire prostate in 35 fractions (a = 0.18 Gy?1). Results...
Christensen, Jette; Stryhn, Henrik; Vallières, André; El Allaki, Farouk
2011-05-01
In 2008, Canada designed and implemented the Canadian Notifiable Avian Influenza Surveillance System (CanNAISS) with six surveillance activities in a phased-in approach. CanNAISS was a surveillance system because it had more than one surveillance activity or component in 2008: passive surveillance; pre-slaughter surveillance; and voluntary enhanced notifiable avian influenza surveillance. Our objectives were to give a short overview of two active surveillance components in CanNAISS; describe the CanNAISS scenario tree model and its application to estimation of probability of populations being free of NAI virus infection and sample size determination. Our data from the pre-slaughter surveillance component included diagnostic test results from 6296 serum samples representing 601 commercial chicken and turkey farms collected from 25 August 2008 to 29 January 2009. In addition, we included data from a sub-population of farms with high biosecurity standards: 36,164 samples from 55 farms sampled repeatedly over the 24 months study period from January 2007 to December 2008. All submissions were negative for Notifiable Avian Influenza (NAI) virus infection. We developed the CanNAISS scenario tree model, so that it will estimate the surveillance component sensitivity and the probability of a population being free of NAI at the 0.01 farm-level and 0.3 within-farm-level prevalences. We propose that a general model, such as the CanNAISS scenario tree model, may have a broader application than more detailed models that require disease specific input parameters, such as relative risk estimates. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.
Holbrook, Christopher M.; Johnson, Nicholas S.; Steibel, Juan P.; Twohey, Michael B.; Binder, Thomas R.; Krueger, Charles C.; Jones, Michael L.
2014-01-01
Improved methods are needed to evaluate barriers and traps for control and assessment of invasive sea lamprey (Petromyzon marinus) in the Great Lakes. A Bayesian state-space model provided reach-specific probabilities of movement, including trap capture and dam passage, for 148 acoustic tagged invasive sea lamprey in the lower Cheboygan River, Michigan, a tributary to Lake Huron. Reach-specific movement probabilities were combined to obtain estimates of spatial distribution and abundance needed to evaluate a barrier and trap complex for sea lamprey control and assessment. Of an estimated 21 828 – 29 300 adult sea lampreys in the river, 0%–2%, or 0–514 untagged lampreys, could have passed upstream of the dam, and 46%–61% were caught in the trap. Although no tagged lampreys passed above the dam (0/148), our sample size was not sufficient to consider the lock and dam a complete barrier to sea lamprey. Results also showed that existing traps are in good locations because 83%–96% of the population was vulnerable to existing traps. However, only 52%–69% of lampreys vulnerable to traps were caught, suggesting that traps can be improved. The approach used in this study was a novel use of Bayesian state-space models that may have broader applications, including evaluation of barriers for other invasive species (e.g., Asian carp (Hypophthalmichthys spp.)) and fish passage structures for other diadromous fishes.
Choice Probability Generating Functions
DEFF Research Database (Denmark)
Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel
This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....
Directory of Open Access Journals (Sweden)
Laktineh Imad
2010-04-01
Full Text Available This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.
Florescu, Ionut
2013-01-01
THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Directory of Open Access Journals (Sweden)
Helli Merica
Full Text Available Little attention has gone into linking to its neuronal substrates the dynamic structure of non-rapid-eye-movement (NREM sleep, defined as the pattern of time-course power in all frequency bands across an entire episode. Using the spectral power time-courses in the sleep electroencephalogram (EEG, we showed in the typical first episode, several moves towards-and-away from deep sleep, each having an identical pattern linking the major frequency bands beta, sigma and delta. The neuronal transition probability model (NTP--in fitting the data well--successfully explained the pattern as resulting from stochastic transitions of the firing-rates of the thalamically-projecting brainstem-activating neurons, alternating between two steady dynamic-states (towards-and-away from deep sleep each initiated by a so-far unidentified flip-flop. The aims here are to identify this flip-flop and to demonstrate that the model fits well all NREM episodes, not just the first. Using published data on suprachiasmatic nucleus (SCN activity we show that the SCN has the information required to provide a threshold-triggered flip-flop for TIMING the towards-and-away alternations, information provided by sleep-relevant feedback to the SCN. NTP then determines the PATTERN of spectral power within each dynamic-state. NTP was fitted to individual NREM episodes 1-4, using data from 30 healthy subjects aged 20-30 years, and the quality of fit for each NREM measured. We show that the model fits well all NREM episodes and the best-fit probability-set is found to be effectively the same in fitting all subject data. The significant model-data agreement, the constant probability parameter and the proposed role of the SCN add considerable strength to the model. With it we link for the first time findings at cellular level and detailed time-course data at EEG level, to give a coherent picture of NREM dynamics over the entire night and over hierarchic brain levels all the way from the SCN
Early bowel complications following radiotherapy of uterine cervix carcinoma
Energy Technology Data Exchange (ETDEWEB)
Kim, Won Dong [College of Medicine, Chungbuk National Univ., Cheongju (Korea, Republic of)
1999-06-01
This study evaluated early bowel complications in cervix cancer patients, following external radiotherapy (ERT) and high dose rate intracavitary radiation (HDR ICR). Factors affecting the risk of developing early bowel complications and its incidence are analyzed and discussed. The study is the retrospective review of 66 patients who received radiotherapy at Chungbuk National University Hospital from April 1994 to December 1998. The patients underwent 41.4 or 50.4 Gy ERT according to FIGO stage and tumor size, then A point dose was boosted to 71.4 or 74.4 Gy using a remotely controlled after loading Buchler HDR ICR. The EORTC/RTOG morbidity criteria were used to grade early bowel complications, which are valid from day 1, the commencement of therapy, through day 90. The actuarial incidence, severity of complications were investigated and clinical pretreatment factors relevant to complications were found through univariate (Wilcoxon) and multivariate (Cox proportional hazard model) analysis. Of the 66 patients, 30 patients (46%) developed early bowel complications; 25 patients (38%) with grade 1 or 2, 4 patients (6%) with grade 3 and 1 patient (2%) with grade 4. The complications usually began to occur 3 weeks after the commencement of radiotherapy. The actuarial incidence of early bowel complications was 41% at 10 weeks. The early bowel complications were associated significantly with an old age and a history of previous abdomino-pelvic surgery. All three patients who had a protracted overall treatment time (about 2 weeks) due to severe bowel complication, suffered from pelvic recurrences. Forty six percent of patients experienced early bowel complications, most of which were grade 1 or 2 and relieved spontaneously or by medication. The patients with an old age or a previous surgery have a high probability of early complications and they may be less compliant with planned radiotherapy. So more careful precaution is necessary for these patients.
Directory of Open Access Journals (Sweden)
Yadira Chinique de Armas
Full Text Available The general lack of well-preserved juvenile skeletal remains from Caribbean archaeological sites has, in the past, prevented evaluations of juvenile dietary changes. Canímar Abajo (Cuba, with a large number of well-preserved juvenile and adult skeletal remains, provided a unique opportunity to fully assess juvenile paleodiets from an ancient Caribbean population. Ages for the start and the end of weaning and possible food sources used for weaning were inferred by combining the results of two Bayesian probability models that help to reduce some of the uncertainties inherent to bone collagen isotope based paleodiet reconstructions. Bone collagen (31 juveniles, 18 adult females was used for carbon and nitrogen isotope analyses. The isotope results were assessed using two Bayesian probability models: Weaning Ages Reconstruction with Nitrogen isotopes and Stable Isotope Analyses in R. Breast milk seems to have been the most important protein source until two years of age with some supplementary food such as tropical fruits and root cultigens likely introduced earlier. After two, juvenile diets were likely continuously supplemented by starch rich foods such as root cultigens and legumes. By the age of three, the model results suggest that the weaning process was completed. Additional indications suggest that animal marine/riverine protein and maize, while part of the Canímar Abajo female diets, were likely not used to supplement juvenile diets. The combined use of both models here provided a more complete assessment of the weaning process for an ancient Caribbean population, indicating not only the start and end ages of weaning but also the relative importance of different food sources for different age juveniles.
Chinique de Armas, Yadira; Roksandic, Mirjana; Nikitović, Dejana; Rodríguez Suárez, Roberto; Smith, David; Kanik, Nadine; García Jordá, Dailys; Buhay, William M
2017-01-01
The general lack of well-preserved juvenile skeletal remains from Caribbean archaeological sites has, in the past, prevented evaluations of juvenile dietary changes. Canímar Abajo (Cuba), with a large number of well-preserved juvenile and adult skeletal remains, provided a unique opportunity to fully assess juvenile paleodiets from an ancient Caribbean population. Ages for the start and the end of weaning and possible food sources used for weaning were inferred by combining the results of two Bayesian probability models that help to reduce some of the uncertainties inherent to bone collagen isotope based paleodiet reconstructions. Bone collagen (31 juveniles, 18 adult females) was used for carbon and nitrogen isotope analyses. The isotope results were assessed using two Bayesian probability models: Weaning Ages Reconstruction with Nitrogen isotopes and Stable Isotope Analyses in R. Breast milk seems to have been the most important protein source until two years of age with some supplementary food such as tropical fruits and root cultigens likely introduced earlier. After two, juvenile diets were likely continuously supplemented by starch rich foods such as root cultigens and legumes. By the age of three, the model results suggest that the weaning process was completed. Additional indications suggest that animal marine/riverine protein and maize, while part of the Canímar Abajo female diets, were likely not used to supplement juvenile diets. The combined use of both models here provided a more complete assessment of the weaning process for an ancient Caribbean population, indicating not only the start and end ages of weaning but also the relative importance of different food sources for different age juveniles.
Probability, Statistics, and Computational Science
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient infe...
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
Difficulties related to Probabilities
Rosinger, Elemer Elad
2010-01-01
Probability theory is often used as it would have the same ontological status with, for instance, Euclidean Geometry or Peano Arithmetics. In this regard, several highly questionable aspects of probability theory are mentioned which have earlier been presented in two arxiv papers.
Indian Academy of Sciences (India)
casinos and gambling houses? How does one interpret a statement like "there is a 30 per cent chance of rain tonight" - a statement we often hear on the news? Such questions arise in the mind of every student when she/he is taught probability as part of mathematics. Many students who go on to study probability and ...
Elements of quantum probability
Kummerer, B.; Maassen, H.
1996-01-01
This is an introductory article presenting some basic ideas of quantum probability. From a discussion of simple experiments with polarized light and a card game we deduce the necessity of extending the body of classical probability theory. For a class of systems, containing classical systems with
Freund, John E
1993-01-01
Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.
Frič, Roman; Papčo, Martin
2017-06-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.