WorldWideScience

Sample records for regression cure model

  1. Semiparametric accelerated failure time cure rate mixture models with competing risks.

    Science.gov (United States)

    Choi, Sangbum; Zhu, Liang; Huang, Xuelin

    2018-01-15

    Modern medical treatments have substantially improved survival rates for many chronic diseases and have generated considerable interest in developing cure fraction models for survival data with a non-ignorable cured proportion. Statistical analysis of such data may be further complicated by competing risks that involve multiple types of endpoints. Regression analysis of competing risks is typically undertaken via a proportional hazards model adapted on cause-specific hazard or subdistribution hazard. In this article, we propose an alternative approach that treats competing events as distinct outcomes in a mixture. We consider semiparametric accelerated failure time models for the cause-conditional survival function that are combined through a multinomial logistic model within the cure-mixture modeling framework. The cure-mixture approach to competing risks provides a means to determine the overall effect of a treatment and insights into how this treatment modifies the components of the mixture in the presence of a cure fraction. The regression and nonparametric parameters are estimated by a nonparametric kernel-based maximum likelihood estimation method. Variance estimation is achieved through resampling methods for the kernel-smoothed likelihood function. Simulation studies show that the procedures work well in practical settings. Application to a sarcoma study demonstrates the use of the proposed method for competing risk data with a cure fraction. Copyright © 2017 John Wiley & Sons, Ltd.

  2. A nonparametric mixture model for cure rate estimation.

    Science.gov (United States)

    Peng, Y; Dear, K B

    2000-03-01

    Nonparametric methods have attracted less attention than their parametric counterparts for cure rate analysis. In this paper, we study a general nonparametric mixture model. The proportional hazards assumption is employed in modeling the effect of covariates on the failure time of patients who are not cured. The EM algorithm, the marginal likelihood approach, and multiple imputations are employed to estimate parameters of interest in the model. This model extends models and improves estimation methods proposed by other researchers. It also extends Cox's proportional hazards regression model by allowing a proportion of event-free patients and investigating covariate effects on that proportion. The model and its estimation method are investigated by simulations. An application to breast cancer data, including comparisons with previous analyses using a parametric model and an existing nonparametric model by other researchers, confirms the conclusions from the parametric model but not those from the existing nonparametric model.

  3. Relaxed Poisson cure rate models.

    Science.gov (United States)

    Rodrigues, Josemar; Cordeiro, Gauss M; Cancho, Vicente G; Balakrishnan, N

    2016-03-01

    The purpose of this article is to make the standard promotion cure rate model (Yakovlev and Tsodikov, ) more flexible by assuming that the number of lesions or altered cells after a treatment follows a fractional Poisson distribution (Laskin, ). It is proved that the well-known Mittag-Leffler relaxation function (Berberan-Santos, ) is a simple way to obtain a new cure rate model that is a compromise between the promotion and geometric cure rate models allowing for superdispersion. So, the relaxed cure rate model developed here can be considered as a natural and less restrictive extension of the popular Poisson cure rate model at the cost of an additional parameter, but a competitor to negative-binomial cure rate models (Rodrigues et al., ). Some mathematical properties of a proper relaxed Poisson density are explored. A simulation study and an illustration of the proposed cure rate model from the Bayesian point of view are finally presented. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Determinant of flexible Parametric Estimation of Mixture Cure ...

    African Journals Online (AJOL)

    PROF. OLIVER OSUAGWA

    2015-12-01

    Dec 1, 2015 ... Suitability of four parametric mixture cure models were considered namely; Log .... regression analysis which relies on the ... The parameter of mixture cure fraction model was ..... Stochastic Models of Tumor Latency and Their.

  5. A flexible cure rate model with dependent censoring and a known cure threshold.

    Science.gov (United States)

    Bernhardt, Paul W

    2016-11-10

    We propose a flexible cure rate model that accommodates different censoring distributions for the cured and uncured groups and also allows for some individuals to be observed as cured when their survival time exceeds a known threshold. We model the survival times for the uncured group using an accelerated failure time model with errors distributed according to the seminonparametric distribution, potentially truncated at a known threshold. We suggest a straightforward extension of the usual expectation-maximization algorithm approach for obtaining estimates in cure rate models to accommodate the cure threshold and dependent censoring. We additionally suggest a likelihood ratio test for testing for the presence of dependent censoring in the proposed cure rate model. We show through numerical studies that our model has desirable properties and leads to approximately unbiased parameter estimates in a variety of scenarios. To demonstrate how our method performs in practice, we analyze data from a bone marrow transplantation study and a liver transplant study. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Using cure models for analyzing the influence of pathogens on salmon survival

    Science.gov (United States)

    Ray, Adam R; Perry, Russell W.; Som, Nicholas A.; Bartholomew, Jerri L

    2014-01-01

    Parasites and pathogens influence the size and stability of wildlife populations, yet many population models ignore the population-level effects of pathogens. Standard survival analysis methods (e.g., accelerated failure time models) are used to assess how survival rates are influenced by disease. However, they assume that each individual is equally susceptible and will eventually experience the event of interest; this assumption is not typically satisfied with regard to pathogens of wildlife populations. In contrast, mixture cure models, which comprise logistic regression and survival analysis components, allow for different covariates to be entered into each part of the model and provide better predictions of survival when a fraction of the population is expected to survive a disease outbreak. We fitted mixture cure models to the host–pathogen dynamics of Chinook Salmon Oncorhynchus tshawytscha and Coho Salmon O. kisutch and the myxozoan parasite Ceratomyxa shasta. Total parasite concentration, water temperature, and discharge were used as covariates to predict the observed parasite-induced mortality in juvenile salmonids collected as part of a long-term monitoring program in the Klamath River, California. The mixture cure models predicted the observed total mortality well, but some of the variability in observed mortality rates was not captured by the models. Parasite concentration and water temperature were positively associated with total mortality and the mortality rate of both Chinook Salmon and Coho Salmon. Discharge was positively associated with total mortality for both species but only affected the mortality rate for Coho Salmon. The mixture cure models provide insights into how daily survival rates change over time in Chinook Salmon and Coho Salmon after they become infected with C. shasta.

  7. Modeling the curing process of thermosetting resin matrix composites

    Science.gov (United States)

    Loos, A. C.

    1986-01-01

    A model is presented for simulating the curing process of a thermosetting resin matrix composite. The model relates the cure temperature, the cure pressure, and the properties of the prepreg to the thermal, chemical, and rheological processes occurring in the composite during cure. The results calculated with the computer code developed on the basis of the model were compared with the experimental data obtained from autoclave-curved composite laminates. Good agreement between the two sets of results was obtained.

  8. The time-dependent "cure-death" model investigating two equally important endpoints simultaneously in trials treating high-risk patients with resistant pathogens.

    Science.gov (United States)

    Sommer, Harriet; Wolkewitz, Martin; Schumacher, Martin

    2017-07-01

    A variety of primary endpoints are used in clinical trials treating patients with severe infectious diseases, and existing guidelines do not provide a consistent recommendation. We propose to study simultaneously two primary endpoints, cure and death, in a comprehensive multistate cure-death model as starting point for a treatment comparison. This technique enables us to study the temporal dynamic of the patient-relevant probability to be cured and alive. We describe and compare traditional and innovative methods suitable for a treatment comparison based on this model. Traditional analyses using risk differences focus on one prespecified timepoint only. A restricted logrank-based test of treatment effect is sensitive to ordered categories of responses and integrates information on duration of response. The pseudo-value regression provides a direct regression model for examination of treatment effect via difference in transition probabilities. Applied to a topical real data example and simulation scenarios, we demonstrate advantages and limitations and provide an insight into how these methods can handle different kinds of treatment imbalances. The cure-death model provides a suitable framework to gain a better understanding of how a new treatment influences the time-dynamic cure and death process. This might help the future planning of randomised clinical trials, sample size calculations, and data analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Cure modeling in real-time prediction: How much does it help?

    Science.gov (United States)

    Ying, Gui-Shuang; Zhang, Qiang; Lan, Yu; Li, Yimei; Heitjan, Daniel F

    2017-08-01

    Various parametric and nonparametric modeling approaches exist for real-time prediction in time-to-event clinical trials. Recently, Chen (2016 BMC Biomedical Research Methodology 16) proposed a prediction method based on parametric cure-mixture modeling, intending to cover those situations where it appears that a non-negligible fraction of subjects is cured. In this article we apply a Weibull cure-mixture model to create predictions, demonstrating the approach in RTOG 0129, a randomized trial in head-and-neck cancer. We compare the ultimate realized data in RTOG 0129 to interim predictions from a Weibull cure-mixture model, a standard Weibull model without a cure component, and a nonparametric model based on the Bayesian bootstrap. The standard Weibull model predicted that events would occur earlier than the Weibull cure-mixture model, but the difference was unremarkable until late in the trial when evidence for a cure became clear. Nonparametric predictions often gave undefined predictions or infinite prediction intervals, particularly at early stages of the trial. Simulations suggest that cure modeling can yield better-calibrated prediction intervals when there is a cured component, or the appearance of a cured component, but at a substantial cost in the average width of the intervals. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Linear regression metamodeling as a tool to summarize and present simulation model results.

    Science.gov (United States)

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  11. Variable selection for mixture and promotion time cure rate models.

    Science.gov (United States)

    Masud, Abdullah; Tu, Wanzhu; Yu, Zhangsheng

    2016-11-16

    Failure-time data with cured patients are common in clinical studies. Data from these studies are typically analyzed with cure rate models. Variable selection methods have not been well developed for cure rate models. In this research, we propose two least absolute shrinkage and selection operators based methods, for variable selection in mixture and promotion time cure models with parametric or nonparametric baseline hazards. We conduct an extensive simulation study to assess the operating characteristics of the proposed methods. We illustrate the use of the methods using data from a study of childhood wheezing. © The Author(s) 2016.

  12. Curing of Thick Thermoset Composite Laminates: Multiphysics Modeling and Experiments

    Science.gov (United States)

    Anandan, S.; Dhaliwal, G. S.; Huo, Z.; Chandrashekhara, K.; Apetre, N.; Iyyer, N.

    2017-11-01

    Fiber reinforced polymer composites are used in high-performance aerospace applications as they are resistant to fatigue, corrosion free and possess high specific strength. The mechanical properties of these composite components depend on the degree of cure and residual stresses developed during the curing process. While these parameters are difficult to determine experimentally in large and complex parts, they can be simulated using numerical models in a cost-effective manner. These simulations can be used to develop cure cycles and change processing parameters to obtain high-quality parts. In the current work, a numerical model was built in Comsol MultiPhysics to simulate the cure behavior of a carbon/epoxy prepreg system (IM7/Cycom 5320-1). A thermal spike was observed in thick laminates when the recommended cure cycle was used. The cure cycle was modified to reduce the thermal spike and maintain the degree of cure at the laminate center. A parametric study was performed to evaluate the effect of air flow in the oven, post cure cycles and cure temperatures on the thermal spike and the resultant degree of cure in the laminate.

  13. Promotion time cure rate model with nonparametric form of covariate effects.

    Science.gov (United States)

    Chen, Tianlei; Du, Pang

    2018-05-10

    Survival data with a cured portion are commonly seen in clinical trials. Motivated from a biological interpretation of cancer metastasis, promotion time cure model is a popular alternative to the mixture cure rate model for analyzing such data. The existing promotion cure models all assume a restrictive parametric form of covariate effects, which can be incorrectly specified especially at the exploratory stage. In this paper, we propose a nonparametric approach to modeling the covariate effects under the framework of promotion time cure model. The covariate effect function is estimated by smoothing splines via the optimization of a penalized profile likelihood. Point-wise interval estimates are also derived from the Bayesian interpretation of the penalized profile likelihood. Asymptotic convergence rates are established for the proposed estimates. Simulations show excellent performance of the proposed nonparametric method, which is then applied to a melanoma study. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Talking Cure Models: A Framework of Analysis

    Directory of Open Access Journals (Sweden)

    Christopher Marx

    2017-09-01

    Full Text Available Psychotherapy is commonly described as a “talking cure,” a treatment method that operates through linguistic action and interaction. The operative specifics of therapeutic language use, however, are insufficiently understood, mainly due to a multitude of disparate approaches that advance different notions of what “talking” means and what “cure” implies in the respective context. Accordingly, a clarification of the basic theoretical structure of “talking cure models,” i.e., models that describe therapeutic processes with a focus on language use, is a desideratum of language-oriented psychotherapy research. Against this background the present paper suggests a theoretical framework of analysis which distinguishes four basic components of “talking cure models”: (1 a foundational theory (which suggests how linguistic activity can affect and transform human experience, (2 an experiential problem state (which defines the problem or pathology of the patient, (3 a curative linguistic activity (which defines linguistic activities that are supposed to effectuate a curative transformation of the experiential problem state, and (4 a change mechanism (which defines the processes and effects involved in such transformations. The purpose of the framework is to establish a terminological foundation that allows for systematically reconstructing basic properties and operative mechanisms of “talking cure models.” To demonstrate the applicability and utility of the framework, five distinct “talking cure models” which spell out the details of curative “talking” processes in terms of (1 catharsis, (2 symbolization, (3 narrative, (4 metaphor, and (5 neurocognitive inhibition are introduced and discussed in terms of the framework components. In summary, we hope that our framework will prove useful for the objective of clarifying the theoretical underpinnings of language-oriented psychotherapy research and help to establish a more

  15. Predictive Modeling of Fast-Curing Thermosets in Nozzle-Based Extrusion

    Science.gov (United States)

    Xie, Jingjin; Randolph, Robert; Simmons, Gary; Hull, Patrick V.; Mazzeo, Aaron D.

    2017-01-01

    This work presents an approach to modeling the dynamic spreading and curing behavior of thermosets in nozzle-based extrusions. Thermosets cover a wide range of materials, some of which permit low-temperature processing with subsequent high-temperature and high-strength working properties. Extruding thermosets may overcome the limited working temperatures and strengths of conventional thermoplastic materials used in additive manufacturing. This project aims to produce technology for the fabrication of thermoset-based structures leveraging advances made in nozzle-based extrusion, such as fused deposition modeling (FDM), material jetting, and direct writing. Understanding the synergistic interactions between spreading and fast curing of extruded thermosetting materials will provide essential insights for applications that require accurate dimensional controls, such as additive manufacturing [1], [2] and centrifugal coating/forming [3]. Two types of thermally curing thermosets -- one being a soft silicone (Ecoflex 0050) and the other being a toughened epoxy (G/Flex) -- served as the test materials in this work to obtain models for cure kinetics and viscosity. The developed models align with extensive measurements made with differential scanning calorimetry (DSC) and rheology. DSC monitors the change in the heat of reaction, which reflects the rate and degree of cure at different crosslinking stages. Rheology measures the change in complex viscosity, shear moduli, yield stress, and other properties dictated by chemical composition. By combining DSC and rheological measurements, it is possible to establish a set of models profiling the cure kinetics and chemorheology without prior knowledge of chemical composition, which is usually necessary for sophisticated mechanistic modeling. In this work, we conducted both isothermal and dynamic measurements with both DSC and rheology. With the developed models, numerical simulations yielded predictions of diameter and height of

  16. Modified Regression Correlation Coefficient for Poisson Regression Model

    Science.gov (United States)

    Kaengthong, Nattacha; Domthong, Uthumporn

    2017-09-01

    This study gives attention to indicators in predictive power of the Generalized Linear Model (GLM) which are widely used; however, often having some restrictions. We are interested in regression correlation coefficient for a Poisson regression model. This is a measure of predictive power, and defined by the relationship between the dependent variable (Y) and the expected value of the dependent variable given the independent variables [E(Y|X)] for the Poisson regression model. The dependent variable is distributed as Poisson. The purpose of this research was modifying regression correlation coefficient for Poisson regression model. We also compare the proposed modified regression correlation coefficient with the traditional regression correlation coefficient in the case of two or more independent variables, and having multicollinearity in independent variables. The result shows that the proposed regression correlation coefficient is better than the traditional regression correlation coefficient based on Bias and the Root Mean Square Error (RMSE).

  17. Post-cure depth of cure of bulk fill dental resin-composites.

    Science.gov (United States)

    Alrahlah, A; Silikas, N; Watts, D C

    2014-02-01

    To determine the post-cure depth of cure of bulk fill resin composites through using Vickers hardness profiles (VHN). Five bulk fill composite materials were examined: Tetric EvoCeram(®) Bulk Fill, X-tra base, Venus(®) Bulk Fill, Filtek™ Bulk Fill, SonicFill™. Three specimens of each material type were prepared in stainless steel molds which contained a slot of dimensions (15 mm × 4 mm × 2 mm), and a top plate. The molds were irradiated from one end. All specimens were stored at 37°C for 24h, before measurement. The Vickers hardness was measured as a function of depth of material, at 0.3mm intervals. Data were analysed by one-way ANOVA using Tukey post hoc tests (α=0.05). The maximum VHN ranged from 37.8 to 77.4, whilst the VHN at 80% of max.VHN ranged from 30.4 to 61.9. The depth corresponding to 80% of max.VHN, ranged from 4.14 to 5.03 mm. One-way ANOVA showed statistically significant differences between materials for all parameters tested. SonicFill exhibited the highest VHN (pFill the lowest (p≤0.001). SonicFill and Tetric EvoCeram Bulk Fill had the greatest depth of cure (5.03 and 4.47 mm, respectively) and was significant's different from X-tra base, Venus Bulk Fill and Filtek Bulk Fill (p≤0.016). Linear regression confirmed a positive regression between max.VHN and filler loading (r(2)=0.94). Bulk fill resin composites can be cured to an acceptable post-cure depth, according to the manufacturers' claims. SonicFill and Tetric EvoCeram Bulk Fill had the greatest depth of cure among the composites examined. Copyright © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  18. Process Modelling of Curing Process-Induced Internal Stress and Deformation of Composite Laminate Structure with Elastic and Viscoelastic Models

    Science.gov (United States)

    Li, Dongna; Li, Xudong; Dai, Jianfeng

    2018-06-01

    In this paper, two kinds of transient models, the viscoelastic model and the linear elastic model, are established to analyze the curing deformation of the thermosetting resin composites, and are calculated by COMSOL Multiphysics software. The two models consider the complicated coupling between physical and chemical changes during curing process of the composites and the time-variant characteristic of material performance parameters. Subsequently, the two proposed models are implemented respectively in a three-dimensional composite laminate structure, and a simple and convenient method of local coordinate system is used to calculate the development of residual stresses, curing shrinkage and curing deformation for the composite laminate. Researches show that the temperature, degree of curing (DOC) and residual stresses during curing process are consistent with the study in literature, so the curing shrinkage and curing deformation obtained on these basis have a certain referential value. Compared the differences between the two numerical results, it indicates that the residual stress and deformation calculated by the viscoelastic model are more close to the reference value than the linear elastic model.

  19. Effect of cure cycle on curing process and hardness for epoxy resin

    Directory of Open Access Journals (Sweden)

    2009-09-01

    Full Text Available A 3-dimensional finite element model is developed to simulate and analyze the temperature and degree of cure field of epoxy casting part during cure process. The present model based on general finite element software ABAQUS is verified by literature example and experimental data. The numerical results show good agreement with literature example and measured data, and are even more accurate than the simulation of literature. After modeling successfully, the influence of temperature cure cycle ramps have on the temperature and degree of cure gradient is investigated. Moreover, the effect of non-uniform temperature and degree of cure field within epoxy casting part on hardness is demonstrated. The present model provides an accurate and novel method that allows further insight into the process of cure for epoxy resin.

  20. Consumer satisfaction with dry-cured ham in five European countries.

    Science.gov (United States)

    Resano, H; Pérez-Cueto, F J A; Sanjuán, A I; de Barcellos, M D; Grunert, K G; Verbeke, W

    2011-04-01

    The objective is to investigate consumer satisfaction with dry-cured ham in five European countries. A logistic regression model has been fitted using data collected through a cross-sectional web-based survey carried out in Belgium, Germany, Denmark, Poland and Greece during January 2008 (n=2437 of which 2156 were dry-cured ham consumers). Satisfaction was evaluated as overall satisfaction, as well as specific satisfaction with healthfulness, price, convenience and taste. The findings show that the main determinant of overall satisfaction is taste satisfaction, hence, producers are recommended to focus on matching sensory acceptability of dry-cured ham. No significant between-country differences were found, reflecting the wide availability of this product in all countries. Consumer characteristics influenced their level of satisfaction. Men, older (age > 52 years) and frequent consumers of dry-cured ham consumption were more likely to be satisfied with dry-cured ham. Consumers trust the butcher's advice and they preferred purchasing dry-cured ham at a butcher shop rather than in a supermarket. © 2010 The American Meat Science Association. Published by Elsevier Ltd. All rights reserved.

  1. Compounds from silicones alter enzyme activity in curing barnacle glue and model enzymes.

    Science.gov (United States)

    Rittschof, Daniel; Orihuela, Beatriz; Harder, Tilmann; Stafslien, Shane; Chisholm, Bret; Dickinson, Gary H

    2011-02-17

    Attachment strength of fouling organisms on silicone coatings is low. We hypothesized that low attachment strength on silicones is, in part, due to the interaction of surface available components with natural glues. Components could alter curing of glues through bulk changes or specifically through altered enzyme activity. GC-MS analysis of silicone coatings showed surface-available siloxanes when the coatings were gently rubbed with a cotton swab for 15 seconds or given a 30 second rinse with methanol. Mixtures of compounds were found on 2 commercial and 8 model silicone coatings. The hypothesis that silicone components alter glue curing enzymes was tested with curing barnacle glue and with commercial enzymes. In our model, barnacle glue curing involves trypsin-like serine protease(s), which activate enzymes and structural proteins, and a transglutaminase which cross-links glue proteins. Transglutaminase activity was significantly altered upon exposure of curing glue from individual barnacles to silicone eluates. Activity of purified trypsin and, to a greater extent, transglutaminase was significantly altered by relevant concentrations of silicone polymer constituents. Surface-associated silicone compounds can disrupt glue curing and alter enzyme properties. Altered curing of natural glues has potential in fouling management.

  2. Modeling the intracellular pathogen-immune interaction with cure rate

    Science.gov (United States)

    Dubey, Balram; Dubey, Preeti; Dubey, Uma S.

    2016-09-01

    Many common and emergent infectious diseases like Influenza, SARS, Hepatitis, Ebola etc. are caused by viral pathogens. These infections can be controlled or prevented by understanding the dynamics of pathogen-immune interaction in vivo. In this paper, interaction of pathogens with uninfected and infected cells in presence or absence of immune response are considered in four different cases. In the first case, the model considers the saturated nonlinear infection rate and linear cure rate without absorption of pathogens into uninfected cells and without immune response. The next model considers the effect of absorption of pathogens into uninfected cells while all other terms are same as in the first case. The third model incorporates innate immune response, humoral immune response and Cytotoxic T lymphocytes (CTL) mediated immune response with cure rate and without absorption of pathogens into uninfected cells. The last model is an extension of the third model in which the effect of absorption of pathogens into uninfected cells has been considered. Positivity and boundedness of solutions are established to ensure the well-posedness of the problem. It has been found that all the four models have two equilibria, namely, pathogen-free equilibrium point and pathogen-present equilibrium point. In each case, stability analysis of each equilibrium point is investigated. Pathogen-free equilibrium is globally asymptotically stable when basic reproduction number is less or equal to unity. This implies that control or prevention of infection is independent of initial concentration of uninfected cells, infected cells, pathogens and immune responses in the body. The proposed models show that introduction of immune response and cure rate strongly affects the stability behavior of the system. Further, on computing basic reproduction number, it has been found to be minimum for the fourth model vis-a-vis other models. The analytical findings of each model have been exemplified by

  3. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  4. An exponential chemorheological model for viscosity dependence on degree-of-cure of a polyfurfuryl alcohol resin during the post-gel curing stage

    DEFF Research Database (Denmark)

    Dominguez, J.C.; Oliet, M.; Alonso, María Virginia

    2016-01-01

    of modeling the evolution of the complex viscosity using a widely used chemorheological model such as the Arrhenius model for each tested temperature, the change of the complex viscosity as a function of the degree-of-cure was predicted using a new exponential type model. In this model, the logarithm...... of the normalized degree-of-cure is used to predict the behavior of the logarithm of the normalized complex viscosity. The model shows good quality of fitting with the experimental data for 4 and 6 wt % amounts of catalyst. For the 2 wt % amount of catalyst, scattered data leads to a slightly lower quality...

  5. Exploring consumer satisfaction with dry-cured ham in five European countries

    DEFF Research Database (Denmark)

    Resano, Helena; Perez-Cueto, Federico J. A.; Sanjuán, Ana

    2011-01-01

    This papers' objective is to investigate consumer satisfaction with dry-cured ham in five European countries. A logistic regression model has been fitted using data collected through a crosssectional web-based survey carried out in Belgium, Germany, Denmark, Poland and Greece during January 2008 (n......=2437 of which 2156 were dry-cured ham consumers). Satisfaction was evaluated as overall satisfaction, as well as specific satisfaction with healthfulness, price, convenience and taste. The findings show that the main determinant of overall satisfaction is taste satisfaction, hence, producers...... are recommended to focus on matching sensory acceptability of dry-cured ham. No significant between-country differences were found, reflecting the wide availability of this product in all countries. Consumer characteristics influenced their level of satisfaction. Men, older (age >52 years) and frequent consumers...

  6. Regression modeling of ground-water flow

    Science.gov (United States)

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  7. Quantitative genetics of Taura syndrome resistance in Pacific (Penaeus vannamei): A cure model approach

    DEFF Research Database (Denmark)

    Ødegård, Jørgen; Gitterle, Thomas; Madsen, Per

    2011-01-01

    cure survival model using Gibbs sampling, treating susceptibility and endurance as separate genetic traits. Results: Overall mortality at the end of test was 28%, while 38% of the population was considered susceptible to the disease. The estimated underlying heritability was high for susceptibility (0....... However, genetic evaluation of susceptibility based on the cure model showed clear associations with standard genetic evaluations that ignore the cure fraction for these data. Using the current testing design, genetic variation in observed survival time and absolute survival at the end of test were most...

  8. Regression Models for Market-Shares

    DEFF Research Database (Denmark)

    Birch, Kristina; Olsen, Jørgen Kai; Tjur, Tue

    2005-01-01

    On the background of a data set of weekly sales and prices for three brands of coffee, this paper discusses various regression models and their relation to the multiplicative competitive-interaction model (the MCI model, see Cooper 1988, 1993) for market-shares. Emphasis is put on the interpretat......On the background of a data set of weekly sales and prices for three brands of coffee, this paper discusses various regression models and their relation to the multiplicative competitive-interaction model (the MCI model, see Cooper 1988, 1993) for market-shares. Emphasis is put...... on the interpretation of the parameters in relation to models for the total sales based on discrete choice models.Key words and phrases. MCI model, discrete choice model, market-shares, price elasitcity, regression model....

  9. Mathematical modelling of simultaneous solvent evaporation and chemical curing in thermoset coatings: A parameter study

    DEFF Research Database (Denmark)

    Kiil, Søren

    2011-01-01

    A mathematical model, describing the curing behaviour of a two-component, solvent-based, thermoset coating, is used to conduct a parameter study. The model includes curing reactions, solvent intra-film diffusion and evaporation, film gelation, vitrification, and crosslinking. A case study with a ...

  10. A Seemingly Unrelated Poisson Regression Model

    OpenAIRE

    King, Gary

    1989-01-01

    This article introduces a new estimator for the analysis of two contemporaneously correlated endogenous event count variables. This seemingly unrelated Poisson regression model (SUPREME) estimator combines the efficiencies created by single equation Poisson regression model estimators and insights from "seemingly unrelated" linear regression models.

  11. Two-component mixture cure rate model with spline estimated nonparametric components.

    Science.gov (United States)

    Wang, Lu; Du, Pang; Liang, Hua

    2012-09-01

    In some survival analysis of medical studies, there are often long-term survivors who can be considered as permanently cured. The goals in these studies are to estimate the noncured probability of the whole population and the hazard rate of the susceptible subpopulation. When covariates are present as often happens in practice, to understand covariate effects on the noncured probability and hazard rate is of equal importance. The existing methods are limited to parametric and semiparametric models. We propose a two-component mixture cure rate model with nonparametric forms for both the cure probability and the hazard rate function. Identifiability of the model is guaranteed by an additive assumption that allows no time-covariate interactions in the logarithm of hazard rate. Estimation is carried out by an expectation-maximization algorithm on maximizing a penalized likelihood. For inferential purpose, we apply the Louis formula to obtain point-wise confidence intervals for noncured probability and hazard rate. Asymptotic convergence rates of our function estimates are established. We then evaluate the proposed method by extensive simulations. We analyze the survival data from a melanoma study and find interesting patterns for this study. © 2011, The International Biometric Society.

  12. Panel Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    González, Andrés; Terasvirta, Timo; Dijk, Dick van

    We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...

  13. Expectation Maximization Algorithm for Box-Cox Transformation Cure Rate Model and Assessment of Model Misspecification Under Weibull Lifetimes.

    Science.gov (United States)

    Pal, Suvra; Balakrishnan, Narayanaswamy

    2018-05-01

    In this paper, we develop likelihood inference based on the expectation maximization algorithm for the Box-Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model misspecification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.

  14. Humanized mice: models for evaluating NeuroHIV and cure strategies.

    Science.gov (United States)

    Honeycutt, Jenna B; Garcia, J Victor

    2018-04-01

    While the human immunodeficiency virus (HIV) epidemic was initially characterized by a high prevalence of severe and widespread neurological pathologies, the development of better treatments to suppress viremia over years and even decades has mitigated many of the severe neurological pathologies previously observed. Despite effective treatment, mild neurocognitive impairment and premature cognitive aging are observed in HIV-infected individuals, suggesting a changing but ongoing role of HIV infection in the central nervous system (CNS). Although current therapies are effective in suppressing viremia, they are not curative and patients must remain on life-long treatment or risk recrudescence of virus. Important for the development and evaluation of a cure for HIV will be animal models that recapitulate critical aspects of infection in vivo. In the following, we seek to summarize some of the recent developments in humanized mouse models and their usefulness in modeling HIV infection of the CNS and HIV cure strategies.

  15. Composite Cure Process Modeling and Simulations using COMPRO(Registered Trademark) and Validation of Residual Strains using Fiber Optics Sensors

    Science.gov (United States)

    Sreekantamurthy, Thammaiah; Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.

    2016-01-01

    Composite cure process induced residual strains and warping deformations in composite components present significant challenges in the manufacturing of advanced composite structure. As a part of the Manufacturing Process and Simulation initiative of the NASA Advanced Composite Project (ACP), research is being conducted on the composite cure process by developing an understanding of the fundamental mechanisms by which the process induced factors influence the residual responses. In this regard, analytical studies have been conducted on the cure process modeling of composite structural parts with varied physical, thermal, and resin flow process characteristics. The cure process simulation results were analyzed to interpret the cure response predictions based on the underlying physics incorporated into the modeling tool. In the cure-kinetic analysis, the model predictions on the degree of cure, resin viscosity and modulus were interpreted with reference to the temperature distribution in the composite panel part and tool setup during autoclave or hot-press curing cycles. In the fiber-bed compaction simulation, the pore pressure and resin flow velocity in the porous media models, and the compaction strain responses under applied pressure were studied to interpret the fiber volume fraction distribution predictions. In the structural simulation, the effect of temperature on the resin and ply modulus, and thermal coefficient changes during curing on predicted mechanical strains and chemical cure shrinkage strains were studied to understand the residual strains and stress response predictions. In addition to computational analysis, experimental studies were conducted to measure strains during the curing of laminated panels by means of optical fiber Bragg grating sensors (FBGs) embedded in the resin impregnated panels. The residual strain measurements from laboratory tests were then compared with the analytical model predictions. The paper describes the cure process

  16. Interpretation of commonly used statistical regression models.

    Science.gov (United States)

    Kasza, Jessica; Wolfe, Rory

    2014-01-01

    A review of some regression models commonly used in respiratory health applications is provided in this article. Simple linear regression, multiple linear regression, logistic regression and ordinal logistic regression are considered. The focus of this article is on the interpretation of the regression coefficients of each model, which are illustrated through the application of these models to a respiratory health research study. © 2013 The Authors. Respirology © 2013 Asian Pacific Society of Respirology.

  17. Mathematical Model For Autoclave Curing Of Unsaturated Polyester Based Composite Materials

    Directory of Open Access Journals (Sweden)

    Adnan A. Abdul Razak

    2013-05-01

    Full Text Available Heat transfer process involved in the autoclave curing of fiber-reinforced thermosetting composites is investigated numerically. A model for the prediction of the temperature and the extent of the reaction across the laminate thickness during curing process in the autoclave of unsaturated polyester based composite has been developed. The governing equation for one dimensional heat transfer, and accounting for the heat generation due to the exothermic cure reaction in the composites had been used.  It was found that the temperature at the central of the laminate increases up to the external imposed temperature, because of the thermal conductivity of the resin and fiber. The heat generated by the exothermic reaction of the resin is not adequately removed; the increase in the temperature at the center increases the resins rate reaction, which in turn generates more heat.

  18. Light-Cured Self-Etch Adhesives Undergo Hydroxyapatite-Triggered Self-Cure

    Science.gov (United States)

    Liu, Y.; Bai, X.; Liu, Y.W.; Wang, Y.

    2015-01-01

    Light cure is a popular mode of curing for dental adhesives. However, it suffers from inadequate light delivery when the restoration site is less accessible, in which case a self-cure mechanism is desirable to salvage any compromised polymerization. We previously reported a novel self-cure system mediated by ethyl 4-(dimethylamino)-benzoate (4E) and hydroxyapatite (HAp). The present work aims to investigate if such self-cure phenomenon takes place in adhesives that underwent prior inadequate light cure and to elucidate if HAp released from the dental etching process is sufficient to trigger it. Model self-etch adhesives were formulated with various components, including bis[2-methacryloyloxy)ethyl]-phosphate (2MP) as acidic monomer and trimethylbenzoyl-diphenylphosphine oxide (TPO) as photoinitiator. In vitro evolution of degree of conversion (DC) of HAp-incorporated adhesives was monitored by infrared spectroscopy during light irradiation and dark storage. Selected adhesives were allowed to etch and extract HAp from enamel, light-cured in situ, and stored in the dark, after which Raman line mapping was used to obtain spatially resolved DC across the enamel-resin interface. Results showed that TPO+4E adhesives reached DC similar to TPO-only counterparts upon completion of light irradiation but underwent another round of initiation that boosted DC to ~100% regardless of HAp level or prior light exposure. When applied to enamel, TPO-only adhesives had ~80% DC in resin, which gradually descended to ~50% in enamel, whereas TPO+4E adhesives consistently scored ~80% DC across the enamel-resin interface. These observations suggest that polymerization of adhesives that underwent insufficient light cure is salvaged by the novel self-cure mechanism, and such salvaging effect can be triggered by HAp released from dental substrate during the etching process. PMID:26635279

  19. Light-Cured Self-Etch Adhesives Undergo Hydroxyapatite-Triggered Self-Cure.

    Science.gov (United States)

    Liu, Y; Bai, X; Liu, Y W; Wang, Y

    2016-03-01

    Light cure is a popular mode of curing for dental adhesives. However, it suffers from inadequate light delivery when the restoration site is less accessible, in which case a self-cure mechanism is desirable to salvage any compromised polymerization. We previously reported a novel self-cure system mediated by ethyl 4-(dimethylamino)-benzoate (4E) and hydroxyapatite (HAp). The present work aims to investigate if such self-cure phenomenon takes place in adhesives that underwent prior inadequate light cure and to elucidate if HAp released from the dental etching process is sufficient to trigger it. Model self-etch adhesives were formulated with various components, including bis[2-methacryloyloxy)ethyl]-phosphate (2MP) as acidic monomer and trimethylbenzoyl-diphenylphosphine oxide (TPO) as photoinitiator. In vitro evolution of degree of conversion (DC) of HAp-incorporated adhesives was monitored by infrared spectroscopy during light irradiation and dark storage. Selected adhesives were allowed to etch and extract HAp from enamel, light-cured in situ, and stored in the dark, after which Raman line mapping was used to obtain spatially resolved DC across the enamel-resin interface. Results showed that TPO+4E adhesives reached DC similar to TPO-only counterparts upon completion of light irradiation but underwent another round of initiation that boosted DC to ~100% regardless of HAp level or prior light exposure. When applied to enamel, TPO-only adhesives had ~80% DC in resin, which gradually descended to ~50% in enamel, whereas TPO+4E adhesives consistently scored ~80% DC across the enamel-resin interface. These observations suggest that polymerization of adhesives that underwent insufficient light cure is salvaged by the novel self-cure mechanism, and such salvaging effect can be triggered by HAp released from dental substrate during the etching process. © International & American Associations for Dental Research 2015.

  20. Accounting for Cured Patients in Cost-Effectiveness Analysis.

    Science.gov (United States)

    Othus, Megan; Bansal, Aasthaa; Koepl, Lisel; Wagner, Samuel; Ramsey, Scott

    2017-04-01

    Economic evaluations often measure an intervention effect with mean overall survival (OS). Emerging types of cancer treatments offer the possibility of being "cured" in that patients can become long-term survivors whose risk of death is the same as that of a disease-free person. Describing cured and noncured patients with one shared mean value may provide a biased assessment of a therapy with a cured proportion. The purpose of this article is to explain how to incorporate the heterogeneity from cured patients into health economic evaluation. We analyzed clinical trial data from patients with advanced melanoma treated with ipilimumab (Ipi; n = 137) versus glycoprotein 100 (gp100; n = 136) with statistical methodology for mixture cure models. Both cured and noncured patients were subject to background mortality not related to cancer. When ignoring cured proportions, we found that patients treated with Ipi had an estimated mean OS that was 8 months longer than that of patients treated with gp100. Cure model analysis showed that the cured proportion drove this difference, with 21% cured on Ipi versus 6% cured on gp100. The mean OS among the noncured cohort patients was 10 and 9 months with Ipi and gp100, respectively. The mean OS among cured patients was 26 years on both arms. When ignoring cured proportions, we found that the incremental cost-effectiveness ratio (ICER) when comparing Ipi with gp100 was $324,000/quality-adjusted life-year (QALY) (95% confidence interval $254,000-$600,000). With a mixture cure model, the ICER when comparing Ipi with gp100 was $113,000/QALY (95% confidence interval $101,000-$154,000). This analysis supports using cure modeling in health economic evaluation in advanced melanoma. When a proportion of patients may be long-term survivors, using cure models may reduce bias in OS estimates and provide more accurate estimates of health economic measures, including QALYs and ICERs. Copyright © 2017 International Society for Pharmacoeconomics

  1. Poisson Mixture Regression Models for Heart Disease Prediction.

    Science.gov (United States)

    Mufudza, Chipo; Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

  2. Poisson Mixture Regression Models for Heart Disease Prediction

    Science.gov (United States)

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  3. A Comparison of Curing Process-Induced Residual Stresses and Cure Shrinkage in Micro-Scale Composite Structures with Different Constitutive Laws

    Science.gov (United States)

    Li, Dongna; Li, Xudong; Dai, Jianfeng; Xi, Shangbin

    2018-02-01

    In this paper, three kinds of constitutive laws, elastic, "cure hardening instantaneously linear elastic (CHILE)" and viscoelastic law, are used to predict curing process-induced residual stress for the thermoset polymer composites. A multi-physics coupling finite element analysis (FEA) model implementing the proposed three approaches is established in COMSOL Multiphysics-Version 4.3b. The evolution of thermo-physical properties with temperature and degree of cure (DOC), which improved the accuracy of numerical simulations, and cure shrinkage are taken into account for the three models. Subsequently, these three proposed constitutive models are implemented respectively in a 3D micro-scale composite laminate structure. Compared the differences between these three numerical results, it indicates that big error in residual stress and cure shrinkage generates by elastic model, but the results calculated by the modified CHILE model are in excellent agreement with those estimated by the viscoelastic model.

  4. Determinant of flexible Parametric Estimation of Mixture Cure ...

    African Journals Online (AJOL)

    AIC, mean time to cure), variance and cure fraction (c) were used to determine the flexible Parametric Cure Fraction Model among the considered models. Gastric Cancer data from 76 patients received adjuvant CRT and 125 receiving resection (surgery) alone were used to confirm the suitability of the models. The data was ...

  5. Mixture of Regression Models with Single-Index

    OpenAIRE

    Xiang, Sijia; Yao, Weixin

    2016-01-01

    In this article, we propose a class of semiparametric mixture regression models with single-index. We argue that many recently proposed semiparametric/nonparametric mixture regression models can be considered special cases of the proposed model. However, unlike existing semiparametric mixture regression models, the new pro- posed model can easily incorporate multivariate predictors into the nonparametric components. Backfitting estimates and the corresponding algorithms have been proposed for...

  6. An Additive-Multiplicative Cox-Aalen Regression Model

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2002-01-01

    Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; Cox regression; survival analysis; time-varying effects...

  7. The software package for solving problems of mathematical modeling of isothermal curing process

    Directory of Open Access Journals (Sweden)

    S. G. Tikhomirov

    2016-01-01

    Full Text Available Summary. On the basis of the general laws of sulfur vulcanization diene rubbers the principles of the effective cross-linking using a multi-component agents was discussed. It is noted that the description of the mechanism of action of the complex cross-linking systems are complicated by the diversity of interactions of components and the influence of each of them on the curing kinetics, leading to a variety technological complications of real technology and affects on the quality and technical and economic indicators of the production of rubber goods. Based on the known theoretical approaches the system analysis of isothermal curing process was performed. It included the integration of different techniques and methods into a single set of. During the analysis of the kinetics of vulcanization it was found that the formation of the spatial grid parameters vulcanizates depend on many factors, to assess which requires special mathematical and algorithmic support. As a result of the stratification of the object were identified the following major subsystems. A software package for solving direct and inverse kinetic problems isothermal curing process was developed. Information support “Isothermal vulcanization” is a set of applications of mathematical modeling of isothermal curing. It is intended for direct and inverse kinetic problems. When solving the problem of clarifying the general scheme of chemical transformations used universal mechanism including secondary chemical reactions. Functional minimization algorithm with constraints on the unknown parameters was used for solving the inverse kinetic problem. Shows a flowchart of the program. An example of solving the inverse kinetic problem with the program was introduced. Dataware was implemented in the programming language C ++. Universal dependence to determine the initial concentration of the curing agent was applied . It allowing the use of a model with different properties of multicomponent

  8. [From clinical judgment to linear regression model.

    Science.gov (United States)

    Palacios-Cruz, Lino; Pérez, Marcela; Rivas-Ruiz, Rodolfo; Talavera, Juan O

    2013-01-01

    When we think about mathematical models, such as linear regression model, we think that these terms are only used by those engaged in research, a notion that is far from the truth. Legendre described the first mathematical model in 1805, and Galton introduced the formal term in 1886. Linear regression is one of the most commonly used regression models in clinical practice. It is useful to predict or show the relationship between two or more variables as long as the dependent variable is quantitative and has normal distribution. Stated in another way, the regression is used to predict a measure based on the knowledge of at least one other variable. Linear regression has as it's first objective to determine the slope or inclination of the regression line: Y = a + bx, where "a" is the intercept or regression constant and it is equivalent to "Y" value when "X" equals 0 and "b" (also called slope) indicates the increase or decrease that occurs when the variable "x" increases or decreases in one unit. In the regression line, "b" is called regression coefficient. The coefficient of determination (R 2 ) indicates the importance of independent variables in the outcome.

  9. Regression models of reactor diagnostic signals

    International Nuclear Information System (INIS)

    Vavrin, J.

    1989-01-01

    The application is described of an autoregression model as the simplest regression model of diagnostic signals in experimental analysis of diagnostic systems, in in-service monitoring of normal and anomalous conditions and their diagnostics. The method of diagnostics is described using a regression type diagnostic data base and regression spectral diagnostics. The diagnostics is described of neutron noise signals from anomalous modes in the experimental fuel assembly of a reactor. (author)

  10. Patients cured of acromegaly do not experience improvement of their skull deformities.

    Science.gov (United States)

    Rick, Jonathan W; Jahangiri, Arman; Flanigan, Patrick M; Aghi, Manish K

    2017-04-01

    Acromegaly is a rare disease that is associated with many co-morbidities. This condition also causes progressive deformity of the skull which includes frontal bossing and cranial thickening. Surgical and/or medical management can cure this condition in many patients, but it is not understood if patients cured of acromegaly experience regression of their skull deformities. We performed a retrospective analysis on patients treated at our dedicated pituitary center from 2009 to 2014. We looked at all MRI images taken during the treatment of these patients and recorded measurements on eight skull dimensions. We then analyzed these measurements for changes over time. 29 patients underwent curative treatment for acromegaly within our timeframe. The mean age for this population was 45.0 years old (range 19-70) and 55.2 % (n = 16) were female. All of these patients were treated with a transsphenoidal resection for a somatotropic pituitary adenoma. 9 (31.1%) of these patients required further medical therapy to be cured. We found statically significant variation in the coronal width of the sella turcica after therapy, which is likely attributable to changes from transsphenoidal surgery. None of the other dimensions had significant variation over time after cure. Patients cured of acromegaly should not expect natural regression of their skull deformities. Our study suggests that both frontal bossing and cranial thickening do not return to normal after cure.

  11. Forecasting with Dynamic Regression Models

    CERN Document Server

    Pankratz, Alan

    2012-01-01

    One of the most widely used tools in statistical forecasting, single equation regression models is examined here. A companion to the author's earlier work, Forecasting with Univariate Box-Jenkins Models: Concepts and Cases, the present text pulls together recent time series ideas and gives special attention to possible intertemporal patterns, distributed lag responses of output to input series and the auto correlation patterns of regression disturbance. It also includes six case studies.

  12. Curing reaction of bisphenol-A based benzoxazine with cyanate ester resin and the properties of the cured thermosetting resin

    Directory of Open Access Journals (Sweden)

    H. Kimura

    2011-12-01

    Full Text Available Curing reaction of bisphenol-A based benzoxazine with cyanate ester resin and the properties of the cured thermosetting resin were investigated. The cure behavior of benzoxazine with cyanate ester resin was monitored by model reaction using nuclear magnetic resonance (NMR. As a result of the model reaction, the ring opening reaction of benzoxazine ring and thermal self-cyclotrimerization of cyanate ester group occurred, and then the phenolic hydoroxyl group generated by the ring opening reaction of benzoxazine ring co-reacted with cyanate ester group. The properties of the cured thermosetting resin were estimated by mechanical properties, electrical resistivity, water resistance and heat resistance. The cured thermosetting resin from benzoxazine and cyanate ester resin showed good heat resistance, high electrical resistivity and high water resistance, compared with the cured thermosetting resin from benzoxazine and epoxy resin.

  13. Curing kinetics of visible light curing dental resin composites investigated by dielectric analysis (DEA).

    Science.gov (United States)

    Steinhaus, Johannes; Hausnerova, Berenika; Haenel, Thomas; Großgarten, Mandy; Möginger, Bernhard

    2014-03-01

    During the curing process of light curing dental composites the mobility of molecules and molecule segments is reduced leading to a significant increase of the viscosity as well as the ion viscosity. Thus, the kinetics of the curing behavior of 6 different composites was derived from dielectric analysis (DEA) using especially redesigned flat sensors with interdigit comb electrodes allowing for irradiation at the top side and measuring the ion viscosity at the bottom side. As the ion viscosities of dental composites change 1-3 orders of magnitude during the curing process, DEA provides a sensitive approach to evaluate their curing behavior, especially in the phase of undisturbed chain growth. In order to determine quantitative kinetic parameters a kinetic model is presented and examined for the evaluation of the ion viscosity curves. From the obtained results it is seen that DEA might be employed in the investigation of the primary curing process, the quality assurance of ingredients as well as the control of processing stability of the light curing dental composites. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  14. Comparing depth-dependent curing radiant exposure and time of curing of regular and flow bulk-fill composites

    Directory of Open Access Journals (Sweden)

    Jose Augusto RODRIGUES

    2017-08-01

    Full Text Available Abstract The effect of restoration depth on the curing time of a conventional and two bulk-fill composite resins by measuring microhardness and the respective radiosity of the bottom surface of the specimen was investigated. 1-, 3- and 5-mm thick washers were filled with Surefil SDR Flow–U (SDR, Tetric EvoCeram Bulk Fill-IVA (TEC or Esthet-X HD–B1 (EHD, and cured with Bluephase® G2 for 40s. Additional 1-mm washers were filled with SDR, TEC or EHD, placed above the light sensor of MARC®, stacked with pre-cured 1-, 3- or 5-mm washer of respective material, and cured for 2.5~60s to mimic 2-, 4- and 6-mm thick composite curing. The sensor measured the radiosity (EB at the bottom of specimen stacks. Vickers hardness (VH was measured immediately at 5 locations with triplicate specimens. Nonlinear regression of VH vs EB by VH=α[1-exp(-EB/β] with all thickness shows that the values of α, maximum hardness, are 21.6±1.0 kg/mm2 for SDR, 38.3±0.6 kg/mm2 for TEC and 45.3±2.6 kg/mm2 for EHD, and the values of β, rate parameter, are 0.40±0.06 J/cm2 for SDR, 0.77±0.04 J/cm2 for TEC and 0.58±0.09 J/cm2 for EHD. The radiosity of the bottom surface was calculated when the bottom surface of each material attained 80% of α of each material. The curing times for each material are in agreement with manufacturer recommendation for thickness. It is possible to estimate time needed to cure composite resin of known depth adequately by the radiosity and microhardness of the bottom surface.

  15. Comparing depth-dependent curing radiant exposure and time of curing of regular and flow bulk-fill composites.

    Science.gov (United States)

    Rodrigues, Jose Augusto; Tenorio, Ilana Pais; Mello, Ginger Baranhuk Rabello de; Reis, André Figueiredo; Shen, Chiayi; Roulet, Jean-François

    2017-08-21

    The effect of restoration depth on the curing time of a conventional and two bulk-fill composite resins by measuring microhardness and the respective radiosity of the bottom surface of the specimen was investigated. 1-, 3- and 5-mm thick washers were filled with Surefil SDR Flow-U (SDR), Tetric EvoCeram Bulk Fill-IVA (TEC) or Esthet-X HD-B1 (EHD), and cured with Bluephase® G2 for 40s. Additional 1-mm washers were filled with SDR, TEC or EHD, placed above the light sensor of MARC®, stacked with pre-cured 1-, 3- or 5-mm washer of respective material, and cured for 2.5~60s to mimic 2-, 4- and 6-mm thick composite curing. The sensor measured the radiosity (EB) at the bottom of specimen stacks. Vickers hardness (VH) was measured immediately at 5 locations with triplicate specimens. Nonlinear regression of VH vs EB by VH=α[1-exp(-EB/β)] with all thickness shows that the values of α, maximum hardness, are 21.6±1.0 kg/mm2 for SDR, 38.3±0.6 kg/mm2 for TEC and 45.3±2.6 kg/mm2 for EHD, and the values of β, rate parameter, are 0.40±0.06 J/cm2 for SDR, 0.77±0.04 J/cm2 for TEC and 0.58±0.09 J/cm2 for EHD. The radiosity of the bottom surface was calculated when the bottom surface of each material attained 80% of α of each material. The curing times for each material are in agreement with manufacturer recommendation for thickness. It is possible to estimate time needed to cure composite resin of known depth adequately by the radiosity and microhardness of the bottom surface.

  16. Categorical regression dose-response modeling

    Science.gov (United States)

    The goal of this training is to provide participants with training on the use of the U.S. EPA’s Categorical Regression soft¬ware (CatReg) and its application to risk assessment. Categorical regression fits mathematical models to toxicity data that have been assigned ord...

  17. Light curing through glass ceramics: effect of curing mode on micromechanical properties of dual-curing resin cements.

    Science.gov (United States)

    Flury, Simon; Lussi, Adrian; Hickel, Reinhard; Ilie, Nicoleta

    2014-04-01

    The aim of this study was to investigate micromechanical properties of five dual-curing resin cements after different curing modes including light curing through glass ceramic materials. Vickers hardness (VH) and indentation modulus (Y HU) of Panavia F2.0, RelyX Unicem 2 Automix, SpeedCEM, BisCem, and BeautiCem SA were measured after 1 week of storage (37 °C, 100 % humidity). The resin cements were tested following self-curing or light curing with the second-generation light-emitting diode (LED) curing unit Elipar FreeLight 2 in Standard Mode (1,545 mW/cm(2)) or with the third-generation LED curing unit VALO in High Power Mode (1,869 mW/cm(2)) or in XtraPower Mode (3,505 mW/cm(2)). Light curing was performed directly or through glass ceramic discs of 1.5 or 3 mm thickness of IPS Empress CAD or IPS e.max CAD. VH and Y HU were analysed with Kruskal-Wallis tests followed by pairwise Wilcoxon rank sum tests (α = 0.05). RelyX Unicem 2 Automix resulted in the highest VH and Y HU followed by BeautiCem SA, BisCem, SpeedCEM, and finally Panavia F2.0. Self-curing of RelyX Unicem 2 Automix and SpeedCEM lowered VH and Y HU compared to light curing whereas self-curing of Panavia F2.0, BisCem, and BeautiCem SA led to similar or significantly higher VH and Y HU compared to light curing. Generally, direct light curing resulted in similar or lower VH and Y HU compared to light curing through 1.5-mm-thick ceramic discs. Light curing through 3-mm-thick discs of IPS e.max CAD generally reduced VH and Y HU for all resin cements except SpeedCEM, which was the least affected by light curing through ceramic discs. The resin cements responded heterogeneously to changes in curing mode. The applied irradiances and light curing times adequately cured the resin cements even through 1.5-mm-thick ceramic discs. When light curing resin cements through thick glass ceramic restorations, clinicians should consider to prolong the light curing times even with LED curing units providing high

  18. Modeling Chronic Dacryocystitis in Rabbits by Nasolacrimal Duct Obstruction with Self-Curing Resin

    Directory of Open Access Journals (Sweden)

    Kai Hou

    2017-01-01

    Full Text Available We established a chronic dacryocystitis model by injecting of 0.05, 0.1, and 0.15 ml self-curing resin via the lacrimal punctum in rabbits. Animals were randomized into four groups (n=11 animals/group. The control group received 0.15 ml normal saline. Within three months postinjection, epiphora and eye discharge were observed. At the 90th day postlacrimal passage irrigation, CT dacryocystography was performed to find changes in the lacrimal image, and hematoxylin and eosin staining was made to identify pathological changes of the lacrimal sac. Three months postinjection, the rabbits in control group and those who received 0.05 and 0.1 ml self-curing resin failed to develop chronic dacryocystitis. However, 8/11 (72.7% rabbits those received 0.15 ml self-curing resin were symptomatic and showed complete reflux in lacrimal passage irrigation, indicating the obstruction of the nasolacrimal duct. CT dacryocystography showed that the obstruction was present only in the animals with chronic dacryocystitis. Pathological examinations of chronic dacryocystitis also revealed significantly inflammatory changes, such as mucus epithelium thickening, irregular papillary proliferation, and submucosal fibrous deposition. Local injection of 0.15 ml self-curing resin can induce permanent obstruction of the nasolacrimal duct in rabbits and establish a model of chronic dacryocystitis.

  19. Mixed-effects regression models in linguistics

    CERN Document Server

    Heylen, Kris; Geeraerts, Dirk

    2018-01-01

    When data consist of grouped observations or clusters, and there is a risk that measurements within the same group are not independent, group-specific random effects can be added to a regression model in order to account for such within-group associations. Regression models that contain such group-specific random effects are called mixed-effects regression models, or simply mixed models. Mixed models are a versatile tool that can handle both balanced and unbalanced datasets and that can also be applied when several layers of grouping are present in the data; these layers can either be nested or crossed.  In linguistics, as in many other fields, the use of mixed models has gained ground rapidly over the last decade. This methodological evolution enables us to build more sophisticated and arguably more realistic models, but, due to its technical complexity, also introduces new challenges. This volume brings together a number of promising new evolutions in the use of mixed models in linguistics, but also addres...

  20. Moderation analysis using a two-level regression model.

    Science.gov (United States)

    Yuan, Ke-Hai; Cheng, Ying; Maxwell, Scott

    2014-10-01

    Moderation analysis is widely used in social and behavioral research. The most commonly used model for moderation analysis is moderated multiple regression (MMR) in which the explanatory variables of the regression model include product terms, and the model is typically estimated by least squares (LS). This paper argues for a two-level regression model in which the regression coefficients of a criterion variable on predictors are further regressed on moderator variables. An algorithm for estimating the parameters of the two-level model by normal-distribution-based maximum likelihood (NML) is developed. Formulas for the standard errors (SEs) of the parameter estimates are provided and studied. Results indicate that, when heteroscedasticity exists, NML with the two-level model gives more efficient and more accurate parameter estimates than the LS analysis of the MMR model. When error variances are homoscedastic, NML with the two-level model leads to essentially the same results as LS with the MMR model. Most importantly, the two-level regression model permits estimating the percentage of variance of each regression coefficient that is due to moderator variables. When applied to data from General Social Surveys 1991, NML with the two-level model identified a significant moderation effect of race on the regression of job prestige on years of education while LS with the MMR model did not. An R package is also developed and documented to facilitate the application of the two-level model.

  1. Variable selection and model choice in geoadditive regression models.

    Science.gov (United States)

    Kneib, Thomas; Hothorn, Torsten; Tutz, Gerhard

    2009-06-01

    Model choice and variable selection are issues of major concern in practical regression analyses, arising in many biometric applications such as habitat suitability analyses, where the aim is to identify the influence of potentially many environmental conditions on certain species. We describe regression models for breeding bird communities that facilitate both model choice and variable selection, by a boosting algorithm that works within a class of geoadditive regression models comprising spatial effects, nonparametric effects of continuous covariates, interaction surfaces, and varying coefficients. The major modeling components are penalized splines and their bivariate tensor product extensions. All smooth model terms are represented as the sum of a parametric component and a smooth component with one degree of freedom to obtain a fair comparison between the model terms. A generic representation of the geoadditive model allows us to devise a general boosting algorithm that automatically performs model choice and variable selection.

  2. The MIDAS Touch: Mixed Data Sampling Regression Models

    OpenAIRE

    Ghysels, Eric; Santa-Clara, Pedro; Valkanov, Rossen

    2004-01-01

    We introduce Mixed Data Sampling (henceforth MIDAS) regression models. The regressions involve time series data sampled at different frequencies. Technically speaking MIDAS models specify conditional expectations as a distributed lag of regressors recorded at some higher sampling frequencies. We examine the asymptotic properties of MIDAS regression estimation and compare it with traditional distributed lag models. MIDAS regressions have wide applicability in macroeconomics and �nance.

  3. Introduction to the use of regression models in epidemiology.

    Science.gov (United States)

    Bender, Ralf

    2009-01-01

    Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.

  4. Real estate value prediction using multivariate regression models

    Science.gov (United States)

    Manjula, R.; Jain, Shubham; Srivastava, Sharad; Rajiv Kher, Pranav

    2017-11-01

    The real estate market is one of the most competitive in terms of pricing and the same tends to vary significantly based on a lot of factors, hence it becomes one of the prime fields to apply the concepts of machine learning to optimize and predict the prices with high accuracy. Therefore in this paper, we present various important features to use while predicting housing prices with good accuracy. We have described regression models, using various features to have lower Residual Sum of Squares error. While using features in a regression model some feature engineering is required for better prediction. Often a set of features (multiple regressions) or polynomial regression (applying a various set of powers in the features) is used for making better model fit. For these models are expected to be susceptible towards over fitting ridge regression is used to reduce it. This paper thus directs to the best application of regression models in addition to other techniques to optimize the result.

  5. CURING EFFICIENCY OF DUAL-CURE RESIN CEMENT UNDER ZIRCONIA WITH TWO DIFFERENT LIGHT CURING UNITS

    Directory of Open Access Journals (Sweden)

    Pınar GÜLTEKİN

    2015-04-01

    Full Text Available Purpose: Adequate polymerization is a crucial factor in obtaining optimal physical properties and a satisfying clinical performance from composite resin materials. The aim of this study was to evaluate the polymerization efficiency of dual-cure resin cement cured with two different light curing units under zirconia structures having differing thicknesses. Materials and Methods: 4 zirconia discs framework in 4 mm diameter and in 0.5 mm, 1 mm and 1.5 mm thickness were prepared using computer-aided design system. One of the 0.5 mm-thick substructures was left as mono-layered whereas others were layered with feldspathic porcelain of same thickness and ceramic samples with 4 different thicknesses (0.5, 1, 1.5 and 2.0 mm were prepared. For each group (n=12 resin cement was light cured in polytetrafluoroethylene molds using Light Emitting Diode (LED or Quartz-Tungsten Halogen (QHT light curing units under each of 4 zirconia based discs (n=96. The values of depth of cure (in mm and the Vickers Hardness Number values (VHN were evaluated for each specimen. Results: The use of LED curing unit produced a greater depth of cure compared to QTH under ceramic discs with 0.5 and 1 mm thickness (p<0.05.At 100μm and 300 μm depth, the LED unit produced significantly greater VHN values compared to the QTH unit (p<0.05. At 500 μm depth, the difference between the VHN values of LED and QTH groups were not statistically significant. Conclusion: Light curing may not result in adequate resin cement polymerization under thick zirconia structures. LED light sources should be preferred over QTH for curing dual-cure resin cements, especially for those under thicker zirconia restorations.

  6. Systematic review, meta-analysis, and meta-regression: Successful second-line treatment for Helicobacter pylori.

    Science.gov (United States)

    Muñoz, Neus; Sánchez-Delgado, Jordi; Baylina, Mireia; Puig, Ignasi; López-Góngora, Sheila; Suarez, David; Calvet, Xavier

    2018-06-01

    Multiple Helicobacter pylori second-line schedules have been described as potentially useful. It remains unclear, however, which are the best combinations, and which features of second-line treatments are related to better cure rates. The aim of this study was to determine that second-line treatments achieved excellent (>90%) cure rates by performing a systematic review and when possible a meta-analysis. A meta-regression was planned to determine the characteristics of treatments achieving excellent cure rates. A systematic review for studies evaluating second-line Helicobacter pylori treatment was carried out in multiple databases. A formal meta-analysis was performed when an adequate number of comparative studies was found, using RevMan5.3. A meta-regression for evaluating factors predicting cure rates >90% was performed using Stata Statistical Software. The systematic review identified 115 eligible studies, including 203 evaluable treatment arms. The results were extremely heterogeneous, with 61 treatment arms (30%) achieving optimal (>90%) cure rates. The meta-analysis favored quadruple therapies over triple (83.2% vs 76.1%, OR: 0.59:0.38-0.93; P = .02) and 14-day quadruple treatments over 7-day treatments (91.2% vs 81.5%, OR; 95% CI: 0.42:0.24-0.73; P = .002), although the differences were significant only in the per-protocol analysis. The meta-regression did not find any particular characteristics of the studies to be associated with excellent cure rates. Second-line Helicobacter pylori treatments achieving>90% cure rates are extremely heterogeneous. Quadruple therapy and 14-day treatments seem better than triple therapies and 7-day ones. No single characteristic of the treatments was related to excellent cure rates. Future approaches suitable for infectious diseases-thus considering antibiotic resistances-are needed to design rescue treatments that consistently achieve excellent cure rates. © 2018 John Wiley & Sons Ltd.

  7. Model-based Quantile Regression for Discrete Data

    KAUST Repository

    Padellini, Tullia

    2018-04-10

    Quantile regression is a class of methods voted to the modelling of conditional quantiles. In a Bayesian framework quantile regression has typically been carried out exploiting the Asymmetric Laplace Distribution as a working likelihood. Despite the fact that this leads to a proper posterior for the regression coefficients, the resulting posterior variance is however affected by an unidentifiable parameter, hence any inferential procedure beside point estimation is unreliable. We propose a model-based approach for quantile regression that considers quantiles of the generating distribution directly, and thus allows for a proper uncertainty quantification. We then create a link between quantile regression and generalised linear models by mapping the quantiles to the parameter of the response variable, and we exploit it to fit the model with R-INLA. We extend it also in the case of discrete responses, where there is no 1-to-1 relationship between quantiles and distribution\\'s parameter, by introducing continuous generalisations of the most common discrete variables (Poisson, Binomial and Negative Binomial) to be exploited in the fitting.

  8. Variable importance in latent variable regression models

    NARCIS (Netherlands)

    Kvalheim, O.M.; Arneberg, R.; Bleie, O.; Rajalahti, T.; Smilde, A.K.; Westerhuis, J.A.

    2014-01-01

    The quality and practical usefulness of a regression model are a function of both interpretability and prediction performance. This work presents some new graphical tools for improved interpretation of latent variable regression models that can also assist in improved algorithms for variable

  9. Dsc cure kinetics of an unsaturated polyester resin using empirical kinetic model

    International Nuclear Information System (INIS)

    Abdullah, I.

    2015-01-01

    In this paper, the kinetics of curing of unsaturated polyester resin initiated with benzoyl peroxide was studied. In case of unsaturated polyester (UP) resin, isothermal test alone could not predict correctly the curing time of UP resin. Therefore, isothermal kinetic analysis through isoconventional adjustment was used to correctly predict the curing time and temperature of UP resin. Isothermal kinetic analysis through isoconversional adjustment indicated that 97% of UP resin cures in 33 min at 120 degree C. Curing of UP resin through microwaves was also studied and found that 67% of UP resin cures in 1 min at 120 degree C. The crosslinking reaction of UP resin is so fast at 120 degree C that it becomes impossible to predict correctly the curing time of UP resin using isothermal test and the burial of C=C bonds in microgels makes it impossible to be fully cured by microwaves at 120 degree C. The rheological behaviour of unsaturated polyester resin was also studied to observe the change in viscosity with respect to time and temperature. (author)

  10. Spontaneous regression of metastases from malignant melanoma: a case report

    DEFF Research Database (Denmark)

    Kalialis, Louise V; Drzewiecki, Krzysztof T; Mohammadi, Mahin

    2008-01-01

    A case of a 61-year-old male with widespread metastatic melanoma is presented 5 years after complete spontaneous cure. Spontaneous regression occurred in cutaneous, pulmonary, hepatic and cerebral metastases. A review of the literature reveals seven cases of regression of cerebral metastases; thi...

  11. A simple approach to power and sample size calculations in logistic regression and Cox regression models.

    Science.gov (United States)

    Vaeth, Michael; Skovlund, Eva

    2004-06-15

    For a given regression problem it is possible to identify a suitably defined equivalent two-sample problem such that the power or sample size obtained for the two-sample problem also applies to the regression problem. For a standard linear regression model the equivalent two-sample problem is easily identified, but for generalized linear models and for Cox regression models the situation is more complicated. An approximately equivalent two-sample problem may, however, also be identified here. In particular, we show that for logistic regression and Cox regression models the equivalent two-sample problem is obtained by selecting two equally sized samples for which the parameters differ by a value equal to the slope times twice the standard deviation of the independent variable and further requiring that the overall expected number of events is unchanged. In a simulation study we examine the validity of this approach to power calculations in logistic regression and Cox regression models. Several different covariate distributions are considered for selected values of the overall response probability and a range of alternatives. For the Cox regression model we consider both constant and non-constant hazard rates. The results show that in general the approach is remarkably accurate even in relatively small samples. Some discrepancies are, however, found in small samples with few events and a highly skewed covariate distribution. Comparison with results based on alternative methods for logistic regression models with a single continuous covariate indicates that the proposed method is at least as good as its competitors. The method is easy to implement and therefore provides a simple way to extend the range of problems that can be covered by the usual formulas for power and sample size determination. Copyright 2004 John Wiley & Sons, Ltd.

  12. Pre-cure freezing affects proteolysis in dry-cured hams.

    Science.gov (United States)

    Bañón, S; Cayuela, J M; Granados, M V; Garrido, M D

    1999-01-01

    Several parameters (sodium chloride, moisture, intramuscular fat, total nitrogen, non-protein nitrogen, white precipitates, free tyrosine, L* a* b* values and acceptability) related with proteolysis during the curing were compared in dry-cured hams manufactured from refrigerated and frozen/thawed raw material. Pre-cure freezing increased the proteolysis levels significantly (pcured meat, although it does not significantly affect the sensory quality of the dry-cured ham.

  13. Regression modeling methods, theory, and computation with SAS

    CERN Document Server

    Panik, Michael

    2009-01-01

    Regression Modeling: Methods, Theory, and Computation with SAS provides an introduction to a diverse assortment of regression techniques using SAS to solve a wide variety of regression problems. The author fully documents the SAS programs and thoroughly explains the output produced by the programs.The text presents the popular ordinary least squares (OLS) approach before introducing many alternative regression methods. It covers nonparametric regression, logistic regression (including Poisson regression), Bayesian regression, robust regression, fuzzy regression, random coefficients regression,

  14. Regression Models For Multivariate Count Data.

    Science.gov (United States)

    Zhang, Yiwen; Zhou, Hua; Zhou, Jin; Sun, Wei

    2017-01-01

    Data with multivariate count responses frequently occur in modern applications. The commonly used multinomial-logit model is limiting due to its restrictive mean-variance structure. For instance, analyzing count data from the recent RNA-seq technology by the multinomial-logit model leads to serious errors in hypothesis testing. The ubiquity of over-dispersion and complicated correlation structures among multivariate counts calls for more flexible regression models. In this article, we study some generalized linear models that incorporate various correlation structures among the counts. Current literature lacks a treatment of these models, partly due to the fact that they do not belong to the natural exponential family. We study the estimation, testing, and variable selection for these models in a unifying framework. The regression models are compared on both synthetic and real RNA-seq data.

  15. Cure of skin cancer. Surgical cure of skin cancer

    International Nuclear Information System (INIS)

    Zikiryakhodjaev, D.Z.; Sanginov, D.R.

    2001-01-01

    In this chapter authors studied the cure of skin cancer in particular the surgical cure of skin cancer. They noted that surgical cure of skin cancer is remain one of the primary and most important methods in treatment of skin cancer

  16. Nonparametric Mixture of Regression Models.

    Science.gov (United States)

    Huang, Mian; Li, Runze; Wang, Shaoli

    2013-07-01

    Motivated by an analysis of US house price index data, we propose nonparametric finite mixture of regression models. We study the identifiability issue of the proposed models, and develop an estimation procedure by employing kernel regression. We further systematically study the sampling properties of the proposed estimators, and establish their asymptotic normality. A modified EM algorithm is proposed to carry out the estimation procedure. We show that our algorithm preserves the ascent property of the EM algorithm in an asymptotic sense. Monte Carlo simulations are conducted to examine the finite sample performance of the proposed estimation procedure. An empirical analysis of the US house price index data is illustrated for the proposed methodology.

  17. Cure Cycle Optimization of Rapidly Cured Out-Of-Autoclave Composites

    Science.gov (United States)

    Dong, Anqi; Zhao, Yan; Zhao, Xinqing; Yu, Qiyong

    2018-01-01

    Out-of-autoclave prepreg typically needs a long cure cycle to guarantee good properties as the result of low processing pressure applied. It is essential to reduce the manufacturing time, achieve real cost reduction, and take full advantage of out-of-autoclave process. The focus of this paper is to reduce the cure cycle time and production cost while maintaining high laminate quality. A rapidly cured out-of-autoclave resin and relative prepreg were independently developed. To determine a suitable rapid cure procedure for the developed prepreg, the effect of heating rate, initial cure temperature, dwelling time, and post-cure time on the final laminate quality were evaluated and the factors were then optimized. As a result, a rapid cure procedure was determined. The results showed that the resin infiltration could be completed at the end of the initial cure stage and no obvious void could be seen in the laminate at this time. The laminate could achieve good internal quality using the optimized cure procedure. The mechanical test results showed that the laminates had a fiber volume fraction of 59–60% with a final glass transition temperature of 205 °C and excellent mechanical strength especially the flexural properties. PMID:29534048

  18. Self-curing concrete with different self-curing agents

    Science.gov (United States)

    Gopala krishna sastry, K. V. S.; manoj kumar, Putturu

    2018-03-01

    Concrete is recognised as a versatile construction material globally. Properties of concrete depend upon, to a greater extent, the hydration of cement and microstructure of hydrated cement. Congenial atmosphere would aid the hydration of cement and hence curing of concrete becomes essential, till a major portion of the hydration process is completed. But in areas of water inadequacy and concreting works at considerable heights, curing is problematic. Self-Curing or Internal Curing technique overcomes these problems. It supplies redundant moisture, for more than sufficient hydration of cement and diminish self-desiccation. Self-Curing agents substantially help in the conservation of water in concrete, by bringing down the evaporation during the hydration of Concrete. The present study focuses on the impact of self-curing agents such as Poly Ethylene Glycol (PEG), Poly Vinyl Alcohol (PVA) and Super Absorbent Polymer (SAP) on the concrete mix of M25 grade (reference mix). The effect of these agents on strength properties of Concrete such as compressive strength, split tensile strength and flexural strength was observed on a comparative basis which revealed that PEG 4000 was the most effective among all the agents.

  19. Gaussian Process Regression Model in Spatial Logistic Regression

    Science.gov (United States)

    Sofro, A.; Oktaviarina, A.

    2018-01-01

    Spatial analysis has developed very quickly in the last decade. One of the favorite approaches is based on the neighbourhood of the region. Unfortunately, there are some limitations such as difficulty in prediction. Therefore, we offer Gaussian process regression (GPR) to accommodate the issue. In this paper, we will focus on spatial modeling with GPR for binomial data with logit link function. The performance of the model will be investigated. We will discuss the inference of how to estimate the parameters and hyper-parameters and to predict as well. Furthermore, simulation studies will be explained in the last section.

  20. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  1. Robust mislabel logistic regression without modeling mislabel probabilities.

    Science.gov (United States)

    Hung, Hung; Jou, Zhi-Yu; Huang, Su-Yun

    2018-03-01

    Logistic regression is among the most widely used statistical methods for linear discriminant analysis. In many applications, we only observe possibly mislabeled responses. Fitting a conventional logistic regression can then lead to biased estimation. One common resolution is to fit a mislabel logistic regression model, which takes into consideration of mislabeled responses. Another common method is to adopt a robust M-estimation by down-weighting suspected instances. In this work, we propose a new robust mislabel logistic regression based on γ-divergence. Our proposal possesses two advantageous features: (1) It does not need to model the mislabel probabilities. (2) The minimum γ-divergence estimation leads to a weighted estimating equation without the need to include any bias correction term, that is, it is automatically bias-corrected. These features make the proposed γ-logistic regression more robust in model fitting and more intuitive for model interpretation through a simple weighting scheme. Our method is also easy to implement, and two types of algorithms are included. Simulation studies and the Pima data application are presented to demonstrate the performance of γ-logistic regression. © 2017, The International Biometric Society.

  2. Mixed Frequency Data Sampling Regression Models: The R Package midasr

    Directory of Open Access Journals (Sweden)

    Eric Ghysels

    2016-08-01

    Full Text Available When modeling economic relationships it is increasingly common to encounter data sampled at different frequencies. We introduce the R package midasr which enables estimating regression models with variables sampled at different frequencies within a MIDAS regression framework put forward in work by Ghysels, Santa-Clara, and Valkanov (2002. In this article we define a general autoregressive MIDAS regression model with multiple variables of different frequencies and show how it can be specified using the familiar R formula interface and estimated using various optimization methods chosen by the researcher. We discuss how to check the validity of the estimated model both in terms of numerical convergence and statistical adequacy of a chosen regression specification, how to perform model selection based on a information criterion, how to assess forecasting accuracy of the MIDAS regression model and how to obtain a forecast aggregation of different MIDAS regression models. We illustrate the capabilities of the package with a simulated MIDAS regression model and give two empirical examples of application of MIDAS regression.

  3. Impact of multicollinearity on small sample hydrologic regression models

    Science.gov (United States)

    Kroll, Charles N.; Song, Peter

    2013-06-01

    Often hydrologic regression models are developed with ordinary least squares (OLS) procedures. The use of OLS with highly correlated explanatory variables produces multicollinearity, which creates highly sensitive parameter estimators with inflated variances and improper model selection. It is not clear how to best address multicollinearity in hydrologic regression models. Here a Monte Carlo simulation is developed to compare four techniques to address multicollinearity: OLS, OLS with variance inflation factor screening (VIF), principal component regression (PCR), and partial least squares regression (PLS). The performance of these four techniques was observed for varying sample sizes, correlation coefficients between the explanatory variables, and model error variances consistent with hydrologic regional regression models. The negative effects of multicollinearity are magnified at smaller sample sizes, higher correlations between the variables, and larger model error variances (smaller R2). The Monte Carlo simulation indicates that if the true model is known, multicollinearity is present, and the estimation and statistical testing of regression parameters are of interest, then PCR or PLS should be employed. If the model is unknown, or if the interest is solely on model predictions, is it recommended that OLS be employed since using more complicated techniques did not produce any improvement in model performance. A leave-one-out cross-validation case study was also performed using low-streamflow data sets from the eastern United States. Results indicate that OLS with stepwise selection generally produces models across study regions with varying levels of multicollinearity that are as good as biased regression techniques such as PCR and PLS.

  4. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  5. Testing homogeneity in Weibull-regression models.

    Science.gov (United States)

    Bolfarine, Heleno; Valença, Dione M

    2005-10-01

    In survival studies with families or geographical units it may be of interest testing whether such groups are homogeneous for given explanatory variables. In this paper we consider score type tests for group homogeneity based on a mixing model in which the group effect is modelled as a random variable. As opposed to hazard-based frailty models, this model presents survival times that conditioned on the random effect, has an accelerated failure time representation. The test statistics requires only estimation of the conventional regression model without the random effect and does not require specifying the distribution of the random effect. The tests are derived for a Weibull regression model and in the uncensored situation, a closed form is obtained for the test statistic. A simulation study is used for comparing the power of the tests. The proposed tests are applied to real data sets with censored data.

  6. Model-based Quantile Regression for Discrete Data

    KAUST Repository

    Padellini, Tullia; Rue, Haavard

    2018-01-01

    Quantile regression is a class of methods voted to the modelling of conditional quantiles. In a Bayesian framework quantile regression has typically been carried out exploiting the Asymmetric Laplace Distribution as a working likelihood. Despite

  7. Detection of epistatic effects with logic regression and a classical linear regression model.

    Science.gov (United States)

    Malina, Magdalena; Ickstadt, Katja; Schwender, Holger; Posch, Martin; Bogdan, Małgorzata

    2014-02-01

    To locate multiple interacting quantitative trait loci (QTL) influencing a trait of interest within experimental populations, usually methods as the Cockerham's model are applied. Within this framework, interactions are understood as the part of the joined effect of several genes which cannot be explained as the sum of their additive effects. However, if a change in the phenotype (as disease) is caused by Boolean combinations of genotypes of several QTLs, this Cockerham's approach is often not capable to identify them properly. To detect such interactions more efficiently, we propose a logic regression framework. Even though with the logic regression approach a larger number of models has to be considered (requiring more stringent multiple testing correction) the efficient representation of higher order logic interactions in logic regression models leads to a significant increase of power to detect such interactions as compared to a Cockerham's approach. The increase in power is demonstrated analytically for a simple two-way interaction model and illustrated in more complex settings with simulation study and real data analysis.

  8. An advanced BLT-humanized mouse model for extended HIV-1 cure studies.

    Science.gov (United States)

    Lavender, Kerry J; Pace, Craig; Sutter, Kathrin; Messer, Ronald J; Pouncey, Dakota L; Cummins, Nathan W; Natesampillai, Sekar; Zheng, Jim; Goldsmith, Joshua; Widera, Marek; Van Dis, Erik S; Phillips, Katie; Race, Brent; Dittmer, Ulf; Kukolj, George; Hasenkrug, Kim J

    2018-01-02

    Although bone marrow, liver, thymus (BLT)-humanized mice provide a robust model for HIV-1 infection and enable evaluation of cure strategies dependent on endogenous immune responses, most mice develop graft versus host disease (GVHD), limiting their utility for extended HIV cure studies. This study aimed to: evaluate the GVHD-resistant C57 black 6 (C57BL/6) recombination activating gene 2 (Rag2)γcCD47 triple knockout (TKO)-BLT mouse as a model to establish HIV-1 latency. Determine whether TKO-BLT mice could be maintained on antiretroviral therapy (ART) for extended periods of time. Assess the rapidity of viral rebound following therapy interruption. TKO-BLT mice were HIV-1 infected, treated with various ART regimens over extended periods of time and assayed for viral rebound following therapy interruption. Daily subcutaneous injection and oral ART-mediated suppression of HIV-1 infection was tested at various doses in TKO-BLT mice. Mice were monitored for suppression of viremia and cellular HIV-1 RNA and DNA prior to and following therapy interruption. Mice remained healthy for 45 weeks posthumanization and could be treated with ART for up to 18 weeks. Viremia was suppressed to less than 200 copies/ml in the majority of mice with significant reductions in cellular HIV-1 RNA and DNA. Treatment interruption resulted in rapid viral recrudescence. HIV-1 latency can be maintained in TKO-BLT mice over extended periods on ART and rapid viral rebound occurs following therapy removal. The additional 15-18 weeks of healthy longevity compared with other BLT models provides sufficient time to examine the decay kinetics of the latent reservoir as well as observe delays in recrudescence in HIV-1 cure studies.

  9. AN APPLICATION OF FUNCTIONAL MULTIVARIATE REGRESSION MODEL TO MULTICLASS CLASSIFICATION

    OpenAIRE

    Krzyśko, Mirosław; Smaga, Łukasz

    2017-01-01

    In this paper, the scale response functional multivariate regression model is considered. By using the basis functions representation of functional predictors and regression coefficients, this model is rewritten as a multivariate regression model. This representation of the functional multivariate regression model is used for multiclass classification for multivariate functional data. Computational experiments performed on real labelled data sets demonstrate the effectiveness of the proposed ...

  10. Parameters Estimation of Geographically Weighted Ordinal Logistic Regression (GWOLR) Model

    Science.gov (United States)

    Zuhdi, Shaifudin; Retno Sari Saputro, Dewi; Widyaningsih, Purnami

    2017-06-01

    A regression model is the representation of relationship between independent variable and dependent variable. The dependent variable has categories used in the logistic regression model to calculate odds on. The logistic regression model for dependent variable has levels in the logistics regression model is ordinal. GWOLR model is an ordinal logistic regression model influenced the geographical location of the observation site. Parameters estimation in the model needed to determine the value of a population based on sample. The purpose of this research is to parameters estimation of GWOLR model using R software. Parameter estimation uses the data amount of dengue fever patients in Semarang City. Observation units used are 144 villages in Semarang City. The results of research get GWOLR model locally for each village and to know probability of number dengue fever patient categories.

  11. Alternative regression models to assess increase in childhood BMI.

    Science.gov (United States)

    Beyerlein, Andreas; Fahrmeir, Ludwig; Mansmann, Ulrich; Toschke, André M

    2008-09-08

    Body mass index (BMI) data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs), quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS). We analyzed data of 4967 children participating in the school entry health examination in Bavaria, Germany, from 2001 to 2002. TV watching, meal frequency, breastfeeding, smoking in pregnancy, maternal obesity, parental social class and weight gain in the first 2 years of life were considered as risk factors for obesity. GAMLSS showed a much better fit regarding the estimation of risk factors effects on transformed and untransformed BMI data than common GLMs with respect to the generalized Akaike information criterion. In comparison with GAMLSS, quantile regression allowed for additional interpretation of prespecified distribution quantiles, such as quantiles referring to overweight or obesity. The variables TV watching, maternal BMI and weight gain in the first 2 years were directly, and meal frequency was inversely significantly associated with body composition in any model type examined. In contrast, smoking in pregnancy was not directly, and breastfeeding and parental social class were not inversely significantly associated with body composition in GLM models, but in GAMLSS and partly in quantile regression models. Risk factor specific BMI percentile curves could be estimated from GAMLSS and quantile regression models. GAMLSS and quantile regression seem to be more appropriate than common GLMs for risk factor modeling of BMI data.

  12. Alternative regression models to assess increase in childhood BMI

    OpenAIRE

    Beyerlein, Andreas; Fahrmeir, Ludwig; Mansmann, Ulrich; Toschke, André M

    2008-01-01

    Abstract Background Body mass index (BMI) data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Methods Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs), quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS). We analyzed data of 4967 childre...

  13. Thermal Efficiency Degradation Diagnosis Method Using Regression Model

    International Nuclear Information System (INIS)

    Jee, Chang Hyun; Heo, Gyun Young; Jang, Seok Won; Lee, In Cheol

    2011-01-01

    This paper proposes an idea for thermal efficiency degradation diagnosis in turbine cycles, which is based on turbine cycle simulation under abnormal conditions and a linear regression model. The correlation between the inputs for representing degradation conditions (normally unmeasured but intrinsic states) and the simulation outputs (normally measured but superficial states) was analyzed with the linear regression model. The regression models can inversely response an associated intrinsic state for a superficial state observed from a power plant. The diagnosis method proposed herein is classified into three processes, 1) simulations for degradation conditions to get measured states (referred as what-if method), 2) development of the linear model correlating intrinsic and superficial states, and 3) determination of an intrinsic state using the superficial states of current plant and the linear regression model (referred as inverse what-if method). The what-if method is to generate the outputs for the inputs including various root causes and/or boundary conditions whereas the inverse what-if method is the process of calculating the inverse matrix with the given superficial states, that is, component degradation modes. The method suggested in this paper was validated using the turbine cycle model for an operating power plant

  14. EVALUATION OF DIELECTRIC CURING MONITORING INVESTIGATING LIGHT-CURING DENTAL FILLING COMPOSITES

    Directory of Open Access Journals (Sweden)

    Johannes Steinhaus

    2011-05-01

    Full Text Available The aim of this study is the evaluation of a dielectric analysis (DEA method monitoring the curing behaviour of a light curing dental filling material in real-time. The evaluation is to extract the influence of light intensity on the photo-curing process of dental composite filling materials. The intensity change is obtained by measuring the curing process at different sample depth. It could be shown that increasing sample thickness, and therefore exponentially decreasing light intensity, causes a proportional decrease in the initial curing rate. Nevertheless, the results give rise to the assumption that lower illumination intensities over a long period cause higher overall conversion, and thus better mechanical properties. This would allow for predictions of the impact of different curing-rates on the final mechanical properties.

  15. Random regression models for detection of gene by environment interaction

    Directory of Open Access Journals (Sweden)

    Meuwissen Theo HE

    2007-02-01

    Full Text Available Abstract Two random regression models, where the effect of a putative QTL was regressed on an environmental gradient, are described. The first model estimates the correlation between intercept and slope of the random regression, while the other model restricts this correlation to 1 or -1, which is expected under a bi-allelic QTL model. The random regression models were compared to a model assuming no gene by environment interactions. The comparison was done with regards to the models ability to detect QTL, to position them accurately and to detect possible QTL by environment interactions. A simulation study based on a granddaughter design was conducted, and QTL were assumed, either by assigning an effect independent of the environment or as a linear function of a simulated environmental gradient. It was concluded that the random regression models were suitable for detection of QTL effects, in the presence and absence of interactions with environmental gradients. Fixing the correlation between intercept and slope of the random regression had a positive effect on power when the QTL effects re-ranked between environments.

  16. The application of cure models in the presence of competing risks: a tool for improved risk communication in population-based cancer patient survival.

    Science.gov (United States)

    Eloranta, Sandra; Lambert, Paul C; Andersson, Therese M-L; Björkholm, Magnus; Dickman, Paul W

    2014-09-01

    Quantifying cancer patient survival from the perspective of cure is clinically relevant. However, most cure models estimate cure assuming no competing causes of death. We use a relative survival framework to demonstrate how flexible parametric cure models can be used in combination with competing-risks theory to incorporate noncancer deaths. Under a model that incorporates statistical cure, we present the probabilities that cancer patients (1) have died from their cancer, (2) have died from other causes, (3) will eventually die from their cancer, or (4) will eventually die from other causes, all as a function of time since diagnosis. We further demonstrate how conditional probabilities can be used to update the prognosis among survivors (eg, at 1 or 5 years after diagnosis) by summarizing the proportion of patients who will not die from their cancer. The proposed method is applied to Swedish population-based data for persons diagnosed with melanoma, colon cancer, or acute myeloid leukemia between 1973 and 2007.

  17. Alternative regression models to assess increase in childhood BMI

    Directory of Open Access Journals (Sweden)

    Mansmann Ulrich

    2008-09-01

    Full Text Available Abstract Background Body mass index (BMI data usually have skewed distributions, for which common statistical modeling approaches such as simple linear or logistic regression have limitations. Methods Different regression approaches to predict childhood BMI by goodness-of-fit measures and means of interpretation were compared including generalized linear models (GLMs, quantile regression and Generalized Additive Models for Location, Scale and Shape (GAMLSS. We analyzed data of 4967 children participating in the school entry health examination in Bavaria, Germany, from 2001 to 2002. TV watching, meal frequency, breastfeeding, smoking in pregnancy, maternal obesity, parental social class and weight gain in the first 2 years of life were considered as risk factors for obesity. Results GAMLSS showed a much better fit regarding the estimation of risk factors effects on transformed and untransformed BMI data than common GLMs with respect to the generalized Akaike information criterion. In comparison with GAMLSS, quantile regression allowed for additional interpretation of prespecified distribution quantiles, such as quantiles referring to overweight or obesity. The variables TV watching, maternal BMI and weight gain in the first 2 years were directly, and meal frequency was inversely significantly associated with body composition in any model type examined. In contrast, smoking in pregnancy was not directly, and breastfeeding and parental social class were not inversely significantly associated with body composition in GLM models, but in GAMLSS and partly in quantile regression models. Risk factor specific BMI percentile curves could be estimated from GAMLSS and quantile regression models. Conclusion GAMLSS and quantile regression seem to be more appropriate than common GLMs for risk factor modeling of BMI data.

  18. The microcomputer scientific software series 2: general linear model--regression.

    Science.gov (United States)

    Harold M. Rauscher

    1983-01-01

    The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...

  19. Wavelet regression model in forecasting crude oil price

    Science.gov (United States)

    Hamid, Mohd Helmie; Shabri, Ani

    2017-05-01

    This study presents the performance of wavelet multiple linear regression (WMLR) technique in daily crude oil forecasting. WMLR model was developed by integrating the discrete wavelet transform (DWT) and multiple linear regression (MLR) model. The original time series was decomposed to sub-time series with different scales by wavelet theory. Correlation analysis was conducted to assist in the selection of optimal decomposed components as inputs for the WMLR model. The daily WTI crude oil price series has been used in this study to test the prediction capability of the proposed model. The forecasting performance of WMLR model were also compared with regular multiple linear regression (MLR), Autoregressive Moving Average (ARIMA) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) using root mean square errors (RMSE) and mean absolute errors (MAE). Based on the experimental results, it appears that the WMLR model performs better than the other forecasting technique tested in this study.

  20. SureCure{sup (R)}-A new material to reduces curing time and improve curing reproducibility of lead-acid batteries

    Energy Technology Data Exchange (ETDEWEB)

    Boden, David P.; Loosemore, Daniel V.; Botts, G. Dean [Hammond Lead Products Division, Hammond Group Inc., 2323 165th Street, Hammond, IN 46320 (United States)

    2006-08-25

    This paper introduces a technology that considerably reduces the time to cure the positive plates of lead-acid batteries. In each of several full-scale trials at automotive and industrial battery manufacturers, the simple replacement of 1wt.% of leady oxide with finely-divided tetrabasic lead sulfate (SureCure(TM) by Hammond Group Inc.) is shown to accelerate significantly the conversion of tribasic lead sulfate (3BS) to tetrabasic lead sulfate (4BS) in the curing process while improving crystal structure and reproducibility. Shorter curing times result in reduced labour and energy costs, as well as reduced fixed (curing chambers and plant footprint) and working (plate inventory) capital investment. (author)

  1. Spontaneous regression of metastases from malignant melanoma: a case report

    DEFF Research Database (Denmark)

    Kalialis, Louise V; Drzewiecki, Krzysztof T; Mohammadi, Mahin

    2008-01-01

    A case of a 61-year-old male with widespread metastatic melanoma is presented 5 years after complete spontaneous cure. Spontaneous regression occurred in cutaneous, pulmonary, hepatic and cerebral metastases. A review of the literature reveals seven cases of regression of cerebral metastases......; this report is the first to document complete spontaneous regression of cerebral metastases from malignant melanoma by means of computed tomography scans. Spontaneous regression is defined as the partial or complete disappearance of a malignant tumour in the absence of all treatment or in the presence...

  2. Predicting and Modelling of Survival Data when Cox's Regression Model does not hold

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2002-01-01

    Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects......Aalen model; additive risk model; counting processes; competing risk; Cox regression; flexible modeling; goodness of fit; prediction of survival; survival analysis; time-varying effects...

  3. Spatial stochastic regression modelling of urban land use

    International Nuclear Information System (INIS)

    Arshad, S H M; Jaafar, J; Abiden, M Z Z; Latif, Z A; Rasam, A R A

    2014-01-01

    Urbanization is very closely linked to industrialization, commercialization or overall economic growth and development. This results in innumerable benefits of the quantity and quality of the urban environment and lifestyle but on the other hand contributes to unbounded development, urban sprawl, overcrowding and decreasing standard of living. Regulation and observation of urban development activities is crucial. The understanding of urban systems that promotes urban growth are also essential for the purpose of policy making, formulating development strategies as well as development plan preparation. This study aims to compare two different stochastic regression modeling techniques for spatial structure models of urban growth in the same specific study area. Both techniques will utilize the same datasets and their results will be analyzed. The work starts by producing an urban growth model by using stochastic regression modeling techniques namely the Ordinary Least Square (OLS) and Geographically Weighted Regression (GWR). The two techniques are compared to and it is found that, GWR seems to be a more significant stochastic regression model compared to OLS, it gives a smaller AICc (Akaike's Information Corrected Criterion) value and its output is more spatially explainable

  4. Physics constrained nonlinear regression models for time series

    International Nuclear Information System (INIS)

    Majda, Andrew J; Harlim, John

    2013-01-01

    A central issue in contemporary science is the development of data driven statistical nonlinear dynamical models for time series of partial observations of nature or a complex physical model. It has been established recently that ad hoc quadratic multi-level regression (MLR) models can have finite-time blow up of statistical solutions and/or pathological behaviour of their invariant measure. Here a new class of physics constrained multi-level quadratic regression models are introduced, analysed and applied to build reduced stochastic models from data of nonlinear systems. These models have the advantages of incorporating memory effects in time as well as the nonlinear noise from energy conserving nonlinear interactions. The mathematical guidelines for the performance and behaviour of these physics constrained MLR models as well as filtering algorithms for their implementation are developed here. Data driven applications of these new multi-level nonlinear regression models are developed for test models involving a nonlinear oscillator with memory effects and the difficult test case of the truncated Burgers–Hopf model. These new physics constrained quadratic MLR models are proposed here as process models for Bayesian estimation through Markov chain Monte Carlo algorithms of low frequency behaviour in complex physical data. (paper)

  5. Logistic regression modelling: procedures and pitfalls in developing and interpreting prediction models

    Directory of Open Access Journals (Sweden)

    Nataša Šarlija

    2017-01-01

    Full Text Available This study sheds light on the most common issues related to applying logistic regression in prediction models for company growth. The purpose of the paper is 1 to provide a detailed demonstration of the steps in developing a growth prediction model based on logistic regression analysis, 2 to discuss common pitfalls and methodological errors in developing a model, and 3 to provide solutions and possible ways of overcoming these issues. Special attention is devoted to the question of satisfying logistic regression assumptions, selecting and defining dependent and independent variables, using classification tables and ROC curves, for reporting model strength, interpreting odds ratios as effect measures and evaluating performance of the prediction model. Development of a logistic regression model in this paper focuses on a prediction model of company growth. The analysis is based on predominantly financial data from a sample of 1471 small and medium-sized Croatian companies active between 2009 and 2014. The financial data is presented in the form of financial ratios divided into nine main groups depicting following areas of business: liquidity, leverage, activity, profitability, research and development, investing and export. The growth prediction model indicates aspects of a business critical for achieving high growth. In that respect, the contribution of this paper is twofold. First, methodological, in terms of pointing out pitfalls and potential solutions in logistic regression modelling, and secondly, theoretical, in terms of identifying factors responsible for high growth of small and medium-sized companies.

  6. Influence of curing protocol on selected properties of light-curing polymers

    DEFF Research Database (Denmark)

    Dewaele, Magali; Asmussen, Erik; Peutzfeldt, Anne

    2009-01-01

    The purpose of this study was to investigate the effect of light-curing protocol on degree of conversion (DC), volume contraction (C), elastic modulus (E), and glass transition temperature (T(g)) as measured on a model polymer. It was a further aim to correlate the measured values with each other....

  7. Grafting and curing

    International Nuclear Information System (INIS)

    Garnett, J.L.; Loo-Teck Ng; Visay Viengkhou

    1998-01-01

    Progress in radiation grafting and curing is briefly reviewed. The two processes are shown to be mechanistically related. The parameters influencing yields are examined particularly for grafting. For ionising radiation grafting systems (EB and gamma ray) these include solvents, substrate and monomer structure, dose and dose-rate, temperature and more recently role of additives. In addition, for UV grafting, the significance of photoinitiators is discussed. Current applications of radiation grafting and curing are outlined. The recent development of photoinitiator free grafting and curing is examined as well as the potential for the new excimer laser sources. The future application of both grafting and curing is considered, especially the significance of the occurrence of concurrent grafting during cure and its relevance in environmental considerations

  8. Regression Models for Repairable Systems

    Czech Academy of Sciences Publication Activity Database

    Novák, Petr

    2015-01-01

    Roč. 17, č. 4 (2015), s. 963-972 ISSN 1387-5841 Institutional support: RVO:67985556 Keywords : Reliability analysis * Repair models * Regression Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.782, year: 2015 http://library.utia.cas.cz/separaty/2015/SI/novak-0450902.pdf

  9. Simulation of Injection Molding Process Including Mold Filling and Compound Curing

    Directory of Open Access Journals (Sweden)

    Mohamad Reza Erfanian

    2012-12-01

    Full Text Available The present work reports and discusses the results of a 3D simulation of the injection molding process of a rubber compound that includes the mold flling stage and  material curing, using the computer code is developed in “UDF” part of the Fluent 6.3 CAE software. The data obtained from a rheometer (MDR 2000 is used to characterize the rubber material in order to fnd the cure model parameters which exist in curing model. Because of non-newtonian behavior of rubber, in this work the non-newtonian model for viscosity was used and viscosity parameters were computed by mean of viscometry test by RPA. After calculation of the physical and curing properties, vulcanization process was simulated for a complex rubber article with non-uniform thickness by solving the continuity, momentum, energy and curing process equations. Predicted flling and curing time in a complex and 3D rubber part is compared with experimentally measured data which confrmed  the accuracy and applicability of the method.

  10. Non-destructive analysis of sensory traits of dry-cured loins by MRI-computer vision techniques and data mining.

    Science.gov (United States)

    Caballero, Daniel; Antequera, Teresa; Caro, Andrés; Ávila, María Del Mar; G Rodríguez, Pablo; Perez-Palacios, Trinidad

    2017-07-01

    Magnetic resonance imaging (MRI) combined with computer vision techniques have been proposed as an alternative or complementary technique to determine the quality parameters of food in a non-destructive way. The aim of this work was to analyze the sensory attributes of dry-cured loins using this technique. For that, different MRI acquisition sequences (spin echo, gradient echo and turbo 3D), algorithms for MRI analysis (GLCM, NGLDM, GLRLM and GLCM-NGLDM-GLRLM) and predictive data mining techniques (multiple linear regression and isotonic regression) were tested. The correlation coefficient (R) and mean absolute error (MAE) were used to validate the prediction results. The combination of spin echo, GLCM and isotonic regression produced the most accurate results. In addition, the MRI data from dry-cured loins seems to be more suitable than the data from fresh loins. The application of predictive data mining techniques on computational texture features from the MRI data of loins enables the determination of the sensory traits of dry-cured loins in a non-destructive way. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  11. Geographically weighted regression model on poverty indicator

    Science.gov (United States)

    Slamet, I.; Nugroho, N. F. T. A.; Muslich

    2017-12-01

    In this research, we applied geographically weighted regression (GWR) for analyzing the poverty in Central Java. We consider Gaussian Kernel as weighted function. The GWR uses the diagonal matrix resulted from calculating kernel Gaussian function as a weighted function in the regression model. The kernel weights is used to handle spatial effects on the data so that a model can be obtained for each location. The purpose of this paper is to model of poverty percentage data in Central Java province using GWR with Gaussian kernel weighted function and to determine the influencing factors in each regency/city in Central Java province. Based on the research, we obtained geographically weighted regression model with Gaussian kernel weighted function on poverty percentage data in Central Java province. We found that percentage of population working as farmers, population growth rate, percentage of households with regular sanitation, and BPJS beneficiaries are the variables that affect the percentage of poverty in Central Java province. In this research, we found the determination coefficient R2 are 68.64%. There are two categories of district which are influenced by different of significance factors.

  12. Monitoring cure of composite resins using frequency dependent electromagnetic sensing techniques

    Science.gov (United States)

    Kranbuehl, D. E.; Hoff, M. S.; Loos, A. C.; Freeman, W. T., Jr.; Eichinger, D. A.

    1988-01-01

    A nondestructive in situ measurement technique has been developed for monitoring and measuring the cure processing properties of composite resins. Frequency dependent electromagnetic sensors (FDEMS) were used to directly measure resin viscosity during cure. The effects of the cure cycle and resin aging on the viscosity during cure were investigated using the sensor. Viscosity measurements obtained using the sensor are compared with the viscosities calculated by the Loos-Springer cure process model. Good overall agreement was obtained except for the aged resin samples.

  13. On a Robust MaxEnt Process Regression Model with Sample-Selection

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2018-04-01

    Full Text Available In a regression analysis, a sample-selection bias arises when a dependent variable is partially observed as a result of the sample selection. This study introduces a Maximum Entropy (MaxEnt process regression model that assumes a MaxEnt prior distribution for its nonparametric regression function and finds that the MaxEnt process regression model includes the well-known Gaussian process regression (GPR model as a special case. Then, this special MaxEnt process regression model, i.e., the GPR model, is generalized to obtain a robust sample-selection Gaussian process regression (RSGPR model that deals with non-normal data in the sample selection. Various properties of the RSGPR model are established, including the stochastic representation, distributional hierarchy, and magnitude of the sample-selection bias. These properties are used in the paper to develop a hierarchical Bayesian methodology to estimate the model. This involves a simple and computationally feasible Markov chain Monte Carlo algorithm that avoids analytical or numerical derivatives of the log-likelihood function of the model. The performance of the RSGPR model in terms of the sample-selection bias correction, robustness to non-normality, and prediction, is demonstrated through results in simulations that attest to its good finite-sample performance.

  14. On concurvity in nonlinear and nonparametric regression models

    Directory of Open Access Journals (Sweden)

    Sonia Amodio

    2014-12-01

    Full Text Available When data are affected by multicollinearity in the linear regression framework, then concurvity will be present in fitting a generalized additive model (GAM. The term concurvity describes nonlinear dependencies among the predictor variables. As collinearity results in inflated variance of the estimated regression coefficients in the linear regression model, the result of the presence of concurvity leads to instability of the estimated coefficients in GAMs. Even if the backfitting algorithm will always converge to a solution, in case of concurvity the final solution of the backfitting procedure in fitting a GAM is influenced by the starting functions. While exact concurvity is highly unlikely, approximate concurvity, the analogue of multicollinearity, is of practical concern as it can lead to upwardly biased estimates of the parameters and to underestimation of their standard errors, increasing the risk of committing type I error. We compare the existing approaches to detect concurvity, pointing out their advantages and drawbacks, using simulated and real data sets. As a result, this paper will provide a general criterion to detect concurvity in nonlinear and non parametric regression models.

  15. Effect of rheological parameters on curing rate during NBR injection molding

    Science.gov (United States)

    Kyas, Kamil; Stanek, Michal; Manas, David; Skrobak, Adam

    2013-04-01

    In this work, non-isothermal injection molding process for NBR rubber mixture considering Isayev-Deng curing kinetic model, generalized Newtonian model with Carreau-WLF viscosity was modeled by using finite element method in order to understand the effect of volume flow rate, index of non-Newtonian behavior and relaxation time on the temperature profile and curing rate. It was found that for specific geometry and processing conditions, increase in relaxation time or in the index of non-Newtonian behavior increases the curing rate due to viscous dissipation taking place at the flow domain walls.

  16. Semiparametric Mixtures of Regressions with Single-index for Model Based Clustering

    OpenAIRE

    Xiang, Sijia; Yao, Weixin

    2017-01-01

    In this article, we propose two classes of semiparametric mixture regression models with single-index for model based clustering. Unlike many semiparametric/nonparametric mixture regression models that can only be applied to low dimensional predictors, the new semiparametric models can easily incorporate high dimensional predictors into the nonparametric components. The proposed models are very general, and many of the recently proposed semiparametric/nonparametric mixture regression models a...

  17. Short-term electricity prices forecasting based on support vector regression and Auto-regressive integrated moving average modeling

    International Nuclear Information System (INIS)

    Che Jinxing; Wang Jianzhou

    2010-01-01

    In this paper, we present the use of different mathematical models to forecast electricity price under deregulated power. A successful prediction tool of electricity price can help both power producers and consumers plan their bidding strategies. Inspired by that the support vector regression (SVR) model, with the ε-insensitive loss function, admits of the residual within the boundary values of ε-tube, we propose a hybrid model that combines both SVR and Auto-regressive integrated moving average (ARIMA) models to take advantage of the unique strength of SVR and ARIMA models in nonlinear and linear modeling, which is called SVRARIMA. A nonlinear analysis of the time-series indicates the convenience of nonlinear modeling, the SVR is applied to capture the nonlinear patterns. ARIMA models have been successfully applied in solving the residuals regression estimation problems. The experimental results demonstrate that the model proposed outperforms the existing neural-network approaches, the traditional ARIMA models and other hybrid models based on the root mean square error and mean absolute percentage error.

  18. Modeling oil production based on symbolic regression

    International Nuclear Information System (INIS)

    Yang, Guangfei; Li, Xianneng; Wang, Jianliang; Lian, Lian; Ma, Tieju

    2015-01-01

    Numerous models have been proposed to forecast the future trends of oil production and almost all of them are based on some predefined assumptions with various uncertainties. In this study, we propose a novel data-driven approach that uses symbolic regression to model oil production. We validate our approach on both synthetic and real data, and the results prove that symbolic regression could effectively identify the true models beneath the oil production data and also make reliable predictions. Symbolic regression indicates that world oil production will peak in 2021, which broadly agrees with other techniques used by researchers. Our results also show that the rate of decline after the peak is almost half the rate of increase before the peak, and it takes nearly 12 years to drop 4% from the peak. These predictions are more optimistic than those in several other reports, and the smoother decline will provide the world, especially the developing countries, with more time to orchestrate mitigation plans. -- Highlights: •A data-driven approach has been shown to be effective at modeling the oil production. •The Hubbert model could be discovered automatically from data. •The peak of world oil production is predicted to appear in 2021. •The decline rate after peak is half of the increase rate before peak. •Oil production projected to decline 4% post-peak

  19. Model performance analysis and model validation in logistic regression

    Directory of Open Access Journals (Sweden)

    Rosa Arboretti Giancristofaro

    2007-10-01

    Full Text Available In this paper a new model validation procedure for a logistic regression model is presented. At first, we illustrate a brief review of different techniques of model validation. Next, we define a number of properties required for a model to be considered "good", and a number of quantitative performance measures. Lastly, we describe a methodology for the assessment of the performance of a given model by using an example taken from a management study.

  20. Model building strategy for logistic regression: purposeful selection.

    Science.gov (United States)

    Zhang, Zhongheng

    2016-03-01

    Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.

  1. The APT model as reduced-rank regression

    NARCIS (Netherlands)

    Bekker, P.A.; Dobbelstein, P.; Wansbeek, T.J.

    Integrating the two steps of an arbitrage pricing theory (APT) model leads to a reduced-rank regression (RRR) model. So the results on RRR can be used to estimate APT models, making estimation very simple. We give a succinct derivation of estimation of RRR, derive the asymptotic variance of RRR

  2. Modelling subject-specific childhood growth using linear mixed-effect models with cubic regression splines.

    Science.gov (United States)

    Grajeda, Laura M; Ivanescu, Andrada; Saito, Mayuko; Crainiceanu, Ciprian; Jaganath, Devan; Gilman, Robert H; Crabtree, Jean E; Kelleher, Dermott; Cabrera, Lilia; Cama, Vitaliano; Checkley, William

    2016-01-01

    Childhood growth is a cornerstone of pediatric research. Statistical models need to consider individual trajectories to adequately describe growth outcomes. Specifically, well-defined longitudinal models are essential to characterize both population and subject-specific growth. Linear mixed-effect models with cubic regression splines can account for the nonlinearity of growth curves and provide reasonable estimators of population and subject-specific growth, velocity and acceleration. We provide a stepwise approach that builds from simple to complex models, and account for the intrinsic complexity of the data. We start with standard cubic splines regression models and build up to a model that includes subject-specific random intercepts and slopes and residual autocorrelation. We then compared cubic regression splines vis-à-vis linear piecewise splines, and with varying number of knots and positions. Statistical code is provided to ensure reproducibility and improve dissemination of methods. Models are applied to longitudinal height measurements in a cohort of 215 Peruvian children followed from birth until their fourth year of life. Unexplained variability, as measured by the variance of the regression model, was reduced from 7.34 when using ordinary least squares to 0.81 (p linear mixed-effect models with random slopes and a first order continuous autoregressive error term. There was substantial heterogeneity in both the intercept (p modeled with a first order continuous autoregressive error term as evidenced by the variogram of the residuals and by a lack of association among residuals. The final model provides a parametric linear regression equation for both estimation and prediction of population- and individual-level growth in height. We show that cubic regression splines are superior to linear regression splines for the case of a small number of knots in both estimation and prediction with the full linear mixed effect model (AIC 19,352 vs. 19

  3. Influence diagnostics in meta-regression model.

    Science.gov (United States)

    Shi, Lei; Zuo, ShanShan; Yu, Dalei; Zhou, Xiaohua

    2017-09-01

    This paper studies the influence diagnostics in meta-regression model including case deletion diagnostic and local influence analysis. We derive the subset deletion formulae for the estimation of regression coefficient and heterogeneity variance and obtain the corresponding influence measures. The DerSimonian and Laird estimation and maximum likelihood estimation methods in meta-regression are considered, respectively, to derive the results. Internal and external residual and leverage measure are defined. The local influence analysis based on case-weights perturbation scheme, responses perturbation scheme, covariate perturbation scheme, and within-variance perturbation scheme are explored. We introduce a method by simultaneous perturbing responses, covariate, and within-variance to obtain the local influence measure, which has an advantage of capable to compare the influence magnitude of influential studies from different perturbations. An example is used to illustrate the proposed methodology. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Logistic Regression Modeling of Diminishing Manufacturing Sources for Integrated Circuits

    National Research Council Canada - National Science Library

    Gravier, Michael

    1999-01-01

    .... The research identified logistic regression as a powerful tool for analysis of DMSMS and further developed twenty models attempting to identify the "best" way to model and predict DMSMS using logistic regression...

  5. Radiation curing - a personal perspective

    International Nuclear Information System (INIS)

    Pappas, S.P.

    1992-01-01

    This chapter briefly introduces radiation curing from the personal perspective of the author. Topics covered in this chapter include characteristic features of radiation curing, photoinitiated polymerization -- ultraviolet (UV) curing, and general principles of electron beam (EB) curing. 57 refs., 2 tabs

  6. The Effects of Time Lag and Cure Rate on the Global Dynamics of HIV-1 Model

    Directory of Open Access Journals (Sweden)

    Nigar Ali

    2017-01-01

    Full Text Available In this research article, a new mathematical model of delayed differential equations is developed which discusses the interaction among CD4 T cells, human immunodeficiency virus (HIV, and recombinant virus with cure rate. The model has two distributed intracellular delays. These delays denote the time needed for the infection of a cell. The dynamics of the model are completely described by the basic reproduction numbers represented by R0, R1, and R2. It is shown that if R0<1, then the infection-free equilibrium is locally as well as globally stable. Similarly, it is proved that the recombinant absent equilibrium is locally as well as globally asymptotically stable if 1cure rate have a positive role in the reduction of infected cells and the increasing of uninfected cells due to which the infection is reduced.

  7. Analysis of dental caries using generalized linear and count regression models

    Directory of Open Access Journals (Sweden)

    Javali M. Phil

    2013-11-01

    Full Text Available Generalized linear models (GLM are generalization of linear regression models, which allow fitting regression models to response data in all the sciences especially medical and dental sciences that follow a general exponential family. These are flexible and widely used class of such models that can accommodate response variables. Count data are frequently characterized by overdispersion and excess zeros. Zero-inflated count models provide a parsimonious yet powerful way to model this type of situation. Such models assume that the data are a mixture of two separate data generation processes: one generates only zeros, and the other is either a Poisson or a negative binomial data-generating process. Zero inflated count regression models such as the zero-inflated Poisson (ZIP, zero-inflated negative binomial (ZINB regression models have been used to handle dental caries count data with many zeros. We present an evaluation framework to the suitability of applying the GLM, Poisson, NB, ZIP and ZINB to dental caries data set where the count data may exhibit evidence of many zeros and over-dispersion. Estimation of the model parameters using the method of maximum likelihood is provided. Based on the Vuong test statistic and the goodness of fit measure for dental caries data, the NB and ZINB regression models perform better than other count regression models.

  8. Multi-step cure kinetic model of ultra-thin glass fiber epoxy prepreg exhibiting both autocatalytic and diffusion-controlled regimes under isothermal and dynamic-heating conditions

    Science.gov (United States)

    Kim, Ye Chan; Min, Hyunsung; Hong, Sungyong; Wang, Mei; Sun, Hanna; Park, In-Kyung; Choi, Hyouk Ryeol; Koo, Ja Choon; Moon, Hyungpil; Kim, Kwang J.; Suhr, Jonghwan; Nam, Jae-Do

    2017-08-01

    As packaging technologies are demanded that reduce the assembly area of substrate, thin composite laminate substrates require the utmost high performance in such material properties as the coefficient of thermal expansion (CTE), and stiffness. Accordingly, thermosetting resin systems, which consist of multiple fillers, monomers and/or catalysts in thermoset-based glass fiber prepregs, are extremely complicated and closely associated with rheological properties, which depend on the temperature cycles for cure. For the process control of these complex systems, it is usually required to obtain a reliable kinetic model that could be used for the complex thermal cycles, which usually includes both the isothermal and dynamic-heating segments. In this study, an ultra-thin prepreg with highly loaded silica beads and glass fibers in the epoxy/amine resin system was investigated as a model system by isothermal/dynamic heating experiments. The maximum degree of cure was obtained as a function of temperature. The curing kinetics of the model prepreg system exhibited a multi-step reaction and a limited conversion as a function of isothermal curing temperatures, which are often observed in epoxy cure system because of the rate-determining diffusion of polymer chain growth. The modified kinetic equation accurately described the isothermal behavior and the beginning of the dynamic-heating behavior by integrating the obtained maximum degree of cure into the kinetic model development.

  9. AIRLINE ACTIVITY FORECASTING BY REGRESSION MODELS

    Directory of Open Access Journals (Sweden)

    Н. Білак

    2012-04-01

    Full Text Available Proposed linear and nonlinear regression models, which take into account the equation of trend and seasonality indices for the analysis and restore the volume of passenger traffic over the past period of time and its prediction for future years, as well as the algorithm of formation of these models based on statistical analysis over the years. The desired model is the first step for the synthesis of more complex models, which will enable forecasting of passenger (income level airline with the highest accuracy and time urgency.

  10. [Application of detecting and taking overdispersion into account in Poisson regression model].

    Science.gov (United States)

    Bouche, G; Lepage, B; Migeot, V; Ingrand, P

    2009-08-01

    Researchers often use the Poisson regression model to analyze count data. Overdispersion can occur when a Poisson regression model is used, resulting in an underestimation of variance of the regression model parameters. Our objective was to take overdispersion into account and assess its impact with an illustration based on the data of a study investigating the relationship between use of the Internet to seek health information and number of primary care consultations. Three methods, overdispersed Poisson, a robust estimator, and negative binomial regression, were performed to take overdispersion into account in explaining variation in the number (Y) of primary care consultations. We tested overdispersion in the Poisson regression model using the ratio of the sum of Pearson residuals over the number of degrees of freedom (chi(2)/df). We then fitted the three models and compared parameter estimation to the estimations given by Poisson regression model. Variance of the number of primary care consultations (Var[Y]=21.03) was greater than the mean (E[Y]=5.93) and the chi(2)/df ratio was 3.26, which confirmed overdispersion. Standard errors of the parameters varied greatly between the Poisson regression model and the three other regression models. Interpretation of estimates from two variables (using the Internet to seek health information and single parent family) would have changed according to the model retained, with significant levels of 0.06 and 0.002 (Poisson), 0.29 and 0.09 (overdispersed Poisson), 0.29 and 0.13 (use of a robust estimator) and 0.45 and 0.13 (negative binomial) respectively. Different methods exist to solve the problem of underestimating variance in the Poisson regression model when overdispersion is present. The negative binomial regression model seems to be particularly accurate because of its theorical distribution ; in addition this regression is easy to perform with ordinary statistical software packages.

  11. Variable Selection for Regression Models of Percentile Flows

    Science.gov (United States)

    Fouad, G.

    2017-12-01

    Percentile flows describe the flow magnitude equaled or exceeded for a given percent of time, and are widely used in water resource management. However, these statistics are normally unavailable since most basins are ungauged. Percentile flows of ungauged basins are often predicted using regression models based on readily observable basin characteristics, such as mean elevation. The number of these independent variables is too large to evaluate all possible models. A subset of models is typically evaluated using automatic procedures, like stepwise regression. This ignores a large variety of methods from the field of feature (variable) selection and physical understanding of percentile flows. A study of 918 basins in the United States was conducted to compare an automatic regression procedure to the following variable selection methods: (1) principal component analysis, (2) correlation analysis, (3) random forests, (4) genetic programming, (5) Bayesian networks, and (6) physical understanding. The automatic regression procedure only performed better than principal component analysis. Poor performance of the regression procedure was due to a commonly used filter for multicollinearity, which rejected the strongest models because they had cross-correlated independent variables. Multicollinearity did not decrease model performance in validation because of a representative set of calibration basins. Variable selection methods based strictly on predictive power (numbers 2-5 from above) performed similarly, likely indicating a limit to the predictive power of the variables. Similar performance was also reached using variables selected based on physical understanding, a finding that substantiates recent calls to emphasize physical understanding in modeling for predictions in ungauged basins. The strongest variables highlighted the importance of geology and land cover, whereas widely used topographic variables were the weakest predictors. Variables suffered from a high

  12. Radiation curing of polymers

    International Nuclear Information System (INIS)

    Randell, D.R.

    1987-01-01

    The contents of this book are: Areas of Application of UV Curing; Areas of Application of EB Curing; Laser Curing of Acrylic Coatings; A User's View of the Application of Radiation Curable Materials; Radiation Curable Offset Inks: A Technical and Marketing Overview; and UV Curable Screen Printing Inks

  13. [Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].

    Science.gov (United States)

    Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L

    2017-03-10

    To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.

  14. Geographically Weighted Logistic Regression Applied to Credit Scoring Models

    Directory of Open Access Journals (Sweden)

    Pedro Henrique Melo Albuquerque

    Full Text Available Abstract This study used real data from a Brazilian financial institution on transactions involving Consumer Direct Credit (CDC, granted to clients residing in the Distrito Federal (DF, to construct credit scoring models via Logistic Regression and Geographically Weighted Logistic Regression (GWLR techniques. The aims were: to verify whether the factors that influence credit risk differ according to the borrower’s geographic location; to compare the set of models estimated via GWLR with the global model estimated via Logistic Regression, in terms of predictive power and financial losses for the institution; and to verify the viability of using the GWLR technique to develop credit scoring models. The metrics used to compare the models developed via the two techniques were the AICc informational criterion, the accuracy of the models, the percentage of false positives, the sum of the value of false positive debt, and the expected monetary value of portfolio default compared with the monetary value of defaults observed. The models estimated for each region in the DF were distinct in their variables and coefficients (parameters, with it being concluded that credit risk was influenced differently in each region in the study. The Logistic Regression and GWLR methodologies presented very close results, in terms of predictive power and financial losses for the institution, and the study demonstrated viability in using the GWLR technique to develop credit scoring models for the target population in the study.

  15. Modeling maximum daily temperature using a varying coefficient regression model

    Science.gov (United States)

    Han Li; Xinwei Deng; Dong-Yum Kim; Eric P. Smith

    2014-01-01

    Relationships between stream water and air temperatures are often modeled using linear or nonlinear regression methods. Despite a strong relationship between water and air temperatures and a variety of models that are effective for data summarized on a weekly basis, such models did not yield consistently good predictions for summaries such as daily maximum temperature...

  16. The Cure Violence model: violence reduction in San Pedro Sula (Honduras

    Directory of Open Access Journals (Sweden)

    Charles Ransford

    2017-09-01

    Full Text Available Developed in the United States, the Cure Violence model is a programme of epidemic control that reduces violence through changes to norms and behaviour. This article primarily examines the issue of violence in Honduras and, in particular, in San Pedro Sula which was for years the city with the highest homicide rates in the world. To tackle this situation, in 2013 an adapted version of the programme began to be implemented in certain areas of the city. After describing the adaptation of the model to the context of the Honduran city, its results are analysed in two periods of 2014 and 2015 (compared to 2013 and 2014, respectively: a significant reduction in shootings and a minor fall in the homicide figures stand out.

  17. Structured Additive Regression Models: An R Interface to BayesX

    Directory of Open Access Journals (Sweden)

    Nikolaus Umlauf

    2015-02-01

    Full Text Available Structured additive regression (STAR models provide a flexible framework for model- ing possible nonlinear effects of covariates: They contain the well established frameworks of generalized linear models and generalized additive models as special cases but also allow a wider class of effects, e.g., for geographical or spatio-temporal data, allowing for specification of complex and realistic models. BayesX is standalone software package providing software for fitting general class of STAR models. Based on a comprehensive open-source regression toolbox written in C++, BayesX uses Bayesian inference for estimating STAR models based on Markov chain Monte Carlo simulation techniques, a mixed model representation of STAR models, or stepwise regression techniques combining penalized least squares estimation with model selection. BayesX not only covers models for responses from univariate exponential families, but also models from less-standard regression situations such as models for multi-categorical responses with either ordered or unordered categories, continuous time survival data, or continuous time multi-state models. This paper presents a new fully interactive R interface to BayesX: the R package R2BayesX. With the new package, STAR models can be conveniently specified using Rs formula language (with some extended terms, fitted using the BayesX binary, represented in R with objects of suitable classes, and finally printed/summarized/plotted. This makes BayesX much more accessible to users familiar with R and adds extensive graphics capabilities for visualizing fitted STAR models. Furthermore, R2BayesX complements the already impressive capabilities for semiparametric regression in R by a comprehensive toolbox comprising in particular more complex response types and alternative inferential procedures such as simulation-based Bayesian inference.

  18. Multiple regression and beyond an introduction to multiple regression and structural equation modeling

    CERN Document Server

    Keith, Timothy Z

    2014-01-01

    Multiple Regression and Beyond offers a conceptually oriented introduction to multiple regression (MR) analysis and structural equation modeling (SEM), along with analyses that flow naturally from those methods. By focusing on the concepts and purposes of MR and related methods, rather than the derivation and calculation of formulae, this book introduces material to students more clearly, and in a less threatening way. In addition to illuminating content necessary for coursework, the accessibility of this approach means students are more likely to be able to conduct research using MR or SEM--and more likely to use the methods wisely. Covers both MR and SEM, while explaining their relevance to one another Also includes path analysis, confirmatory factor analysis, and latent growth modeling Figures and tables throughout provide examples and illustrate key concepts and techniques For additional resources, please visit: http://tzkeith.com/.

  19. Time series regression model for infectious disease and weather.

    Science.gov (United States)

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Linear regression crash prediction models : issues and proposed solutions.

    Science.gov (United States)

    2010-05-01

    The paper develops a linear regression model approach that can be applied to : crash data to predict vehicle crashes. The proposed approach involves novice data aggregation : to satisfy linear regression assumptions; namely error structure normality ...

  1. Identification of Influential Points in a Linear Regression Model

    Directory of Open Access Journals (Sweden)

    Jan Grosz

    2011-03-01

    Full Text Available The article deals with the detection and identification of influential points in the linear regression model. Three methods of detection of outliers and leverage points are described. These procedures can also be used for one-sample (independentdatasets. This paper briefly describes theoretical aspects of several robust methods as well. Robust statistics is a powerful tool to increase the reliability and accuracy of statistical modelling and data analysis. A simulation model of the simple linear regression is presented.

  2. Additive effects in radiation grafting and curing

    International Nuclear Information System (INIS)

    Viengkhou, V.; Ng, L.

    1996-01-01

    Full text: Detailed studies on the accelerative effect of novel additives in radiation grafting and curing using acrylated monomer/oligomer systems have been performed in the presence of ionising radiation and UV as sources. Methyl methacrylate (MMA) is used as typical monomer for these grafting studies in the presence of the additives with model backbone polymers, cellulose and propropylene. Additives which have been found to accelerate these grafting processes are: mineral acid, occlusion compounds like urea, thermal initiators and photoinitiators as well as multifunctional monomers such as multifunctional acrylates. The results from irradiation with gamma rays have also been compared with irradiation from a 90W UV lamp. The role of the above additives in accelerating the analogous process of radiation curing has been investigated. Acrylated urethanes, epoxies and polyesters are used as oligomers together with acrylated monomers in this work with uv lamps of 300 watts/inch as radiation source. In the UV curing process bonding between film and substrate is usually due to physical forces. In the present work the presence of additives are shown to influence the occurrence of concurrent grafting during cure thus affecting the nature of the bonding of the cured film. The conditions under which concurrent grafting with UV can occur will be examined. A mechanism for accelerative effect of these additives in both grafting and curing processes has been proposed involving radiation effects and partitioning phenomena

  3. Radiation curing in the eighties

    International Nuclear Information System (INIS)

    Vrancken, A.

    1984-01-01

    The subject is discussed under the headings: introduction; what is radiation curing; history; radiation curable resins (with properties of products); ultraviolet and electron beam curing; photoinitiation and the ultraviolet light curing process; electron beam curing (initiation; electron beam accelerators); end uses (graphic arts; wood finishing; paper upgrading; adhesives; metal finishing; electronic chemical; floor coatings). (U.K.)

  4. Cure Kinetics of Benzoxazine/Cycloaliphatic Epoxy Resin by Differential Scanning Calorimetry

    Science.gov (United States)

    Gouni, Sreeja Reddy

    Understanding the curing kinetics of a thermoset resin has a significant importance in developing and optimizing curing cycles in various industrial manufacturing processes. This can assist in improving the quality of final product and minimizing the manufacturing-associated costs. One approach towards developing such an understanding is to formulate kinetic models that can be used to optimize curing time and temperature to reach a full cure state or to determine time to apply pressure in an autoclave process. Various phenomenological reaction models have been used in the literature to successfully predict the kinetic behavior of a thermoset system. The current research work was designed to investigate the cure kinetics of Bisphenol-A based Benzoxazine (BZ-a) and Cycloaliphatic epoxy resin (CER) system under isothermal and nonisothermal conditions by Differential Scanning Calorimetry (DSC). The cure characteristics of BZ-a/CER copolymer systems with 75/25 wt% and 50/50 wt% have been studied and compared to that of pure benzoxazine under nonisothermal conditions. The DSC thermograms exhibited by these BZ-a/CER copolymer systems showed a single exothermic peak, indicating that the reactions between benzoxazine-benzoxazine monomers and benzoxazine-cycloaliphatic epoxy resin were interactive and occurred simultaneously. The Kissinger method and isoconversional methods including Ozawa-Flynn-Wall and Freidman were employed to obtain the activation energy values and determine the nature of the reaction. The cure behavior and the kinetic parameters were determined by adopting a single step autocatalytic model based on Kamal and Sourour phenomenological reaction model. The model was found to suitably describe the cure kinetics of copolymer system prior to the diffusion-control reaction. Analyzing and understanding the thermoset resin system under isothermal conditions is also important since it is the most common practice in the industry. The BZ-a/CER copolymer system with

  5. Development and Validation of a Constitutive Model for Dental Composites during the Curing Process

    Science.gov (United States)

    Wickham Kolstad, Lauren

    Debonding is a critical failure of a dental composites used for dental restorations. Debonding of dental composites can be determined by comparing the shrinkage stress of to the debonding strength of the adhesive that bonds it to the tooth surface. It is difficult to measure shrinkage stress experimentally. In this study, finite element analysis is used to predict the stress in the composite during cure. A new constitutive law is presented that will allow composite developers to evaluate composite shrinkage stress at early stages in the material development. Shrinkage stress and shrinkage strain experimental data were gathered for three dental resins, Z250, Z350, and P90. Experimental data were used to develop a constitutive model for the Young's modulus as a function of time of the dental composite during cure. A Maxwell model, spring and dashpot in series, was used to simulate the composite. The compliance of the shrinkage stress device was also taken into account by including a spring in series with the Maxwell model. A coefficient of thermal expansion was also determined for internal loading of the composite by dividing shrinkage strain by time. Three FEA models are presented. A spring-disk model validates that the constitutive law is self-consistent. A quarter cuspal deflection model uses separate experimental data to verify that the constitutive law is valid. Finally, an axisymmetric tooth model is used to predict interfacial stresses in the composite. These stresses are compared to the debonding strength to check if the composite debonds. The new constitutive model accurately predicted cuspal deflection data. Predictions for interfacial bond stress in the tooth model compare favorably with debonding characteristics observed in practice for dental resins.

  6. The art of regression modeling in road safety

    CERN Document Server

    Hauer, Ezra

    2015-01-01

    This unique book explains how to fashion useful regression models from commonly available data to erect models essential for evidence-based road safety management and research. Composed from techniques and best practices presented over many years of lectures and workshops, The Art of Regression Modeling in Road Safety illustrates that fruitful modeling cannot be done without substantive knowledge about the modeled phenomenon. Class-tested in courses and workshops across North America, the book is ideal for professionals, researchers, university professors, and graduate students with an interest in, or responsibilities related to, road safety. This book also: · Presents for the first time a powerful analytical tool for road safety researchers and practitioners · Includes problems and solutions in each chapter as well as data and spreadsheets for running models and PowerPoint presentation slides · Features pedagogy well-suited for graduate courses and workshops including problems, solutions, and PowerPoint p...

  7. Support Vector Regression Model Based on Empirical Mode Decomposition and Auto Regression for Electric Load Forecasting

    Directory of Open Access Journals (Sweden)

    Hong-Juan Li

    2013-04-01

    Full Text Available Electric load forecasting is an important issue for a power utility, associated with the management of daily operations such as energy transfer scheduling, unit commitment, and load dispatch. Inspired by strong non-linear learning capability of support vector regression (SVR, this paper presents a SVR model hybridized with the empirical mode decomposition (EMD method and auto regression (AR for electric load forecasting. The electric load data of the New South Wales (Australia market are employed for comparing the forecasting performances of different forecasting models. The results confirm the validity of the idea that the proposed model can simultaneously provide forecasting with good accuracy and interpretability.

  8. Robust geographically weighted regression of modeling the Air Polluter Standard Index (APSI)

    Science.gov (United States)

    Warsito, Budi; Yasin, Hasbi; Ispriyanti, Dwi; Hoyyi, Abdul

    2018-05-01

    The Geographically Weighted Regression (GWR) model has been widely applied to many practical fields for exploring spatial heterogenity of a regression model. However, this method is inherently not robust to outliers. Outliers commonly exist in data sets and may lead to a distorted estimate of the underlying regression model. One of solution to handle the outliers in the regression model is to use the robust models. So this model was called Robust Geographically Weighted Regression (RGWR). This research aims to aid the government in the policy making process related to air pollution mitigation by developing a standard index model for air polluter (Air Polluter Standard Index - APSI) based on the RGWR approach. In this research, we also consider seven variables that are directly related to the air pollution level, which are the traffic velocity, the population density, the business center aspect, the air humidity, the wind velocity, the air temperature, and the area size of the urban forest. The best model is determined by the smallest AIC value. There are significance differences between Regression and RGWR in this case, but Basic GWR using the Gaussian kernel is the best model to modeling APSI because it has smallest AIC.

  9. Curing Characterisation of Spruce Tannin-based Foams using the Advanced Isoconversional Method

    Directory of Open Access Journals (Sweden)

    Matjaž Čop

    2014-06-01

    Full Text Available The curing kinetics of foam prepared from the tannin of spruce tree bark was investigated using differential scanning calorimetry (DSC and the advanced isoconversional method. An analysis of the formulations with differing amounts of components (furfuryl alcohol, glycerol, tannin, and a catalyst showed that curing was delayed with increasing proportions of glycerol or tannins. An optimum amount of the catalyst constituent was also found during the study. The curing of the foam system was accelerated with increasing temperatures. Finally, the advanced isoconversional method, based on the model-free kinetic algorithm developed by Vyazovkin, appeared to be an appropriate model for the characterisation of the curing kinetics of tannin-based foams.

  10. Comparison of the heat generation of light curing units.

    Science.gov (United States)

    Bagis, Bora; Bagis, Yildirim; Ertas, Ertan; Ustaomer, Seda

    2008-02-01

    The aim of this study was to evaluate the heat generation of three different types of light curing units. Temperature increases were recorded from a distance of 1 mm from a thermocouple to the tip of three different types of light curing units including one quartz-tungsten halogen (QTH), one plasma arc (PAC), and one light emitting diode (LED) unit. An experimental model was designed to fix the 1 mm distance between the tip of the light curing units and the thermocouple wire. Temperature changes were recorded in 10 second intervals up to 40 seconds. (10, 20, 30, and 40 seconds). Temperature measurements were repeated three times for every light curing unit after a one hour standby period. Statistical analysis of the results was performed using the analysis of variance (ANOVA) and the Bonferroni Test. The highest temperature rises (54.4+/-1.65 degrees C) occurred during activation of a PAC light curing unit for every test period (pdamage to the pulp.

  11. Flexible competing risks regression modeling and goodness-of-fit

    DEFF Research Database (Denmark)

    Scheike, Thomas; Zhang, Mei-Jie

    2008-01-01

    In this paper we consider different approaches for estimation and assessment of covariate effects for the cumulative incidence curve in the competing risks model. The classic approach is to model all cause-specific hazards and then estimate the cumulative incidence curve based on these cause...... models that is easy to fit and contains the Fine-Gray model as a special case. One advantage of this approach is that our regression modeling allows for non-proportional hazards. This leads to a new simple goodness-of-fit procedure for the proportional subdistribution hazards assumption that is very easy...... of the flexible regression models to analyze competing risks data when non-proportionality is present in the data....

  12. Curing agent for polyepoxides and epoxy resins and composites cured therewith. [preventing carbon fiber release

    Science.gov (United States)

    Serafini, T. T.; Delvigs, P.; Vannucci, R. D. (Inventor)

    1981-01-01

    A curing for a polyepoxide is described which contains a divalent aryl radical such as phenylene a tetravalent aryl radical such as a tetravalent benzene radical. An epoxide is cured by admixture with the curing agent. The cured epoxy product retains the usual properties of cured epoxides and, in addition, has a higher char residue after burning, on the order of 45% by weight. The higher char residue is of value in preventing release to the atmosphere of carbon fibers from carbon fiber-epoxy resin composites in the event of burning of the composite.

  13. Cured meat, vegetables, and bean-curd foods in relation to childhood acute leukemia risk: a population based case-control study.

    Science.gov (United States)

    Liu, Chen-Yu; Hsu, Yi-Hsiang; Wu, Ming-Tsang; Pan, Pi-Chen; Ho, Chi-Kung; Su, Li; Xu, Xin; Li, Yi; Christiani, David C

    2009-01-13

    Consumption of cured/smoked meat and fish leads to the formation of carcinogenic N-nitroso compounds in the acidic stomach. This study investigated whether consumed cured/smoked meat and fish, the major dietary resource for exposure to nitrites and nitrosamines, is associated with childhood acute leukemia. A population-based case-control study of Han Chinese between 2 and 20 years old was conducted in southern Taiwan. 145 acute leukemia cases and 370 age- and sex-matched controls were recruited between 1997 and 2005. Dietary data were obtained from a questionnaire. Multiple logistic regression models were used in data analyses. Consumption of cured/smoked meat and fish more than once a week was associated with an increased risk of acute leukemia (OR = 1.74; 95% CI: 1.15-2.64). Conversely, higher intake of vegetables (OR = 0.55; 95% CI: 0.37-0.83) and bean-curd (OR = 0.55; 95% CI: 0.34-0.89) was associated with a reduced risk. No statistically significant association was observed between leukemia risk and the consumption of pickled vegetables, fruits, and tea. Dietary exposure to cured/smoked meat and fish may be associated with leukemia risk through their contents of nitrites and nitrosamines among children and adolescents, and intake of vegetables and soy-bean curd may be protective.

  14. Critical parameters for electron beam curing of cationic epoxies and property comparison of electron beam cured cationic epoxies versus thermal cured resins and composites

    International Nuclear Information System (INIS)

    Janke, C.J.; Norris, R.E.; Yarborough, K.; Lopata, V.J.

    1997-01-01

    Electron beam curing of composites is a nonthermal, nonautoclave curing process offering the following advantages compared to conventional thermal curing: substantially reduced manufacturing costs and curing times; improvements in part quality and performance; reduced environmental and health concerns; and improvements in material handling. In 1994 a Cooperative Research and Development Agreement (CRADA), sponsored by the Department of Energy Defense Programs and 10 industrial partners, was established to advance electron beam curing of composites. The CRADA has successfully developed hundreds of new toughened and untoughened resins, offering unlimited formulation and processing flexibility. Several patent applications have been filed for this work. Composites made from these easily processable, low shrinkage material match the performance of thermal cured composites and exhibit: low void contents comparable to autoclave cured composites (less than 1%); superb low water absorption values in the same range as cyanate esters (less than 1%); glass transition temperatures rivaling those of polyimides (greater than 390 C); mechanical properties comparable to high performance, autoclave cured composites; and excellent property retention after cryogenic and thermal cycling. These materials have been used to manufacture many composite parts using various fabrication processes including hand lay-up, tow placement, filament winding, resin transfer molding and vacuum assisted resin transfer molding

  15. Curing mechanism of flexible aqueous polymeric coatings.

    Science.gov (United States)

    Irfan, Muhammad; Ahmed, Abid Riaz; Kolter, Karl; Bodmeier, Roland; Dashevskiy, Andriy

    2017-06-01

    The objective of this study was to explain curing phenomena for pellets coated with a flexible polymeric coating based on poly(vinyl acetate) (Kollicoat® SR 30D) with regard to the effect of starter cores, thickness of drug layer, adhesion of coating to drug-layered-cores as well as coating properties. In addition, appropriate approaches to eliminate the curing effect were identified. Sugar or MCC cores were layered with the model drugs carbamazepine, theophylline, propranolol HCl, tramadol HCl and metoprolol HCl using HPMC (5 or 25% w/w, based on drug) as a binder. Drug-layered pellets were coated with Kollicoat® SR 30D in a fluidized bed coater using TEC (10% w/w) as plasticizer and talc (35-100% w/w) as anti-tacking agent. Drug release, pellet properties (morphology, water uptake-weight loss and osmolality) and adhesion of the coating to the drug layer were investigated as a function of curing at 60°C or 60°C/75% RH for 24h. The film formation of the aqueous dispersion of Kollicoat® SR 30D was complete, and therefore, a strong curing effect (decrease in drug release) at elevated temperature and humidity (60°C/75% RH) could not be explained by the well-known hydroplasticization and the further gradual coalescence of the colloidal polymer particles. According to the provided mechanistic explanation, the observed curing effect was associated with (1) high flexibility of coating, (2) adhesion between coating and drug layer, (3) water retaining properties of the drug layer, and (4) osmotically active cores. Unwanted curing effects could be minimized/eliminated by the addition of talc or/and pore-forming water soluble polymers in the coating, increasing binder amount or applying an intermediate coating, by increasing the thickness of drug layer or using non-osmotic cores. A new insight into curing phenomena mainly associated with the adhesion between drug layer and coating was provided. Appropriate approaches to avoid unwanted curing effect were identified

  16. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    Science.gov (United States)

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  17. Modelling fourier regression for time series data- a case study: modelling inflation in foods sector in Indonesia

    Science.gov (United States)

    Prahutama, Alan; Suparti; Wahyu Utami, Tiani

    2018-03-01

    Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.

  18. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  19. General regression and representation model for classification.

    Directory of Open Access Journals (Sweden)

    Jianjun Qian

    Full Text Available Recently, the regularized coding-based classification methods (e.g. SRC and CRC show a great potential for pattern classification. However, most existing coding methods assume that the representation residuals are uncorrelated. In real-world applications, this assumption does not hold. In this paper, we take account of the correlations of the representation residuals and develop a general regression and representation model (GRR for classification. GRR not only has advantages of CRC, but also takes full use of the prior information (e.g. the correlations between representation residuals and representation coefficients and the specific information (weight matrix of image pixels to enhance the classification performance. GRR uses the generalized Tikhonov regularization and K Nearest Neighbors to learn the prior information from the training data. Meanwhile, the specific information is obtained by using an iterative algorithm to update the feature (or image pixel weights of the test sample. With the proposed model as a platform, we design two classifiers: basic general regression and representation classifier (B-GRR and robust general regression and representation classifier (R-GRR. The experimental results demonstrate the performance advantages of proposed methods over state-of-the-art algorithms.

  20. Residual Stress Developed During the Cure of Thermosetting Polymers: Optimizing Cure Schedule to Minimize Stress.

    Energy Technology Data Exchange (ETDEWEB)

    Kropka, Jamie Michael; Stavig, Mark E.; Jaramillo, Rex

    2016-06-01

    When thermosetting polymers are used to bond or encapsulate electrical, mechanical or optical assemblies, residual stress, which often affects the performance and/or reliability of these devices, develops within the structure. The Thin-Disk-on-Cylinder structural response test is demonstrated as a powerful tool to design epoxy encapsulant cure schedules to reduce residual stress, even when all the details of the material evolution during cure are not explicitly known. The test's ability to (1) distinguish between cohesive and adhesive failure modes and (2) demonstrate methodologies to eliminate failure and reduce residual stress, make choices of cure schedules that optimize stress in the encapsulant unambiguous. For the 828/DEA/GMB material in the Thin-Disk-on-Cylinder geometry, the stress associated with cure is significant and outweighs that associated with cool down from the final cure temperature to room temperature (for measured lid strain, Scure I > I I e+h erma * II) * The difference between the final cure temperature and 1 1 -- the temperature at which the material gels, Tf-T ge i, was demonstrated to be a primary factor in determining the residual stress associated with cure. Increasing T f -T ge i leads to a reduction in cure stress that is described as being associated with balancing some of the 828/DEA/GMB cure shrinkage with thermal expansion. The ability to tune residual stress associated with cure by controlling T f -T ge i would be anticipated to translate to other thermosetting encapsulation materials, but the times and temperatures appropriate for a given material may vary widely.

  1. Effect of curing mode on the hardness of dual-cured composite resin core build-up materials

    Directory of Open Access Journals (Sweden)

    César Augusto Galvão Arrais

    2010-06-01

    Full Text Available This study evaluated the Knoop Hardness (KHN values of two dual-cured composite resin core build-up materials and one resin cement exposed to different curing conditions. Two dual-cured core build-up composite resins (LuxaCore®-Dual, DMG; and FluoroCore®2, Dentsply Caulk, and one dual-cured resin cement (Rely X ARC, 3M ESPE were used in the present study. The composite materials were placed into a cylindrical matrix (2 mm in height and 3 mm in diameter, and the specimens thus produced were either light-activated for 40 s (Optilux 501, Demetron Kerr or were allowed to self-cure for 10 min in the dark (n = 5. All specimens were then stored in humidity at 37°C for 24 h in the dark and were subjected to KHN analysis. The results were submitted to 2-way ANOVA and Tukey's post-hoc test at a pre-set alpha of 5%. All the light-activated groups exhibited higher KHN values than the self-cured ones (p = 0.00001, regardless of product. Among the self-cured groups, both composite resin core build-up materials showed higher KHN values than the dual-cured resin cement (p = 0.00001. LuxaCore®-Dual exhibited higher KHN values than FluoroCore®2 (p = 0.00001 when they were allowed to self-cure, while no significant differences in KHN values were observed among the light-activated products. The results suggest that dual-cured composite resin core build-up materials may be more reliable than dual-cured resin cements when curing light is not available.

  2. Polyurethane curing kinetics for polymer bonded explosives: HTPB/IPDI binder

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sangmook; Hong, In-Kwon [Dankook University, Yongin (Korea, Republic of); Choi, Chong Han; Lee, Jae Wook [Sogang University, Seoul (Korea, Republic of)

    2015-08-15

    The kinetics of polyurethane reaction and the effect of catalysts on the curing behavior were studied. The mixtures of hydroxyl terminated polybutadiene and isophorone diisocyanate with different reaction catalysts were dynamically cured in a differential scanning calorimeter. The activation energies were evaluated by the Kissinger and the Ozawa methods. The Chang plot was also used to determine reaction order and rate constant. The results showed that the activation energies were influenced remarkably by the choice of catalysts. The degree of cure and the cure time at given temperatures were calculated by direct integration of modified auto-catalytic kinetic model. It would give valuable information like pot-life estimation during manufacturing polymer-bonded explosives.

  3. Accelerated dry curing of hams.

    Science.gov (United States)

    Marriott, N G; Kelly, R F; Shaffer, C K; Graham, P P; Boling, J W

    1985-01-01

    Uncured pork legs from the right side of 18 carcasses were treated with a Ross Tenderizer and the left side were controls. All 36 samples were dry-cured for 40, 56 or 70 days and evaluated for appearance traits, cure penetration characteristics, microbial load, Kramer Shear force and taste attributes. The tenderization treatment had no effect (P > 0·05) on visual color or cure penetration rate, weight loss before curing, percentage moisture, nitrate level, nitrite level, total plate count, anaerobic counts, psychrotrophic counts, objective and subjective tenderness measurements or juiciness. However, the higher values of salt suggested a possible acceleration of the dry cure penetration process among the tenderized samples. Cure time had no effect (P > 0·05) on percentage moisture, percentage salt, nitrate content, nitrite content, shear force and juiciness. Results suggest a limited effect of the mechanical tenderization process on certain traits related to dry curing and that total process time should be at least 70 days if color stability during cooking is desired. Copyright © 1985. Published by Elsevier Ltd.

  4. A test for the parameters of multiple linear regression models ...

    African Journals Online (AJOL)

    A test for the parameters of multiple linear regression models is developed for conducting tests simultaneously on all the parameters of multiple linear regression models. The test is robust relative to the assumptions of homogeneity of variances and absence of serial correlation of the classical F-test. Under certain null and ...

  5. Predicting recycling behaviour: Comparison of a linear regression model and a fuzzy logic model.

    Science.gov (United States)

    Vesely, Stepan; Klöckner, Christian A; Dohnal, Mirko

    2016-03-01

    In this paper we demonstrate that fuzzy logic can provide a better tool for predicting recycling behaviour than the customarily used linear regression. To show this, we take a set of empirical data on recycling behaviour (N=664), which we randomly divide into two halves. The first half is used to estimate a linear regression model of recycling behaviour, and to develop a fuzzy logic model of recycling behaviour. As the first comparison, the fit of both models to the data included in estimation of the models (N=332) is evaluated. As the second comparison, predictive accuracy of both models for "new" cases (hold-out data not included in building the models, N=332) is assessed. In both cases, the fuzzy logic model significantly outperforms the regression model in terms of fit. To conclude, when accurate predictions of recycling and possibly other environmental behaviours are needed, fuzzy logic modelling seems to be a promising technique. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Anisotropic Dielectric Properties of Carbon Fiber Reinforced Polymer Composites during Microwave Curing

    Science.gov (United States)

    Zhang, Linglin; Li, Yingguang; Zhou, Jing

    2018-01-01

    Microwave cuing technology is a promising alternative to conventional autoclave curing technology in high efficient and energy saving processing of polymer composites. Dielectric properties of composites are key parameters related to the energy conversion efficiency during the microwave curing process. However, existing methods of dielectric measurement cannot be applied to the microwave curing process. This paper presented an offline test method to solve this problem. Firstly, a kinetics model of the polymer composites under microwave curing was established based on differential scanning calorimetry to describe the whole curing process. Then several specially designed samples of different feature cure degrees were prepared and used to reflect the dielectric properties of the composite during microwave curing. It was demonstrated to be a feasible plan for both test accuracy and efficiency through extensive experimental research. Based on this method, the anisotropic complex permittivity of a carbon fiber/epoxy composite during microwave curing was accurately determined. Statistical results indicated that both the dielectric constant and dielectric loss of the composite increased at the initial curing stage, peaked at the maximum reaction rate point and decreased finally during the microwave curing process. Corresponding mechanism has also been systematically investigated in this work.

  7. Logistic regression models

    CERN Document Server

    Hilbe, Joseph M

    2009-01-01

    This book really does cover everything you ever wanted to know about logistic regression … with updates available on the author's website. Hilbe, a former national athletics champion, philosopher, and expert in astronomy, is a master at explaining statistical concepts and methods. Readers familiar with his other expository work will know what to expect-great clarity.The book provides considerable detail about all facets of logistic regression. No step of an argument is omitted so that the book will meet the needs of the reader who likes to see everything spelt out, while a person familiar with some of the topics has the option to skip "obvious" sections. The material has been thoroughly road-tested through classroom and web-based teaching. … The focus is on helping the reader to learn and understand logistic regression. The audience is not just students meeting the topic for the first time, but also experienced users. I believe the book really does meet the author's goal … .-Annette J. Dobson, Biometric...

  8. Development situation of radiation curing materials

    International Nuclear Information System (INIS)

    He Songhua; Luo Junyi; Liu Zhen

    2010-01-01

    Due to fitting the '4E' principle, radiation curing technology, known as green technology, have shown its own superiority in many applications. It has been rapid developed in China and abroad in recent years, especially ultraviolet/electron beam (UV/EB) radiation curing technology. In order to let the researchers have a general understanding on the radiation curing materials and their development, in this paper a briefly introducing on the related radiation sources, chemical systems, curing mechanism, and the application, the common and difference of ultraviolet curing and electron beam curing has been made. A brief account of development of radiation-curable material in China and the outlook of the development of materials can be found in this paper. At last, we have proposed that the development of radiation curing technology will promote the development of the radiation curing material and benefit in the humanity. (authors)

  9. Linking Simple Economic Theory Models and the Cointegrated Vector AutoRegressive Model

    DEFF Research Database (Denmark)

    Møller, Niels Framroze

    This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its stru....... Further fundamental extensions and advances to more sophisticated theory models, such as those related to dynamics and expectations (in the structural relations) are left for future papers......This paper attempts to clarify the connection between simple economic theory models and the approach of the Cointegrated Vector-Auto-Regressive model (CVAR). By considering (stylized) examples of simple static equilibrium models, it is illustrated in detail, how the theoretical model and its......, it is demonstrated how other controversial hypotheses such as Rational Expectations can be formulated directly as restrictions on the CVAR-parameters. A simple example of a "Neoclassical synthetic" AS-AD model is also formulated. Finally, the partial- general equilibrium distinction is related to the CVAR as well...

  10. Multiple Response Regression for Gaussian Mixture Models with Known Labels.

    Science.gov (United States)

    Lee, Wonyul; Du, Ying; Sun, Wei; Hayes, D Neil; Liu, Yufeng

    2012-12-01

    Multiple response regression is a useful regression technique to model multiple response variables using the same set of predictor variables. Most existing methods for multiple response regression are designed for modeling homogeneous data. In many applications, however, one may have heterogeneous data where the samples are divided into multiple groups. Our motivating example is a cancer dataset where the samples belong to multiple cancer subtypes. In this paper, we consider modeling the data coming from a mixture of several Gaussian distributions with known group labels. A naive approach is to split the data into several groups according to the labels and model each group separately. Although it is simple, this approach ignores potential common structures across different groups. We propose new penalized methods to model all groups jointly in which the common and unique structures can be identified. The proposed methods estimate the regression coefficient matrix, as well as the conditional inverse covariance matrix of response variables. Asymptotic properties of the proposed methods are explored. Through numerical examples, we demonstrate that both estimation and prediction can be improved by modeling all groups jointly using the proposed methods. An application to a glioblastoma cancer dataset reveals some interesting common and unique gene relationships across different cancer subtypes.

  11. Extending the linear model with R generalized linear, mixed effects and nonparametric regression models

    CERN Document Server

    Faraway, Julian J

    2005-01-01

    Linear models are central to the practice of statistics and form the foundation of a vast range of statistical methodologies. Julian J. Faraway''s critically acclaimed Linear Models with R examined regression and analysis of variance, demonstrated the different methods available, and showed in which situations each one applies. Following in those footsteps, Extending the Linear Model with R surveys the techniques that grow from the regression model, presenting three extensions to that framework: generalized linear models (GLMs), mixed effect models, and nonparametric regression models. The author''s treatment is thoroughly modern and covers topics that include GLM diagnostics, generalized linear mixed models, trees, and even the use of neural networks in statistics. To demonstrate the interplay of theory and practice, throughout the book the author weaves the use of the R software environment to analyze the data of real examples, providing all of the R commands necessary to reproduce the analyses. All of the ...

  12. Optimal cure cycle design of a resin-fiber composite laminate

    Science.gov (United States)

    Hou, Jean W.; Sheen, Jeenson

    1987-01-01

    A unified computed aided design method was studied for the cure cycle design that incorporates an optimal design technique with the analytical model of a composite cure process. The preliminary results of using this proposed method for optimal cure cycle design are reported and discussed. The cure process of interest is the compression molding of a polyester which is described by a diffusion reaction system. The finite element method is employed to convert the initial boundary value problem into a set of first order differential equations which are solved simultaneously by the DE program. The equations for thermal design sensitivities are derived by using the direct differentiation method and are solved by the DE program. A recursive quadratic programming algorithm with an active set strategy called a linearization method is used to optimally design the cure cycle, subjected to the given design performance requirements. The difficulty of casting the cure cycle design process into a proper mathematical form is recognized. Various optimal design problems are formulated to address theses aspects. The optimal solutions of these formulations are compared and discussed.

  13. Curing behavior and thermal properties of trifunctional epoxy resin cured by 4, 4’-diaminodiphenyl sulfone

    Directory of Open Access Journals (Sweden)

    2009-08-01

    Full Text Available A novel trifunctional epoxy resin 4-(3, 3-dihydro-7-hydroxy-2, 4, 4-trimethyl-2H-1-benzopyran-2-yl-1, 3-benzenediol glycidyl (shorted as TMBPBTH-EPOXY was synthesized in our lab to improve thermal performance. Its curing behavior and performance were studied by using 4, 4′-diaminodiphenyl sulfone (DDS as hardener with the mass ratio of 100:41 of TMBPBTH-EPOXY and DDS. The curing activation energy was investigated by differential scanning calorimetry (DSC to be 64.0 kJ/mol estimated by Kissinger’s method and 68.7 kJ/mol estimated by Flynn-Wall-Ozawa method respectively. Thermogravimetric analyzer (TGA was used to investigate the thermal decomposition of cured compounds. It was found that when curing temperature was lower than 180°C, the thermal decomposition temperature increased with the rise of curing temperature and curing time. On the other hand, when the curing temperature was higher than 180°C, the thermal decomposition temperature went down instead with the increase of curing time that might be the over-crosslinking of TMBPBTH-EPOXY and DDS hardener. The glass transition temperature (Tg of cured TMBPBTH-EPOXY/DDS compound determined by dynamic mechanical thermal analysis (DMTA is 290.1°C.

  14. Degree of conversion and surface hardness of resin cement cured with different curing units.

    Science.gov (United States)

    Ozturk, Nilgun; Usumez, Aslihan; Usumez, Serdar; Ozturk, Bora

    2005-01-01

    The aim of this study was to evaluate the degree of conversion and Vickers surface hardness of resin cement under a simulated ceramic restoration with 3 different curing units: a conventional halogen unit, a high-intensity halogen unit, and a light-emitting diode system. A conventional halogen curing unit (Hilux 550) (40 s), a high-intensity halogen curing unit used in conventional and ramp mode (Optilux 501) (10 s and 20 s, respectively), and a light-emitting diode system (Elipar FreeLight) (20 s, 40 s) were used in this study. The dual-curing resin cement (Variolink II) was cured under a simulated ceramic restoration (diameter 5 mm, height 2 mm), and the degree of conversion and Vickers surface hardness were measured. For degree of conversion measurement, 10 specimens were prepared for each group. The absorbance peaks were recorded using the diffuse-reflection mode of Fourier transformation infrared spectroscopy. For Vickers surface hardness measurement, 10 specimens were prepared for each group. A load of 200 N was applied for 15 seconds, and 3 evaluations of each of the samples were performed. Degree of conversion achieved with Optilux 501 (20 s) was significantly higher than those of Hilux, Optilux 501 (10 s), Elipar FreeLight (20 s), and Elipar FreeLight (40 s). For Vickers surface hardness measurement, Optilux 501 (20 s) produced the highest surface hardness value. No significant differences were found among the Hilux, Optilux 501 (10 s), Elipar FreeLight (20 s), and Elipar FreeLight (40 s). The high-intensity halogen curing unit used in ramp mode (20 s) produced harder resin cement surfaces than did the conventional halogen curing unit, high-intensity halogen curing unit used in conventional mode (10 s) and light-emitting diode system (20 s, 40 s), when cured through a simulated ceramic restoration.

  15. Bayesian approach to errors-in-variables in regression models

    Science.gov (United States)

    Rozliman, Nur Aainaa; Ibrahim, Adriana Irawati Nur; Yunus, Rossita Mohammad

    2017-05-01

    In many applications and experiments, data sets are often contaminated with error or mismeasured covariates. When at least one of the covariates in a model is measured with error, Errors-in-Variables (EIV) model can be used. Measurement error, when not corrected, would cause misleading statistical inferences and analysis. Therefore, our goal is to examine the relationship of the outcome variable and the unobserved exposure variable given the observed mismeasured surrogate by applying the Bayesian formulation to the EIV model. We shall extend the flexible parametric method proposed by Hossain and Gustafson (2009) to another nonlinear regression model which is the Poisson regression model. We shall then illustrate the application of this approach via a simulation study using Markov chain Monte Carlo sampling methods.

  16. Excimer UV curing in printing

    International Nuclear Information System (INIS)

    Mehnert, R.

    1999-01-01

    It is the aim of this study to investigate the potential of 308 run excimer UV curing in web and sheet fed offset printing and to discuss its present status. Using real-time FTIR-ATR and stationary or pulsed monochromatic (313 nm) irradiation chemical and physical factors affecting the curing speed of printing inks such as nature and concentration of photo-initiators, reactivity of the ink binding system, ink thickness and pigmentation, irradiance in the curing plane, oxygen concentration and nitrogen inerting, multiple pulse exposure, the photochemical dark reaction and temperature dependence were studied. The results were used to select optimum conditions for excimer UV curing in respect to ink reactivity, nitrogen inerting and UV exposure and to build an excimer UV curing unit consisting of two 50 W/cm 308 run excimer lamps, power supply, cooling and inerting unit. The excimer UV curing devices were tested under realistic conditions on a web offset press zirkon supra forte and a sheet fed press Heidelberg GTO 52. Maximum curing speeds of 300 m/min in web offset and 8000 sheets per hour in sheet fed offset were obtained

  17. Electron curing of surface coatings

    International Nuclear Information System (INIS)

    Nablo, S.V.

    1974-01-01

    The technical development of electron curing of surface coatings has received great impetus since 1970 from dramatic changes in the economics of the conventional thermal process. The most important of these changes are reviewed, including: the Clear Air Act, increasing cost and restrictive allocation of energy, decreased availability and increased costs of solvents, competitive pressure for higher line productivity. The principles of free-radical initiated curing as they pertain to industrial coatings are reviewed. Although such electron initiated processes have been under active development for at least two decades, high volume production applications on an industrial scale have only recently appeared. These installations are surveyed with emphasis on the developments in machinery and coatings which have made this possible. The most significant economic advantages of electron curing are presented. In particular, the ability of electron curing to eliminate substrate damage and to eliminate the curing station (oven) as the pacing element for most industrial surface coating curing applications is discussed. Examples of several new processes of particular interest in the textile industry are reviewed, including the curing of transfer cast urethane films, flock adhesives, and graftable surface finishes

  18. Composite cements benefit from light-curing.

    Science.gov (United States)

    Lührs, Anne-Katrin; De Munck, Jan; Geurtsen, Werner; Van Meerbeek, Bart

    2014-03-01

    To investigate the effect of curing of composite cements and a new ceramic silanization pre-treatment on the micro-tensile bond strength (μTBS). Feldspathic ceramic blocks were luted onto dentin using either Optibond XTR/Nexus 3 (XTR/NX3; Kerr), the silane-incorporated 'universal' adhesive Scotchbond Universal/RelyX Ultimate (SBU/RXU; 3M ESPE), or ED Primer II/Panavia F2.0 (ED/PAF; Kuraray Noritake). Besides 'composite cement', experimental variables were 'curing mode' ('AA': complete auto-cure at 21°C; 'AA*': complete auto-cure at 37°C; 'LA': light-curing of adhesive and auto-cure of cement; 'LL': complete light-curing) and 'ceramic surface pre-treatment' ('HF/S/HB': hydrofluoric acid ('HF': IPS Ceramic Etching Gel, Ivoclar-Vivadent), silanization ('S': Monobond Plus, Ivoclar-Vivadent) and application of an adhesive resin ('HB': Heliobond, Ivoclar-Vivadent); 'HF/SBU': 'HF' and application of the 'universal' adhesive Scotchbond Universal ('SBU'; 3M ESPE, only for SBU/RXU)). After water storage (7 days at 37°C), ceramic-dentin sticks were subjected to μTBS testing. Regarding the 'composite cement', the significantly lowest μTBSs were measured for ED/PAF. Regarding 'curing mode', the significantly highest μTBS was recorded when at least the adhesive was light-cured ('LA' and 'LL'). Complete auto-cure ('AA') revealed the significantly lowest μTBS. The higher auto-curing temperature ('AA*') increased the μTBS only for ED/PAF. Regarding 'ceramic surface pre-treatment', only for 'LA' the μTBS was significantly higher for 'HF/S/HB' than for 'HF/SBU'. Complete auto-cure led to inferior μTBS than when either the adhesive (on dentin) or both adhesive and composite cement were light-cured. The use of a silane-incorporated adhesive did not decrease luting effectiveness when also the composite cement was light-cured. Copyright © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  19. Modeling Fire Occurrence at the City Scale: A Comparison between Geographically Weighted Regression and Global Linear Regression.

    Science.gov (United States)

    Song, Chao; Kwan, Mei-Po; Zhu, Jiping

    2017-04-08

    An increasing number of fires are occurring with the rapid development of cities, resulting in increased risk for human beings and the environment. This study compares geographically weighted regression-based models, including geographically weighted regression (GWR) and geographically and temporally weighted regression (GTWR), which integrates spatial and temporal effects and global linear regression models (LM) for modeling fire risk at the city scale. The results show that the road density and the spatial distribution of enterprises have the strongest influences on fire risk, which implies that we should focus on areas where roads and enterprises are densely clustered. In addition, locations with a large number of enterprises have fewer fire ignition records, probably because of strict management and prevention measures. A changing number of significant variables across space indicate that heterogeneity mainly exists in the northern and eastern rural and suburban areas of Hefei city, where human-related facilities or road construction are only clustered in the city sub-centers. GTWR can capture small changes in the spatiotemporal heterogeneity of the variables while GWR and LM cannot. An approach that integrates space and time enables us to better understand the dynamic changes in fire risk. Thus governments can use the results to manage fire safety at the city scale.

  20. Curing efficiency of three light emitting diode units at different curing profiles

    Directory of Open Access Journals (Sweden)

    Priyanka Verma

    2016-01-01

    Conclusions: Reduction of exposure time to 6 s with high-intensity curing light seemed to be clinically acceptable and should be recommended. Curing of metal brackets with single exposure from buccal side showed lower shear bond strength values.

  1. A novel calorimetry technique for monitoring electron beam curing of polymer resins

    International Nuclear Information System (INIS)

    Chen, J.H.; Johnston, A.; Petrescue, L.; Hojjati, M.

    2006-01-01

    This paper describes the development of a calorimetry-based technique for monitoring of the curing of electron beam (EB) curable resins, including design of the calorimeter hardware and the development of an analytical model for calculating resin cure rates and radiation dose. Factors affecting the performance of the calorimeter were investigated. Experimental trials monitoring the curing of epoxy resin were conducted under single pass and multiple passes of EB irradiation. Results show that the developed calorimeter is a simple, inexpensive and reasonably accurate technique for monitoring the EB curing of cationic epoxies

  2. Direction of Effects in Multiple Linear Regression Models.

    Science.gov (United States)

    Wiedermann, Wolfgang; von Eye, Alexander

    2015-01-01

    Previous studies analyzed asymmetric properties of the Pearson correlation coefficient using higher than second order moments. These asymmetric properties can be used to determine the direction of dependence in a linear regression setting (i.e., establish which of two variables is more likely to be on the outcome side) within the framework of cross-sectional observational data. Extant approaches are restricted to the bivariate regression case. The present contribution extends the direction of dependence methodology to a multiple linear regression setting by analyzing distributional properties of residuals of competing multiple regression models. It is shown that, under certain conditions, the third central moments of estimated regression residuals can be used to decide upon direction of effects. In addition, three different approaches for statistical inference are discussed: a combined D'Agostino normality test, a skewness difference test, and a bootstrap difference test. Type I error and power of the procedures are assessed using Monte Carlo simulations, and an empirical example is provided for illustrative purposes. In the discussion, issues concerning the quality of psychological data, possible extensions of the proposed methods to the fourth central moment of regression residuals, and potential applications are addressed.

  3. A Monte Carlo simulation study comparing linear regression, beta regression, variable-dispersion beta regression and fractional logit regression at recovering average difference measures in a two sample design.

    Science.gov (United States)

    Meaney, Christopher; Moineddin, Rahim

    2014-01-24

    In biomedical research, response variables are often encountered which have bounded support on the open unit interval--(0,1). Traditionally, researchers have attempted to estimate covariate effects on these types of response data using linear regression. Alternative modelling strategies may include: beta regression, variable-dispersion beta regression, and fractional logit regression models. This study employs a Monte Carlo simulation design to compare the statistical properties of the linear regression model to that of the more novel beta regression, variable-dispersion beta regression, and fractional logit regression models. In the Monte Carlo experiment we assume a simple two sample design. We assume observations are realizations of independent draws from their respective probability models. The randomly simulated draws from the various probability models are chosen to emulate average proportion/percentage/rate differences of pre-specified magnitudes. Following simulation of the experimental data we estimate average proportion/percentage/rate differences. We compare the estimators in terms of bias, variance, type-1 error and power. Estimates of Monte Carlo error associated with these quantities are provided. If response data are beta distributed with constant dispersion parameters across the two samples, then all models are unbiased and have reasonable type-1 error rates and power profiles. If the response data in the two samples have different dispersion parameters, then the simple beta regression model is biased. When the sample size is small (N0 = N1 = 25) linear regression has superior type-1 error rates compared to the other models. Small sample type-1 error rates can be improved in beta regression models using bias correction/reduction methods. In the power experiments, variable-dispersion beta regression and fractional logit regression models have slightly elevated power compared to linear regression models. Similar results were observed if the

  4. Tutorial on Using Regression Models with Count Outcomes Using R

    Directory of Open Access Journals (Sweden)

    A. Alexander Beaujean

    2016-02-01

    Full Text Available Education researchers often study count variables, such as times a student reached a goal, discipline referrals, and absences. Most researchers that study these variables use typical regression methods (i.e., ordinary least-squares either with or without transforming the count variables. In either case, using typical regression for count data can produce parameter estimates that are biased, thus diminishing any inferences made from such data. As count-variable regression models are seldom taught in training programs, we present a tutorial to help educational researchers use such methods in their own research. We demonstrate analyzing and interpreting count data using Poisson, negative binomial, zero-inflated Poisson, and zero-inflated negative binomial regression models. The count regression methods are introduced through an example using the number of times students skipped class. The data for this example are freely available and the R syntax used run the example analyses are included in the Appendix.

  5. Statistical approach for selection of regression model during validation of bioanalytical method

    Directory of Open Access Journals (Sweden)

    Natalija Nakov

    2014-06-01

    Full Text Available The selection of an adequate regression model is the basis for obtaining accurate and reproducible results during the bionalytical method validation. Given the wide concentration range, frequently present in bioanalytical assays, heteroscedasticity of the data may be expected. Several weighted linear and quadratic regression models were evaluated during the selection of the adequate curve fit using nonparametric statistical tests: One sample rank test and Wilcoxon signed rank test for two independent groups of samples. The results obtained with One sample rank test could not give statistical justification for the selection of linear vs. quadratic regression models because slight differences between the error (presented through the relative residuals were obtained. Estimation of the significance of the differences in the RR was achieved using Wilcoxon signed rank test, where linear and quadratic regression models were treated as two independent groups. The application of this simple non-parametric statistical test provides statistical confirmation of the choice of an adequate regression model.

  6. Linear regression models for quantitative assessment of left ...

    African Journals Online (AJOL)

    Changes in left ventricular structures and function have been reported in cardiomyopathies. No prediction models have been established in this environment. This study established regression models for prediction of left ventricular structures in normal subjects. A sample of normal subjects was drawn from a large urban ...

  7. Cure Schedule for Stycast 2651/Catalyst 11.

    Energy Technology Data Exchange (ETDEWEB)

    Kropka, Jamie Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McCoy, John D. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    2017-11-01

    The Henkel technical data sheet (TDS) for Stycast 2651/Catalyst 11 lists three alternate cure schedules for the material, each of which would result in a different state of reaction and different material properties. Here, a cure schedule that attains full reaction of the material is defined. The use of this cure schedule will eliminate variance in material properties due to changes in the cure state of the material, and the cure schedule will serve as the method to make material prior to characterizing properties. The following recommendation was motivated by (1) a desire to cure at a single temperature for ease of manufacture and (2) a desire to keep the cure temperature low (to minimize residual stress build-up associated with the cooldown from the cure temperature to room temperature) without excessively limiting the cure reaction due to vitrification (i.e., material glass transition temperature, Tg, exceeding cure temperature).

  8. Properties of radiation cured coatings

    International Nuclear Information System (INIS)

    Larson, E.G.; Spencer, D.S.; Boettcher, T.E.; Melbauer, M.A.; Skarjune, R.P.

    1987-01-01

    Coatings were prepared from acrylate or methacrylate functionalized resins to study the effect of end group functionality on the physical properties of u.v. and electron beam cured coatings. Cure response was measured by solid state NMR and gel extraction, as expected, methacrylate resins cured much slower. Thermal Gravimetric Analysis (TGA) revealed acrylate coatings have greater thermal stability. Properties such as tensile strength and hardness showed little effect of end group functionality or curing method. The O 2 and H 2 O permeabilities of the coating were correlated with the processing conditions. (author)

  9. The situation of radiation curing

    International Nuclear Information System (INIS)

    Chen Weixiu

    1988-01-01

    Radiation curing is a branch of radiation processing. It has developed significantly and its annual growth rate exceeds 10% in the nineteen eighties. Several products were manufactured by radiation curing, such as magnetic media, release coating, floor tile, printing flates, optical fiber, electronics, lithography and pressure sensitive adhesives etc. The chemistry of radiation curing is often considered ahead. The safe handling of UV/EB curable material, the regulation of industial and the patent protection for development in radiation curing were introduced. The equipment and processes of this field have got progress recently

  10. Reduced Rank Regression

    DEFF Research Database (Denmark)

    Johansen, Søren

    2008-01-01

    The reduced rank regression model is a multivariate regression model with a coefficient matrix with reduced rank. The reduced rank regression algorithm is an estimation procedure, which estimates the reduced rank regression model. It is related to canonical correlations and involves calculating...

  11. Poisson regression for modeling count and frequency outcomes in trauma research.

    Science.gov (United States)

    Gagnon, David R; Doron-LaMarca, Susan; Bell, Margret; O'Farrell, Timothy J; Taft, Casey T

    2008-10-01

    The authors describe how the Poisson regression method for analyzing count or frequency outcome variables can be applied in trauma studies. The outcome of interest in trauma research may represent a count of the number of incidents of behavior occurring in a given time interval, such as acts of physical aggression or substance abuse. Traditional linear regression approaches assume a normally distributed outcome variable with equal variances over the range of predictor variables, and may not be optimal for modeling count outcomes. An application of Poisson regression is presented using data from a study of intimate partner aggression among male patients in an alcohol treatment program and their female partners. Results of Poisson regression and linear regression models are compared.

  12. Electron beam curing of coatings

    International Nuclear Information System (INIS)

    Schmidt, J.; Mai, H.

    1986-01-01

    Modern low-energy electron beam processors offer the possibility for high-speed curing of coatings on paper, plastics, wood and metal. Today the electron beam curing gets more importance due to the increasing environmental problems and the rising cost of energy. For an effective curing process low-energy electron beam processors as well as very reactive binders are necessary. Generally such binders consist of acrylic-modified unsaturated polyester resins, polyacrylates, urethane acrylates or epoxy acrylates and vinyl monomers, mostly multifunctional acrylates. First results on the production of EBC binders on the base of polyester resins and vinyl monomers are presented. The aim of our investigations is to obtain binders with curing doses ≤ 50 kGy. In order to reduce the curing dose we studied mixtures of resins and acrylates. (author)

  13. Regression Models and Fuzzy Logic Prediction of TBM Penetration Rate

    Directory of Open Access Journals (Sweden)

    Minh Vu Trieu

    2017-03-01

    Full Text Available This paper presents statistical analyses of rock engineering properties and the measured penetration rate of tunnel boring machine (TBM based on the data of an actual project. The aim of this study is to analyze the influence of rock engineering properties including uniaxial compressive strength (UCS, Brazilian tensile strength (BTS, rock brittleness index (BI, the distance between planes of weakness (DPW, and the alpha angle (Alpha between the tunnel axis and the planes of weakness on the TBM rate of penetration (ROP. Four (4 statistical regression models (two linear and two nonlinear are built to predict the ROP of TBM. Finally a fuzzy logic model is developed as an alternative method and compared to the four statistical regression models. Results show that the fuzzy logic model provides better estimations and can be applied to predict the TBM performance. The R-squared value (R2 of the fuzzy logic model scores the highest value of 0.714 over the second runner-up of 0.667 from the multiple variables nonlinear regression model.

  14. Regression Models and Fuzzy Logic Prediction of TBM Penetration Rate

    Science.gov (United States)

    Minh, Vu Trieu; Katushin, Dmitri; Antonov, Maksim; Veinthal, Renno

    2017-03-01

    This paper presents statistical analyses of rock engineering properties and the measured penetration rate of tunnel boring machine (TBM) based on the data of an actual project. The aim of this study is to analyze the influence of rock engineering properties including uniaxial compressive strength (UCS), Brazilian tensile strength (BTS), rock brittleness index (BI), the distance between planes of weakness (DPW), and the alpha angle (Alpha) between the tunnel axis and the planes of weakness on the TBM rate of penetration (ROP). Four (4) statistical regression models (two linear and two nonlinear) are built to predict the ROP of TBM. Finally a fuzzy logic model is developed as an alternative method and compared to the four statistical regression models. Results show that the fuzzy logic model provides better estimations and can be applied to predict the TBM performance. The R-squared value (R2) of the fuzzy logic model scores the highest value of 0.714 over the second runner-up of 0.667 from the multiple variables nonlinear regression model.

  15. Deep ensemble learning of sparse regression models for brain disease diagnosis.

    Science.gov (United States)

    Suk, Heung-Il; Lee, Seong-Whan; Shen, Dinggang

    2017-04-01

    Recent studies on brain imaging analysis witnessed the core roles of machine learning techniques in computer-assisted intervention for brain disease diagnosis. Of various machine-learning techniques, sparse regression models have proved their effectiveness in handling high-dimensional data but with a small number of training samples, especially in medical problems. In the meantime, deep learning methods have been making great successes by outperforming the state-of-the-art performances in various applications. In this paper, we propose a novel framework that combines the two conceptually different methods of sparse regression and deep learning for Alzheimer's disease/mild cognitive impairment diagnosis and prognosis. Specifically, we first train multiple sparse regression models, each of which is trained with different values of a regularization control parameter. Thus, our multiple sparse regression models potentially select different feature subsets from the original feature set; thereby they have different powers to predict the response values, i.e., clinical label and clinical scores in our work. By regarding the response values from our sparse regression models as target-level representations, we then build a deep convolutional neural network for clinical decision making, which thus we call 'Deep Ensemble Sparse Regression Network.' To our best knowledge, this is the first work that combines sparse regression models with deep neural network. In our experiments with the ADNI cohort, we validated the effectiveness of the proposed method by achieving the highest diagnostic accuracies in three classification tasks. We also rigorously analyzed our results and compared with the previous studies on the ADNI cohort in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A computational approach to compare regression modelling strategies in prediction research.

    Science.gov (United States)

    Pajouheshnia, Romin; Pestman, Wiebe R; Teerenstra, Steven; Groenwold, Rolf H H

    2016-08-25

    It is often unclear which approach to fit, assess and adjust a model will yield the most accurate prediction model. We present an extension of an approach for comparing modelling strategies in linear regression to the setting of logistic regression and demonstrate its application in clinical prediction research. A framework for comparing logistic regression modelling strategies by their likelihoods was formulated using a wrapper approach. Five different strategies for modelling, including simple shrinkage methods, were compared in four empirical data sets to illustrate the concept of a priori strategy comparison. Simulations were performed in both randomly generated data and empirical data to investigate the influence of data characteristics on strategy performance. We applied the comparison framework in a case study setting. Optimal strategies were selected based on the results of a priori comparisons in a clinical data set and the performance of models built according to each strategy was assessed using the Brier score and calibration plots. The performance of modelling strategies was highly dependent on the characteristics of the development data in both linear and logistic regression settings. A priori comparisons in four empirical data sets found that no strategy consistently outperformed the others. The percentage of times that a model adjustment strategy outperformed a logistic model ranged from 3.9 to 94.9 %, depending on the strategy and data set. However, in our case study setting the a priori selection of optimal methods did not result in detectable improvement in model performance when assessed in an external data set. The performance of prediction modelling strategies is a data-dependent process and can be highly variable between data sets within the same clinical domain. A priori strategy comparison can be used to determine an optimal logistic regression modelling strategy for a given data set before selecting a final modelling approach.

  17. Thermal Aging Behaviors of Rubber Vulcanizates Cured with Single and Binary Cure Systems

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sung Seen; Ha, Sung Ho [Sejong University, Seoul (Korea, Republic of); Woo, Chang Su [Korea Institute of Machinery and Materials, Daejeon (Korea, Republic of)

    2006-03-15

    In general, an accelerated sulfur cure system consists of elemental sulfur, one or two cure accelerators, and cure activators. Crosslink density of a rubber vulcanizate determines the physical properties. By increasing the crosslink density, the modulus, hardness, resilience, and abrasion resistance increase, whereas the elongation at break, heat build-up, and stress relaxation decrease. Sulfur linkages are composed of monosulfide, disulfide, and polysulfides. Sulfur linkages, especially polysulfides, are dissociated by heating and this brings about decrease of the crosslink density.

  18. Strength of Geopolymer Cement Curing at Ambient Temperature by Non-Oven Curing Approaches: An Overview

    Science.gov (United States)

    Wattanachai, Pitiwat; Suwan, Teewara

    2017-06-01

    At the present day, a concept of environmentally friendly construction materials has been intensively studying to reduce the amount of releasing greenhouse gases. Geopolymer is one of the cementitious binders which can be produced by utilising pozzolanic wastes (e.g. fly ash or furnace slag) and also receiving much more attention as a low-CO2 emission material. However, to achieve excellent mechanical properties, heat curing process is needed to apply to geopolymer cement in a range of temperature around 40 to 90°C. To consume less oven-curing energy and be more convenience in practical work, the study on geopolymer curing at ambient temperature (around 20 to 25°C) is therefore widely investigated. In this paper, a core review of factors and approaches for non-oven curing geopolymer has been summarised. The performance, in term of strength, of each non-oven curing method, is also presented and analysed. The main aim of this review paper is to gather the latest study of ambient temperature curing geopolymer and to enlarge a feasibility of non-oven curing geopolymer development. Also, to extend the directions of research work, some approaches or techniques can be combined or applied to the specific properties for in-field applications and embankment stabilization by using soil-cement column.

  19. Delbruck Prize Award: Insights into HIV Dynamics and Cure

    Science.gov (United States)

    Perelson, Alan S.

    A large effort is being made to find a means to cure HIV infection. I will present a dynamical model of a phenomenon called post-treatment control (PTC) or functional cure of HIV-infection in which some patients treated with suppressive antiviral therapy have been taken off of therapy and then spontaneously control HIV infection. The model relies on an immune response and bistability to explain PTC. I will then generalize the model to explicitly include immunotherapy with monoclonal antibodies approved for use in cancer to show that one can induce PTC with a limited number of antibody infusions and compare model predictions with experiments in SIV infected macaques given immunotherapy. Lastly, I will argue that quantitative insights derived from models of HIV infection have and will continue to play an important role in medicine.

  20. Potential of yeasts isolated from dry-cured ham to control ochratoxin A production in meat models.

    Science.gov (United States)

    Peromingo, Belén; Núñez, Félix; Rodríguez, Alicia; Alía, Alberto; Andrade, María J

    2018-03-02

    The environmental conditions reached during the ripening of dry-cured meat products favour the proliferation of moulds on their surface. Some of these moulds are hazardous to consumers because of their ability to produce ochratoxin A (OTA). Biocontrol using Debaryomyces hansenii could be a suitable strategy to prevent the growth of ochratoxigenic moulds and OTA accumulation in dry-cured meat products. The aim of this work was to evaluate the ability of two strains of D. hansenii to control the growth and OTA production of Penicillium verrucosum in a meat model under water activities (a w ) values commonly reached during the dry-cured meat product ripening. The presence of D. hansenii strains triggered a lengthening of the lag phase and a decrease of the growth rate of P. verrucosum in meat-based media at 0.97 and 0.92 a w . Both D. hansenii strains significantly reduced OTA production (between 85.16 and 92.63%) by P. verrucosum in the meat-based medium at 0.92 a w . Neither absorption nor detoxification of OTA by D. hansenii strains seems to be involved. However, a repression of the expression of the non-ribosomal peptide synthetase (otanpsPN) gene linked to the OTA biosynthetic pathway was observed in the presence of D. hansenii. To confirm the protective role of D. hansenii strains, they were inoculated together with P. verrucosum Pv45 in dry-fermented sausage and dry-cured ham slices. Although P. verrucosum Pv45 counts were not affected by the presence of D. hansenii in both meat matrices, a reduction of OTA amount was observed. Therefore, the effect of D. hansenii strains on OTA accumulation should be attributed to a reduction at transcriptional level. Consequently, native D. hansenii can be useful as biocontrol agent in dry-cured meat products for preventing the hazard associated with the presence of OTA. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. A generalized right truncated bivariate Poisson regression model with applications to health data.

    Science.gov (United States)

    Islam, M Ataharul; Chowdhury, Rafiqul I

    2017-01-01

    A generalized right truncated bivariate Poisson regression model is proposed in this paper. Estimation and tests for goodness of fit and over or under dispersion are illustrated for both untruncated and right truncated bivariate Poisson regression models using marginal-conditional approach. Estimation and test procedures are illustrated for bivariate Poisson regression models with applications to Health and Retirement Study data on number of health conditions and the number of health care services utilized. The proposed test statistics are easy to compute and it is evident from the results that the models fit the data very well. A comparison between the right truncated and untruncated bivariate Poisson regression models using the test for nonnested models clearly shows that the truncated model performs significantly better than the untruncated model.

  2. A comparative evaluation of effect of modern-curing lights and curing modes on conventional and novel-resin monomers

    Science.gov (United States)

    Roy, Konda Karthik; Kumar, Kanumuru Pavan; John, Gijo; Sooraparaju, Sujatha Gopal; Nujella, Surya Kumari; Sowmya, Kyatham

    2018-01-01

    Aim: The aim of this study is to compare and to evaluate effect of curing light and curing modes on the nanohybrid composite resins with conventional Bis-GMA and novel tricyclodecane (TCD) monomers. Methodology: Two nanohybrid composites, IPS empress direct and charisma diamond were used in this study. Light-emitting diode (LED)-curing unit and quartz-tungsten-halogen (QTH)-curing unit which were operated into two different modes: continuous and soft start. Based on the composite resin, curing lights, and mode of curing used, the samples were divided into 8 groups. After polymerization, the samples were stored for 48 h in complete darkness at 37°C and 100% humidity. The Vickers hardness (VK) of the surface was determined with Vickers indenter by the application of 200 g for 15 s. Three VK readings were recorded for each sample surface both on top and bottom surfaces. For all the specimens, the three hardness values for each surface were averaged and reported as a single value. The mean VK and hardness ratio were calculated. The depth of cure was assessed based on the hardness ratio. Results: Comparison of mean hardness values and hardness ratios was done using ANOVA with post hoc Tukey's test. Conclusion: Both QTH- and LED-curing units had shown the adequate depth of cure. Soft-start-curing mode in both QTH- and LED-curing lights had effectively increased microhardness than the continuous mode of curing. TCD monomer had shown higher hardness values compared with conventional Bis-GMA-containing resin. PMID:29628651

  3. Likelihood inference for COM-Poisson cure rate model with interval-censored data and Weibull lifetimes.

    Science.gov (United States)

    Pal, Suvra; Balakrishnan, N

    2017-10-01

    In this paper, we consider a competing cause scenario and assume the number of competing causes to follow a Conway-Maxwell Poisson distribution which can capture both over and under dispersion that is usually encountered in discrete data. Assuming the population of interest having a component cure and the form of the data to be interval censored, as opposed to the usually considered right-censored data, the main contribution is in developing the steps of the expectation maximization algorithm for the determination of the maximum likelihood estimates of the model parameters of the flexible Conway-Maxwell Poisson cure rate model with Weibull lifetimes. An extensive Monte Carlo simulation study is carried out to demonstrate the performance of the proposed estimation method. Model discrimination within the Conway-Maxwell Poisson distribution is addressed using the likelihood ratio test and information-based criteria to select a suitable competing cause distribution that provides the best fit to the data. A simulation study is also carried out to demonstrate the loss in efficiency when selecting an improper competing cause distribution which justifies the use of a flexible family of distributions for the number of competing causes. Finally, the proposed methodology and the flexibility of the Conway-Maxwell Poisson distribution are illustrated with two known data sets from the literature: smoking cessation data and breast cosmesis data.

  4. Hierarchical Neural Regression Models for Customer Churn Prediction

    Directory of Open Access Journals (Sweden)

    Golshan Mohammadi

    2013-01-01

    Full Text Available As customers are the main assets of each industry, customer churn prediction is becoming a major task for companies to remain in competition with competitors. In the literature, the better applicability and efficiency of hierarchical data mining techniques has been reported. This paper considers three hierarchical models by combining four different data mining techniques for churn prediction, which are backpropagation artificial neural networks (ANN, self-organizing maps (SOM, alpha-cut fuzzy c-means (α-FCM, and Cox proportional hazards regression model. The hierarchical models are ANN + ANN + Cox, SOM + ANN + Cox, and α-FCM + ANN + Cox. In particular, the first component of the models aims to cluster data in two churner and nonchurner groups and also filter out unrepresentative data or outliers. Then, the clustered data as the outputs are used to assign customers to churner and nonchurner groups by the second technique. Finally, the correctly classified data are used to create Cox proportional hazards model. To evaluate the performance of the hierarchical models, an Iranian mobile dataset is considered. The experimental results show that the hierarchical models outperform the single Cox regression baseline model in terms of prediction accuracy, Types I and II errors, RMSE, and MAD metrics. In addition, the α-FCM + ANN + Cox model significantly performs better than the two other hierarchical models.

  5. LINEAR REGRESSION MODEL ESTİMATİON FOR RIGHT CENSORED DATA

    Directory of Open Access Journals (Sweden)

    Ersin Yılmaz

    2016-05-01

    Full Text Available In this study, firstly we will define a right censored data. If we say shortly right-censored data is censoring values that above the exact line. This may be related with scaling device. And then  we will use response variable acquainted from right-censored explanatory variables. Then the linear regression model will be estimated. For censored data’s existence, Kaplan-Meier weights will be used for  the estimation of the model. With the weights regression model  will be consistent and unbiased with that.   And also there is a method for the censored data that is a semi parametric regression and this method also give  useful results  for censored data too. This study also might be useful for the health studies because of the censored data used in medical issues generally.

  6. Developing and testing a global-scale regression model to quantify mean annual streamflow

    Science.gov (United States)

    Barbarossa, Valerio; Huijbregts, Mark A. J.; Hendriks, A. Jan; Beusen, Arthur H. W.; Clavreul, Julie; King, Henry; Schipper, Aafke M.

    2017-01-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF based on a dataset unprecedented in size, using observations of discharge and catchment characteristics from 1885 catchments worldwide, measuring between 2 and 106 km2. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area and catchment averaged mean annual precipitation and air temperature, slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error (RMSE) values were lower (0.29-0.38 compared to 0.49-0.57) and the modified index of agreement (d) was higher (0.80-0.83 compared to 0.72-0.75). Our regression model can be applied globally to estimate MAF at any point of the river network, thus providing a feasible alternative to spatially explicit process-based global hydrological models.

  7. Correlation between the state of cure of thermosetting resins and their properties

    International Nuclear Information System (INIS)

    Haffane, N.; Benameur, T.; Granger, R.; Vergnaud, J.M.

    1996-01-01

    Thermosetting resins, in the same way as polymers, are more and more used for coating metal sheets, in order to bring various interesting properties. An important problem arises with the cure of the thermoset, the process of cure being complex with heating conduction and convection and the heat generated by the cure reaction. The kinetics of the heat evolved from the overall cure reaction is determined through calorimetry experiments in scanning mode. The state of cure at time t is expressed by the heat generated by reaction up to time t as a fraction of the total heat generated. A numerical model taking all the facts into account is able to evaluate the profile of the state of cure developed through the thickness of the thermoset. The state of cure which derives from a theoretical point of view is correlated with some properties of interest for the coating, such as the hardness and the resistance to liquids. The resistance to water and ethanol is evaluated by determining the kinetics of absorption which is controlled by diffusion. copyright 1996 American Institute of Physics

  8. Effect of cementitious permanent formwork on moisture field of internal-cured concrete under drying

    Science.gov (United States)

    Wang, Jiahe; Zhang, Jun; Ding, Xiaoping; Zhang, Jiajia

    2018-02-01

    Drying shrinkage of concrete may still be the main source of cracking in concrete structures, even though the autogenous shrinkage of concrete can be effectively reduced by using internal curing. In the present paper, the effect of internal curing with pre-soaked lightweight aggregate and engineered cementitious composite permanent formwork (ECC-PF) on a moisture distribution in three kinds of concrete in a drying environment are investigated from both aspects of experiments and theoretical modeling. The test results show that the combination use of ECC-PF and internal curing can well maintain the humidity at a relatively high level not only at a place far from drying surface, but also at a place close to the drying surfaces. The developed model can well catch the characteristics of the moisture distribution in concrete under drying and the impacts of internal curing and ECC-PF can well be reflected as well. The model can be used for the design of concrete structures with combination use of internal curing and permanent formwork.

  9. A land use regression model for ambient ultrafine particles in Montreal, Canada: A comparison of linear regression and a machine learning approach.

    Science.gov (United States)

    Weichenthal, Scott; Ryswyk, Keith Van; Goldstein, Alon; Bagg, Scott; Shekkarizfard, Maryam; Hatzopoulou, Marianne

    2016-04-01

    Existing evidence suggests that ambient ultrafine particles (UFPs) (regression model for UFPs in Montreal, Canada using mobile monitoring data collected from 414 road segments during the summer and winter months between 2011 and 2012. Two different approaches were examined for model development including standard multivariable linear regression and a machine learning approach (kernel-based regularized least squares (KRLS)) that learns the functional form of covariate impacts on ambient UFP concentrations from the data. The final models included parameters for population density, ambient temperature and wind speed, land use parameters (park space and open space), length of local roads and rail, and estimated annual average NOx emissions from traffic. The final multivariable linear regression model explained 62% of the spatial variation in ambient UFP concentrations whereas the KRLS model explained 79% of the variance. The KRLS model performed slightly better than the linear regression model when evaluated using an external dataset (R(2)=0.58 vs. 0.55) or a cross-validation procedure (R(2)=0.67 vs. 0.60). In general, our findings suggest that the KRLS approach may offer modest improvements in predictive performance compared to standard multivariable linear regression models used to estimate spatial variations in ambient UFPs. However, differences in predictive performance were not statistically significant when evaluated using the cross-validation procedure. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  10. Adaptive regression for modeling nonlinear relationships

    CERN Document Server

    Knafl, George J

    2016-01-01

    This book presents methods for investigating whether relationships are linear or nonlinear and for adaptively fitting appropriate models when they are nonlinear. Data analysts will learn how to incorporate nonlinearity in one or more predictor variables into regression models for different types of outcome variables. Such nonlinear dependence is often not considered in applied research, yet nonlinear relationships are common and so need to be addressed. A standard linear analysis can produce misleading conclusions, while a nonlinear analysis can provide novel insights into data, not otherwise possible. A variety of examples of the benefits of modeling nonlinear relationships are presented throughout the book. Methods are covered using what are called fractional polynomials based on real-valued power transformations of primary predictor variables combined with model selection based on likelihood cross-validation. The book covers how to formulate and conduct such adaptive fractional polynomial modeling in the s...

  11. Confidence bands for inverse regression models

    International Nuclear Information System (INIS)

    Birke, Melanie; Bissantz, Nicolai; Holzmann, Hajo

    2010-01-01

    We construct uniform confidence bands for the regression function in inverse, homoscedastic regression models with convolution-type operators. Here, the convolution is between two non-periodic functions on the whole real line rather than between two periodic functions on a compact interval, since the former situation arguably arises more often in applications. First, following Bickel and Rosenblatt (1973 Ann. Stat. 1 1071–95) we construct asymptotic confidence bands which are based on strong approximations and on a limit theorem for the supremum of a stationary Gaussian process. Further, we propose bootstrap confidence bands based on the residual bootstrap and prove consistency of the bootstrap procedure. A simulation study shows that the bootstrap confidence bands perform reasonably well for moderate sample sizes. Finally, we apply our method to data from a gel electrophoresis experiment with genetically engineered neuronal receptor subunits incubated with rat brain extract

  12. Electricity consumption forecasting in Italy using linear regression models

    Energy Technology Data Exchange (ETDEWEB)

    Bianco, Vincenzo; Manca, Oronzio; Nardini, Sergio [DIAM, Seconda Universita degli Studi di Napoli, Via Roma 29, 81031 Aversa (CE) (Italy)

    2009-09-15

    The influence of economic and demographic variables on the annual electricity consumption in Italy has been investigated with the intention to develop a long-term consumption forecasting model. The time period considered for the historical data is from 1970 to 2007. Different regression models were developed, using historical electricity consumption, gross domestic product (GDP), gross domestic product per capita (GDP per capita) and population. A first part of the paper considers the estimation of GDP, price and GDP per capita elasticities of domestic and non-domestic electricity consumption. The domestic and non-domestic short run price elasticities are found to be both approximately equal to -0.06, while long run elasticities are equal to -0.24 and -0.09, respectively. On the contrary, the elasticities of GDP and GDP per capita present higher values. In the second part of the paper, different regression models, based on co-integrated or stationary data, are presented. Different statistical tests are employed to check the validity of the proposed models. A comparison with national forecasts, based on complex econometric models, such as Markal-Time, was performed, showing that the developed regressions are congruent with the official projections, with deviations of {+-}1% for the best case and {+-}11% for the worst. These deviations are to be considered acceptable in relation to the time span taken into account. (author)

  13. Electricity consumption forecasting in Italy using linear regression models

    International Nuclear Information System (INIS)

    Bianco, Vincenzo; Manca, Oronzio; Nardini, Sergio

    2009-01-01

    The influence of economic and demographic variables on the annual electricity consumption in Italy has been investigated with the intention to develop a long-term consumption forecasting model. The time period considered for the historical data is from 1970 to 2007. Different regression models were developed, using historical electricity consumption, gross domestic product (GDP), gross domestic product per capita (GDP per capita) and population. A first part of the paper considers the estimation of GDP, price and GDP per capita elasticities of domestic and non-domestic electricity consumption. The domestic and non-domestic short run price elasticities are found to be both approximately equal to -0.06, while long run elasticities are equal to -0.24 and -0.09, respectively. On the contrary, the elasticities of GDP and GDP per capita present higher values. In the second part of the paper, different regression models, based on co-integrated or stationary data, are presented. Different statistical tests are employed to check the validity of the proposed models. A comparison with national forecasts, based on complex econometric models, such as Markal-Time, was performed, showing that the developed regressions are congruent with the official projections, with deviations of ±1% for the best case and ±11% for the worst. These deviations are to be considered acceptable in relation to the time span taken into account. (author)

  14. Improving sub-pixel imperviousness change prediction by ensembling heterogeneous non-linear regression models

    Directory of Open Access Journals (Sweden)

    Drzewiecki Wojciech

    2016-12-01

    Full Text Available In this work nine non-linear regression models were compared for sub-pixel impervious surface area mapping from Landsat images. The comparison was done in three study areas both for accuracy of imperviousness coverage evaluation in individual points in time and accuracy of imperviousness change assessment. The performance of individual machine learning algorithms (Cubist, Random Forest, stochastic gradient boosting of regression trees, k-nearest neighbors regression, random k-nearest neighbors regression, Multivariate Adaptive Regression Splines, averaged neural networks, and support vector machines with polynomial and radial kernels was also compared with the performance of heterogeneous model ensembles constructed from the best models trained using particular techniques.

  15. Modelling infant mortality rate in Central Java, Indonesia use generalized poisson regression method

    Science.gov (United States)

    Prahutama, Alan; Sudarno

    2018-05-01

    The infant mortality rate is the number of deaths under one year of age occurring among the live births in a given geographical area during a given year, per 1,000 live births occurring among the population of the given geographical area during the same year. This problem needs to be addressed because it is an important element of a country’s economic development. High infant mortality rate will disrupt the stability of a country as it relates to the sustainability of the population in the country. One of regression model that can be used to analyze the relationship between dependent variable Y in the form of discrete data and independent variable X is Poisson regression model. Recently The regression modeling used for data with dependent variable is discrete, among others, poisson regression, negative binomial regression and generalized poisson regression. In this research, generalized poisson regression modeling gives better AIC value than poisson regression. The most significant variable is the Number of health facilities (X1), while the variable that gives the most influence to infant mortality rate is the average breastfeeding (X9).

  16. Degree of conversion of resin-based materials cured with dual-peak or single-peak LED light-curing units.

    Science.gov (United States)

    Lucey, Siobhan M; Santini, Ario; Roebuck, Elizabeth M

    2015-03-01

    There is a lack of data on polymerization of resin-based materials (RBMs) used in paediatric dentistry, using dual-peak light-emitting diode (LED) light-curing units (LCUs). To evaluate the degree of conversion (DC) of RBMs cured with dual-peak or single-peak LED LCUs. Samples of Vit-l-escence (Ultradent) and Herculite XRV Ultra (Kerr) and fissure sealants Delton Clear and Delton Opaque (Dentsply) were prepared (n = 3 per group) and cured with either one of two dual-peak LCUs (bluephase(®) G2; Ivoclar Vivadent or Valo; Ultradent) or a single-peak (bluephase(®) ; Ivoclar Vivadent). High-performance liquid chromatography and nuclear magnetic resonance spectroscopy were used to confirm the presence or absence of initiators other than camphorquinone. The DC was determined using micro-Raman spectroscopy. Data were analysed using general linear model anova; α = 0.05. With Herculite XRV Ultra, the single-peak LCU gave higher DC values than either of the two dual-peak LCUs (P < 0.05). Both fissure sealants showed higher DC compared with the two RBMs (P < 0.05); the DC at the bottom of the clear sealant was greater than the opaque sealant, (P < 0.05). 2,4,6-trimethylbenzoyldiphenylphosphine oxide (Lucirin(®) TPO) was found only in Vit-l-escence. Dual-peak LED LCUs may not be best suited for curing non-Lucirin(®) TPO-containing materials. A clear sealant showed a better cure throughout the material and may be more appropriate than opaque versions in deep fissures. © 2014 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Augmented Beta rectangular regression models: A Bayesian perspective.

    Science.gov (United States)

    Wang, Jue; Luo, Sheng

    2016-01-01

    Mixed effects Beta regression models based on Beta distributions have been widely used to analyze longitudinal percentage or proportional data ranging between zero and one. However, Beta distributions are not flexible to extreme outliers or excessive events around tail areas, and they do not account for the presence of the boundary values zeros and ones because these values are not in the support of the Beta distributions. To address these issues, we propose a mixed effects model using Beta rectangular distribution and augment it with the probabilities of zero and one. We conduct extensive simulation studies to assess the performance of mixed effects models based on both the Beta and Beta rectangular distributions under various scenarios. The simulation studies suggest that the regression models based on Beta rectangular distributions improve the accuracy of parameter estimates in the presence of outliers and heavy tails. The proposed models are applied to the motivating Neuroprotection Exploratory Trials in Parkinson's Disease (PD) Long-term Study-1 (LS-1 study, n = 1741), developed by The National Institute of Neurological Disorders and Stroke Exploratory Trials in Parkinson's Disease (NINDS NET-PD) network. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Bayesian semiparametric regression models to characterize molecular evolution

    Directory of Open Access Journals (Sweden)

    Datta Saheli

    2012-10-01

    Full Text Available Abstract Background Statistical models and methods that associate changes in the physicochemical properties of amino acids with natural selection at the molecular level typically do not take into account the correlations between such properties. We propose a Bayesian hierarchical regression model with a generalization of the Dirichlet process prior on the distribution of the regression coefficients that describes the relationship between the changes in amino acid distances and natural selection in protein-coding DNA sequence alignments. Results The Bayesian semiparametric approach is illustrated with simulated data and the abalone lysin sperm data. Our method identifies groups of properties which, for this particular dataset, have a similar effect on evolution. The model also provides nonparametric site-specific estimates for the strength of conservation of these properties. Conclusions The model described here is distinguished by its ability to handle a large number of amino acid properties simultaneously, while taking into account that such data can be correlated. The multi-level clustering ability of the model allows for appealing interpretations of the results in terms of properties that are roughly equivalent from the standpoint of molecular evolution.

  19. Regression analysis of a chemical reaction fouling model

    International Nuclear Information System (INIS)

    Vasak, F.; Epstein, N.

    1996-01-01

    A previously reported mathematical model for the initial chemical reaction fouling of a heated tube is critically examined in the light of the experimental data for which it was developed. A regression analysis of the model with respect to that data shows that the reference point upon which the two adjustable parameters of the model were originally based was well chosen, albeit fortuitously. (author). 3 refs., 2 tabs., 2 figs

  20. Correlation-regression model for physico-chemical quality of ...

    African Journals Online (AJOL)

    abusaad

    areas, suggesting that groundwater quality in urban areas is closely related with land use ... the ground water, with correlation and regression model is also presented. ...... WHO (World Health Organization) (1985). Health hazards from nitrates.

  1. Dielectric Cure Monitoring of Thermosetting Matrix Composites

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyoung Geun [Agency for Defense Development, Daejeon (Korea, Republic of); Lee, Dae Gil [KAIST, Daejeon (Korea, Republic of)

    2003-10-15

    Cure monitoring can be used to improve the quality and productivity of thermosetting resin matrix composite products during their manufacturing process. In this work, the sensitivity of dielectrometry was improved by adequate separation the efforts of sensor and externals on the measured signal. A new algorithm to obtain the degree of cure during dielectric cure monitoring of glass/polyester and glass/epoxy composites was developed by employing a function of both temperature and dissipation factor, in which five cure monitoring parameters were used to calculate the degree of cure. The decreasing pattern of dissipation factor was compared with the relationships between the degree of cure and the resin viscosity. The developed algorithm might be employed for the in situ cure monitoring of thermosetting resin composites

  2. Dielectric Cure Monitoring of Thermosetting Matrix Composites

    International Nuclear Information System (INIS)

    Kim, Hyoung Geun; Lee, Dae Gil

    2003-01-01

    Cure monitoring can be used to improve the quality and productivity of thermosetting resin matrix composite products during their manufacturing process. In this work, the sensitivity of dielectrometry was improved by adequate separation the efforts of sensor and externals on the measured signal. A new algorithm to obtain the degree of cure during dielectric cure monitoring of glass/polyester and glass/epoxy composites was developed by employing a function of both temperature and dissipation factor, in which five cure monitoring parameters were used to calculate the degree of cure. The decreasing pattern of dissipation factor was compared with the relationships between the degree of cure and the resin viscosity. The developed algorithm might be employed for the in situ cure monitoring of thermosetting resin composites

  3. Multiple linear regression and regression with time series error models in forecasting PM10 concentrations in Peninsular Malaysia.

    Science.gov (United States)

    Ng, Kar Yong; Awang, Norhashidah

    2018-01-06

    Frequent haze occurrences in Malaysia have made the management of PM 10 (particulate matter with aerodynamic less than 10 μm) pollution a critical task. This requires knowledge on factors associating with PM 10 variation and good forecast of PM 10 concentrations. Hence, this paper demonstrates the prediction of 1-day-ahead daily average PM 10 concentrations based on predictor variables including meteorological parameters and gaseous pollutants. Three different models were built. They were multiple linear regression (MLR) model with lagged predictor variables (MLR1), MLR model with lagged predictor variables and PM 10 concentrations (MLR2) and regression with time series error (RTSE) model. The findings revealed that humidity, temperature, wind speed, wind direction, carbon monoxide and ozone were the main factors explaining the PM 10 variation in Peninsular Malaysia. Comparison among the three models showed that MLR2 model was on a same level with RTSE model in terms of forecasting accuracy, while MLR1 model was the worst.

  4. Multivariate Frequency-Severity Regression Models in Insurance

    Directory of Open Access Journals (Sweden)

    Edward W. Frees

    2016-02-01

    Full Text Available In insurance and related industries including healthcare, it is common to have several outcome measures that the analyst wishes to understand using explanatory variables. For example, in automobile insurance, an accident may result in payments for damage to one’s own vehicle, damage to another party’s vehicle, or personal injury. It is also common to be interested in the frequency of accidents in addition to the severity of the claim amounts. This paper synthesizes and extends the literature on multivariate frequency-severity regression modeling with a focus on insurance industry applications. Regression models for understanding the distribution of each outcome continue to be developed yet there now exists a solid body of literature for the marginal outcomes. This paper contributes to this body of literature by focusing on the use of a copula for modeling the dependence among these outcomes; a major advantage of this tool is that it preserves the body of work established for marginal models. We illustrate this approach using data from the Wisconsin Local Government Property Insurance Fund. This fund offers insurance protection for (i property; (ii motor vehicle; and (iii contractors’ equipment claims. In addition to several claim types and frequency-severity components, outcomes can be further categorized by time and space, requiring complex dependency modeling. We find significant dependencies for these data; specifically, we find that dependencies among lines are stronger than the dependencies between the frequency and average severity within each line.

  5. Methodology to predict long-term cancer survival from short-term data using Tobacco Cancer Risk and Absolute Cancer Cure models

    International Nuclear Information System (INIS)

    Mould, R F; Lederman, M; Tai, P; Wong, J K M

    2002-01-01

    Three parametric statistical models have been fully validated for cancer of the larynx for the prediction of long-term 15, 20 and 25 year cancer-specific survival fractions when short-term follow-up data was available for just 1-2 years after the end of treatment of the last patient. In all groups of cases the treatment period was only 5 years. Three disease stage groups were studied, T1N0, T2N0 and T3N0. The models are the Standard Lognormal (SLN) first proposed by Boag (1949 J. R. Stat. Soc. Series B 11 15-53) but only ever fully validated for cancer of the cervix, Mould and Boag (1975 Br. J. Cancer 32 529-50), and two new models which have been termed Tobacco Cancer Risk (TCR) and Absolute Cancer Cure (ACC). In each, the frequency distribution of survival times of defined groups of cancer deaths is lognormally distributed: larynx only (SLN), larynx and lung (TCR) and all cancers (ACC). All models each have three unknown parameters but it was possible to estimate a value for the lognormal parameter S a priori. By reduction to two unknown parameters the model stability has been improved. The material used to validate the methodology consisted of case histories of 965 patients, all treated during the period 1944-1968 by Dr Manuel Lederman of the Royal Marsden Hospital, London, with follow-up to 1988. This provided a follow-up range of 20- 44 years and enabled predicted long-term survival fractions to be compared with the actual survival fractions, calculated by the Kaplan and Meier (1958 J. Am. Stat. Assoc. 53 457-82) method. The TCR and ACC models are better than the SLN model and for a maximum short-term follow-up of 6 years, the 20 and 25 year survival fractions could be predicted. Therefore the numbers of follow-up years saved are respectively 14 years and 19 years. Clinical trial results using the TCR and ACC models can thus be analysed much earlier than currently possible. Absolute cure from cancer was also studied, using not only the prediction models which

  6. Quantile Regression Methods

    DEFF Research Database (Denmark)

    Fitzenberger, Bernd; Wilke, Ralf Andreas

    2015-01-01

    if the mean regression model does not. We provide a short informal introduction into the principle of quantile regression which includes an illustrative application from empirical labor market research. This is followed by briefly sketching the underlying statistical model for linear quantile regression based......Quantile regression is emerging as a popular statistical approach, which complements the estimation of conditional mean models. While the latter only focuses on one aspect of the conditional distribution of the dependent variable, the mean, quantile regression provides more detailed insights...... by modeling conditional quantiles. Quantile regression can therefore detect whether the partial effect of a regressor on the conditional quantiles is the same for all quantiles or differs across quantiles. Quantile regression can provide evidence for a statistical relationship between two variables even...

  7. Application of random regression models to the genetic evaluation ...

    African Journals Online (AJOL)

    The model included fixed regression on AM (range from 30 to 138 mo) and the effect of herd-measurement date concatenation. Random parts of the model were RRM coefficients for additive and permanent environmental effects, while residual effects were modelled to account for heterogeneity of variance by AY. Estimates ...

  8. EB/UV curing market in Malaysia

    Energy Technology Data Exchange (ETDEWEB)

    Dahlan, Khairul Zaman; Nik Salleh, Nik Ghazali; Mahmood, Mohd Hilmi [Malaysian Inst. for Nuclear Technology Res. (MINT), Bangi (Malaysia)

    1999-07-01

    Radiation curing of coatings of wood based products is expanding and being used for curing of coatings of table tops, parquet, wood panel, furniture, curtain railing, etc. UV curing of over print varnish is still the main application of UV curing in printing industry. However, curing ofprinting ink has also been extended in the printing of CD and VCD in addition to other printing such as paper, magazine, label on bottles, metal-can, etc. In the electronic industry, the manufacturer of printed circuit board is still the main consumer of UV curable resins. On the other hand, low energy electron beam machine is used mainly for cross-linking of heat shrink films.

  9. EB/UV curing market in Malaysia

    International Nuclear Information System (INIS)

    Khairul Zaman Dahlan; Nik Ghazali Nik Salleh; Mohd Hilmi Mahmood

    1999-01-01

    Radiation curing of coatings of wood based products is expanding and being used for curing of coatings of table tops, parquet, wood panel, furniture, curtain railing, etc. UV curing of over print varnish is still the main application of UV curing in printing industry. However, curing of printing ink has also been extended in the printing of CD and VCD in addition to other printing such as paper, magazine, label on bottles, metal-can, etc. In the electronic industry, the manufacturer of printed circuit board is still the main consumer of UV curable resins. On the other hand, low energy electron beam machine is used mainly for cross-linking of heat shrink films

  10. Multitask Quantile Regression under the Transnormal Model.

    Science.gov (United States)

    Fan, Jianqing; Xue, Lingzhou; Zou, Hui

    2016-01-01

    We consider estimating multi-task quantile regression under the transnormal model, with focus on high-dimensional setting. We derive a surprisingly simple closed-form solution through rank-based covariance regularization. In particular, we propose the rank-based ℓ 1 penalization with positive definite constraints for estimating sparse covariance matrices, and the rank-based banded Cholesky decomposition regularization for estimating banded precision matrices. By taking advantage of alternating direction method of multipliers, nearest correlation matrix projection is introduced that inherits sampling properties of the unprojected one. Our work combines strengths of quantile regression and rank-based covariance regularization to simultaneously deal with nonlinearity and nonnormality for high-dimensional regression. Furthermore, the proposed method strikes a good balance between robustness and efficiency, achieves the "oracle"-like convergence rate, and provides the provable prediction interval under the high-dimensional setting. The finite-sample performance of the proposed method is also examined. The performance of our proposed rank-based method is demonstrated in a real application to analyze the protein mass spectroscopy data.

  11. Approximating prediction uncertainty for random forest regression models

    Science.gov (United States)

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  12. Profile-driven regression for modeling and runtime optimization of mobile networks

    DEFF Research Database (Denmark)

    McClary, Dan; Syrotiuk, Violet; Kulahci, Murat

    2010-01-01

    Computer networks often display nonlinear behavior when examined over a wide range of operating conditions. There are few strategies available for modeling such behavior and optimizing such systems as they run. Profile-driven regression is developed and applied to modeling and runtime optimization...... of throughput in a mobile ad hoc network, a self-organizing collection of mobile wireless nodes without any fixed infrastructure. The intermediate models generated in profile-driven regression are used to fit an overall model of throughput, and are also used to optimize controllable factors at runtime. Unlike...

  13. Cure Behavior and Thermal Properties of Diepoxidized Cardanol Resin Cured by Electron Beam Process

    International Nuclear Information System (INIS)

    Cho, Donghwan; Cheon, Jinsil

    2013-01-01

    Thermal curing of epoxy resin requires high temperature, time-consuming process and the volatilization of hardener. It has known that electron beam curing of epoxy resin is a fast process and occurs at low or room temperature that help reduce residual mechanical stresses in thermosetting polymers. Diepoxidized cardanol (DEC) can be synthesized by an enzymatic method from cashew nut shell liquid (CNSL), that constitutes nearly one-third of the total nut weight. A large amount of CNSL can be formed as a byproduct of the mechanical processes used to render the cashew kerneledible and its total production approaches one million tons annually, which can be bio-degradable and replace the industrial thermosetting plastics. It is expected that DEC may be cured as in an epoxy resin, which was constituted on two epoxide group and long alkyl chain, and two-types of onium salts (cationic initiator) were used as a photo-initiator. The experimental variables of this study are type and concentration of photo-initiators and electron beam dosage. In this study, the effects of initiator type and concentration on the cure behavior and the thermal properties of DEC resin processed by using electron beam technology were studied using FT-IR, TGA, TMA, DSC, and DMA. Figure 1 is the FT-IR results, showing the change of chemical structure of pure DEC and electron beam cured DEC. The characteristic absorption peak of epoxide group appeared at 850cm -1 . The shape and the height were reduced when the sample was irradiated with electron beam. From this result, the epoxide groups is DEC were opened by electron beam and cured. After then, electron beam cured DEC was investigated the effect of forming 3-dimensional network

  14. Cure Behavior and Thermal Properties of Diepoxidized Cardanol Resin Cured by Electron Beam Process

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Donghwan; Cheon, Jinsil [Kumoh National Institute of Technology, Gumi (Korea, Republic of)

    2013-07-01

    Thermal curing of epoxy resin requires high temperature, time-consuming process and the volatilization of hardener. It has known that electron beam curing of epoxy resin is a fast process and occurs at low or room temperature that help reduce residual mechanical stresses in thermosetting polymers. Diepoxidized cardanol (DEC) can be synthesized by an enzymatic method from cashew nut shell liquid (CNSL), that constitutes nearly one-third of the total nut weight. A large amount of CNSL can be formed as a byproduct of the mechanical processes used to render the cashew kerneledible and its total production approaches one million tons annually, which can be bio-degradable and replace the industrial thermosetting plastics. It is expected that DEC may be cured as in an epoxy resin, which was constituted on two epoxide group and long alkyl chain, and two-types of onium salts (cationic initiator) were used as a photo-initiator. The experimental variables of this study are type and concentration of photo-initiators and electron beam dosage. In this study, the effects of initiator type and concentration on the cure behavior and the thermal properties of DEC resin processed by using electron beam technology were studied using FT-IR, TGA, TMA, DSC, and DMA. Figure 1 is the FT-IR results, showing the change of chemical structure of pure DEC and electron beam cured DEC. The characteristic absorption peak of epoxide group appeared at 850cm{sup -1}. The shape and the height were reduced when the sample was irradiated with electron beam. From this result, the epoxide groups is DEC were opened by electron beam and cured. After then, electron beam cured DEC was investigated the effect of forming 3-dimensional network.

  15. Accounting for measurement error in log regression models with applications to accelerated testing.

    Science.gov (United States)

    Richardson, Robert; Tolley, H Dennis; Evenson, William E; Lunt, Barry M

    2018-01-01

    In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  16. Accounting for measurement error in log regression models with applications to accelerated testing.

    Directory of Open Access Journals (Sweden)

    Robert Richardson

    Full Text Available In regression settings, parameter estimates will be biased when the explanatory variables are measured with error. This bias can significantly affect modeling goals. In particular, accelerated lifetime testing involves an extrapolation of the fitted model, and a small amount of bias in parameter estimates may result in a significant increase in the bias of the extrapolated predictions. Additionally, bias may arise when the stochastic component of a log regression model is assumed to be multiplicative when the actual underlying stochastic component is additive. To account for these possible sources of bias, a log regression model with measurement error and additive error is approximated by a weighted regression model which can be estimated using Iteratively Re-weighted Least Squares. Using the reduced Eyring equation in an accelerated testing setting, the model is compared to previously accepted approaches to modeling accelerated testing data with both simulations and real data.

  17. OPLS statistical model versus linear regression to assess sonographic predictors of stroke prognosis.

    Science.gov (United States)

    Vajargah, Kianoush Fathi; Sadeghi-Bazargani, Homayoun; Mehdizadeh-Esfanjani, Robab; Savadi-Oskouei, Daryoush; Farhoudi, Mehdi

    2012-01-01

    The objective of the present study was to assess the comparable applicability of orthogonal projections to latent structures (OPLS) statistical model vs traditional linear regression in order to investigate the role of trans cranial doppler (TCD) sonography in predicting ischemic stroke prognosis. The study was conducted on 116 ischemic stroke patients admitted to a specialty neurology ward. The Unified Neurological Stroke Scale was used once for clinical evaluation on the first week of admission and again six months later. All data was primarily analyzed using simple linear regression and later considered for multivariate analysis using PLS/OPLS models through the SIMCA P+12 statistical software package. The linear regression analysis results used for the identification of TCD predictors of stroke prognosis were confirmed through the OPLS modeling technique. Moreover, in comparison to linear regression, the OPLS model appeared to have higher sensitivity in detecting the predictors of ischemic stroke prognosis and detected several more predictors. Applying the OPLS model made it possible to use both single TCD measures/indicators and arbitrarily dichotomized measures of TCD single vessel involvement as well as the overall TCD result. In conclusion, the authors recommend PLS/OPLS methods as complementary rather than alternative to the available classical regression models such as linear regression.

  18. A Technique of Fuzzy C-Mean in Multiple Linear Regression Model toward Paddy Yield

    Science.gov (United States)

    Syazwan Wahab, Nur; Saifullah Rusiman, Mohd; Mohamad, Mahathir; Amira Azmi, Nur; Che Him, Norziha; Ghazali Kamardan, M.; Ali, Maselan

    2018-04-01

    In this paper, we propose a hybrid model which is a combination of multiple linear regression model and fuzzy c-means method. This research involved a relationship between 20 variates of the top soil that are analyzed prior to planting of paddy yields at standard fertilizer rates. Data used were from the multi-location trials for rice carried out by MARDI at major paddy granary in Peninsular Malaysia during the period from 2009 to 2012. Missing observations were estimated using mean estimation techniques. The data were analyzed using multiple linear regression model and a combination of multiple linear regression model and fuzzy c-means method. Analysis of normality and multicollinearity indicate that the data is normally scattered without multicollinearity among independent variables. Analysis of fuzzy c-means cluster the yield of paddy into two clusters before the multiple linear regression model can be used. The comparison between two method indicate that the hybrid of multiple linear regression model and fuzzy c-means method outperform the multiple linear regression model with lower value of mean square error.

  19. Direct modeling of regression effects for transition probabilities in the progressive illness-death model

    DEFF Research Database (Denmark)

    Azarang, Leyla; Scheike, Thomas; de Uña-Álvarez, Jacobo

    2017-01-01

    In this work, we present direct regression analysis for the transition probabilities in the possibly non-Markov progressive illness–death model. The method is based on binomial regression, where the response is the indicator of the occupancy for the given state along time. Randomly weighted score...

  20. SPSS macros to compare any two fitted values from a regression model.

    Science.gov (United States)

    Weaver, Bruce; Dubois, Sacha

    2012-12-01

    In regression models with first-order terms only, the coefficient for a given variable is typically interpreted as the change in the fitted value of Y for a one-unit increase in that variable, with all other variables held constant. Therefore, each regression coefficient represents the difference between two fitted values of Y. But the coefficients represent only a fraction of the possible fitted value comparisons that might be of interest to researchers. For many fitted value comparisons that are not captured by any of the regression coefficients, common statistical software packages do not provide the standard errors needed to compute confidence intervals or carry out statistical tests-particularly in more complex models that include interactions, polynomial terms, or regression splines. We describe two SPSS macros that implement a matrix algebra method for comparing any two fitted values from a regression model. The !OLScomp and !MLEcomp macros are for use with models fitted via ordinary least squares and maximum likelihood estimation, respectively. The output from the macros includes the standard error of the difference between the two fitted values, a 95% confidence interval for the difference, and a corresponding statistical test with its p-value.

  1. 7 CFR 30.12 - Fire-cure.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Fire-cure. 30.12 Section 30.12 Agriculture Regulations... AND STANDARDS Classification of Leaf Tobacco Covering Classes, Types and Groups of Grades § 30.12 Fire-cure. To cure tobacco under artificial atmospheric conditions by the use of open fires, the smoke and...

  2. Parameter estimation and statistical test of geographically weighted bivariate Poisson inverse Gaussian regression models

    Science.gov (United States)

    Amalia, Junita; Purhadi, Otok, Bambang Widjanarko

    2017-11-01

    Poisson distribution is a discrete distribution with count data as the random variables and it has one parameter defines both mean and variance. Poisson regression assumes mean and variance should be same (equidispersion). Nonetheless, some case of the count data unsatisfied this assumption because variance exceeds mean (over-dispersion). The ignorance of over-dispersion causes underestimates in standard error. Furthermore, it causes incorrect decision in the statistical test. Previously, paired count data has a correlation and it has bivariate Poisson distribution. If there is over-dispersion, modeling paired count data is not sufficient with simple bivariate Poisson regression. Bivariate Poisson Inverse Gaussian Regression (BPIGR) model is mix Poisson regression for modeling paired count data within over-dispersion. BPIGR model produces a global model for all locations. In another hand, each location has different geographic conditions, social, cultural and economic so that Geographically Weighted Regression (GWR) is needed. The weighting function of each location in GWR generates a different local model. Geographically Weighted Bivariate Poisson Inverse Gaussian Regression (GWBPIGR) model is used to solve over-dispersion and to generate local models. Parameter estimation of GWBPIGR model obtained by Maximum Likelihood Estimation (MLE) method. Meanwhile, hypothesis testing of GWBPIGR model acquired by Maximum Likelihood Ratio Test (MLRT) method.

  3. Can We Use Regression Modeling to Quantify Mean Annual Streamflow at a Global-Scale?

    Science.gov (United States)

    Barbarossa, V.; Huijbregts, M. A. J.; Hendriks, J. A.; Beusen, A.; Clavreul, J.; King, H.; Schipper, A.

    2016-12-01

    Quantifying mean annual flow of rivers (MAF) at ungauged sites is essential for a number of applications, including assessments of global water supply, ecosystem integrity and water footprints. MAF can be quantified with spatially explicit process-based models, which might be overly time-consuming and data-intensive for this purpose, or with empirical regression models that predict MAF based on climate and catchment characteristics. Yet, regression models have mostly been developed at a regional scale and the extent to which they can be extrapolated to other regions is not known. In this study, we developed a global-scale regression model for MAF using observations of discharge and catchment characteristics from 1,885 catchments worldwide, ranging from 2 to 106 km2 in size. In addition, we compared the performance of the regression model with the predictive ability of the spatially explicit global hydrological model PCR-GLOBWB [van Beek et al., 2011] by comparing results from both models to independent measurements. We obtained a regression model explaining 89% of the variance in MAF based on catchment area, mean annual precipitation and air temperature, average slope and elevation. The regression model performed better than PCR-GLOBWB for the prediction of MAF, as root-mean-square error values were lower (0.29 - 0.38 compared to 0.49 - 0.57) and the modified index of agreement was higher (0.80 - 0.83 compared to 0.72 - 0.75). Our regression model can be applied globally at any point of the river network, provided that the input parameters are within the range of values employed in the calibration of the model. The performance is reduced for water scarce regions and further research should focus on improving such an aspect for regression-based global hydrological models.

  4. Modeling Governance KB with CATPCA to Overcome Multicollinearity in the Logistic Regression

    Science.gov (United States)

    Khikmah, L.; Wijayanto, H.; Syafitri, U. D.

    2017-04-01

    The problem often encounters in logistic regression modeling are multicollinearity problems. Data that have multicollinearity between explanatory variables with the result in the estimation of parameters to be bias. Besides, the multicollinearity will result in error in the classification. In general, to overcome multicollinearity in regression used stepwise regression. They are also another method to overcome multicollinearity which involves all variable for prediction. That is Principal Component Analysis (PCA). However, classical PCA in only for numeric data. Its data are categorical, one method to solve the problems is Categorical Principal Component Analysis (CATPCA). Data were used in this research were a part of data Demographic and Population Survey Indonesia (IDHS) 2012. This research focuses on the characteristic of women of using the contraceptive methods. Classification results evaluated using Area Under Curve (AUC) values. The higher the AUC value, the better. Based on AUC values, the classification of the contraceptive method using stepwise method (58.66%) is better than the logistic regression model (57.39%) and CATPCA (57.39%). Evaluation of the results of logistic regression using sensitivity, shows the opposite where CATPCA method (99.79%) is better than logistic regression method (92.43%) and stepwise (92.05%). Therefore in this study focuses on major class classification (using a contraceptive method), then the selected model is CATPCA because it can raise the level of the major class model accuracy.

  5. Time series modeling by a regression approach based on a latent process.

    Science.gov (United States)

    Chamroukhi, Faicel; Samé, Allou; Govaert, Gérard; Aknin, Patrice

    2009-01-01

    Time series are used in many domains including finance, engineering, economics and bioinformatics generally to represent the change of a measurement over time. Modeling techniques may then be used to give a synthetic representation of such data. A new approach for time series modeling is proposed in this paper. It consists of a regression model incorporating a discrete hidden logistic process allowing for activating smoothly or abruptly different polynomial regression models. The model parameters are estimated by the maximum likelihood method performed by a dedicated Expectation Maximization (EM) algorithm. The M step of the EM algorithm uses a multi-class Iterative Reweighted Least-Squares (IRLS) algorithm to estimate the hidden process parameters. To evaluate the proposed approach, an experimental study on simulated data and real world data was performed using two alternative approaches: a heteroskedastic piecewise regression model using a global optimization algorithm based on dynamic programming, and a Hidden Markov Regression Model whose parameters are estimated by the Baum-Welch algorithm. Finally, in the context of the remote monitoring of components of the French railway infrastructure, and more particularly the switch mechanism, the proposed approach has been applied to modeling and classifying time series representing the condition measurements acquired during switch operations.

  6. Regression to Causality : Regression-style presentation influences causal attribution

    DEFF Research Database (Denmark)

    Bordacconi, Mats Joe; Larsen, Martin Vinæs

    2014-01-01

    of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...

  7. Radiation curing

    International Nuclear Information System (INIS)

    Wendrinsky, J.

    1987-04-01

    In the beginning of the seventies the two types of radiation sources applied in industrial processes, electron radiation and UV, had been given rather optimistic forecasts. While UV could succeed in the field of panel and film coating, electron radiation curing seems to gain success in quite new fields of manufacturing. The listing of the suggested applications of radiation curing and a comparison of both advantages and disadvantages of this technology are followed by a number of case studies emphasizing the features of these processes and giving some examplary calculations. The data used for the calculations should provide an easy calculation of individual manufacturing costs if special production parameters, investment or energy costs are employed. (Author)

  8. Dynamics and optimal control of a non-linear epidemic model with relapse and cure

    Science.gov (United States)

    Lahrouz, A.; El Mahjour, H.; Settati, A.; Bernoussi, A.

    2018-04-01

    In this work, we introduce the basic reproduction number R0 for a general epidemic model with graded cure, relapse and nonlinear incidence rate in a non-constant population size. We established that the disease free-equilibrium state Ef is globally asymptotically exponentially stable if R0 1, we proved that the system model has at least one endemic state Ee. Then, by means of an appropriate Lyapunov function, we showed that Ee is unique and globally asymptotically stable under some acceptable biological conditions. On the other hand, we use two types of control to reduce the number of infectious individuals. The optimality system is formulated and solved numerically using a Gauss-Seidel-like implicit finite-difference method.

  9. DSC and curing kinetics study of epoxy grouting diluted with furfural -acetone slurry

    Science.gov (United States)

    Yin, H.; Sun, D. W.; Li, B.; Liu, Y. T.; Ran, Q. P.; Liu, J. P.

    2016-07-01

    The use of furfural-acetone slurry as active diluents of Bisphenol-A epoxy resin (DGEBA) groutings has been studied by dynamic and non-isothermal DSC for the first time. Curing kinetics study was investigated by non-isothermal differential scanning calorimetries at different heating rates. Activation enery (Ea) was calculated based on Kissinger and Ozawa Methods, and the results showed that Ea increased from 58.87 to 71.13KJ/mol after the diluents were added. The furfural-acetone epoxy matrix could cure completely at the theoretical curing temperature of 365.8K and the curing time of 139mins, which were determined by the kinetic model parameters.

  10. Comparison between linear and non-parametric regression models for genome-enabled prediction in wheat.

    Science.gov (United States)

    Pérez-Rodríguez, Paulino; Gianola, Daniel; González-Camacho, Juan Manuel; Crossa, José; Manès, Yann; Dreisigacker, Susanne

    2012-12-01

    In genome-enabled prediction, parametric, semi-parametric, and non-parametric regression models have been used. This study assessed the predictive ability of linear and non-linear models using dense molecular markers. The linear models were linear on marker effects and included the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B. The non-linear models (this refers to non-linearity on markers) were reproducing kernel Hilbert space (RKHS) regression, Bayesian regularized neural networks (BRNN), and radial basis function neural networks (RBFNN). These statistical models were compared using 306 elite wheat lines from CIMMYT genotyped with 1717 diversity array technology (DArT) markers and two traits, days to heading (DTH) and grain yield (GY), measured in each of 12 environments. It was found that the three non-linear models had better overall prediction accuracy than the linear regression specification. Results showed a consistent superiority of RKHS and RBFNN over the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B models.

  11. Adjusting for Confounding in Early Postlaunch Settings: Going Beyond Logistic Regression Models.

    Science.gov (United States)

    Schmidt, Amand F; Klungel, Olaf H; Groenwold, Rolf H H

    2016-01-01

    Postlaunch data on medical treatments can be analyzed to explore adverse events or relative effectiveness in real-life settings. These analyses are often complicated by the number of potential confounders and the possibility of model misspecification. We conducted a simulation study to compare the performance of logistic regression, propensity score, disease risk score, and stabilized inverse probability weighting methods to adjust for confounding. Model misspecification was induced in the independent derivation dataset. We evaluated performance using relative bias confidence interval coverage of the true effect, among other metrics. At low events per coefficient (1.0 and 0.5), the logistic regression estimates had a large relative bias (greater than -100%). Bias of the disease risk score estimates was at most 13.48% and 18.83%. For the propensity score model, this was 8.74% and >100%, respectively. At events per coefficient of 1.0 and 0.5, inverse probability weighting frequently failed or reduced to a crude regression, resulting in biases of -8.49% and 24.55%. Coverage of logistic regression estimates became less than the nominal level at events per coefficient ≤5. For the disease risk score, inverse probability weighting, and propensity score, coverage became less than nominal at events per coefficient ≤2.5, ≤1.0, and ≤1.0, respectively. Bias of misspecified disease risk score models was 16.55%. In settings with low events/exposed subjects per coefficient, disease risk score methods can be useful alternatives to logistic regression models, especially when propensity score models cannot be used. Despite better performance of disease risk score methods than logistic regression and propensity score models in small events per coefficient settings, bias, and coverage still deviated from nominal.

  12. Regression Model to Predict Global Solar Irradiance in Malaysia

    Directory of Open Access Journals (Sweden)

    Hairuniza Ahmed Kutty

    2015-01-01

    Full Text Available A novel regression model is developed to estimate the monthly global solar irradiance in Malaysia. The model is developed based on different available meteorological parameters, including temperature, cloud cover, rain precipitate, relative humidity, wind speed, pressure, and gust speed, by implementing regression analysis. This paper reports on the details of the analysis of the effect of each prediction parameter to identify the parameters that are relevant to estimating global solar irradiance. In addition, the proposed model is compared in terms of the root mean square error (RMSE, mean bias error (MBE, and the coefficient of determination (R2 with other models available from literature studies. Seven models based on single parameters (PM1 to PM7 and five multiple-parameter models (PM7 to PM12 are proposed. The new models perform well, with RMSE ranging from 0.429% to 1.774%, R2 ranging from 0.942 to 0.992, and MBE ranging from −0.1571% to 0.6025%. In general, cloud cover significantly affects the estimation of global solar irradiance. However, cloud cover in Malaysia lacks sufficient influence when included into multiple-parameter models although it performs fairly well in single-parameter prediction models.

  13. Curing mode affects bond strength of adhesively luted composite CAD/CAM restorations to dentin.

    Science.gov (United States)

    Lührs, Anne-Katrin; Pongprueksa, Pong; De Munck, Jan; Geurtsen, Werner; Van Meerbeek, Bart

    2014-03-01

    To determine the effect of curing mode and restoration-surface pre-treatment on the micro-tensile bond strength (μTBS) to dentin. Sandblasted CAD/CAM composite blocks (LAVA Ultimate, 3M ESPE) were cemented to bur-cut dentin using either the etch & rinse composite cement Nexus 3 ('NX3', Kerr) with Optibond XTR ('XTR', Kerr), or the self-etch composite cement RelyX Ultimate ('RXU', 3M ESPE) with Scotchbond Universal ('SBU', 3M ESPE). All experimental groups included different 'curing modes' (light-curing of adhesive and cement ('LL'), light-curing of adhesive and auto-cure of cement ('LA'), co-cure of adhesive through light-curing of cement ('AL'), or complete auto-cure ('AA')) and different 'restoration-surface pre-treatments' of the composite block (NX3: either a silane primer (Kerr), or the XTR adhesive; RXU: either silane primer (RelyX Ceramic Primer, 3M ESPE) and SBU, or solely SBU). After water-storage (7 days, 37°C), the μTBS was measured. Additionally, the degree of conversion (DC) of both cements was measured after 10min and after 1 week, either auto-cured (21°C/37°C) or light-cured (directly/through 3-mm CAD/CAM composite). The linear mixed-effects model (α=0.05) revealed a significant influence of the factors 'curing mode' and 'composite cement', and a less significant effect of the factor 'restoration-surface pre-treatment'. Light-curing 'LL' revealed the highest μTBS, which decreased significantly for all other curing modes. For curing modes 'AA' and 'AL', the lowest μTBS and a high percentage of pre-testing failures were reported. Overall, DC increased with light-curing and incubation time. The curing mode is decisive for the bonding effectiveness of adhesively luted composite CAD/CAM restorations to dentin. Copyright © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  14. Polynomial regression analysis and significance test of the regression function

    International Nuclear Information System (INIS)

    Gao Zhengming; Zhao Juan; He Shengping

    2012-01-01

    In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)

  15. Curing kinetics, mechanism and chemorheological behavior of methanol etherified amino/novolac epoxy systems

    Directory of Open Access Journals (Sweden)

    S. F. Zhao

    2014-02-01

    Full Text Available The curing kinetics and mechanism of epoxy novolac resin (DEN and modified epoxy novolac resin (MDEN with methanol etherified amino resin were studied by means of differential scanning calorimetry (DSC, Fourier transforminfrared (FT-IR spectroscopy and chemorheological analysis. Their kinetics parameters and models of the curing were examined utilizing isoconversional methods, Flynn-Wall-Ozawa and Friedman methods. For the DEN mixture, its average activation energy (Ea was 71.05 kJ/mol and the autocatalytic model was established to describe the curing reaction. The MDEN mixture exhibited three dominant curing processes, termed as reaction 1, reaction 2 and reaction 3; and their Ea were 70.05, 106.55 and 101.91 kJ/mol, respectively. Besides, Ea of reaction 1 was similar to that of DEN mixture, while Ea of reactions 2 and 3 corresponded to that of the etherification reaction between hydroxyl and epoxide group. Moreover, these three dominant reactions were nth order in nature. Furthermore, their curing mechanisms were proposed from the results of DSC and FTIR. The chemorheological behavior was also investigated to obtain better plastics products via optimizing the processing schedules.

  16. Computer Simulation of Cure Process of an Axisymmetric Rubber Article Reinforced by Metal Plates Using Extended ABAQUS Code

    Directory of Open Access Journals (Sweden)

    M.H.R. Ghoreishy

    2013-01-01

    Full Text Available Afinite element model is developed for simulation of the curing process of a thick axisymmetric rubber article reinforced by metal plates during the molding and cooling stages. The model consists of the heat transfer equation and a newly developed kinetics model for the determination of the state of cure in the rubber. The latter is based on the modification of the well-known Kamal-Sourour model. The thermal contact of the rubber with metallic surfaces (inserts and molds and the variation of the thermal properties (conductivity and specific heat with temperature and state-of-cure are taken into consideration. The ABAQUS code is used in conjunction with an in-house developed user subroutine to solve the governing equations. Having compared temperature profile and variation of the state-of-cure with experimentally measured data, the accuracy and applicability of the model is confirmed. It is also shown that this model can be successfully used for the optimization of curing process which gives rise to reduction of the molding time.

  17. Logistic regression for risk factor modelling in stuttering research.

    Science.gov (United States)

    Reed, Phil; Wu, Yaqionq

    2013-06-01

    To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Efficient estimation of an additive quantile regression model

    NARCIS (Netherlands)

    Cheng, Y.; de Gooijer, J.G.; Zerom, D.

    2011-01-01

    In this paper, two non-parametric estimators are proposed for estimating the components of an additive quantile regression model. The first estimator is a computationally convenient approach which can be viewed as a more viable alternative to existing kernel-based approaches. The second estimator

  19. Advanced statistics: linear regression, part I: simple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  20. Advanced statistics: linear regression, part II: multiple linear regression.

    Science.gov (United States)

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  1. Novel techniques for concrete curing

    DEFF Research Database (Denmark)

    Kovler, Konstantin; Jensen, Ole Mejlhede

    2005-01-01

    It is known that some high-strength/high-performance concretes (HSC/HPC) are prone to cracking at an early age unless special precautions are taken. The paper deals with the methods of curing as one of the main strategies to ensure good performance of concrete. Curing by both external (conventional......) and internal methods is reviewed and analyzed, among other methods of mitigating shrinkage and cracking of concrete. The focus is on the mitigation of autogenous shrinkage of low water to binder ratio (w/b) concrete by means of internal curing. The concepts of internal curing are based on using lightweight...... aggregate, superabsorbent polymers or water-soluble chemicals, which reduce water evaporation (so called "internal sealing"). These concepts have been intensively researched in the 90s, but still are not widespread among contractors and concrete suppliers. The differences between conventional methods...

  2. Crime Modeling using Spatial Regression Approach

    Science.gov (United States)

    Saleh Ahmar, Ansari; Adiatma; Kasim Aidid, M.

    2018-01-01

    Act of criminality in Indonesia increased both variety and quantity every year. As murder, rape, assault, vandalism, theft, fraud, fencing, and other cases that make people feel unsafe. Risk of society exposed to crime is the number of reported cases in the police institution. The higher of the number of reporter to the police institution then the number of crime in the region is increasing. In this research, modeling criminality in South Sulawesi, Indonesia with the dependent variable used is the society exposed to the risk of crime. Modelling done by area approach is the using Spatial Autoregressive (SAR) and Spatial Error Model (SEM) methods. The independent variable used is the population density, the number of poor population, GDP per capita, unemployment and the human development index (HDI). Based on the analysis using spatial regression can be shown that there are no dependencies spatial both lag or errors in South Sulawesi.

  3. C-CURE

    Data.gov (United States)

    US Agency for International Development — C-CURE system manages certain aspects of the access control system, including collecting employee and contractor names and photographs. The Office of Security uses...

  4. Determining factors influencing survival of breast cancer by fuzzy logistic regression model.

    Science.gov (United States)

    Nikbakht, Roya; Bahrampour, Abbas

    2017-01-01

    Fuzzy logistic regression model can be used for determining influential factors of disease. This study explores the important factors of actual predictive survival factors of breast cancer's patients. We used breast cancer data which collected by cancer registry of Kerman University of Medical Sciences during the period of 2000-2007. The variables such as morphology, grade, age, and treatments (surgery, radiotherapy, and chemotherapy) were applied in the fuzzy logistic regression model. Performance of model was determined in terms of mean degree of membership (MDM). The study results showed that almost 41% of patients were in neoplasm and malignant group and more than two-third of them were still alive after 5-year follow-up. Based on the fuzzy logistic model, the most important factors influencing survival were chemotherapy, morphology, and radiotherapy, respectively. Furthermore, the MDM criteria show that the fuzzy logistic regression have a good fit on the data (MDM = 0.86). Fuzzy logistic regression model showed that chemotherapy is more important than radiotherapy in survival of patients with breast cancer. In addition, another ability of this model is calculating possibilistic odds of survival in cancer patients. The results of this study can be applied in clinical research. Furthermore, there are few studies which applied the fuzzy logistic models. Furthermore, we recommend using this model in various research areas.

  5. Efficient estimation of an additive quantile regression model

    NARCIS (Netherlands)

    Cheng, Y.; de Gooijer, J.G.; Zerom, D.

    2009-01-01

    In this paper two kernel-based nonparametric estimators are proposed for estimating the components of an additive quantile regression model. The first estimator is a computationally convenient approach which can be viewed as a viable alternative to the method of De Gooijer and Zerom (2003). By

  6. Efficient estimation of an additive quantile regression model

    NARCIS (Netherlands)

    Cheng, Y.; de Gooijer, J.G.; Zerom, D.

    2010-01-01

    In this paper two kernel-based nonparametric estimators are proposed for estimating the components of an additive quantile regression model. The first estimator is a computationally convenient approach which can be viewed as a viable alternative to the method of De Gooijer and Zerom (2003). By

  7. Boosted beta regression.

    Directory of Open Access Journals (Sweden)

    Matthias Schmid

    Full Text Available Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1. Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures.

  8. Using the classical linear regression model in analysis of the dependences of conveyor belt life

    Directory of Open Access Journals (Sweden)

    Miriam Andrejiová

    2013-12-01

    Full Text Available The paper deals with the classical linear regression model of the dependence of conveyor belt life on some selected parameters: thickness of paint layer, width and length of the belt, conveyor speed and quantity of transported material. The first part of the article is about regression model design, point and interval estimation of parameters, verification of statistical significance of the model, and about the parameters of the proposed regression model. The second part of the article deals with identification of influential and extreme values that can have an impact on estimation of regression model parameters. The third part focuses on assumptions of the classical regression model, i.e. on verification of independence assumptions, normality and homoscedasticity of residuals.

  9. Hsp104 Overexpression Cures Saccharomyces cerevisiae [PSI+] by Causing Dissolution of the Prion Seeds

    Science.gov (United States)

    Park, Yang-Nim; Zhao, Xiaohong; Yim, Yang-In; Todor, Horia; Ellerbrock, Robyn; Reidy, Michael; Eisenberg, Evan; Masison, Daniel C.

    2014-01-01

    The [PSI+] yeast prion is formed when Sup35 misfolds into amyloid aggregates. [PSI+], like other yeast prions, is dependent on the molecular chaperone Hsp104, which severs the prion seeds so that they pass on as the yeast cells divide. Surprisingly, however, overexpression of Hsp104 also cures [PSI+]. Several models have been proposed to explain this effect: inhibition of severing, asymmetric segregation of the seeds between mother and daughter cells, and dissolution of the prion seeds. First, we found that neither the kinetics of curing nor the heterogeneity in the distribution of the green fluorescent protein (GFP)-labeled Sup35 foci in partially cured yeast cells is compatible with Hsp104 overexpression curing [PSI+] by inhibiting severing. Second, we ruled out the asymmetric segregation model by showing that the extent of curing was essentially the same in mother and daughter cells and that the fluorescent foci did not distribute asymmetrically, but rather, there was marked loss of foci in both mother and daughter cells. These results suggest that Hsp104 overexpression cures [PSI+] by dissolution of the prion seeds in a two-step process. First, trimming of the prion seeds by Hsp104 reduces their size, and second, their amyloid core is eliminated, most likely by proteolysis. PMID:24632242

  10. Hsp104 overexpression cures Saccharomyces cerevisiae [PSI+] by causing dissolution of the prion seeds.

    Science.gov (United States)

    Park, Yang-Nim; Zhao, Xiaohong; Yim, Yang-In; Todor, Horia; Ellerbrock, Robyn; Reidy, Michael; Eisenberg, Evan; Masison, Daniel C; Greene, Lois E

    2014-05-01

    The [PSI(+)] yeast prion is formed when Sup35 misfolds into amyloid aggregates. [PSI(+)], like other yeast prions, is dependent on the molecular chaperone Hsp104, which severs the prion seeds so that they pass on as the yeast cells divide. Surprisingly, however, overexpression of Hsp104 also cures [PSI(+)]. Several models have been proposed to explain this effect: inhibition of severing, asymmetric segregation of the seeds between mother and daughter cells, and dissolution of the prion seeds. First, we found that neither the kinetics of curing nor the heterogeneity in the distribution of the green fluorescent protein (GFP)-labeled Sup35 foci in partially cured yeast cells is compatible with Hsp104 overexpression curing [PSI(+)] by inhibiting severing. Second, we ruled out the asymmetric segregation model by showing that the extent of curing was essentially the same in mother and daughter cells and that the fluorescent foci did not distribute asymmetrically, but rather, there was marked loss of foci in both mother and daughter cells. These results suggest that Hsp104 overexpression cures [PSI(+)] by dissolution of the prion seeds in a two-step process. First, trimming of the prion seeds by Hsp104 reduces their size, and second, their amyloid core is eliminated, most likely by proteolysis.

  11. Modeling the frequency of opposing left-turn conflicts at signalized intersections using generalized linear regression models.

    Science.gov (United States)

    Zhang, Xin; Liu, Pan; Chen, Yuguang; Bai, Lu; Wang, Wei

    2014-01-01

    The primary objective of this study was to identify whether the frequency of traffic conflicts at signalized intersections can be modeled. The opposing left-turn conflicts were selected for the development of conflict predictive models. Using data collected at 30 approaches at 20 signalized intersections, the underlying distributions of the conflicts under different traffic conditions were examined. Different conflict-predictive models were developed to relate the frequency of opposing left-turn conflicts to various explanatory variables. The models considered include a linear regression model, a negative binomial model, and separate models developed for four traffic scenarios. The prediction performance of different models was compared. The frequency of traffic conflicts follows a negative binominal distribution. The linear regression model is not appropriate for the conflict frequency data. In addition, drivers behaved differently under different traffic conditions. Accordingly, the effects of conflicting traffic volumes on conflict frequency vary across different traffic conditions. The occurrences of traffic conflicts at signalized intersections can be modeled using generalized linear regression models. The use of conflict predictive models has potential to expand the uses of surrogate safety measures in safety estimation and evaluation.

  12. Use of empirical likelihood to calibrate auxiliary information in partly linear monotone regression models.

    Science.gov (United States)

    Chen, Baojiang; Qin, Jing

    2014-05-10

    In statistical analysis, a regression model is needed if one is interested in finding the relationship between a response variable and covariates. When the response depends on the covariate, then it may also depend on the function of this covariate. If one has no knowledge of this functional form but expect for monotonic increasing or decreasing, then the isotonic regression model is preferable. Estimation of parameters for isotonic regression models is based on the pool-adjacent-violators algorithm (PAVA), where the monotonicity constraints are built in. With missing data, people often employ the augmented estimating method to improve estimation efficiency by incorporating auxiliary information through a working regression model. However, under the framework of the isotonic regression model, the PAVA does not work as the monotonicity constraints are violated. In this paper, we develop an empirical likelihood-based method for isotonic regression model to incorporate the auxiliary information. Because the monotonicity constraints still hold, the PAVA can be used for parameter estimation. Simulation studies demonstrate that the proposed method can yield more efficient estimates, and in some situations, the efficiency improvement is substantial. We apply this method to a dementia study. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Gamma and electron beam curing of polymers and composites

    International Nuclear Information System (INIS)

    Saunders, C.B.; Dickson, L.W.; Singh, A.

    1987-01-01

    Radiation polymerization has helped us understand polymer chemistry, and is also playing an increasing role in the field of practical applications. Radiation curing has a present market share of about 5% of the total market for curing of polymers and composites and the annual growth rate of the radiation curing market is ≥20% per year. Advantages of radiation curing over thermal or chemical curing methods include: improved control of the curing rate, reduced curing times, curing at ambient temperatures, curing without the need for chemical initiators, and complete (100%) curing with minimal toxic chemical emissions. Radiation treatment may also be used to effect crosslinking and grafting of polymer and composite materials. The major advantage in these cases is the ability to process products in their final shape. Cable insulation, automotive and aircraft components, and improved construction materials are some of the current and near-future industrial applications of radiation curing and crosslinking. 19 refs

  14. Cure Schedule for Stycast 2651/Catalyst 9.

    Energy Technology Data Exchange (ETDEWEB)

    Kropka, Jamie Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McCoy, John D. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    2017-11-01

    The Emerson & Cuming technical data sheet (TDS) for Stycast 2651/Catalyst 9 lists three alternate cure schedules for the material, each of which would result in a different state of reaction and different material properties. Here, a cure schedule that attains full reaction of the material is defined. The use of this cure schedule will eliminate variance in material properties due to changes in the cure state of the material, and the cure schedule will serve as the method to make material prior to characterizing properties. The following recommendation uses one of the schedules within the TDS and adds a “post cure” to obtain full reaction.

  15. UV/EB curing market in Indonesia

    International Nuclear Information System (INIS)

    Hilmy, N.; Danu, S.

    1999-01-01

    The most application of UV curing of surface coating in Indonesia are on fancy plywood, furniture and wood flooring industry. Other application are on papers, printing ink/labelling, printed circuit board/PCB and dental materials. At present, application of EB curing coating is still in a pilot plant scale due to the high cost of production. Limited number of application of EB curing by using low energy electron beam machine are on wood panels, ceramics and marbles. This paper describes the market and the problem faced by the largest user of radiation curing systems such as the secondary process plywood, furniture and paper industries

  16. Predictive market segmentation model: An application of logistic regression model and CHAID procedure

    Directory of Open Access Journals (Sweden)

    Soldić-Aleksić Jasna

    2009-01-01

    Full Text Available Market segmentation presents one of the key concepts of the modern marketing. The main goal of market segmentation is focused on creating groups (segments of customers that have similar characteristics, needs, wishes and/or similar behavior regarding the purchase of concrete product/service. Companies can create specific marketing plan for each of these segments and therefore gain short or long term competitive advantage on the market. Depending on the concrete marketing goal, different segmentation schemes and techniques may be applied. This paper presents a predictive market segmentation model based on the application of logistic regression model and CHAID analysis. The logistic regression model was used for the purpose of variables selection (from the initial pool of eleven variables which are statistically significant for explaining the dependent variable. Selected variables were afterwards included in the CHAID procedure that generated the predictive market segmentation model. The model results are presented on the concrete empirical example in the following form: summary model results, CHAID tree, Gain chart, Index chart, risk and classification tables.

  17. Analysis of Multivariate Experimental Data Using A Simplified Regression Model Search Algorithm

    Science.gov (United States)

    Ulbrich, Norbert Manfred

    2013-01-01

    A new regression model search algorithm was developed in 2011 that may be used to analyze both general multivariate experimental data sets and wind tunnel strain-gage balance calibration data. The new algorithm is a simplified version of a more complex search algorithm that was originally developed at the NASA Ames Balance Calibration Laboratory. The new algorithm has the advantage that it needs only about one tenth of the original algorithm's CPU time for the completion of a search. In addition, extensive testing showed that the prediction accuracy of math models obtained from the simplified algorithm is similar to the prediction accuracy of math models obtained from the original algorithm. The simplified algorithm, however, cannot guarantee that search constraints related to a set of statistical quality requirements are always satisfied in the optimized regression models. Therefore, the simplified search algorithm is not intended to replace the original search algorithm. Instead, it may be used to generate an alternate optimized regression model of experimental data whenever the application of the original search algorithm either fails or requires too much CPU time. Data from a machine calibration of NASA's MK40 force balance is used to illustrate the application of the new regression model search algorithm.

  18. Evaluation of accuracy of linear regression models in predicting urban stormwater discharge characteristics.

    Science.gov (United States)

    Madarang, Krish J; Kang, Joo-Hyon

    2014-06-01

    Stormwater runoff has been identified as a source of pollution for the environment, especially for receiving waters. In order to quantify and manage the impacts of stormwater runoff on the environment, predictive models and mathematical models have been developed. Predictive tools such as regression models have been widely used to predict stormwater discharge characteristics. Storm event characteristics, such as antecedent dry days (ADD), have been related to response variables, such as pollutant loads and concentrations. However it has been a controversial issue among many studies to consider ADD as an important variable in predicting stormwater discharge characteristics. In this study, we examined the accuracy of general linear regression models in predicting discharge characteristics of roadway runoff. A total of 17 storm events were monitored in two highway segments, located in Gwangju, Korea. Data from the monitoring were used to calibrate United States Environmental Protection Agency's Storm Water Management Model (SWMM). The calibrated SWMM was simulated for 55 storm events, and the results of total suspended solid (TSS) discharge loads and event mean concentrations (EMC) were extracted. From these data, linear regression models were developed. R(2) and p-values of the regression of ADD for both TSS loads and EMCs were investigated. Results showed that pollutant loads were better predicted than pollutant EMC in the multiple regression models. Regression may not provide the true effect of site-specific characteristics, due to uncertainty in the data. Copyright © 2014 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.

  19. Improving sub-pixel imperviousness change prediction by ensembling heterogeneous non-linear regression models

    Science.gov (United States)

    Drzewiecki, Wojciech

    2016-12-01

    In this work nine non-linear regression models were compared for sub-pixel impervious surface area mapping from Landsat images. The comparison was done in three study areas both for accuracy of imperviousness coverage evaluation in individual points in time and accuracy of imperviousness change assessment. The performance of individual machine learning algorithms (Cubist, Random Forest, stochastic gradient boosting of regression trees, k-nearest neighbors regression, random k-nearest neighbors regression, Multivariate Adaptive Regression Splines, averaged neural networks, and support vector machines with polynomial and radial kernels) was also compared with the performance of heterogeneous model ensembles constructed from the best models trained using particular techniques. The results proved that in case of sub-pixel evaluation the most accurate prediction of change may not necessarily be based on the most accurate individual assessments. When single methods are considered, based on obtained results Cubist algorithm may be advised for Landsat based mapping of imperviousness for single dates. However, Random Forest may be endorsed when the most reliable evaluation of imperviousness change is the primary goal. It gave lower accuracies for individual assessments, but better prediction of change due to more correlated errors of individual predictions. Heterogeneous model ensembles performed for individual time points assessments at least as well as the best individual models. In case of imperviousness change assessment the ensembles always outperformed single model approaches. It means that it is possible to improve the accuracy of sub-pixel imperviousness change assessment using ensembles of heterogeneous non-linear regression models.

  20. Hydration kinetics modeling of Portland cement considering the effects of curing temperature and applied pressure

    International Nuclear Information System (INIS)

    Lin Feng; Meyer, Christian

    2009-01-01

    A hydration kinetics model for Portland cement is formulated based on thermodynamics of multiphase porous media. The mechanism of cement hydration is discussed based on literature review. The model is then developed considering the effects of chemical composition and fineness of cement, water-cement ratio, curing temperature and applied pressure. The ultimate degree of hydration of Portland cement is also analyzed and a corresponding formula is established. The model is calibrated against the experimental data for eight different Portland cements. Simple relations between the model parameters and cement composition are obtained and used to predict hydration kinetics. The model is used to reproduce experimental results on hydration kinetics, adiabatic temperature rise, and chemical shrinkage of different cement pastes. The comparisons between the model reproductions and the different experimental results demonstrate the applicability of the proposed model, especially for cement hydration at elevated temperature and high pressure.

  1. Radiation curing of polymers II

    International Nuclear Information System (INIS)

    Randell, D.R.

    1991-01-01

    During the last decade radiation cured polymers have continued to grow in importance not only by expansion within existing coatings applications but also by extension into new fields of application such as ceramics, ink-jet inks and fibres. To provide a further update on the rapidly growing science and technology of radiation curing the Third International Symposium was held. Apart from providing an update on the application, chemistry and control aspects of the radiation curing the aim of the meeting was also to provide the newcomer with a basic insight into radiation curing applications. Accordingly the proceedings contained in this special publication which follow closely the format of the meeting, has five sections covering the background/trends, applications, initiator chemistry, substrate chemistry and analytical, physical chemical and health and safety aspects. There are twenty-five papers all told, three of which are indexed separately. (Author)

  2. Effects of Chemical Curing Temperature and Time on the Properties of Liquefied Wood based As-cured Precursors and Carbon Fibers

    Directory of Open Access Journals (Sweden)

    Junbo Shang

    2015-09-01

    Full Text Available Liquefied wood based as-cured precursors and carbon fibers prepared by different chemical curing processes were carried out to investigate the effects of curing temperature and time on the thermostability and microstructure of liquefied wood based precursors, the tensile strength of carbon fibers as well. The primary fibers can be converted into precursors with high performance by directly heating at target curing temperature. With the temperature and duration increasing, the numbers of methylene bonds in precursors increased, resulting in the enhancement of cross-linkages among molecular chains and then the improvement of thermostability of precursors. Carbon fibers prepared from as-cured precursors (curing temperature 95 oC, curing time 3h had the minimum value of the average interlayer spacing (d002, it also showed the highest tensile strength, almost 800 MPa, which can be classified as fibers of general grade.

  3. Regularized multivariate regression models with skew-t error distributions

    KAUST Repository

    Chen, Lianfu; Pourahmadi, Mohsen; Maadooliat, Mehdi

    2014-01-01

    We consider regularization of the parameters in multivariate linear regression models with the errors having a multivariate skew-t distribution. An iterative penalized likelihood procedure is proposed for constructing sparse estimators of both

  4. On the probability of cure for heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Hanin, Leonid; Zaider, Marco

    2014-01-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)

  5. Investigation of synthesis, thermal properties and curing kinetics of fluorene diamine-based benzoxazine by using two curing kinetic methods

    International Nuclear Information System (INIS)

    He, Xuan-yu; Wang, Jun; Ramdani, Noureddine; Liu, Wen-bin; Liu, Li-jia; Yang, Lei

    2013-01-01

    Graphical abstract: - Highlights: • A novel diamine-based benzoxazine monomer containing aryl ether and bulky fluorene groups (BEF-p) is synthesized. • Kinetic parameters can be calculated by Starink-LSR method and direct LSR method. • Cure reaction could be successfully described with the autocatalytic model. • The poly(BEF-p) exhibits high T g and superior thermal stability. • Aryl ether linkages had little influence on the thermal stability. - Abstract: A novel diamine-based benzoxazine monomer containing aryl ether and bulky fluorene groups (BEF-p) was prepared from the reaction of 9,9-bis-[4-(p-aminophenoxy)-phenyl]fluorene with paraformaldehyde and phenol. The chemical structure of monomer was confirmed by Fourier-transform infrared (FTIR) and 1 H and 13 C nuclear magnetic resonance spectroscopy ( 1 H and 13 C NMR). The polymerization behavior of monomer was analyzed by differential scanning calorimetry (DSC) and FTIR. The curing kinetics was studied by non-isothermal DSC, and the kinetic parameters were determined. The autocatalytic model based on two kinetic methods (Starink-LSR method and direct LSR method) showed good agreement with experimental results. The thermal and mechanical properties of poly(BEF-p) were evaluated with DSC, dynamic mechanical thermal analysis (DMTA), and thermogravimetric analysis (TGA). The results showed that the cured polymer exhibited higher glass transition temperature (T g ) and better thermal stability compared with diaminodiphenylmethane-based benzoxazine(P-ddm), and was slightly lower than those of fluorene diamine-phenol-based polybenzoxazine (poly(BF-p))

  6. Evaluation of a multiple linear regression model and SARIMA model in forecasting heat demand for district heating system

    International Nuclear Information System (INIS)

    Fang, Tingting; Lahdelma, Risto

    2016-01-01

    Highlights: • Social factor is considered for the linear regression models besides weather file. • Simultaneously optimize all the coefficients for linear regression models. • SARIMA combined with linear regression is used to forecast the heat demand. • The accuracy for both linear regression and time series models are evaluated. - Abstract: Forecasting heat demand is necessary for production and operation planning of district heating (DH) systems. In this study we first propose a simple regression model where the hourly outdoor temperature and wind speed forecast the heat demand. Weekly rhythm of heat consumption as a social component is added to the model to significantly improve the accuracy. The other type of model is the seasonal autoregressive integrated moving average (SARIMA) model with exogenous variables as a combination to take weather factors, and the historical heat consumption data as depending variables. One outstanding advantage of the model is that it peruses the high accuracy for both long-term and short-term forecast by considering both exogenous factors and time series. The forecasting performance of both linear regression models and time series model are evaluated based on real-life heat demand data for the city of Espoo in Finland by out-of-sample tests for the last 20 full weeks of the year. The results indicate that the proposed linear regression model (T168h) using 168-h demand pattern with midweek holidays classified as Saturdays or Sundays gives the highest accuracy and strong robustness among all the tested models based on the tested forecasting horizon and corresponding data. Considering the parsimony of the input, the ease of use and the high accuracy, the proposed T168h model is the best in practice. The heat demand forecasting model can also be developed for individual buildings if automated meter reading customer measurements are available. This would allow forecasting the heat demand based on more accurate heat consumption

  7. The irradiation curing of coatings

    International Nuclear Information System (INIS)

    Autio, T.

    1974-01-01

    The electron beam irradiation curing of coatings has been technically feasible for over a decade. A brief description of the process is presented. The progress in this field has been astonishingly slow in comparison with the use of UV lamps as radiation source. The primary reason for this has been the great advantage in terms of capital cost of the UV curing lines and their ready adaptability to low or high production rates. A literature survey is given concerning basic and applied research in the electron curing area, patents, economics and existing installations around the world. (author)

  8. Anti-proliferative therapy for HIV cure: a compound interest approach.

    Science.gov (United States)

    Reeves, Daniel B; Duke, Elizabeth R; Hughes, Sean M; Prlic, Martin; Hladik, Florian; Schiffer, Joshua T

    2017-06-21

    In the era of antiretroviral therapy (ART), HIV-1 infection is no longer tantamount to early death. Yet the benefits of treatment are available only to those who can access, afford, and tolerate taking daily pills. True cure is challenged by HIV latency, the ability of chromosomally integrated virus to persist within memory CD4 + T cells in a non-replicative state and activate when ART is discontinued. Using a mathematical model of HIV dynamics, we demonstrate that treatment strategies offering modest but continual enhancement of reservoir clearance rates result in faster cure than abrupt, one-time reductions in reservoir size. We frame this concept in terms of compounding interest: small changes in interest rate drastically improve returns over time. On ART, latent cell proliferation rates are orders of magnitude larger than activation and new infection rates. Contingent on subtypes of cells that may make up the reservoir and their respective proliferation rates, our model predicts that coupling clinically available, anti-proliferative therapies with ART could result in functional cure within 2-10 years rather than several decades on ART alone.

  9. Effect of Relining Methods (Cold & Heat Cure On the Accuracy of Posterior

    Directory of Open Access Journals (Sweden)

    Nafiseh AsadzadehOghadaee

    2013-01-01

    Full Text Available Introduction: The posterior palatal area is the most important area for retention of maxillary dentures and must be considered carefully during and after the reline. The purpose of this in vitro study was to compare the posterior palatal seal in relined complete dentures with two different methods.Materials & Methods: An average size of edentulous maxillary acrylic arch without undercuts was selected in this in vitro study. The alginate impression was made of this model ten times and was poured with a type IV gypsum product, and the casts of control groups were prepared. Then 10 definitive bases were created for each cast. For the experimental groups, one relief wax layer with a thickness of 2mm was put in post-dam area for relining processes. Then, 20 alginate impressions were made of this model. On definitive base, clear heat-cured acrylic bases were fabricated. In experimental groups, bases were divided into 2 groups of 10: first group was relined with heat-cured acrylic resin and another one was relined with cold cured acrylic resin. All of the bases were put in distilled water for two weeks and then each of them was placed on the definitive base. One code was considered for each model. The gap in posterior area between acrylic bases and arch was measured in five points (a-b-c-d-e: mid line, two points in hamular notch, and two points between midline and hamular notch by two practicers in two different times (during two weeks with light B×60 microscope. The data were analyzed by Tukey and Kruskal Wallis tests.Results: The results of this study indicated that there was a statistically significant difference in the amount of gap at point A between control (bases without reline and experimental groups (P=0.047. At point D there was no significant difference between experimental groups, but a significant difference was detected between control group and bases relined with cold cure acryl (P<0.05.Conclusion: The results of this laboratory study

  10. Validation of regression models for nitrate concentrations in the upper groundwater in sandy soils

    International Nuclear Information System (INIS)

    Sonneveld, M.P.W.; Brus, D.J.; Roelsma, J.

    2010-01-01

    For Dutch sandy regions, linear regression models have been developed that predict nitrate concentrations in the upper groundwater on the basis of residual nitrate contents in the soil in autumn. The objective of our study was to validate these regression models for one particular sandy region dominated by dairy farming. No data from this area were used for calibrating the regression models. The model was validated by additional probability sampling. This sample was used to estimate errors in 1) the predicted areal fractions where the EU standard of 50 mg l -1 is exceeded for farms with low N surpluses (ALT) and farms with higher N surpluses (REF); 2) predicted cumulative frequency distributions of nitrate concentration for both groups of farms. Both the errors in the predicted areal fractions as well as the errors in the predicted cumulative frequency distributions indicate that the regression models are invalid for the sandy soils of this study area. - This study indicates that linear regression models that predict nitrate concentrations in the upper groundwater using residual soil N contents should be applied with care.

  11. Bivariate least squares linear regression: Towards a unified analytic formalism. I. Functional models

    Science.gov (United States)

    Caimmi, R.

    2011-08-01

    Concerning bivariate least squares linear regression, the classical approach pursued for functional models in earlier attempts ( York, 1966, 1969) is reviewed using a new formalism in terms of deviation (matrix) traces which, for unweighted data, reduce to usual quantities leaving aside an unessential (but dimensional) multiplicative factor. Within the framework of classical error models, the dependent variable relates to the independent variable according to the usual additive model. The classes of linear models considered are regression lines in the general case of correlated errors in X and in Y for weighted data, and in the opposite limiting situations of (i) uncorrelated errors in X and in Y, and (ii) completely correlated errors in X and in Y. The special case of (C) generalized orthogonal regression is considered in detail together with well known subcases, namely: (Y) errors in X negligible (ideally null) with respect to errors in Y; (X) errors in Y negligible (ideally null) with respect to errors in X; (O) genuine orthogonal regression; (R) reduced major-axis regression. In the limit of unweighted data, the results determined for functional models are compared with their counterparts related to extreme structural models i.e. the instrumental scatter is negligible (ideally null) with respect to the intrinsic scatter ( Isobe et al., 1990; Feigelson and Babu, 1992). While regression line slope and intercept estimators for functional and structural models necessarily coincide, the contrary holds for related variance estimators even if the residuals obey a Gaussian distribution, with the exception of Y models. An example of astronomical application is considered, concerning the [O/H]-[Fe/H] empirical relations deduced from five samples related to different stars and/or different methods of oxygen abundance determination. For selected samples and assigned methods, different regression models yield consistent results within the errors (∓ σ) for both

  12. Modeling and prediction of flotation performance using support vector regression

    Directory of Open Access Journals (Sweden)

    Despotović Vladimir

    2017-01-01

    Full Text Available Continuous efforts have been made in recent year to improve the process of paper recycling, as it is of critical importance for saving the wood, water and energy resources. Flotation deinking is considered to be one of the key methods for separation of ink particles from the cellulose fibres. Attempts to model the flotation deinking process have often resulted in complex models that are difficult to implement and use. In this paper a model for prediction of flotation performance based on Support Vector Regression (SVR, is presented. Representative data samples were created in laboratory, under a variety of practical control variables for the flotation deinking process, including different reagents, pH values and flotation residence time. Predictive model was created that was trained on these data samples, and the flotation performance was assessed showing that Support Vector Regression is a promising method even when dataset used for training the model is limited.

  13. Study on the heat-resistant EB curing composites

    International Nuclear Information System (INIS)

    Bao Jianwen; Li Yang; Li Fengmei

    2000-01-01

    There are many advantages in the EB-curing process of composites. Heat-resistant EB-curing composites could substitute for polyimide composites used in aeronautical engine. The effects of catalyst and dose on the cured resin were investigated. The heat-resistance of the resin cured by EB was evaluated by dynamic mechanical thermal analysis (DMTA). The experiment result shows that the mechanical property of the composites cured by EB could meet the needs of the aeronautical engine in 250degC. (author)

  14. About the cure kinetics in natural rubber/styrene Butadiene rubber blends at 433 K

    International Nuclear Information System (INIS)

    Mansilla, M.A.; Marzocca, A.J.

    2012-01-01

    Vulcanized blends of elastomers are employed in several goods mainly to improve physical properties and reduce costs. One of the most used blends of this kind is that composed by natural rubber (NR) and styrene butadiene rubber (SBR). The cure kinetic of these blends depends mainly on the compound formulation and the cure temperature and time. The preparation method of the blends can influence the mechanical properties of the vulcanized compounds. In this work the cure kinetic at 433 K of NR/SBR blends vulcanized with the system sulfur/TBBS (N-t-butyl-2-benzothiazole sulfenamide) is analyzed in samples prepared by mechanical mixing and solution blending. The two methods produce elastomer domains of NR and SBR, which present different microstructure due to the cure level attained during vulcanization. The cure kinetics is studied by means of rheometer tests and the model proposed by Kamal and Sourour. The analysis of the cure rate is presented and is related to the structure obtained during the vulcanization process.

  15. About the cure kinetics in natural rubber/styrene Butadiene rubber blends at 433 K

    Energy Technology Data Exchange (ETDEWEB)

    Mansilla, M.A., E-mail: mmansilla@df.uba.ar [Laboratorio de Polimeros y Materiales Compuestos, Departamento de Fisica, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Ciudad Universitaria, Pabellon 1, C1428EGA Buenos Aires (Argentina); Marzocca, A.J. [Laboratorio de Polimeros y Materiales Compuestos, Departamento de Fisica, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Ciudad Universitaria, Pabellon 1, C1428EGA Buenos Aires (Argentina)

    2012-08-15

    Vulcanized blends of elastomers are employed in several goods mainly to improve physical properties and reduce costs. One of the most used blends of this kind is that composed by natural rubber (NR) and styrene butadiene rubber (SBR). The cure kinetic of these blends depends mainly on the compound formulation and the cure temperature and time. The preparation method of the blends can influence the mechanical properties of the vulcanized compounds. In this work the cure kinetic at 433 K of NR/SBR blends vulcanized with the system sulfur/TBBS (N-t-butyl-2-benzothiazole sulfenamide) is analyzed in samples prepared by mechanical mixing and solution blending. The two methods produce elastomer domains of NR and SBR, which present different microstructure due to the cure level attained during vulcanization. The cure kinetics is studied by means of rheometer tests and the model proposed by Kamal and Sourour. The analysis of the cure rate is presented and is related to the structure obtained during the vulcanization process.

  16. Forecasting daily meteorological time series using ARIMA and regression models

    Science.gov (United States)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  17. Replica analysis of overfitting in regression models for time-to-event data

    Science.gov (United States)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  18. Cross-validation pitfalls when selecting and assessing regression and classification models.

    Science.gov (United States)

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  19. An emerging alternative to thermal curing: Electron curing of fiber-reinforced composites

    International Nuclear Information System (INIS)

    Singh, A.; Saunders, C.B.; Lopata, V.J.; Kremers, W.; Chung, M.

    1995-01-01

    Electron curing of fiber-reinforced composites to produce materials with good mechanical properties has been demonstrated by the authors' work, and by Aerospatiale. The attractions of this technology are the technical and processing advantages offered over thermal curing, and the projected cost benefits. Though the work so far has focused on the higher value composites for the aircraft and aerospace industries, the technology can also be used to produce composites for the higher volume industries, such as transportation and automotive

  20. Radiation cured acrylonitrile--butadiene elastomers

    International Nuclear Information System (INIS)

    Eldred, R.J.

    1976-01-01

    In accordance with a preferred embodiment of this invention, the ultimate elongation of an electron beam radiation cured acrylonitrile-butadiene elastomer is significantly increased by the incorporation of a preferred noncrosslinking monomer, glycidyl methacrylate, in combination with the conventional crosslinking monomer, trimethylolpropanetrimethacrylate, prior to the radiation curing process

  1. Techniques for internal water curing of concrete

    DEFF Research Database (Denmark)

    Jensen, Ole Mejlhede; Pietro, Lura

    2003-01-01

    This paper gives an overview of different techniques for incorporation of internal curing water in concrete. Internal curing can be used to mitigate self-desiccation and self-desiccation shrinkage. Some concretes may need 50 kg/m3 of internal curing water for this purpose. The price of the internal...

  2. A LATENT CLASS POISSON REGRESSION-MODEL FOR HETEROGENEOUS COUNT DATA

    NARCIS (Netherlands)

    WEDEL, M; DESARBO, WS; BULT, [No Value; RAMASWAMY, [No Value

    1993-01-01

    In this paper an approach is developed that accommodates heterogeneity in Poisson regression models for count data. The model developed assumes that heterogeneity arises from a distribution of both the intercept and the coefficients of the explanatory variables. We assume that the mixing

  3. Continuous validation of ASTEC containment models and regression testing

    International Nuclear Information System (INIS)

    Nowack, Holger; Reinke, Nils; Sonnenkalb, Martin

    2014-01-01

    The focus of the ASTEC (Accident Source Term Evaluation Code) development at GRS is primarily on the containment module CPA (Containment Part of ASTEC), whose modelling is to a large extent based on the GRS containment code COCOSYS (COntainment COde SYStem). Validation is usually understood as the approval of the modelling capabilities by calculations of appropriate experiments done by external users different from the code developers. During the development process of ASTEC CPA, bugs and unintended side effects may occur, which leads to changes in the results of the initially conducted validation. Due to the involvement of a considerable number of developers in the coding of ASTEC modules, validation of the code alone, even if executed repeatedly, is not sufficient. Therefore, a regression testing procedure has been implemented in order to ensure that the initially obtained validation results are still valid with succeeding code versions. Within the regression testing procedure, calculations of experiments and plant sequences are performed with the same input deck but applying two different code versions. For every test-case the up-to-date code version is compared to the preceding one on the basis of physical parameters deemed to be characteristic for the test-case under consideration. In the case of post-calculations of experiments also a comparison to experimental data is carried out. Three validation cases from the regression testing procedure are presented within this paper. The very good post-calculation of the HDR E11.1 experiment shows the high quality modelling of thermal-hydraulics in ASTEC CPA. Aerosol behaviour is validated on the BMC VANAM M3 experiment, and the results show also a very good agreement with experimental data. Finally, iodine behaviour is checked in the validation test-case of the THAI IOD-11 experiment. Within this test-case, the comparison of the ASTEC versions V2.0r1 and V2.0r2 shows how an error was detected by the regression testing

  4. Linear Multivariable Regression Models for Prediction of Eddy Dissipation Rate from Available Meteorological Data

    Science.gov (United States)

    MCKissick, Burnell T. (Technical Monitor); Plassman, Gerald E.; Mall, Gerald H.; Quagliano, John R.

    2005-01-01

    Linear multivariable regression models for predicting day and night Eddy Dissipation Rate (EDR) from available meteorological data sources are defined and validated. Model definition is based on a combination of 1997-2000 Dallas/Fort Worth (DFW) data sources, EDR from Aircraft Vortex Spacing System (AVOSS) deployment data, and regression variables primarily from corresponding Automated Surface Observation System (ASOS) data. Model validation is accomplished through EDR predictions on a similar combination of 1994-1995 Memphis (MEM) AVOSS and ASOS data. Model forms include an intercept plus a single term of fixed optimal power for each of these regression variables; 30-minute forward averaged mean and variance of near-surface wind speed and temperature, variance of wind direction, and a discrete cloud cover metric. Distinct day and night models, regressing on EDR and the natural log of EDR respectively, yield best performance and avoid model discontinuity over day/night data boundaries.

  5. A cure for HIV: is it in sight?

    Science.gov (United States)

    Pace, Matthew; Frater, John

    2014-07-01

    HIV is a devastating disease affecting millions of people worldwide despite the advent of successful antiretroviral therapy (ART). However, ART does not result in a cure and has to be taken for life. Accordingly, researchers are turning towards cure efforts, particularly in the light of two patients whose HIV has been seemingly eradicated. Numerous approaches and strategies have been considered for curing HIV, but no scalable and safe solution has yet been reached. With newly discovered difficulties in measuring the HIV reservoir, the main barrier to a cure, the only true test of cure is to stop ART and see whether the virus becomes detectable. However, it is possible that this treatment interruption may be associated with certain risks for patients. Here, we compare the current major approaches and recent advances for curing HIV, as well as discuss ways of evaluating HIV cure and the safety concerns involved.

  6. Prediction of Mind-Wandering with Electroencephalogram and Non-linear Regression Modeling.

    Science.gov (United States)

    Kawashima, Issaku; Kumano, Hiroaki

    2017-01-01

    Mind-wandering (MW), task-unrelated thought, has been examined by researchers in an increasing number of articles using models to predict whether subjects are in MW, using numerous physiological variables. However, these models are not applicable in general situations. Moreover, they output only binary classification. The current study suggests that the combination of electroencephalogram (EEG) variables and non-linear regression modeling can be a good indicator of MW intensity. We recorded EEGs of 50 subjects during the performance of a Sustained Attention to Response Task, including a thought sampling probe that inquired the focus of attention. We calculated the power and coherence value and prepared 35 patterns of variable combinations and applied Support Vector machine Regression (SVR) to them. Finally, we chose four SVR models: two of them non-linear models and the others linear models; two of the four models are composed of a limited number of electrodes to satisfy model usefulness. Examination using the held-out data indicated that all models had robust predictive precision and provided significantly better estimations than a linear regression model using single electrode EEG variables. Furthermore, in limited electrode condition, non-linear SVR model showed significantly better precision than linear SVR model. The method proposed in this study helps investigations into MW in various little-examined situations. Further, by measuring MW with a high temporal resolution EEG, unclear aspects of MW, such as time series variation, are expected to be revealed. Furthermore, our suggestion that a few electrodes can also predict MW contributes to the development of neuro-feedback studies.

  7. Prediction of Mind-Wandering with Electroencephalogram and Non-linear Regression Modeling

    Directory of Open Access Journals (Sweden)

    Issaku Kawashima

    2017-07-01

    Full Text Available Mind-wandering (MW, task-unrelated thought, has been examined by researchers in an increasing number of articles using models to predict whether subjects are in MW, using numerous physiological variables. However, these models are not applicable in general situations. Moreover, they output only binary classification. The current study suggests that the combination of electroencephalogram (EEG variables and non-linear regression modeling can be a good indicator of MW intensity. We recorded EEGs of 50 subjects during the performance of a Sustained Attention to Response Task, including a thought sampling probe that inquired the focus of attention. We calculated the power and coherence value and prepared 35 patterns of variable combinations and applied Support Vector machine Regression (SVR to them. Finally, we chose four SVR models: two of them non-linear models and the others linear models; two of the four models are composed of a limited number of electrodes to satisfy model usefulness. Examination using the held-out data indicated that all models had robust predictive precision and provided significantly better estimations than a linear regression model using single electrode EEG variables. Furthermore, in limited electrode condition, non-linear SVR model showed significantly better precision than linear SVR model. The method proposed in this study helps investigations into MW in various little-examined situations. Further, by measuring MW with a high temporal resolution EEG, unclear aspects of MW, such as time series variation, are expected to be revealed. Furthermore, our suggestion that a few electrodes can also predict MW contributes to the development of neuro-feedback studies.

  8. Reconstruction of missing daily streamflow data using dynamic regression models

    Science.gov (United States)

    Tencaliec, Patricia; Favre, Anne-Catherine; Prieur, Clémentine; Mathevet, Thibault

    2015-12-01

    River discharge is one of the most important quantities in hydrology. It provides fundamental records for water resources management and climate change monitoring. Even very short data-gaps in this information can cause extremely different analysis outputs. Therefore, reconstructing missing data of incomplete data sets is an important step regarding the performance of the environmental models, engineering, and research applications, thus it presents a great challenge. The objective of this paper is to introduce an effective technique for reconstructing missing daily discharge data when one has access to only daily streamflow data. The proposed procedure uses a combination of regression and autoregressive integrated moving average models (ARIMA) called dynamic regression model. This model uses the linear relationship between neighbor and correlated stations and then adjusts the residual term by fitting an ARIMA structure. Application of the model to eight daily streamflow data for the Durance river watershed showed that the model yields reliable estimates for the missing data in the time series. Simulation studies were also conducted to evaluate the performance of the procedure.

  9. Effect of various infection-control methods for light-cure units on the cure of composite resins.

    Science.gov (United States)

    Chong, S L; Lam, Y K; Lee, F K; Ramalingam, L; Yeo, A C; Lim, C C

    1998-01-01

    This study (1) compared the curing-light intensity with various barrier infection-control methods used to prevent cross contamination, (2) compared the Knoop hardness value of cured composite resin when various barrier control methods were used, and (3) correlated the hardness of the composite resin with the light-intensity output when different infection-control methods were used. The light-cure unit tips were covered with barriers, such as cellophane wrap, plastic gloves, Steri-shields, and finger cots. The control group had no barrier. Composite resins were then cured for each of the five groups, and their Knoop hardness values recorded. The results showed that there was significant statistical difference in the light-intensity output among the five groups. However, there was no significant statistical difference in the Knoop hardness values among any of the groups. There was also no correlation between the Knoop hardness value of the composite resin with the light-intensity output and the different infection-control methods. Therefore, any of the five infection-control methods could be used as barriers for preventing cross-contamination of the light-cure unit tip, for the light-intensity output for all five groups exceeded the recommended value of 300 W/m2. However, to allow a greater margin of error in clinical situations, the authors recommend that the plastic glove or the cellophane wrap be used to wrap the light-cure tip, since these barriers allowed the highest light-intensity output.

  10. Detection of Outliers in Regression Model for Medical Data

    Directory of Open Access Journals (Sweden)

    Stephen Raj S

    2017-07-01

    Full Text Available In regression analysis, an outlier is an observation for which the residual is large in magnitude compared to other observations in the data set. The detection of outliers and influential points is an important step of the regression analysis. Outlier detection methods have been used to detect and remove anomalous values from data. In this paper, we detect the presence of outliers in simple linear regression models for medical data set. Chatterjee and Hadi mentioned that the ordinary residuals are not appropriate for diagnostic purposes; a transformed version of them is preferable. First, we investigate the presence of outliers based on existing procedures of residuals and standardized residuals. Next, we have used the new approach of standardized scores for detecting outliers without the use of predicted values. The performance of the new approach was verified with the real-life data.

  11. Effect of light dispersion of LED curing lights on resin composite polymerization.

    Science.gov (United States)

    Vandewalle, Kraig S; Roberts, Howard W; Andrus, Jeffrey L; Dunn, William J

    2005-01-01

    higher DC ratios with the hybrid resin composite. No differences were found among lights with the microfill at 1 mm. At 5 mm, SmartLite iQ, FLASHlite 1001, LEDemetron 1, and UltraLume LED 5 produced significantly higher DC ratios with the hybrid resin composite, whereas LEDemetron 1 and SmartLite iQ produced significantly higher DC ratios with the microfill resin composite. The UltraLume LED 5, Allegro, and Optilux 501 had significant reductions in mean DC ratios at curing distances of 1 and 5 mm with both resin composite types. For dispersion of light, significant differences were found in Top Hat factor and divergence angle (p < .001). SmartLite iQ had overall the highest Top Hat factor and lowest divergence angle of tested lights. A linear regression analysis relating pooled DC with pooled Top Hat factors and divergence angles found a very good correlation (r2 = .86) between dispersion of light over distance and the ability to polymerize resin composite. The latest generation of LED curing lights provides DC ratios similar to or better than the halogen curing light at a curing distance of 5 mm. Dispersion of light plays a significant role in the DC of resin composite. To maximize curing effectiveness, light guides should be maintained in close proximity to the surface of the light-activated restorative material.

  12. Estimasi Model Seemingly Unrelated Regression (SUR dengan Metode Generalized Least Square (GLS

    Directory of Open Access Journals (Sweden)

    Ade Widyaningsih

    2015-04-01

    Full Text Available Regression analysis is a statistical tool that is used to determine the relationship between two or more quantitative variables so that one variable can be predicted from the other variables. A method that can used to obtain a good estimation in the regression analysis is ordinary least squares method. The least squares method is used to estimate the parameters of one or more regression but relationships among the errors in the response of other estimators are not allowed. One way to overcome this problem is Seemingly Unrelated Regression model (SUR in which parameters are estimated using Generalized Least Square (GLS. In this study, the author applies SUR model using GLS method on world gasoline demand data. The author obtains that SUR using GLS is better than OLS because SUR produce smaller errors than the OLS.

  13. Estimasi Model Seemingly Unrelated Regression (SUR dengan Metode Generalized Least Square (GLS

    Directory of Open Access Journals (Sweden)

    Ade Widyaningsih

    2014-06-01

    Full Text Available Regression analysis is a statistical tool that is used to determine the relationship between two or more quantitative variables so that one variable can be predicted from the other variables. A method that can used to obtain a good estimation in the regression analysis is ordinary least squares method. The least squares method is used to estimate the parameters of one or more regression but relationships among the errors in the response of other estimators are not allowed. One way to overcome this problem is Seemingly Unrelated Regression model (SUR in which parameters are estimated using Generalized Least Square (GLS. In this study, the author applies SUR model using GLS method on world gasoline demand data. The author obtains that SUR using GLS is better than OLS because SUR produce smaller errors than the OLS.

  14. Na-MRI quantification of sodium movements in pork during brine curing as related to meat pH

    DEFF Research Database (Denmark)

    Vestergaard, Christian Sylvest; Risum, Jørgen; Adler-Nissen, Jens

    A model study of measuring diffusion during curing with 23Na images, Na-profiles, Apperant Diffusion Coefficients, T1-weighted images is presented.......A model study of measuring diffusion during curing with 23Na images, Na-profiles, Apperant Diffusion Coefficients, T1-weighted images is presented....

  15. On pseudo-values for regression analysis in competing risks models

    DEFF Research Database (Denmark)

    Graw, F; Gerds, Thomas Alexander; Schumacher, M

    2009-01-01

    For regression on state and transition probabilities in multi-state models Andersen et al. (Biometrika 90:15-27, 2003) propose a technique based on jackknife pseudo-values. In this article we analyze the pseudo-values suggested for competing risks models and prove some conjectures regarding their...

  16. Modeling and prediction of Turkey's electricity consumption using Support Vector Regression

    International Nuclear Information System (INIS)

    Kavaklioglu, Kadir

    2011-01-01

    Support Vector Regression (SVR) methodology is used to model and predict Turkey's electricity consumption. Among various SVR formalisms, ε-SVR method was used since the training pattern set was relatively small. Electricity consumption is modeled as a function of socio-economic indicators such as population, Gross National Product, imports and exports. In order to facilitate future predictions of electricity consumption, a separate SVR model was created for each of the input variables using their current and past values; and these models were combined to yield consumption prediction values. A grid search for the model parameters was performed to find the best ε-SVR model for each variable based on Root Mean Square Error. Electricity consumption of Turkey is predicted until 2026 using data from 1975 to 2006. The results show that electricity consumption can be modeled using Support Vector Regression and the models can be used to predict future electricity consumption. (author)

  17. Cluster regression model and level fluctuation features of Van Lake, Turkey

    Directory of Open Access Journals (Sweden)

    Z. Şen

    1999-02-01

    Full Text Available Lake water levels change under the influences of natural and/or anthropogenic environmental conditions. Among these influences are the climate change, greenhouse effects and ozone layer depletions which are reflected in the hydrological cycle features over the lake drainage basins. Lake levels are among the most significant hydrological variables that are influenced by different atmospheric and environmental conditions. Consequently, lake level time series in many parts of the world include nonstationarity components such as shifts in the mean value, apparent or hidden periodicities. On the other hand, many lake level modeling techniques have a stationarity assumption. The main purpose of this work is to develop a cluster regression model for dealing with nonstationarity especially in the form of shifting means. The basis of this model is the combination of transition probability and classical regression technique. Both parts of the model are applied to monthly level fluctuations of Lake Van in eastern Turkey. It is observed that the cluster regression procedure does preserve the statistical properties and the transitional probabilities that are indistinguishable from the original data.Key words. Hydrology (hydrologic budget; stochastic processes · Meteorology and atmospheric dynamics (ocean-atmosphere interactions

  18. Cluster regression model and level fluctuation features of Van Lake, Turkey

    Directory of Open Access Journals (Sweden)

    Z. Şen

    Full Text Available Lake water levels change under the influences of natural and/or anthropogenic environmental conditions. Among these influences are the climate change, greenhouse effects and ozone layer depletions which are reflected in the hydrological cycle features over the lake drainage basins. Lake levels are among the most significant hydrological variables that are influenced by different atmospheric and environmental conditions. Consequently, lake level time series in many parts of the world include nonstationarity components such as shifts in the mean value, apparent or hidden periodicities. On the other hand, many lake level modeling techniques have a stationarity assumption. The main purpose of this work is to develop a cluster regression model for dealing with nonstationarity especially in the form of shifting means. The basis of this model is the combination of transition probability and classical regression technique. Both parts of the model are applied to monthly level fluctuations of Lake Van in eastern Turkey. It is observed that the cluster regression procedure does preserve the statistical properties and the transitional probabilities that are indistinguishable from the original data.

    Key words. Hydrology (hydrologic budget; stochastic processes · Meteorology and atmospheric dynamics (ocean-atmosphere interactions

  19. Improved model of the retardance in citric acid coated ferrofluids using stepwise regression

    Science.gov (United States)

    Lin, J. F.; Qiu, X. R.

    2017-06-01

    Citric acid (CA) coated Fe3O4 ferrofluids (FFs) have been conducted for biomedical application. The magneto-optical retardance of CA coated FFs was measured by a Stokes polarimeter. Optimization and multiple regression of retardance in FFs were executed by Taguchi method and Microsoft Excel previously, and the F value of regression model was large enough. However, the model executed by Excel was not systematic. Instead we adopted the stepwise regression to model the retardance of CA coated FFs. From the results of stepwise regression by MATLAB, the developed model had highly predictable ability owing to F of 2.55897e+7 and correlation coefficient of one. The average absolute error of predicted retardances to measured retardances was just 0.0044%. Using the genetic algorithm (GA) in MATLAB, the optimized parametric combination was determined as [4.709 0.12 39.998 70.006] corresponding to the pH of suspension, molar ratio of CA to Fe3O4, CA volume, and coating temperature. The maximum retardance was found as 31.712°, close to that obtained by evolutionary solver in Excel and a relative error of -0.013%. Above all, the stepwise regression method was successfully used to model the retardance of CA coated FFs, and the maximum global retardance was determined by the use of GA.

  20. Mechanical properties of self-curing concrete (SCUC

    Directory of Open Access Journals (Sweden)

    Magda I. Mousa

    2015-12-01

    Full Text Available The mechanical properties of concrete containing self-curing agents are investigated in this paper. In this study, two materials were selected as self-curing agents with different amounts, and the addition of silica fume was studied. The self-curing agents were, pre-soaked lightweight aggregate (Leca; 0.0%, 10%, 15%, and 20% of volume of sand; or polyethylene-glycol (Ch.; 1%, 2%, and 3% by weight of cement. To carry out this study the cement content of 300, 400, 500 kg/m3, water/cement ratio of 0.5, 0.4, 0.3 and 0.0%, 15% silica fume of weight of cement as an additive were used in concrete mixes. The mechanical properties were evaluated while the concrete specimens were subjected to air curing regime (in the laboratory environment with 25 °C, 65% R.H. during the experiment. The results show that, the use of self-curing agents in concrete effectively improved the mechanical properties. The concrete used polyethylene-glycol as self-curing agent, attained higher values of mechanical properties than concrete with saturated Leca. In all cases, either 2% Ch. or 15% Leca was the optimum ratio compared with the other ratios. Higher cement content and/or lower water/cement ratio lead(s to more efficient performance of self-curing agents in concrete. Incorporation of silica fume into self-curing concrete mixture enhanced all mechanical properties, not only due to its pozzolanic reaction, but also due to its ability to retain water inside concrete.

  1. Accounting for spatial effects in land use regression for urban air pollution modeling.

    Science.gov (United States)

    Bertazzon, Stefania; Johnson, Markey; Eccles, Kristin; Kaplan, Gilaad G

    2015-01-01

    In order to accurately assess air pollution risks, health studies require spatially resolved pollution concentrations. Land-use regression (LUR) models estimate ambient concentrations at a fine spatial scale. However, spatial effects such as spatial non-stationarity and spatial autocorrelation can reduce the accuracy of LUR estimates by increasing regression errors and uncertainty; and statistical methods for resolving these effects--e.g., spatially autoregressive (SAR) and geographically weighted regression (GWR) models--may be difficult to apply simultaneously. We used an alternate approach to address spatial non-stationarity and spatial autocorrelation in LUR models for nitrogen dioxide. Traditional models were re-specified to include a variable capturing wind speed and direction, and re-fit as GWR models. Mean R(2) values for the resulting GWR-wind models (summer: 0.86, winter: 0.73) showed a 10-20% improvement over traditional LUR models. GWR-wind models effectively addressed both spatial effects and produced meaningful predictive models. These results suggest a useful method for improving spatially explicit models. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Regression analysis understanding and building business and economic models using Excel

    CERN Document Server

    Wilson, J Holton

    2012-01-01

    The technique of regression analysis is used so often in business and economics today that an understanding of its use is necessary for almost everyone engaged in the field. This book will teach you the essential elements of building and understanding regression models in a business/economic context in an intuitive manner. The authors take a non-theoretical treatment that is accessible even if you have a limited statistical background. It is specifically designed to teach the correct use of regression, while advising you of its limitations and teaching about common pitfalls. This book describe

  3. Analysis and simulation of heat transfer in human tooth during the curing of orthodontic appliance and food ingestion

    Directory of Open Access Journals (Sweden)

    J Velazquez-Lopez

    2016-01-01

    Full Text Available The aim of this study was to analyze and simulate the heat transfer in the human tooth undergoing fixed orthodontic appliances and food intake. An in vivo representative mathematic model of a layered thermographic profile was developed during the LED curing of Gemini bracket 0.022 in slot (conventional ligating system and Transbond XT adhesive. The characterization of the layered thermic response allowed to identify if during the LED curing process, according to manufacturer's specification (light curing unit, adhesive can induce pulpar necrosis. The profile's thermographic model was the simulation basis of many conditions such as food intake, due to in vivo metrology is affected by the impossibility of a correct apparatus position and the physiologic function of the oral cavity which is exposed to uncontrollable temperature changes. The metrology was carried out with a T-440 thermographic camera during LED curing bracket, using a LED curing light (Elipar S10 placed at 3 ± 1 mm for 5 s at each mesial and distal surface. The thermography outcomes were analyzed in the FLIR Tools Software, Microsoft Excel 2013 and SPSS 22. To adjust the mathematic model error, in vitro studies were performed on third molars for the purpose of realizing extreme exposition temperature condition tests caused by the LED curing unit without jeopardizing the human tooth vitality as would it be on in vivo experimentation. The bracket curing results according to manufacturer's conditions reached 39°C in vivo temperatures and 47°C on in vitro tests, which does not jeopardize human tooth vitality as said by previous researches, although, an LED curing precise protocol established by the manufacturer's LED curing light is sustained.

  4. Curing behaviors and properties of an extrinsic toughened epoxy/anhydride system and an intrinsic toughened epoxy/anhydride system

    International Nuclear Information System (INIS)

    Fan, Mengjin; Liu, Jialin; Li, Xiangyuan; Cheng, Jue; Zhang, Junying

    2013-01-01

    Highlights: ► Two curing systems (ETRS and ITRS) with similar chemical composite were prepared. ► The curing kinetics of the ETRS and the novel ITRS were comparatively studied. ► Crosslinking density can affect the kinetic schemes of the two curing systems. ► Their mechanical properties and thermal stabilities were also comparatively studied. ► Crosslinking density may play an influential role in mechanical properties. - Abstract: The curing kinetics of an extrinsic toughened epoxy (mixture of diglycidyl ether of bisphenol-A and 1,4-butanediol epoxy resin, DGEBA/DGEBD) and an intrinsic toughened epoxy (ethoxylated bisphenol-A epoxy resin with two oxyethylene units, DGEBAEO-2) using hexahydrophthalic anhydride (HHPA) as curing agent and tris-(dimethylaminomethyl) phenol (DMP-30) as accelerator were comparatively studied by non-isothermal DSC with a model-fitting Málek approach and a model-free advanced isoconversional method of Vyazovkin. The dynamic mechanical properties and thermal stabilities of the cured materials were investigated by DMTA and TGA, respectively. The results showed that Šesták–Berggren model can generally simulate well the reaction rates of these two systems. The activation energy of DGEBA/DGEBD/HHPA/DMP-30 at high fractional conversion changed much higher than that of DGEBAEO-2/HHPA/DMP-30, indicating the increased steric hindrance mainly affected the reaction kinetic scheme of DGEBA/DGEBD/HHPA/DMP-30. The T g and storage moduli of cured DGEBAEO-2/HHPA/DMP-30 were lower than those of cured DGEBA/DGEBD/HHPA/DMP-30 according to DMTA while TGA showed that the thermal stabilities of these two cured systems were similar

  5. Curing behaviors and properties of an extrinsic toughened epoxy/anhydride system and an intrinsic toughened epoxy/anhydride system

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Mengjin; Liu, Jialin; Li, Xiangyuan [Key Laboratory of Carbon Fiber and Functional Polymers, Ministry of Education, Beijing University of Chemical Technology, Beijing 100029 (China); Cheng, Jue, E-mail: chengjue@mail.buct.edu.cn [Key Laboratory of Carbon Fiber and Functional Polymers, Ministry of Education, Beijing University of Chemical Technology, Beijing 100029 (China); Zhang, Junying, E-mail: zjybuct@gmail.com [Key Laboratory of Carbon Fiber and Functional Polymers, Ministry of Education, Beijing University of Chemical Technology, Beijing 100029 (China)

    2013-02-20

    Highlights: ► Two curing systems (ETRS and ITRS) with similar chemical composite were prepared. ► The curing kinetics of the ETRS and the novel ITRS were comparatively studied. ► Crosslinking density can affect the kinetic schemes of the two curing systems. ► Their mechanical properties and thermal stabilities were also comparatively studied. ► Crosslinking density may play an influential role in mechanical properties. - Abstract: The curing kinetics of an extrinsic toughened epoxy (mixture of diglycidyl ether of bisphenol-A and 1,4-butanediol epoxy resin, DGEBA/DGEBD) and an intrinsic toughened epoxy (ethoxylated bisphenol-A epoxy resin with two oxyethylene units, DGEBAEO-2) using hexahydrophthalic anhydride (HHPA) as curing agent and tris-(dimethylaminomethyl) phenol (DMP-30) as accelerator were comparatively studied by non-isothermal DSC with a model-fitting Málek approach and a model-free advanced isoconversional method of Vyazovkin. The dynamic mechanical properties and thermal stabilities of the cured materials were investigated by DMTA and TGA, respectively. The results showed that Šesták–Berggren model can generally simulate well the reaction rates of these two systems. The activation energy of DGEBA/DGEBD/HHPA/DMP-30 at high fractional conversion changed much higher than that of DGEBAEO-2/HHPA/DMP-30, indicating the increased steric hindrance mainly affected the reaction kinetic scheme of DGEBA/DGEBD/HHPA/DMP-30. The T{sub g} and storage moduli of cured DGEBAEO-2/HHPA/DMP-30 were lower than those of cured DGEBA/DGEBD/HHPA/DMP-30 according to DMTA while TGA showed that the thermal stabilities of these two cured systems were similar.

  6. The prediction of intelligence in preschool children using alternative models to regression.

    Science.gov (United States)

    Finch, W Holmes; Chang, Mei; Davis, Andrew S; Holden, Jocelyn E; Rothlisberg, Barbara A; McIntosh, David E

    2011-12-01

    Statistical prediction of an outcome variable using multiple independent variables is a common practice in the social and behavioral sciences. For example, neuropsychologists are sometimes called upon to provide predictions of preinjury cognitive functioning for individuals who have suffered a traumatic brain injury. Typically, these predictions are made using standard multiple linear regression models with several demographic variables (e.g., gender, ethnicity, education level) as predictors. Prior research has shown conflicting evidence regarding the ability of such models to provide accurate predictions of outcome variables such as full-scale intelligence (FSIQ) test scores. The present study had two goals: (1) to demonstrate the utility of a set of alternative prediction methods that have been applied extensively in the natural sciences and business but have not been frequently explored in the social sciences and (2) to develop models that can be used to predict premorbid cognitive functioning in preschool children. Predictions of Stanford-Binet 5 FSIQ scores for preschool-aged children is used to compare the performance of a multiple regression model with several of these alternative methods. Results demonstrate that classification and regression trees provided more accurate predictions of FSIQ scores than does the more traditional regression approach. Implications of these results are discussed.

  7. A primer for biomedical scientists on how to execute model II linear regression analysis.

    Science.gov (United States)

    Ludbrook, John

    2012-04-01

    1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.

  8. Biocontrol of Listeria monocytogenes in a meat model using a combination of a bacteriocinogenic strain with curing additives.

    Science.gov (United States)

    Orihuel, Alejandra; Bonacina, Julieta; Vildoza, María José; Bru, Elena; Vignolo, Graciela; Saavedra, Lucila; Fadda, Silvina

    2018-05-01

    The aim of this work was to evaluate the effect of meat curing agents on the bioprotective activity of the bacteriocinogenic strain, Enterococcus (E.) mundtii CRL35 against Listeria (L.) monocytogenes during meat fermentation. The ability of E. mundtii CRL35 to grow, acidify and produce bacteriocin in situ was assayed in a meat model system in the presence of curing additives (CA). E. mundtii CRL35 showed optimal growth and acidification rates in the presence of CA. More importantly, the highest bacteriocin titer was achieved in the presence of these food agents. In addition, the CA produced a statistical significant enhancement of the enterocin CRL35 activity. This positive effect was demonstrated in vitro in a meat based culture medium, by time-kill kinetics and finally by using a beaker sausage model with a challenge experiment with the pathogenic L. monocytogenes FBUNT strain. E. mundtii CRL35 was found to be a promising strain of use as a safety adjunct culture in meat industry and a novel functional supplement for sausage fermentation, ensuring hygiene and quality of the final product. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Radiation curing - twenty five years on

    International Nuclear Information System (INIS)

    Garnett, J.L.

    1995-01-01

    Progress in UV/EB curing during the past twenty five years is briefly reviewed. During this time developments in unique polymer chemistry, novel equipment design and the introduction of relevant educational programmes has enabled radiation curing to become an established technology with specific strengths in certain industries. Possible reasons for the emergence of the technology in these niche markets are discussed. Despite the worldwide recession, radiation curing is shown to be expanding at 5% per annum with the prospect of higher growth with improving economic conditions. (Author)

  10. Applied logistic regression

    CERN Document Server

    Hosmer, David W; Sturdivant, Rodney X

    2013-01-01

     A new edition of the definitive guide to logistic regression modeling for health science and other applications This thoroughly expanded Third Edition provides an easily accessible introduction to the logistic regression (LR) model and highlights the power of this model by examining the relationship between a dichotomous outcome and a set of covariables. Applied Logistic Regression, Third Edition emphasizes applications in the health sciences and handpicks topics that best suit the use of modern statistical software. The book provides readers with state-of-

  11. Silicone rubber curing by high intensity infrared radiation

    International Nuclear Information System (INIS)

    Huang, T.; Tsai, J.; Cherng, C.; Chen, J.

    1994-01-01

    A high-intensity (12 kW) and compact (80 cm) infrared heating oven for fast curing (12 seconds) of tube-like silicone rubber curing studies is reported. Quality inspection by DSC and DMA and results from pilot-scale curing oven all suggest that infrared heating provides a better way of vulcanization regarding to curing time, quality, cost, and spacing over conventional hot air heating. copyright 1995 American Institute of Physics

  12. Evaluation of Logistic Regression and Multivariate Adaptive Regression Spline Models for Groundwater Potential Mapping Using R and GIS

    Directory of Open Access Journals (Sweden)

    Soyoung Park

    2017-07-01

    Full Text Available This study mapped and analyzed groundwater potential using two different models, logistic regression (LR and multivariate adaptive regression splines (MARS, and compared the results. A spatial database was constructed for groundwater well data and groundwater influence factors. Groundwater well data with a high potential yield of ≥70 m3/d were extracted, and 859 locations (70% were used for model training, whereas the other 365 locations (30% were used for model validation. We analyzed 16 groundwater influence factors including altitude, slope degree, slope aspect, plan curvature, profile curvature, topographic wetness index, stream power index, sediment transport index, distance from drainage, drainage density, lithology, distance from fault, fault density, distance from lineament, lineament density, and land cover. Groundwater potential maps (GPMs were constructed using LR and MARS models and tested using a receiver operating characteristics curve. Based on this analysis, the area under the curve (AUC for the success rate curve of GPMs created using the MARS and LR models was 0.867 and 0.838, and the AUC for the prediction rate curve was 0.836 and 0.801, respectively. This implies that the MARS model is useful and effective for groundwater potential analysis in the study area.

  13. Sample size calculation to externally validate scoring systems based on logistic regression models.

    Directory of Open Access Journals (Sweden)

    Antonio Palazón-Bru

    Full Text Available A sample size containing at least 100 events and 100 non-events has been suggested to validate a predictive model, regardless of the model being validated and that certain factors can influence calibration of the predictive model (discrimination, parameterization and incidence. Scoring systems based on binary logistic regression models are a specific type of predictive model.The aim of this study was to develop an algorithm to determine the sample size for validating a scoring system based on a binary logistic regression model and to apply it to a case study.The algorithm was based on bootstrap samples in which the area under the ROC curve, the observed event probabilities through smooth curves, and a measure to determine the lack of calibration (estimated calibration index were calculated. To illustrate its use for interested researchers, the algorithm was applied to a scoring system, based on a binary logistic regression model, to determine mortality in intensive care units.In the case study provided, the algorithm obtained a sample size with 69 events, which is lower than the value suggested in the literature.An algorithm is provided for finding the appropriate sample size to validate scoring systems based on binary logistic regression models. This could be applied to determine the sample size in other similar cases.

  14. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    Science.gov (United States)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  15. Effect of curing modes of dual-curing core systems on microtensile bond strength to dentin and formation of an acid-base resistant zone.

    Science.gov (United States)

    Li, Na; Takagaki, Tomohiro; Sadr, Alireza; Waidyasekera, Kanchana; Ikeda, Masaomi; Chen, Jihua; Nikaido, Toru; Tagami, Junji

    2011-12-01

    To evaluate the microtensile bond strength (μTBS) and acid-base resistant zone (ABRZ) of two dualcuring core systems to dentin using four curing modes. Sixty-four caries-free human molars were randomly divided into two groups according to two dual-curing resin core systems: (1) Clearfil DC Core Automix; (2) Estelite Core Quick. For each core system, four different curing modes were applied to the adhesive and core resin: (1) dual-cured and dual-cured (DD); (2) chemically cured and dual-cured (CD); (3) dual-cured and chemically cured (DC); (4) chemically cured and chemically cured (CC). The specimens were sectioned into sticks (n = 20 for each group) for the microtensile bond test. μTBS data were analyzed using two-way ANOVA and the Dunnett T3 test. Failure patterns were examined with scanning electron microscopy (SEM) to determine the proportion of each mode. Dentin sandwiches were produced and subjected to an acid-base challenge. After argon-ion etching, the ultrastructure of ABRZ was observed using SEM. For Clearfil DC Core Automix, the μTBS values in MPa were as follows: DD: 29.1 ± 5.4, CD: 21.6 ± 5.6, DC: 17.9 ± 2.8, CC: 11.5 ± 3.2. For Estelite Core Quick, they were: DD: 48.9 ±5.7, CD: 20.5 ± 4.7, DC: 41.4 ± 8.3, CC: 19.1 ± 6.0. The bond strength was affected by both material and curing mode, and the interaction of the two factors was significant (p < 0.001). Within both systems, there were significant differences among groups, and the DD group showed the highest μTBS (p < 0.05). ABRZ morphology was not affected by curing mode, but it was highly adhesive-material dependent. The curing mode of dual-curing core systems affects bond strength to dentin, but has no significant effect on the formation of ABRZ.

  16. Experimental Study on the Curing Effect of Dredged Sediments with Three Types of Curing Agents

    Directory of Open Access Journals (Sweden)

    Yan Lei-Ming

    2016-01-01

    Full Text Available Sediment solidification technology is widely used to dispose dredged sediment, three types of curing agents were used in this study to solidified the dredged sediment from shallows in Nantong with three types of curing agents: JY, ZL and FJ. The results showed that the optimal additive amounts of these three curing agents were 140g JY, 16g ZL, 2.0g FJ per 1000g of the dredged sediment respectively, their 28d USC were up to 2.48 MPa, 2.96 MPa and 3.00 MPa. JY has obvious early strength effect, which of FJ is not that obvious, but the later-stage strength of sediment solidified by FJ are relatively higher.

  17. Using the Logistic Regression model in supporting decisions of establishing marketing strategies

    Directory of Open Access Journals (Sweden)

    Cristinel CONSTANTIN

    2015-12-01

    Full Text Available This paper is about an instrumental research regarding the using of Logistic Regression model for data analysis in marketing research. The decision makers inside different organisation need relevant information to support their decisions regarding the marketing strategies. The data provided by marketing research could be computed in various ways but the multivariate data analysis models can enhance the utility of the information. Among these models we can find the Logistic Regression model, which is used for dichotomous variables. Our research is based on explanation the utility of this model and interpretation of the resulted information in order to help practitioners and researchers to use it in their future investigations

  18. Vector regression introduced

    Directory of Open Access Journals (Sweden)

    Mok Tik

    2014-06-01

    Full Text Available This study formulates regression of vector data that will enable statistical analysis of various geodetic phenomena such as, polar motion, ocean currents, typhoon/hurricane tracking, crustal deformations, and precursory earthquake signals. The observed vector variable of an event (dependent vector variable is expressed as a function of a number of hypothesized phenomena realized also as vector variables (independent vector variables and/or scalar variables that are likely to impact the dependent vector variable. The proposed representation has the unique property of solving the coefficients of independent vector variables (explanatory variables also as vectors, hence it supersedes multivariate multiple regression models, in which the unknown coefficients are scalar quantities. For the solution, complex numbers are used to rep- resent vector information, and the method of least squares is deployed to estimate the vector model parameters after transforming the complex vector regression model into a real vector regression model through isomorphism. Various operational statistics for testing the predictive significance of the estimated vector parameter coefficients are also derived. A simple numerical example demonstrates the use of the proposed vector regression analysis in modeling typhoon paths.

  19. Multiple regression models for energy use in air-conditioned office buildings in different climates

    International Nuclear Information System (INIS)

    Lam, Joseph C.; Wan, Kevin K.W.; Liu Dalong; Tsang, C.L.

    2010-01-01

    An attempt was made to develop multiple regression models for office buildings in the five major climates in China - severe cold, cold, hot summer and cold winter, mild, and hot summer and warm winter. A total of 12 key building design variables were identified through parametric and sensitivity analysis, and considered as inputs in the regression models. The coefficient of determination R 2 varies from 0.89 in Harbin to 0.97 in Kunming, indicating that 89-97% of the variations in annual building energy use can be explained by the changes in the 12 parameters. A pseudo-random number generator based on three simple multiplicative congruential generators was employed to generate random designs for evaluation of the regression models. The difference between regression-predicted and DOE-simulated annual building energy use are largely within 10%. It is envisaged that the regression models developed can be used to estimate the likely energy savings/penalty during the initial design stage when different building schemes and design concepts are being considered.

  20. CICAAR - Convolutive ICA with an Auto-Regressive Inverse Model

    DEFF Research Database (Denmark)

    Dyrholm, Mads; Hansen, Lars Kai

    2004-01-01

    We invoke an auto-regressive IIR inverse model for convolutive ICA and derive expressions for the likelihood and its gradient. We argue that optimization will give a stable inverse. When there are more sensors than sources the mixing model parameters are estimated in a second step by least square...... estimation. We demonstrate the method on synthetic data and finally separate speech and music in a real room recording....

  1. Effect of curing methods, packaging and gamma irradiation on the weight loss and dry matter percent of garlic during curing and storage

    International Nuclear Information System (INIS)

    Mahmoud, A.A.; El-Oksh, I.I.; Farag, S.E.A.

    1988-01-01

    The Egyptian garlic plants, showed higher percent of weight loss at 17 or 27 days from curing compared to those of Chinese plants. The curing period of 17 days seemed satisfactory for the Egyptian cultivar, whereas, 27 days seemed to be enough for the Chinese garlic. No significant differences were observed between common and shaded curing methods in weight loss per cent. The Chinese garlic contained higher dry matter percentage than those of the Egyptian cultivar. Shaded cured plants of the two cultivars contained higher dry matter percent than those subjected to the common curing methods. Irradiation of garlic bulbs, shaded curing method and sack packaging decreased, in general the weight loss during storage in comparison with other treatments

  2. Two levels ARIMAX and regression models for forecasting time series data with calendar variation effects

    Science.gov (United States)

    Suhartono, Lee, Muhammad Hisyam; Prastyo, Dedy Dwi

    2015-12-01

    The aim of this research is to develop a calendar variation model for forecasting retail sales data with the Eid ul-Fitr effect. The proposed model is based on two methods, namely two levels ARIMAX and regression methods. Two levels ARIMAX and regression models are built by using ARIMAX for the first level and regression for the second level. Monthly men's jeans and women's trousers sales in a retail company for the period January 2002 to September 2009 are used as case study. In general, two levels of calendar variation model yields two models, namely the first model to reconstruct the sales pattern that already occurred, and the second model to forecast the effect of increasing sales due to Eid ul-Fitr that affected sales at the same and the previous months. The results show that the proposed two level calendar variation model based on ARIMAX and regression methods yields better forecast compared to the seasonal ARIMA model and Neural Networks.

  3. Proceedings of workshop on surface finishing by radiation curing technology: radiation curing for better finishing

    International Nuclear Information System (INIS)

    1993-01-01

    This book compiled the paper presented at this workshop. The papers discussed are 1. Introduction to radiation curing, 2. Radiation sources -ultraviolet and electron beams, 3. UV/EB curing of surface coating - wood and nonwood substrates, 4. Development of EPOLA (epoxidised palm oil products acrylate) and its application, 5. Development of radiation-curable resin based natural rubber

  4. Proceedings of workshop on surface finishing by radiation curing technology: radiation curing for better finishing

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This book compiled the paper presented at this workshop. The papers discussed are 1. Introduction to radiation curing, 2. Radiation sources -ultraviolet and electron beams, 3. UV/EB curing of surface coating - wood and nonwood substrates, 4. Development of EPOLA (epoxidised palm oil products acrylate) and its application, 5. Development of radiation-curable resin based natural rubber.

  5. Testing and Modeling Fuel Regression Rate in a Miniature Hybrid Burner

    Directory of Open Access Journals (Sweden)

    Luciano Fanton

    2012-01-01

    Full Text Available Ballistic characterization of an extended group of innovative HTPB-based solid fuel formulations for hybrid rocket propulsion was performed in a lab-scale burner. An optical time-resolved technique was used to assess the quasisteady regression history of single perforation, cylindrical samples. The effects of metalized additives and radiant heat transfer on the regression rate of such formulations were assessed. Under the investigated operating conditions and based on phenomenological models from the literature, analyses of the collected experimental data show an appreciable influence of the radiant heat flux from burnt gases and soot for both unloaded and loaded fuel formulations. Pure HTPB regression rate data are satisfactorily reproduced, while the impressive initial regression rates of metalized formulations require further assessment.

  6. Physical properties of self-curing concrete (SCUC

    Directory of Open Access Journals (Sweden)

    Magda I. Mousa

    2015-08-01

    The results show that the use of self-curing agent (Ch. in concrete effectively improves the physical properties compared with conventional concrete. On the other hand, up to 15% saturated leca was effective while 20% saturated leca was effective for permeability and mass loss but adversely affects the sorptivity and volumetric water absorption. Self-curing agent Ch. was more effective than self-curing agent leca. In all cases, both 2% Ch. and 15% leca were the optimum values. Higher cement content and/or lower water–cement ratio leads to more effective results of self-curing agents in concrete. Incorporation of silica fume into concrete mixtures enhances all physical properties.

  7. Significance of grafting in radiation curing reactions. Comparison of ionising radiation and UV systems

    International Nuclear Information System (INIS)

    Zilic, E.; Ng, L.; Viengkhou, V.; Garnett, J.L.

    1998-01-01

    Full text: Radiation curing is now an accepted commercial technology where both ionising radiation (electron beam) and ultra violet light (UV) sources are used. Grafting is essentially the copolymerisation of a monomer/oligomer to a backbone polymer whereas curing is the rapid polymerisation of a monomer/oligomer mixture onto the surface of the substrate. There is no time scale theoretically associated with grafting processes which can occur in minutes or hours whereas curing reactions are usually very rapid, occurring within a fraction of a second. An important difference between grafting and curing is the nature of the bonding occurring in each process. In grafting covalent carbon-carbon bonds are formed, whereas in curing, bonding usually involves weaker Van der Waals or London dispersion forces. The bonding properties of the systems are important in determining their use commercially. Thus the possibility that concurrent grafting during curing could occur in a system is important since if present, grafting would not only minimise delamination of the coated product but could also, in some circumstances, render difficulties recycling of the finished product especially if it were cellulosic. Hence the conditions for observing the occurrence of concurrent grafting during radiation curing are important. In the present paper, this problem has been studied by examining the effect that the components used in radiation curing exert on a typical reaction. Instead of electron beam sources, the spent fuel element facility at Lucas Heights is used to simulate such ionising radiation sources. The model system utilised is the grafting of a typical methacrylate to cellulose. This is the generic chemistry used in curing systems. The effect of typical additives from curing systems including polyfunctional monomer and oligomers in the grafting reactions have been studied. The ionising radiation results have been compared with analogous data from UV experiments. The significance

  8. Two-step variable selection in quantile regression models

    Directory of Open Access Journals (Sweden)

    FAN Yali

    2015-06-01

    Full Text Available We propose a two-step variable selection procedure for high dimensional quantile regressions, in which the dimension of the covariates, pn is much larger than the sample size n. In the first step, we perform ℓ1 penalty, and we demonstrate that the first step penalized estimator with the LASSO penalty can reduce the model from an ultra-high dimensional to a model whose size has the same order as that of the true model, and the selected model can cover the true model. The second step excludes the remained irrelevant covariates by applying the adaptive LASSO penalty to the reduced model obtained from the first step. Under some regularity conditions, we show that our procedure enjoys the model selection consistency. We conduct a simulation study and a real data analysis to evaluate the finite sample performance of the proposed approach.

  9. Regression Models for Predicting Force Coefficients of Aerofoils

    Directory of Open Access Journals (Sweden)

    Mohammed ABDUL AKBAR

    2015-09-01

    Full Text Available Renewable sources of energy are attractive and advantageous in a lot of different ways. Among the renewable energy sources, wind energy is the fastest growing type. Among wind energy converters, Vertical axis wind turbines (VAWTs have received renewed interest in the past decade due to some of the advantages they possess over their horizontal axis counterparts. VAWTs have evolved into complex 3-D shapes. A key component in predicting the output of VAWTs through analytical studies is obtaining the values of lift and drag coefficients which is a function of shape of the aerofoil, ‘angle of attack’ of wind and Reynolds’s number of flow. Sandia National Laboratories have carried out extensive experiments on aerofoils for the Reynolds number in the range of those experienced by VAWTs. The volume of experimental data thus obtained is huge. The current paper discusses three Regression analysis models developed wherein lift and drag coefficients can be found out using simple formula without having to deal with the bulk of the data. Drag coefficients and Lift coefficients were being successfully estimated by regression models with R2 values as high as 0.98.

  10. The Relationship between Economic Growth and Money Laundering – a Linear Regression Model

    Directory of Open Access Journals (Sweden)

    Daniel Rece

    2009-09-01

    Full Text Available This study provides an overview of the relationship between economic growth and money laundering modeled by a least squares function. The report analyzes statistically data collected from USA, Russia, Romania and other eleven European countries, rendering a linear regression model. The study illustrates that 23.7% of the total variance in the regressand (level of money laundering is “explained” by the linear regression model. In our opinion, this model will provide critical auxiliary judgment and decision support for anti-money laundering service systems.

  11. Regression analysis of informative current status data with the additive hazards model.

    Science.gov (United States)

    Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo

    2015-04-01

    This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.

  12. Poisson regression approach for modeling fatal injury rates amongst Malaysian workers

    International Nuclear Information System (INIS)

    Kamarulzaman Ibrahim; Heng Khai Theng

    2005-01-01

    Many safety studies are based on the analysis carried out on injury surveillance data. The injury surveillance data gathered for the analysis include information on number of employees at risk of injury in each of several strata where the strata are defined in terms of a series of important predictor variables. Further insight into the relationship between fatal injury rates and predictor variables may be obtained by the poisson regression approach. Poisson regression is widely used in analyzing count data. In this study, poisson regression is used to model the relationship between fatal injury rates and predictor variables which are year (1995-2002), gender, recording system and industry type. Data for the analysis were obtained from PERKESO and Jabatan Perangkaan Malaysia. It is found that the assumption that the data follow poisson distribution has been violated. After correction for the problem of over dispersion, the predictor variables that are found to be significant in the model are gender, system of recording, industry type, two interaction effects (interaction between recording system and industry type and between year and industry type). Introduction Regression analysis is one of the most popular

  13. Accounting for Zero Inflation of Mussel Parasite Counts Using Discrete Regression Models

    Directory of Open Access Journals (Sweden)

    Emel Çankaya

    2017-06-01

    Full Text Available In many ecological applications, the absences of species are inevitable due to either detection faults in samples or uninhabitable conditions for their existence, resulting in high number of zero counts or abundance. Usual practice for modelling such data is regression modelling of log(abundance+1 and it is well know that resulting model is inadequate for prediction purposes. New discrete models accounting for zero abundances, namely zero-inflated regression (ZIP and ZINB, Hurdle-Poisson (HP and Hurdle-Negative Binomial (HNB amongst others are widely preferred to the classical regression models. Due to the fact that mussels are one of the economically most important aquatic products of Turkey, the purpose of this study is therefore to examine the performances of these four models in determination of the significant biotic and abiotic factors on the occurrences of Nematopsis legeri parasite harming the existence of Mediterranean mussels (Mytilus galloprovincialis L.. The data collected from the three coastal regions of Sinop city in Turkey showed more than 50% of parasite counts on the average are zero-valued and model comparisons were based on information criterion. The results showed that the probability of the occurrence of this parasite is here best formulated by ZINB or HNB models and influential factors of models were found to be correspondent with ecological differences of the regions.

  14. Prediction of unwanted pregnancies using logistic regression, probit regression and discriminant analysis.

    Science.gov (United States)

    Ebrahimzadeh, Farzad; Hajizadeh, Ebrahim; Vahabi, Nasim; Almasian, Mohammad; Bakhteyar, Katayoon

    2015-01-01

    Unwanted pregnancy not intended by at least one of the parents has undesirable consequences for the family and the society. In the present study, three classification models were used and compared to predict unwanted pregnancies in an urban population. In this cross-sectional study, 887 pregnant mothers referring to health centers in Khorramabad, Iran, in 2012 were selected by the stratified and cluster sampling; relevant variables were measured and for prediction of unwanted pregnancy, logistic regression, discriminant analysis, and probit regression models and SPSS software version 21 were used. To compare these models, indicators such as sensitivity, specificity, the area under the ROC curve, and the percentage of correct predictions were used. The prevalence of unwanted pregnancies was 25.3%. The logistic and probit regression models indicated that parity and pregnancy spacing, contraceptive methods, household income and number of living male children were related to unwanted pregnancy. The performance of the models based on the area under the ROC curve was 0.735, 0.733, and 0.680 for logistic regression, probit regression, and linear discriminant analysis, respectively. Given the relatively high prevalence of unwanted pregnancies in Khorramabad, it seems necessary to revise family planning programs. Despite the similar accuracy of the models, if the researcher is interested in the interpretability of the results, the use of the logistic regression model is recommended.

  15. Transpiration of glasshouse rose crops: evaluation of regression models

    NARCIS (Netherlands)

    Baas, R.; Rijssel, van E.

    2006-01-01

    Regression models of transpiration (T) based on global radiation inside the greenhouse (G), with or without energy input from heating pipes (Eh) and/or vapor pressure deficit (VPD) were parameterized. Therefore, data on T, G, temperatures from air, canopy and heating pipes, and VPD from both a

  16. Electron beam curing of coating

    International Nuclear Information System (INIS)

    Fujioka, S.; Fujikawa, Z.

    1974-01-01

    Electron beam curing (EBC) method, by which hardened coating film is obtained by polymerizing and cross-linking paint with electron beam, has finally reached industrialized stage. While about seven items such as short curing time, high efficiency of energy consumption, and homogeneous curing are enumerated as the advantages of EBC method, it has limitations of the isolation requirement from air needing the injection of inert gas, and considerable amount of initial investment. In the electron accelerators employed in EBC method, the accelerating voltage is 250 to 750 kV, and the tube current is several tens of mA to 200 mA. As an example of EBC applications, EBC ''Erio'' steel sheet was developed by the cooperative research of Nippon Steel Corp., Dai-Nippon Printing Co. and Toray Industries, Inc. It is a high-class pre-coated metal product made from galvanized steel sheets, and the flat sheets with cured coating are sold, and final products are fabricated by being worked in various shapes in users. It seems necessary to develop the paint which enables to raise added value by adopting the EBC method. (Wakatsuki, Y.)

  17. Accelerated production of dry cured hams.

    Science.gov (United States)

    Marriott, N G; Graham, P P; Shaffer, C K; Phelps, S K

    1987-01-01

    Ten uncured legs from the right side of the sampled pork carcasses (Study A) were vacuum tumbled with the cure adjuncts for 30 min (T) and 10 counterparts from the left side were tumbled 30 min, rested 30 min and tumbled an additional 30 min (TRT). Evaluations were conducted at 40 and 70 days after cure application for color, taste attributes, percentage moisture, percentage salt and NO(3)(-) and NO(2)(-) content. Study B was the same except that 18 legs were boned, tumbled and cured for 40, 56 and 70 days. The TRT samples (Study A) at 40 days sustained less color fading (P 0.05) existed among the uncooked hams. Increased cure time enhanced moisture loss and salt content (Study A) and color retention during cookery (Study B). The TRT samples had increased moisture loss and salt content (Study A). Copyright © 1987. Published by Elsevier Ltd.

  18. Rubber curing chemistry governing the orientation of layered silicate

    Directory of Open Access Journals (Sweden)

    2007-11-01

    Full Text Available The effect of curing systems on the orientation and the dispersion of the layered silicates in acrylonitrile butadiene rubber nanocomposite is reported. Significant differences in X-ray diffraction pattern between peroxide curing and sulfur curing was observed. Intense X-ray scattering values in the XRD experiments from peroxide cured vulcanizates indicate an orientation of the layers in a preferred direction as evinced by transmission electron micrographs. However, sulfur cured vulcanizates show no preferential orientation of the silicate particles. Nevertheless, a closer inspection of transmission electron microscopy (TEM images of peroxide and sulfur cured samples shows exfoliated silicate layers in the acrylonitrile butadiene rubber (NBR matrix. It was revealed in the prevailing study that the use of an excess amount of stearic acid in the formulation of the sulfur curing package leads to almost exfoliated type X-ray scattering pattern.

  19. Curing kinetics of alkyd/melamine resin mixtures

    Directory of Open Access Journals (Sweden)

    Jovičić Mirjana C.

    2009-01-01

    Full Text Available Alkyd resins are the most popular and useful synthetic resins applied as the binder in protective coatings. Frequently they are not used alone but are modified with other synthetic resins in the manufacture of the coatings. An alkyd/melamine resin mixture is the usual composition for the preparation of coating called 'baking enamel' and it is cured through functional groups of resins at high temperatures. In this paper, curing kinetics of alkyd resins based on castor oil and dehydrated castor oil with melamine resin, has been studied by DSC method with programmed heating and in isothermal mode. The results determined from dynamic DSC curves were mathematically transformed using the Ozawa isoconversional method for obtaining the isothermal data. These results, degree of curing versus time, are in good agreement with those determined by the isothermal DSC experiments. By applying the Ozawa method it is possible to calculate the isothermal kinetic parameters for the alkyd/melamine resin mixtures curing using only calorimetric data obtained by dynamic DSC runs. Depending on the alkyd resin type and ratio in mixtures the values of activation energies of curing process of resin mixtures are from 51.3 to 114 kJ mol-1. The rate constant of curing increases with increasing the content of melamine resin in the mixture and with curing temperature. The reaction order varies from 1.12 to 1.37 for alkyd based on dehydrated castor oil/melamine resin mixtures and from 1.74 to 2.03 for mixtures with alkyd based on castor oil. Based on the results obtained, we propose that dehydrated castor oil alkyd/melamine resin mixtures can be used in practice (curing temperatures from 120 to 160°C.

  20. Convergent Time-Varying Regression Models for Data Streams: Tracking Concept Drift by the Recursive Parzen-Based Generalized Regression Neural Networks.

    Science.gov (United States)

    Duda, Piotr; Jaworski, Maciej; Rutkowski, Leszek

    2018-03-01

    One of the greatest challenges in data mining is related to processing and analysis of massive data streams. Contrary to traditional static data mining problems, data streams require that each element is processed only once, the amount of allocated memory is constant and the models incorporate changes of investigated streams. A vast majority of available methods have been developed for data stream classification and only a few of them attempted to solve regression problems, using various heuristic approaches. In this paper, we develop mathematically justified regression models working in a time-varying environment. More specifically, we study incremental versions of generalized regression neural networks, called IGRNNs, and we prove their tracking properties - weak (in probability) and strong (with probability one) convergence assuming various concept drift scenarios. First, we present the IGRNNs, based on the Parzen kernels, for modeling stationary systems under nonstationary noise. Next, we extend our approach to modeling time-varying systems under nonstationary noise. We present several types of concept drifts to be handled by our approach in such a way that weak and strong convergence holds under certain conditions. Finally, in the series of simulations, we compare our method with commonly used heuristic approaches, based on forgetting mechanism or sliding windows, to deal with concept drift. Finally, we apply our concept in a real life scenario solving the problem of currency exchange rates prediction.

  1. How does duration of curing affect the radiopacity of dental materials?

    Energy Technology Data Exchange (ETDEWEB)

    Bejeh Mir, Arash Poorsattar [School of Dentistry, Babol University of Medical Sciences, Babol (Iran, Islamic Republic of); Bejeh Mir, Morvarid Poorsattar [Private Practice of Orthodontics, Montreal (Canada)

    2012-06-15

    Clinicians commonly encounter cases in which it is difficult to determine whether adjacent radiopacities are normal or pathologic. The ideal radiopacity of composite resin is equal to or higher than that of the same thickness of aluminum. We aimed to investigate the possible effects of different curing times on the post-24-hour radiopacity of composite resins on digital radiographs. One mm thick samples of Filtek P60 and Clearfil resin composites were prepared and cured with three regimens of continuous 400 mW/cm{sup 2} irradiance for 10, 20 and 30 seconds. Along with a 12-step aluminum step wedge, digital radiographs were captured and the radiopacities were transformed to the equivalent aluminum thicknesses. Data were compared by a general linear model and repeated-measures of ANOVA. Overall, the calculated equivalent aluminum thicknesses of composite resins were increased significantly by doubling and tripling the curing times (F(2,8)=8.94, p=0.002). Notably, Bonferroni post-hoc tests confirmed that the radiopacity of the cured Filtek P60 was significantly higher at 30 seconds compared with 10 seconds (p=0.04). Although the higher radiopacity was observed by increasing the time, other comparisons showed no statistical significance (p>0.05). These results supported the hypothesis that the radiopacity of resin composites might be related to the duration of light curing. In addition to the current standards for radiopacity of digital images, defining a standard protocol for curing of dental materials should be considered, and it is suggested that they should be added to the current requirements for dental material.

  2. Development of a fast curing tissue adhesive for meniscus tear repair.

    Science.gov (United States)

    Bochyńska, Agnieszka Izabela; Hannink, Gerjon; Janssen, Dennis; Buma, Pieter; Grijpma, Dirk W

    2017-01-01

    Isocyanate-terminated adhesive amphiphilic block copolymers are attractive materials to treat meniscus tears due to their tuneable mechanical properties and good adhesive characteristics. However, a drawback of this class of materials is their relatively long curing time. In this study, we evaluate the use of an amine cross-linker and addition of catalysts as two strategies to accelerate the curing rates of a recently developed biodegradable reactive isocyanate-terminated hyper-branched adhesive block copolymer prepared from polyethylene glycol (PEG), trimethylene carbonate, citric acid and hexamethylene diisocyanate. The curing kinetics of the hyper-branched adhesive alone and in combination with different concentrations of spermidine solutions, and after addition of 2,2-dimorpholinodiethylether (DMDEE) or 1,4-diazabicyclo [2.2.2] octane (DABCO) were determined using FTIR. Additionally, lap-shear adhesion tests using all compositions at various time points were performed. The two most promising compositions of the fast curing adhesives were evaluated in a meniscus bucket handle lesion model and their performance was compared with that of fibrin glue. The results showed that addition of both spermidine and catalysts to the adhesive copolymer can accelerate the curing rate and that firm adhesion can already be achieved after 2 h. The adhesive strength to meniscus tissue of 3.2-3.7 N was considerably higher for the newly developed compositions than for fibrin glue (0.3 N). The proposed combination of an adhesive component and a cross-linking component or catalyst is a promising way to accelerate curing rates of isocyanate-terminated tissue adhesives.

  3. How does duration of curing affect the radiopacity of dental materials?

    International Nuclear Information System (INIS)

    Bejeh Mir, Arash Poorsattar; Bejeh Mir, Morvarid Poorsattar

    2012-01-01

    Clinicians commonly encounter cases in which it is difficult to determine whether adjacent radiopacities are normal or pathologic. The ideal radiopacity of composite resin is equal to or higher than that of the same thickness of aluminum. We aimed to investigate the possible effects of different curing times on the post-24-hour radiopacity of composite resins on digital radiographs. One mm thick samples of Filtek P60 and Clearfil resin composites were prepared and cured with three regimens of continuous 400 mW/cm 2 irradiance for 10, 20 and 30 seconds. Along with a 12-step aluminum step wedge, digital radiographs were captured and the radiopacities were transformed to the equivalent aluminum thicknesses. Data were compared by a general linear model and repeated-measures of ANOVA. Overall, the calculated equivalent aluminum thicknesses of composite resins were increased significantly by doubling and tripling the curing times (F(2,8)=8.94, p=0.002). Notably, Bonferroni post-hoc tests confirmed that the radiopacity of the cured Filtek P60 was significantly higher at 30 seconds compared with 10 seconds (p=0.04). Although the higher radiopacity was observed by increasing the time, other comparisons showed no statistical significance (p>0.05). These results supported the hypothesis that the radiopacity of resin composites might be related to the duration of light curing. In addition to the current standards for radiopacity of digital images, defining a standard protocol for curing of dental materials should be considered, and it is suggested that they should be added to the current requirements for dental material.

  4. Predicting Antitumor Activity of Peptides by Consensus of Regression Models Trained on a Small Data Sample

    Directory of Open Access Journals (Sweden)

    Ivanka Jerić

    2011-11-01

    Full Text Available Predicting antitumor activity of compounds using regression models trained on a small number of compounds with measured biological activity is an ill-posed inverse problem. Yet, it occurs very often within the academic community. To counteract, up to some extent, overfitting problems caused by a small training data, we propose to use consensus of six regression models for prediction of biological activity of virtual library of compounds. The QSAR descriptors of 22 compounds related to the opioid growth factor (OGF, Tyr-Gly-Gly-Phe-Met with known antitumor activity were used to train regression models: the feed-forward artificial neural network, the k-nearest neighbor, sparseness constrained linear regression, the linear and nonlinear (with polynomial and Gaussian kernel support vector machine. Regression models were applied on a virtual library of 429 compounds that resulted in six lists with candidate compounds ranked by predicted antitumor activity. The highly ranked candidate compounds were synthesized, characterized and tested for an antiproliferative activity. Some of prepared peptides showed more pronounced activity compared with the native OGF; however, they were less active than highly ranked compounds selected previously by the radial basis function support vector machine (RBF SVM regression model. The ill-posedness of the related inverse problem causes unstable behavior of trained regression models on test data. These results point to high complexity of prediction based on the regression models trained on a small data sample.

  5. A simulation study on Bayesian Ridge regression models for several collinearity levels

    Science.gov (United States)

    Efendi, Achmad; Effrihan

    2017-12-01

    When analyzing data with multiple regression model if there are collinearities, then one or several predictor variables are usually omitted from the model. However, there sometimes some reasons, for instance medical or economic reasons, the predictors are all important and should be included in the model. Ridge regression model is not uncommon in some researches to use to cope with collinearity. Through this modeling, weights for predictor variables are used for estimating parameters. The next estimation process could follow the concept of likelihood. Furthermore, for the estimation nowadays the Bayesian version could be an alternative. This estimation method does not match likelihood one in terms of popularity due to some difficulties; computation and so forth. Nevertheless, with the growing improvement of computational methodology recently, this caveat should not at the moment become a problem. This paper discusses about simulation process for evaluating the characteristic of Bayesian Ridge regression parameter estimates. There are several simulation settings based on variety of collinearity levels and sample sizes. The results show that Bayesian method gives better performance for relatively small sample sizes, and for other settings the method does perform relatively similar to the likelihood method.

  6. Monitoring Prepregs As They Cure

    Science.gov (United States)

    Young, P. R.; Gleason, J. R.; Chang, A. C.

    1986-01-01

    Quality IR spectra obtained in dynamic heating environment. New technique obtains quality infrared spectra on graphite-fiber-reinforced, polymeric-matrix-resin prepregs as they cure. Technique resulted from modification of diffuse reflectance/Fourier transform infrared (DR/FTIR) technique previously used to analyze environmentally exposed cured graphite composites. Technique contribute to better understanding of prepreg chemistry/temperature relationships and development of more efficient processing cycles for advanced materials.

  7. Catalyzed Synthesis and Characterization of a Novel Lignin-Based Curing Agent for the Curing of High-Performance Epoxy Resin

    Directory of Open Access Journals (Sweden)

    Saeid Nikafshar

    2017-07-01

    Full Text Available In this study, lignin, an aromatic compound from the forestry industry, was used as a renewable material to synthesize a new aromatic amine curing agent for epoxy resin. Firstly, lignin was separated from black liquor and hydroxyl groups were converted to tosyl groups as leaving groups. Then, primary amination was conducted using an ammonia solution at high pressure and temperature, in the presence of a nano-alumina-based catalyst. The structure of the nanocatalyst was confirmed by FT-IR, ICP, SEM, and XPS analyses. According to the FT-IR spectra, a demethylation reaction, the substitution of hydroxyl groups with tosyl groups, and then an amination reaction were successfully performed on lignin, which was further confirmed by the 13C NMR and CHNS analyses. The active hydrogen equivalent of aminated lignin was determined and three samples with 9.9 wt %, 12.9 wt %, and 15.9 wt % of aminated lignin, as curing agents, were prepared for curing the diglycidyl ether of bisphenol A (DGEBA. The thermal characteristics of the curing process of these epoxy samples were determined by DSC and TGA analyses. Moreover, the mechanical performance of the cured epoxy systems, e.g., the tensile strength and Izod impact strength, were measured, showing that in the presence of 12.9 wt % aminated lignin, the mechanical properties of the aminated lignin-epoxy system exhibited the best performance, which was competitive, compared to the epoxy systems cured by commercial aromatic curing agents.

  8. Translating Genomic Discoveries to Cure Ultrahypermutant ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Translating Genomic Discoveries to Cure Ultrahypermutant Mismatch Repair Deficient Brain Tumours. Malignant brain tumours are the most common cause of death among children with cancer, but there is no known cure. This project will advance research in this important field. Inherited mutations and childhood cancer.

  9. Photovoltaic Array Condition Monitoring Based on Online Regression of Performance Model

    DEFF Research Database (Denmark)

    Spataru, Sergiu; Sera, Dezso; Kerekes, Tamas

    2013-01-01

    regression modeling, from PV array production, plane-of-array irradiance, and module temperature measurements, acquired during an initial learning phase of the system. After the model has been parameterized automatically, the condition monitoring system enters the normal operation phase, where...

  10. Extended cox regression model: The choice of timefunction

    Science.gov (United States)

    Isik, Hatice; Tutkun, Nihal Ata; Karasoy, Durdu

    2017-07-01

    Cox regression model (CRM), which takes into account the effect of censored observations, is one the most applicative and usedmodels in survival analysis to evaluate the effects of covariates. Proportional hazard (PH), requires a constant hazard ratio over time, is the assumptionofCRM. Using extended CRM provides the test of including a time dependent covariate to assess the PH assumption or an alternative model in case of nonproportional hazards. In this study, the different types of real data sets are used to choose the time function and the differences between time functions are analyzed and discussed.

  11. Plasmid-cured Chlamydia caviae activates TLR2-dependent signaling and retains virulence in the guinea pig model of genital tract infection.

    Science.gov (United States)

    Frazer, Lauren C; Darville, Toni; Chandra-Kuntal, Kumar; Andrews, Charles W; Zurenski, Matthew; Mintus, Margaret; AbdelRahman, Yasser M; Belland, Robert J; Ingalls, Robin R; O'Connell, Catherine M

    2012-01-01

    Loss of the conserved "cryptic" plasmid from C. trachomatis and C. muridarum is pleiotropic, resulting in reduced innate inflammatory activation via TLR2, glycogen accumulation and infectivity. The more genetically distant C. caviae GPIC is a natural pathogen of guinea pigs and induces upper genital tract pathology when inoculated intravaginally, modeling human disease. To examine the contribution of pCpGP1 to C. caviae pathogenesis, a cured derivative of GPIC, strain CC13, was derived and evaluated in vitro and in vivo. Transcriptional profiling of CC13 revealed only partial conservation of previously identified plasmid-responsive chromosomal loci (PRCL) in C. caviae. However, 2-deoxyglucose (2DG) treatment of GPIC and CC13 resulted in reduced transcription of all identified PRCL, including glgA, indicating the presence of a plasmid-independent glucose response in this species. In contrast to plasmid-cured C. muridarum and C. trachomatis, plasmid-cured C. caviae strain CC13 signaled via TLR2 in vitro and elicited cytokine production in vivo similar to wild-type C. caviae. Furthermore, inflammatory pathology induced by infection of guinea pigs with CC13 was similar to that induced by GPIC, although we observed more rapid resolution of CC13 infection in estrogen-treated guinea pigs. These data indicate that either the plasmid is not involved in expression or regulation of virulence in C. caviae or that redundant effectors prevent these phenotypic changes from being observed in C. caviae plasmid-cured strains.

  12. Plasmid-cured Chlamydia caviae activates TLR2-dependent signaling and retains virulence in the guinea pig model of genital tract infection.

    Directory of Open Access Journals (Sweden)

    Lauren C Frazer

    Full Text Available Loss of the conserved "cryptic" plasmid from C. trachomatis and C. muridarum is pleiotropic, resulting in reduced innate inflammatory activation via TLR2, glycogen accumulation and infectivity. The more genetically distant C. caviae GPIC is a natural pathogen of guinea pigs and induces upper genital tract pathology when inoculated intravaginally, modeling human disease. To examine the contribution of pCpGP1 to C. caviae pathogenesis, a cured derivative of GPIC, strain CC13, was derived and evaluated in vitro and in vivo. Transcriptional profiling of CC13 revealed only partial conservation of previously identified plasmid-responsive chromosomal loci (PRCL in C. caviae. However, 2-deoxyglucose (2DG treatment of GPIC and CC13 resulted in reduced transcription of all identified PRCL, including glgA, indicating the presence of a plasmid-independent glucose response in this species. In contrast to plasmid-cured C. muridarum and C. trachomatis, plasmid-cured C. caviae strain CC13 signaled via TLR2 in vitro and elicited cytokine production in vivo similar to wild-type C. caviae. Furthermore, inflammatory pathology induced by infection of guinea pigs with CC13 was similar to that induced by GPIC, although we observed more rapid resolution of CC13 infection in estrogen-treated guinea pigs. These data indicate that either the plasmid is not involved in expression or regulation of virulence in C. caviae or that redundant effectors prevent these phenotypic changes from being observed in C. caviae plasmid-cured strains.

  13. INVESTIGATION OF E-MAIL TRAFFIC BY USING ZERO-INFLATED REGRESSION MODELS

    Directory of Open Access Journals (Sweden)

    Yılmaz KAYA

    2012-06-01

    Full Text Available Based on count data obtained with a value of zero may be greater than anticipated. These types of data sets should be used to analyze by regression methods taking into account zero values. Zero- Inflated Poisson (ZIP, Zero-Inflated negative binomial (ZINB, Poisson Hurdle (PH, negative binomial Hurdle (NBH are more common approaches in modeling more zero value possessing dependent variables than expected. In the present study, the e-mail traffic of Yüzüncü Yıl University in 2009 spring semester was investigated. ZIP and ZINB, PH and NBH regression methods were applied on the data set because more zeros counting (78.9% were found in data set than expected. ZINB and NBH regression considered zero dispersion and overdispersion were found to be more accurate results due to overdispersion and zero dispersion in sending e-mail. ZINB is determined to be best model accordingto Vuong statistics and information criteria.

  14. Focused information criterion and model averaging based on weighted composite quantile regression

    KAUST Repository

    Xu, Ganggang; Wang, Suojin; Huang, Jianhua Z.

    2013-01-01

    We study the focused information criterion and frequentist model averaging and their application to post-model-selection inference for weighted composite quantile regression (WCQR) in the context of the additive partial linear models. With the non

  15. Influence of curing time, overlay material and thickness on three light-curing composites used for luting indirect composite restorations.

    Science.gov (United States)

    D'Arcangelo, Camillo; De Angelis, Francesco; Vadini, Mirco; Carluccio, Fabio; Vitalone, Laura Merla; D'Amario, Maurizio

    2012-08-01

    To assess the microhardness of three resin composites employed in the adhesive luting of indirect composite restorations and examine the influence of the overlay material and thickness as well as the curing time on polymerization rate. Three commercially available resin composites were selected: Enamel Plus HRI (Micerium) (ENA), Saremco ELS (Saremco Dental) (SAR), Esthet-X HD (Dentsply/DeTrey) (EST-X). Post-polymerized cylinders of 6 different thicknesses were produced and used as overlays: 2 mm, 3 mm, 3.5 mm, 4 mm, 5 mm, and 6 mm. Two-mm-thick disks were produced and employed as underlays. A standardized amount of composite paste was placed between the underlay and the overlay surfaces which were maintained at a fixed distance of 0.5 mm. Light curing of the luting composite layer was performed through the overlays for 40, 80, or 120 s. For each specimen, the composite to be cured, the cured overlay, and the underlay were made out of the same batch of resin composite. All specimens were assigned to three experimental groups on the basis of the resin composite used, and to subgroups on the basis of the overlay thickness and the curing time, resulting in 54 experimental subgroups (n = 5). Forty-five additional specimens, 15 for each material under investigation, were produced and subjected to 40, 80, or 120 s of light curing using a microscope glass as an overlay; they were assigned to 9 control subgroups (n = 5). Three Vicker's hardness (VH) indentations were performed on each specimen. Means and standard deviations were calculated. Data were statistically analyzed using 3-way ANOVA. Within the same material, VH values lower than 55% of control were not considered acceptable. The used material, the overlay thickness, and the curing time significantly influenced VH values. In the ENA group, acceptable hardness values were achieved with 3.5-mm or thinner overlays after 120 or 80 s curing time (VH 41.75 and 39.32, respectively), and with 2-mm overlays after 40 s (VH 54

  16. Radiation cured coating containing glitter particles and process therefor

    International Nuclear Information System (INIS)

    Sachs, P.R.; Sears, J.W.

    1992-01-01

    Radiation curable coatings for use on a variety of substrates and curable by exposure to ionizing irradiation of ultraviolet light are well known. The use of urethane type coatings cured with ultraviolet light to provide protective wear layers for wall or floor tile is for instance described in U.S. Pat. No. 4,180,615. U.S. Pat. No. 3,918,393 describes a method for obtaining a non-glossy coating on various substrates by curing radiation sensitive material with ionizing irradiation or ultraviolet light in two stages. In this process the coating is partially cured in an oxygen-containing atmosphere and the curing is completed in an inert atmosphere. U.S. Pat. No. 4,122,225 discloses a method and apparatus for coating tile which involves the application of one coat of radiation curable material to an entire substrate followed by partial curing and the subsequent application and curing of a second coat or radiation curable material only on high areas of the substrate which are subject to greater than average wear. Use of pigment in radiation cured coatings on products such as floor covering which are subject to wear during use has presented substantial difficulties. Incorporation of pigment, especially enough pigment to make the coating opaque, makes the coating hard to cure and substantially reduces the thicknesses of coating which can be cured relative to a clear coating cured under the same conditions

  17. Accelerated Cure Project for Multiple Sclerosis

    Science.gov (United States)

    ... questions and enable an era of optimized MS treatment. Read more... The Accelerated Cure Project for MS is a non-profit, 501(c)(3) tax-exempt organization whose mission is to accelerate efforts toward a cure for multiple sclerosis by rapidly advancing research that determines its causes ...

  18. 7 CFR 29.3002 - Air-cured.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Air-cured. 29.3002 Section 29.3002 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing...-cured tobacco should not carry the odor of smoke or fumes resulting from the application of artificial...

  19. Model-based bootstrapping when correcting for measurement error with application to logistic regression.

    Science.gov (United States)

    Buonaccorsi, John P; Romeo, Giovanni; Thoresen, Magne

    2018-03-01

    When fitting regression models, measurement error in any of the predictors typically leads to biased coefficients and incorrect inferences. A plethora of methods have been proposed to correct for this. Obtaining standard errors and confidence intervals using the corrected estimators can be challenging and, in addition, there is concern about remaining bias in the corrected estimators. The bootstrap, which is one option to address these problems, has received limited attention in this context. It has usually been employed by simply resampling observations, which, while suitable in some situations, is not always formally justified. In addition, the simple bootstrap does not allow for estimating bias in non-linear models, including logistic regression. Model-based bootstrapping, which can potentially estimate bias in addition to being robust to the original sampling or whether the measurement error variance is constant or not, has received limited attention. However, it faces challenges that are not present in handling regression models with no measurement error. This article develops new methods for model-based bootstrapping when correcting for measurement error in logistic regression with replicate measures. The methodology is illustrated using two examples, and a series of simulations are carried out to assess and compare the simple and model-based bootstrap methods, as well as other standard methods. While not always perfect, the model-based approaches offer some distinct improvements over the other methods. © 2017, The International Biometric Society.

  20. Microwave and thermal curing of an epoxy resin for microelectronic applications

    Energy Technology Data Exchange (ETDEWEB)

    Johnston, K. [Institute of Chemical Sciences, School of Engineering and Physical Sciences, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom); Pavuluri, S.K.; Leonard, M.T.; Desmulliez, M.P.Y. [MIcroSystems Engineering Centre (MISEC), Institute of Signals, Sensors and Systems, School of Engineering and Physical Sciences, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom); Arrighi, V., E-mail: v.arrighi@hw.ac.uk [Institute of Chemical Sciences, School of Engineering and Physical Sciences, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom)

    2015-09-20

    Graphical abstract: - Highlights: • Thermal and microwave curing of a commercial epoxy resin EO1080 are compared. • Microwave curing increases cure rate and does not adversely affect properties. • The curing of EO1080 is generally autocatalytic but deviates at high conversion. • Microwave radiation has a more complex effect on curing kinetics. - Abstract: Microwave curing of thermosetting polymers has a number of advantages to natural or thermal oven curing and is considered a cost-effective alternative. Here we present a detailed study of a commercially available epoxy resin, EO1080. Samples that are thermally cured are compared to curing using a recently developed modular microwave processing system. For commercial purposes it is crucial to demonstrate that microwave curing does not adversely affect the thermal and chemical properties of the material. Therefore, the kinetics of cure and various post cure properties of the resin are investigated. Attenuated Total Reflectance Fourier-Transform Infrared (ATR-FTIR) analysis shows no significant difference between the conventionally and microwave cured samples. Differential scanning calorimetry (DSC) is used to monitor the kinetics of the curing reaction, as well as determine the thermal and ageing properties of the material. As expected, the rate of curing is higher when using microwave energy and we attempt to quantify differences compared to conventional thermal curing. No change in glass transition temperature (T{sub g}) is observed. For the first time, enthalpy relaxation measurements performed on conventional and microwave cured samples are reported and these indicate similar ageing properties at any given temperature under T{sub g}.

  1. Microwave and thermal curing of an epoxy resin for microelectronic applications

    International Nuclear Information System (INIS)

    Johnston, K.; Pavuluri, S.K.; Leonard, M.T.; Desmulliez, M.P.Y.; Arrighi, V.

    2015-01-01

    Graphical abstract: - Highlights: • Thermal and microwave curing of a commercial epoxy resin EO1080 are compared. • Microwave curing increases cure rate and does not adversely affect properties. • The curing of EO1080 is generally autocatalytic but deviates at high conversion. • Microwave radiation has a more complex effect on curing kinetics. - Abstract: Microwave curing of thermosetting polymers has a number of advantages to natural or thermal oven curing and is considered a cost-effective alternative. Here we present a detailed study of a commercially available epoxy resin, EO1080. Samples that are thermally cured are compared to curing using a recently developed modular microwave processing system. For commercial purposes it is crucial to demonstrate that microwave curing does not adversely affect the thermal and chemical properties of the material. Therefore, the kinetics of cure and various post cure properties of the resin are investigated. Attenuated Total Reflectance Fourier-Transform Infrared (ATR-FTIR) analysis shows no significant difference between the conventionally and microwave cured samples. Differential scanning calorimetry (DSC) is used to monitor the kinetics of the curing reaction, as well as determine the thermal and ageing properties of the material. As expected, the rate of curing is higher when using microwave energy and we attempt to quantify differences compared to conventional thermal curing. No change in glass transition temperature (T g ) is observed. For the first time, enthalpy relaxation measurements performed on conventional and microwave cured samples are reported and these indicate similar ageing properties at any given temperature under T g

  2. Financing cures in the United States.

    Science.gov (United States)

    Basu, Anirban

    2015-02-01

    True cures in health care are rare but likely not for long. The high price tag that accompanies a cure along with its rapid uptake create challenges in the financing of cures by public and private payers. In the US, the disaggregated nature of health insurance system adds to this challenge as patients frequently churn across multiple health plans. This creates a 'free-rider' problem, where no one health plan has the incentive to invest in cure since the returns will be scattered over many health plans. Here, a new health currency is proposed as a generalized version of a social impact bond that has the potential to solve this free-rider problem, as it can be traded not only between public and private payers but also within the private sector. An ensuing debate as to whether and how to develop such a currency can serve the US health care system well.

  3. Ordinal regression models to describe tourist satisfaction with Sintra's world heritage

    Science.gov (United States)

    Mouriño, Helena

    2013-10-01

    In Tourism Research, ordinal regression models are becoming a very powerful tool in modelling the relationship between an ordinal response variable and a set of explanatory variables. In August and September 2010, we conducted a pioneering Tourist Survey in Sintra, Portugal. The data were obtained by face-to-face interviews at the entrances of the Palaces and Parks of Sintra. The work developed in this paper focus on two main points: tourists' perception of the entrance fees; overall level of satisfaction with this heritage site. For attaining these goals, ordinal regression models were developed. We concluded that tourist's nationality was the only significant variable to describe the perception of the admission fees. Also, Sintra's image among tourists depends not only on their nationality, but also on previous knowledge about Sintra's World Heritage status.

  4. Combination of supervised and semi-supervised regression models for improved unbiased estimation

    DEFF Research Database (Denmark)

    Arenas-Garía, Jeronimo; Moriana-Varo, Carlos; Larsen, Jan

    2010-01-01

    In this paper we investigate the steady-state performance of semisupervised regression models adjusted using a modified RLS-like algorithm, identifying the situations where the new algorithm is expected to outperform standard RLS. By using an adaptive combination of the supervised and semisupervi......In this paper we investigate the steady-state performance of semisupervised regression models adjusted using a modified RLS-like algorithm, identifying the situations where the new algorithm is expected to outperform standard RLS. By using an adaptive combination of the supervised...

  5. Random regression models for daily feed intake in Danish Duroc pigs

    DEFF Research Database (Denmark)

    Strathe, Anders Bjerring; Mark, Thomas; Jensen, Just

    The objective of this study was to develop random regression models and estimate covariance functions for daily feed intake (DFI) in Danish Duroc pigs. A total of 476201 DFI records were available on 6542 Duroc boars between 70 to 160 days of age. The data originated from the National test station......-year-season, permanent, and animal genetic effects. The functional form was based on Legendre polynomials. A total of 64 models for random regressions were initially ranked by BIC to identify the approximate order for the Legendre polynomials using AI-REML. The parsimonious model included Legendre polynomials of 2nd...... order for genetic and permanent environmental curves and a heterogeneous residual variance, allowing the daily residual variance to change along the age trajectory due to scale effects. The parameters of the model were estimated in a Bayesian framework, using the RJMC module of the DMU package, where...

  6. Photoacoustic monitoring of inhomogeneous curing processes in polystyrene emulsions

    International Nuclear Information System (INIS)

    Vargas-Luna, M.; Gutierrez-Juarez, G.; Rodriguez-Vizcaino, J.M.; Varela-Nsjera, J.B.; Rodriguez-Palencia, J.M.; Bernal-Alvarado, J.; Sosa, M.; Alvarado-Gil, J.J.

    2002-01-01

    The time evolution of the inhomogeneous curing process of polystyrene emulsions is studied using a variant of the conventional photoacoustic (PA) technique. The thermal effusivity, as a function of time, is determined in order to monitor the sintering process of a styrene emulsion in different steps of the manufacturing procedure. PA measurements of thermal effusivity show a sigmoidal growth as a function of time during the curing process. The parameterization of these curves permits the determination of the characteristic curing time and velocity of the process. A decreasing of the curing time and an increasing curing velocity for the final steps of the manufacturing process are observed. The feasibility of our approach and its potentiality for the characterization of other curing process are discussed. (author)

  7. 7 CFR 29.1019 - Flue-cured.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Flue-cured. 29.1019 Section 29.1019 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... tobacco; or tobacco cured by some other process which accomplishes the same results. [42 FR 21092, Apr. 25...

  8. Longitudinal beta regression models for analyzing health-related quality of life scores over time

    Directory of Open Access Journals (Sweden)

    Hunger Matthias

    2012-09-01

    Full Text Available Abstract Background Health-related quality of life (HRQL has become an increasingly important outcome parameter in clinical trials and epidemiological research. HRQL scores are typically bounded at both ends of the scale and often highly skewed. Several regression techniques have been proposed to model such data in cross-sectional studies, however, methods applicable in longitudinal research are less well researched. This study examined the use of beta regression models for analyzing longitudinal HRQL data using two empirical examples with distributional features typically encountered in practice. Methods We used SF-6D utility data from a German older age cohort study and stroke-specific HRQL data from a randomized controlled trial. We described the conceptual differences between mixed and marginal beta regression models and compared both models to the commonly used linear mixed model in terms of overall fit and predictive accuracy. Results At any measurement time, the beta distribution fitted the SF-6D utility data and stroke-specific HRQL data better than the normal distribution. The mixed beta model showed better likelihood-based fit statistics than the linear mixed model and respected the boundedness of the outcome variable. However, it tended to underestimate the true mean at the upper part of the distribution. Adjusted group means from marginal beta model and linear mixed model were nearly identical but differences could be observed with respect to standard errors. Conclusions Understanding the conceptual differences between mixed and marginal beta regression models is important for their proper use in the analysis of longitudinal HRQL data. Beta regression fits the typical distribution of HRQL data better than linear mixed models, however, if focus is on estimating group mean scores rather than making individual predictions, the two methods might not differ substantially.

  9. CURING OF POLYMERIC COMPOSITES USING MICROWAVE RESIN TRANSFER MOULDING (RTM

    Directory of Open Access Journals (Sweden)

    R. YUSOFF

    2007-08-01

    Full Text Available The main objective of this work is to compare the difference between microwave heating and conventional thermal heating in fabricating carbon/epoxy composites. Two types of epoxy resin systems were used as matrices, LY5052-HY5052 and DGEBA-HY917-DY073. All composite samples were fabricated using resin transfer moulding (RTM technique. The curing of the LY5052-HY5052-carbon and the DGEBA-HY917-DY073-carbon composite systems, were carried out at 100 °C and 120 °C, respectively. Microwave heating showed better temperature control than conventional heating, however, the heating rate of the microwave cured samples were slower than the conventionally cured samples. This was attributed to the lower power (250 W used when heating with microwaves compared to 2000 W used in conventional heating. Study of thermal characteristics as curing progressed showed that the polymerisation reaction occurred at a faster rate during microwave curing than in conventional curing for both the DGEBA and the LY/HY5052 carbon composite systems. The actual cure cycle was reduced from 60 minutes to 40 minutes when using microwaves for curing DGEBA-carbon composites. As for LY/HY5052-carbon composites, the actual cure cycle was reduced from 3 hours to 40 minutes. Both conventional and microwave heating yielded similar glass transition temperatures (120 °C for DGEBA systems and 130 °C for LY/HY5052 systems. Microwave cured composites had higher void contents than conventionally cured composites (2.2-2.8% and 1.8-2.4% for DGEBA and LY/HY5052 microwave cured composites, respectively, compared to 0.2-0.4% for both DGEBA and LY/HY5052 thermally cured composites. C-scan traces showed that all composites, regardless of methods of curing, had minimal defects.

  10. Curing critical links in oscillator networks as power flow models

    International Nuclear Information System (INIS)

    Rohden, Martin; Meyer-Ortmanns, Hildegard; Witthaut, Dirk; Timme, Marc

    2017-01-01

    Modern societies crucially depend on the robust supply with electric energy so that blackouts of power grids can have far reaching consequences. Typically, large scale blackouts take place after a cascade of failures: the failure of a single infrastructure component, such as a critical transmission line, results in several subsequent failures that spread across large parts of the network. Improving the robustness of a network to prevent such secondary failures is thus key for assuring a reliable power supply. In this article we analyze the nonlocal rerouting of power flows after transmission line failures for a simplified AC power grid model and compare different strategies to improve network robustness. We identify critical links in the grid and compute alternative pathways to quantify the grid’s redundant capacity and to find bottlenecks along the pathways. Different strategies are developed and tested to increase transmission capacities to restore stability with respect to transmission line failures. We show that local and nonlocal strategies typically perform alike: one can equally well cure critical links by providing backup capacities locally or by extending the capacities of bottleneck links at remote locations. (paper)

  11. A Gompertz regression model for fern spores germination

    Directory of Open Access Journals (Sweden)

    Gabriel y Galán, Jose María

    2015-06-01

    Full Text Available Germination is one of the most important biological processes for both seed and spore plants, also for fungi. At present, mathematical models of germination have been developed in fungi, bryophytes and several plant species. However, ferns are the only group whose germination has never been modelled. In this work we develop a regression model of the germination of fern spores. We have found that for Blechnum serrulatum, Blechnum yungense, Cheilanthes pilosa, Niphidium macbridei and Polypodium feuillei species the Gompertz growth model describe satisfactorily cumulative germination. An important result is that regression parameters are independent of fern species and the model is not affected by intraspecific variation. Our results show that the Gompertz curve represents a general germination model for all the non-green spore leptosporangiate ferns, including in the paper a discussion about the physiological and ecological meaning of the model.La germinación es uno de los procesos biológicos más relevantes tanto para las plantas con esporas, como para las plantas con semillas y los hongos. Hasta el momento, se han desarrollado modelos de germinación para hongos, briofitos y diversas especies de espermatófitos. Los helechos son el único grupo de plantas cuya germinación nunca ha sido modelizada. En este trabajo se desarrolla un modelo de regresión para explicar la germinación de las esporas de helechos. Observamos que para las especies Blechnum serrulatum, Blechnum yungense, Cheilanthes pilosa, Niphidium macbridei y Polypodium feuillei el modelo de crecimiento de Gompertz describe satisfactoriamente la germinación acumulativa. Un importante resultado es que los parámetros de la regresión son independientes de la especie y que el modelo no está afectado por variación intraespecífica. Por lo tanto, los resultados del trabajo muestran que la curva de Gompertz puede representar un modelo general para todos los helechos leptosporangiados

  12. Generic global regression models for growth prediction of Salmonella in ground pork and pork cuts

    DEFF Research Database (Denmark)

    Buschhardt, Tasja; Hansen, Tina Beck; Bahl, Martin Iain

    2017-01-01

    Introduction and Objectives Models for the prediction of bacterial growth in fresh pork are primarily developed using two-step regression (i.e. primary models followed by secondary models). These models are also generally based on experiments in liquids or ground meat and neglect surface growth....... It has been shown that one-step global regressions can result in more accurate models and that bacterial growth on intact surfaces can substantially differ from growth in liquid culture. Material and Methods We used a global-regression approach to develop predictive models for the growth of Salmonella....... One part of obtained logtransformed cell counts was used for model development and another for model validation. The Ratkowsky square root model and the relative lag time (RLT) model were integrated into the logistic model with delay. Fitted parameter estimates were compared to investigate the effect...

  13. Application of multilinear regression analysis in modeling of soil ...

    African Journals Online (AJOL)

    The application of Multi-Linear Regression Analysis (MLRA) model for predicting soil properties in Calabar South offers a technical guide and solution in foundation designs problems in the area. Forty-five soil samples were collected from fifteen different boreholes at a different depth and 270 tests were carried out for CBR, ...

  14. Radiation curing--new technology of green industries facing 21st century

    International Nuclear Information System (INIS)

    Wang Jianguo; Teng Renrui

    2000-01-01

    The development of radiation curing was simply reviewed and the mechanism of UV curing and EB curing, the equipment and materials used in the radiation curing were also introduced. Compared with ordinary curing, the radiation curing has advantages of energy saving, high effectiveness and little pollution. It is a new technology of green industries facing the 21st century

  15. EMD-regression for modelling multi-scale relationships, and application to weather-related cardiovascular mortality

    Science.gov (United States)

    Masselot, Pierre; Chebana, Fateh; Bélanger, Diane; St-Hilaire, André; Abdous, Belkacem; Gosselin, Pierre; Ouarda, Taha B. M. J.

    2018-01-01

    In a number of environmental studies, relationships between natural processes are often assessed through regression analyses, using time series data. Such data are often multi-scale and non-stationary, leading to a poor accuracy of the resulting regression models and therefore to results with moderate reliability. To deal with this issue, the present paper introduces the EMD-regression methodology consisting in applying the empirical mode decomposition (EMD) algorithm on data series and then using the resulting components in regression models. The proposed methodology presents a number of advantages. First, it accounts of the issues of non-stationarity associated to the data series. Second, this approach acts as a scan for the relationship between a response variable and the predictors at different time scales, providing new insights about this relationship. To illustrate the proposed methodology it is applied to study the relationship between weather and cardiovascular mortality in Montreal, Canada. The results shed new knowledge concerning the studied relationship. For instance, they show that the humidity can cause excess mortality at the monthly time scale, which is a scale not visible in classical models. A comparison is also conducted with state of the art methods which are the generalized additive models and distributed lag models, both widely used in weather-related health studies. The comparison shows that EMD-regression achieves better prediction performances and provides more details than classical models concerning the relationship.

  16. Overview of UV and EB curing

    International Nuclear Information System (INIS)

    Garnett, J.L.

    2000-01-01

    Full text: UV and EB are complementary techniques in radiation curing. In the proposed paper, a brief review of both fields will be given. This will include principles of the process, the chemistry of the systems including monomers/oligomers/polymers used, additives required where necessary such as photoinitiators for UV, flow aids, adhesion promoters and the like. The types of equipment used in such processes will also be discussed including low energy electron beam utilisation and excimer curing. The advantages and disadvantages of both techniques will be examined. Mechanistic aspects of both curing systems will be discussed. Applications of the technology including developments in the banknote printing field will be summarised

  17. The Application of Classical and Neural Regression Models for the Valuation of Residential Real Estate

    Directory of Open Access Journals (Sweden)

    Mach Łukasz

    2017-06-01

    Full Text Available The research process aimed at building regression models, which helps to valuate residential real estate, is presented in the following article. Two widely used computational tools i.e. the classical multiple regression and regression models of artificial neural networks were used in order to build models. An attempt to define the utilitarian usefulness of the above-mentioned tools and comparative analysis of them is the aim of the conducted research. Data used for conducting analyses refers to the secondary transactional residential real estate market.

  18. A review of a priori regression models for warfarin maintenance dose prediction.

    Directory of Open Access Journals (Sweden)

    Ben Francis

    Full Text Available A number of a priori warfarin dosing algorithms, derived using linear regression methods, have been proposed. Although these dosing algorithms may have been validated using patients derived from the same centre, rarely have they been validated using a patient cohort recruited from another centre. In order to undertake external validation, two cohorts were utilised. One cohort formed by patients from a prospective trial and the second formed by patients in the control arm of the EU-PACT trial. Of these, 641 patients were identified as having attained stable dosing and formed the dataset used for validation. Predicted maintenance doses from six criterion fulfilling regression models were then compared to individual patient stable warfarin dose. Predictive ability was assessed with reference to several statistics including the R-square and mean absolute error. The six regression models explained different amounts of variability in the stable maintenance warfarin dose requirements of the patients in the two validation cohorts; adjusted R-squared values ranged from 24.2% to 68.6%. An overview of the summary statistics demonstrated that no one dosing algorithm could be considered optimal. The larger validation cohort from the prospective trial produced more consistent statistics across the six dosing algorithms. The study found that all the regression models performed worse in the validation cohort when compared to the derivation cohort. Further, there was little difference between regression models that contained pharmacogenetic coefficients and algorithms containing just non-pharmacogenetic coefficients. The inconsistency of results between the validation cohorts suggests that unaccounted population specific factors cause variability in dosing algorithm performance. Better methods for dosing that take into account inter- and intra-individual variability, at the initiation and maintenance phases of warfarin treatment, are needed.

  19. A review of a priori regression models for warfarin maintenance dose prediction.

    Science.gov (United States)

    Francis, Ben; Lane, Steven; Pirmohamed, Munir; Jorgensen, Andrea

    2014-01-01

    A number of a priori warfarin dosing algorithms, derived using linear regression methods, have been proposed. Although these dosing algorithms may have been validated using patients derived from the same centre, rarely have they been validated using a patient cohort recruited from another centre. In order to undertake external validation, two cohorts were utilised. One cohort formed by patients from a prospective trial and the second formed by patients in the control arm of the EU-PACT trial. Of these, 641 patients were identified as having attained stable dosing and formed the dataset used for validation. Predicted maintenance doses from six criterion fulfilling regression models were then compared to individual patient stable warfarin dose. Predictive ability was assessed with reference to several statistics including the R-square and mean absolute error. The six regression models explained different amounts of variability in the stable maintenance warfarin dose requirements of the patients in the two validation cohorts; adjusted R-squared values ranged from 24.2% to 68.6%. An overview of the summary statistics demonstrated that no one dosing algorithm could be considered optimal. The larger validation cohort from the prospective trial produced more consistent statistics across the six dosing algorithms. The study found that all the regression models performed worse in the validation cohort when compared to the derivation cohort. Further, there was little difference between regression models that contained pharmacogenetic coefficients and algorithms containing just non-pharmacogenetic coefficients. The inconsistency of results between the validation cohorts suggests that unaccounted population specific factors cause variability in dosing algorithm performance. Better methods for dosing that take into account inter- and intra-individual variability, at the initiation and maintenance phases of warfarin treatment, are needed.

  20. Modeling of chemical exergy of agricultural biomass using improved general regression neural network

    International Nuclear Information System (INIS)

    Huang, Y.W.; Chen, M.Q.; Li, Y.; Guo, J.

    2016-01-01

    A comprehensive evaluation for energy potential contained in agricultural biomass was a vital step for energy utilization of agricultural biomass. The chemical exergy of typical agricultural biomass was evaluated based on the second law of thermodynamics. The chemical exergy was significantly influenced by C and O elements rather than H element. The standard entropy of the samples also was examined based on their element compositions. Two predicted models of the chemical exergy were developed, which referred to a general regression neural network model based upon the element composition, and a linear model based upon the high heat value. An auto-refinement algorithm was firstly developed to improve the performance of regression neural network model. The developed general regression neural network model with K-fold cross-validation had a better ability for predicting the chemical exergy than the linear model, which had lower predicted errors (±1.5%). - Highlights: • Chemical exergies of agricultural biomass were evaluated based upon fifty samples. • Values for the standard entropy of agricultural biomass samples were calculated. • A linear relationship between chemical exergy and HHV of samples was detected. • An improved GRNN prediction model for the chemical exergy of biomass was developed.

  1. High Power UV LED Industrial Curing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Karlicek, Robert, F., Jr; Sargent, Robert

    2012-05-14

    UV curing is a green technology that is largely underutilized because UV radiation sources like Hg Lamps are unreliable and difficult to use. High Power UV LEDs are now efficient enough to replace Hg Lamps, and offer significantly improved performance relative to Hg Lamps. In this study, a modular, scalable high power UV LED curing system was designed and tested, performing well in industrial coating evaluations. In order to achieve mechanical form factors similar to commercial Hg Lamp systems, a new patent pending design was employed enabling high irradiance at long working distances. While high power UV LEDs are currently only available at longer UVA wavelengths, rapid progress on UVC LEDs and the development of new formulations designed specifically for use with UV LED sources will converge to drive more rapid adoption of UV curing technology. An assessment of the environmental impact of replacing Hg Lamp systems with UV LED systems was performed. Since UV curing is used in only a small portion of the industrial printing, painting and coating markets, the ease of use of UV LED systems should increase the use of UV curing technology. Even a small penetration of the significant number of industrial applications still using oven curing and drying will lead to significant reductions in energy consumption and reductions in the emission of green house gases and solvent emissions.

  2. New robust statistical procedures for the polytomous logistic regression models.

    Science.gov (United States)

    Castilla, Elena; Ghosh, Abhik; Martin, Nirian; Pardo, Leandro

    2018-05-17

    This article derives a new family of estimators, namely the minimum density power divergence estimators, as a robust generalization of the maximum likelihood estimator for the polytomous logistic regression model. Based on these estimators, a family of Wald-type test statistics for linear hypotheses is introduced. Robustness properties of both the proposed estimators and the test statistics are theoretically studied through the classical influence function analysis. Appropriate real life examples are presented to justify the requirement of suitable robust statistical procedures in place of the likelihood based inference for the polytomous logistic regression model. The validity of the theoretical results established in the article are further confirmed empirically through suitable simulation studies. Finally, an approach for the data-driven selection of the robustness tuning parameter is proposed with empirical justifications. © 2018, The International Biometric Society.

  3. Electricity demand loads modeling using AutoRegressive Moving Average (ARMA) models

    Energy Technology Data Exchange (ETDEWEB)

    Pappas, S.S. [Department of Information and Communication Systems Engineering, University of the Aegean, Karlovassi, 83 200 Samos (Greece); Ekonomou, L.; Chatzarakis, G.E. [Department of Electrical Engineering Educators, ASPETE - School of Pedagogical and Technological Education, N. Heraklion, 141 21 Athens (Greece); Karamousantas, D.C. [Technological Educational Institute of Kalamata, Antikalamos, 24100 Kalamata (Greece); Katsikas, S.K. [Department of Technology Education and Digital Systems, University of Piraeus, 150 Androutsou Srt., 18 532 Piraeus (Greece); Liatsis, P. [Division of Electrical Electronic and Information Engineering, School of Engineering and Mathematical Sciences, Information and Biomedical Engineering Centre, City University, Northampton Square, London EC1V 0HB (United Kingdom)

    2008-09-15

    This study addresses the problem of modeling the electricity demand loads in Greece. The provided actual load data is deseasonilized and an AutoRegressive Moving Average (ARMA) model is fitted on the data off-line, using the Akaike Corrected Information Criterion (AICC). The developed model fits the data in a successful manner. Difficulties occur when the provided data includes noise or errors and also when an on-line/adaptive modeling is required. In both cases and under the assumption that the provided data can be represented by an ARMA model, simultaneous order and parameter estimation of ARMA models under the presence of noise are performed. The produced results indicate that the proposed method, which is based on the multi-model partitioning theory, tackles successfully the studied problem. For validation purposes the produced results are compared with three other established order selection criteria, namely AICC, Akaike's Information Criterion (AIC) and Schwarz's Bayesian Information Criterion (BIC). The developed model could be useful in the studies that concern electricity consumption and electricity prices forecasts. (author)

  4. Enhanced inhibition of Aspergillus niger on sedge (Lepironia articulata) treated with heat-cured lime oil.

    Science.gov (United States)

    Matan, N; Matan, N; Ketsa, S

    2013-08-01

    This study aimed to examine heat curing effect (30-100°C) on antifungal activities of lime oil and its components (limonene, p-cymene, β-pinene and α-pinene) at concentrations ranging from 100 to 300 μl ml(-1) against Aspergillus niger in microbiological medium and to optimize heat curing of lime oil for efficient mould control on sedge (Lepironia articulata). Broth dilution method was employed to determine lime oil minimum inhibitory concentration, which was at 90 μl ml(-1) with heat curing at 70°C. Limonene, a main component of lime oil, was an agent responsible for temperature dependencies of lime oil activities observed. Response surface methodology was used to construct the mathematical model describing a time period of zero mould growth on sedge as functions of heat curing temperature and lime oil concentration. Heat curing of 90 μl ml(-1) lime oil at 70°C extended a period of zero mould growth on sedge to 18 weeks under moist conditions. Heat curing at 70°C best enhanced antifungal activity of lime oil against A. niger both in medium and on sedge. Heat curing of lime oil has potential to be used to enhance the antifungal safety of sedge products. © 2013 The Society for Applied Microbiology.

  5. Analysis of the influence of quantile regression model on mainland tourists' service satisfaction performance.

    Science.gov (United States)

    Wang, Wen-Cheng; Cho, Wen-Chien; Chen, Yin-Jen

    2014-01-01

    It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models.

  6. Analysis of the Influence of Quantile Regression Model on Mainland Tourists' Service Satisfaction Performance

    Science.gov (United States)

    Wang, Wen-Cheng; Cho, Wen-Chien; Chen, Yin-Jen

    2014-01-01

    It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models. PMID:24574916

  7. Analysis of the Influence of Quantile Regression Model on Mainland Tourists’ Service Satisfaction Performance

    Directory of Open Access Journals (Sweden)

    Wen-Cheng Wang

    2014-01-01

    Full Text Available It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models.

  8. A binary logistic regression model with complex sampling design of ...

    African Journals Online (AJOL)

    2017-09-03

    Sep 3, 2017 ... Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. .... Data was entered into STATA-12 and analyzed using. SPSS-21. .... lack of access/too far or costs too much. 35. 1.2.

  9. truncSP: An R Package for Estimation of Semi-Parametric Truncated Linear Regression Models

    Directory of Open Access Journals (Sweden)

    Maria Karlsson

    2014-05-01

    Full Text Available Problems with truncated data occur in many areas, complicating estimation and inference. Regarding linear regression models, the ordinary least squares estimator is inconsistent and biased for these types of data and is therefore unsuitable for use. Alternative estimators, designed for the estimation of truncated regression models, have been developed. This paper presents the R package truncSP. The package contains functions for the estimation of semi-parametric truncated linear regression models using three different estimators: the symmetrically trimmed least squares, quadratic mode, and left truncated estimators, all of which have been shown to have good asymptotic and ?nite sample properties. The package also provides functions for the analysis of the estimated models. Data from the environmental sciences are used to illustrate the functions in the package.

  10. Illness perceptions of leprosy-cured individuals in Surinam with residual disfigurements - "I am cured, but still I am ill".

    Science.gov (United States)

    van Haaren, Mark Ac; Reyme, Melinda; Lawrence, Maggie; Menke, Jack; Kaptein, Ad A

    2017-06-01

    Objective Leprosy has rarely been the subject of health psychology research despite its substantial impact. Our aim was to explore illness perceptions in patients and their health care providers in Surinam. The Common Sense Model (CSM) was the guiding theoretical model. Design Patients with biomedically cured leprosy and their health care providers completed the B-IPQ and took part in semi-structured interviews. The literature on illness perceptions in patients with leprosy was reviewed. Main outcome measures Patients' B-IPQ scores were compared with samples of patients with other (chronic) illnesses, and with health care providers completing the questionnaire as if they were visibly disfigured patients. Quotations from the semi-structured interviews were used to contextualise the illness perceptions. Results Patients' B-IPQ scores reflected the chronic nature of leprosy and were comparable with those with other chronic illnesses. Health care providers perceived leprosy to have a greater negative impact than did the patients. Perceived understanding of causes differed considerably between patients and health care providers. Conclusion Leprosy continues to be experienced as an illness with major psychological and social consequences such as stigmatisation, even after biomedical cure. Interventions that target patients, health care providers, and society at large may help reduce perceived shame and stigma. The CSM is a helpful theoretical model in studying this population.

  11. Applied linear regression

    CERN Document Server

    Weisberg, Sanford

    2013-01-01

    Praise for the Third Edition ""...this is an excellent book which could easily be used as a course text...""-International Statistical Institute The Fourth Edition of Applied Linear Regression provides a thorough update of the basic theory and methodology of linear regression modeling. Demonstrating the practical applications of linear regression analysis techniques, the Fourth Edition uses interesting, real-world exercises and examples. Stressing central concepts such as model building, understanding parameters, assessing fit and reliability, and drawing conclusions, the new edition illus

  12. Proceedings of national executive management seminar on surface finishing by radiation curing technology: radiation curing for better finishing

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This book compiled the paper presented at this seminar. The papers discussed are 1. Incentives for investment in the manufacturing sector (in Malaysia) 2.Trends and prospect of surface finishing by radiation curing technology in Malaysia 3. Industrial application of radiation curing.

  13. Proceedings of national executive management seminar on surface finishing by radiation curing technology: radiation curing for better finishing

    International Nuclear Information System (INIS)

    1993-01-01

    This book compiled the paper presented at this seminar. The papers discussed are 1. Incentives for investment in the manufacturing sector (in Malaysia) 2.Trends and prospect of surface finishing by radiation curing technology in Malaysia 3. Industrial application of radiation curing

  14. Testing for constant nonparametric effects in general semiparametric regression models with interactions

    KAUST Repository

    Wei, Jiawei; Carroll, Raymond J.; Maity, Arnab

    2011-01-01

    We consider the problem of testing for a constant nonparametric effect in a general semi-parametric regression model when there is the potential for interaction between the parametrically and nonparametrically modeled variables. The work

  15. Building interpretable predictive models for pediatric hospital readmission using Tree-Lasso logistic regression.

    Science.gov (United States)

    Jovanovic, Milos; Radovanovic, Sandro; Vukicevic, Milan; Van Poucke, Sven; Delibasic, Boris

    2016-09-01

    Quantification and early identification of unplanned readmission risk have the potential to improve the quality of care during hospitalization and after discharge. However, high dimensionality, sparsity, and class imbalance of electronic health data and the complexity of risk quantification, challenge the development of accurate predictive models. Predictive models require a certain level of interpretability in order to be applicable in real settings and create actionable insights. This paper aims to develop accurate and interpretable predictive models for readmission in a general pediatric patient population, by integrating a data-driven model (sparse logistic regression) and domain knowledge based on the international classification of diseases 9th-revision clinical modification (ICD-9-CM) hierarchy of diseases. Additionally, we propose a way to quantify the interpretability of a model and inspect the stability of alternative solutions. The analysis was conducted on >66,000 pediatric hospital discharge records from California, State Inpatient Databases, Healthcare Cost and Utilization Project between 2009 and 2011. We incorporated domain knowledge based on the ICD-9-CM hierarchy in a data driven, Tree-Lasso regularized logistic regression model, providing the framework for model interpretation. This approach was compared with traditional Lasso logistic regression resulting in models that are easier to interpret by fewer high-level diagnoses, with comparable prediction accuracy. The results revealed that the use of a Tree-Lasso model was as competitive in terms of accuracy (measured by area under the receiver operating characteristic curve-AUC) as the traditional Lasso logistic regression, but integration with the ICD-9-CM hierarchy of diseases provided more interpretable models in terms of high-level diagnoses. Additionally, interpretations of models are in accordance with existing medical understanding of pediatric readmission. Best performing models have

  16. Multiple regression analysis in modelling of carbon dioxide emissions by energy consumption use in Malaysia

    Science.gov (United States)

    Keat, Sim Chong; Chun, Beh Boon; San, Lim Hwee; Jafri, Mohd Zubir Mat

    2015-04-01

    Climate change due to carbon dioxide (CO2) emissions is one of the most complex challenges threatening our planet. This issue considered as a great and international concern that primary attributed from different fossil fuels. In this paper, regression model is used for analyzing the causal relationship among CO2 emissions based on the energy consumption in Malaysia using time series data for the period of 1980-2010. The equations were developed using regression model based on the eight major sources that contribute to the CO2 emissions such as non energy, Liquefied Petroleum Gas (LPG), diesel, kerosene, refinery gas, Aviation Turbine Fuel (ATF) and Aviation Gasoline (AV Gas), fuel oil and motor petrol. The related data partly used for predict the regression model (1980-2000) and partly used for validate the regression model (2001-2010). The results of the prediction model with the measured data showed a high correlation coefficient (R2=0.9544), indicating the model's accuracy and efficiency. These results are accurate and can be used in early warning of the population to comply with air quality standards.

  17. Better Autologistic Regression

    Directory of Open Access Journals (Sweden)

    Mark A. Wolters

    2017-11-01

    Full Text Available Autologistic regression is an important probability model for dichotomous random variables observed along with covariate information. It has been used in various fields for analyzing binary data possessing spatial or network structure. The model can be viewed as an extension of the autologistic model (also known as the Ising model, quadratic exponential binary distribution, or Boltzmann machine to include covariates. It can also be viewed as an extension of logistic regression to handle responses that are not independent. Not all authors use exactly the same form of the autologistic regression model. Variations of the model differ in two respects. First, the variable coding—the two numbers used to represent the two possible states of the variables—might differ. Common coding choices are (zero, one and (minus one, plus one. Second, the model might appear in either of two algebraic forms: a standard form, or a recently proposed centered form. Little attention has been paid to the effect of these differences, and the literature shows ambiguity about their importance. It is shown here that changes to either coding or centering in fact produce distinct, non-nested probability models. Theoretical results, numerical studies, and analysis of an ecological data set all show that the differences among the models can be large and practically significant. Understanding the nature of the differences and making appropriate modeling choices can lead to significantly improved autologistic regression analyses. The results strongly suggest that the standard model with plus/minus coding, which we call the symmetric autologistic model, is the most natural choice among the autologistic variants.

  18. Alternative Methods of Regression

    CERN Document Server

    Birkes, David

    2011-01-01

    Of related interest. Nonlinear Regression Analysis and its Applications Douglas M. Bates and Donald G. Watts ".an extraordinary presentation of concepts and methods concerning the use and analysis of nonlinear regression models.highly recommend[ed].for anyone needing to use and/or understand issues concerning the analysis of nonlinear regression models." --Technometrics This book provides a balance between theory and practice supported by extensive displays of instructive geometrical constructs. Numerous in-depth case studies illustrate the use of nonlinear regression analysis--with all data s

  19. Time-adaptive quantile regression

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg; Madsen, Henrik

    2008-01-01

    and an updating procedure are combined into a new algorithm for time-adaptive quantile regression, which generates new solutions on the basis of the old solution, leading to savings in computation time. The suggested algorithm is tested against a static quantile regression model on a data set with wind power......An algorithm for time-adaptive quantile regression is presented. The algorithm is based on the simplex algorithm, and the linear optimization formulation of the quantile regression problem is given. The observations have been split to allow a direct use of the simplex algorithm. The simplex method...... production, where the models combine splines and quantile regression. The comparison indicates superior performance for the time-adaptive quantile regression in all the performance parameters considered....

  20. Construction of risk prediction model of type 2 diabetes mellitus based on logistic regression

    Directory of Open Access Journals (Sweden)

    Li Jian

    2017-01-01

    Full Text Available Objective: to construct multi factor prediction model for the individual risk of T2DM, and to explore new ideas for early warning, prevention and personalized health services for T2DM. Methods: using logistic regression techniques to screen the risk factors for T2DM and construct the risk prediction model of T2DM. Results: Male’s risk prediction model logistic regression equation: logit(P=BMI × 0.735+ vegetables × (−0.671 + age × 0.838+ diastolic pressure × 0.296+ physical activity× (−2.287 + sleep ×(−0.009 +smoking ×0.214; Female’s risk prediction model logistic regression equation: logit(P=BMI ×1.979+ vegetables× (−0.292 + age × 1.355+ diastolic pressure× 0.522+ physical activity × (−2.287 + sleep × (−0.010.The area under the ROC curve of male was 0.83, the sensitivity was 0.72, the specificity was 0.86, the area under the ROC curve of female was 0.84, the sensitivity was 0.75, the specificity was 0.90. Conclusion: This study model data is from a compared study of nested case, the risk prediction model has been established by using the more mature logistic regression techniques, and the model is higher predictive sensitivity, specificity and stability.

  1. Dielectric monitoring of carbon nanotube network formation in curing thermosetting nanocomposites

    Energy Technology Data Exchange (ETDEWEB)

    Battisti, A; Skordos, A A; Partridge, I K, E-mail: a.battisti@cranfield.ac.u [Composites Centre, School of Applied Sciences, Cranfield University, Cranfield, Bedfordshire, MK43 0AL (United Kingdom)

    2009-08-07

    This paper focuses on monitoring of carbon nanotube (CNT) network development during the cure of unsaturated polyester nanocomposites by means of electrical impedance spectroscopy. A phenomenological model of the dielectric response is developed using equivalent circuit analysis. The model comprises two parallel RC elements connected in series, each of them giving rise to a semicircular arc in impedance complex plane plots. An established inverse modelling methodology is utilized for the estimation of the parameters of the corresponding equivalent circuit. This allows a quantification of the evolution of two separate processes corresponding to the two parallel RC elements. The high frequency process, which is attributed to CNT aggregates, shows a monotonic decrease in characteristic time during the cure. In contrast, the low frequency process, which corresponds to inter-aggregate phenomena, shows a more complex behaviour explained by the interplay between conductive network development and the cross-linking of the polymer.

  2. Dielectric monitoring of carbon nanotube network formation in curing thermosetting nanocomposites

    Science.gov (United States)

    Battisti, A.; Skordos, A. A.; Partridge, I. K.

    2009-08-01

    This paper focuses on monitoring of carbon nanotube (CNT) network development during the cure of unsaturated polyester nanocomposites by means of electrical impedance spectroscopy. A phenomenological model of the dielectric response is developed using equivalent circuit analysis. The model comprises two parallel RC elements connected in series, each of them giving rise to a semicircular arc in impedance complex plane plots. An established inverse modelling methodology is utilized for the estimation of the parameters of the corresponding equivalent circuit. This allows a quantification of the evolution of two separate processes corresponding to the two parallel RC elements. The high frequency process, which is attributed to CNT aggregates, shows a monotonic decrease in characteristic time during the cure. In contrast, the low frequency process, which corresponds to inter-aggregate phenomena, shows a more complex behaviour explained by the interplay between conductive network development and the cross-linking of the polymer.

  3. Dielectric monitoring of carbon nanotube network formation in curing thermosetting nanocomposites

    International Nuclear Information System (INIS)

    Battisti, A; Skordos, A A; Partridge, I K

    2009-01-01

    This paper focuses on monitoring of carbon nanotube (CNT) network development during the cure of unsaturated polyester nanocomposites by means of electrical impedance spectroscopy. A phenomenological model of the dielectric response is developed using equivalent circuit analysis. The model comprises two parallel RC elements connected in series, each of them giving rise to a semicircular arc in impedance complex plane plots. An established inverse modelling methodology is utilized for the estimation of the parameters of the corresponding equivalent circuit. This allows a quantification of the evolution of two separate processes corresponding to the two parallel RC elements. The high frequency process, which is attributed to CNT aggregates, shows a monotonic decrease in characteristic time during the cure. In contrast, the low frequency process, which corresponds to inter-aggregate phenomena, shows a more complex behaviour explained by the interplay between conductive network development and the cross-linking of the polymer.

  4. THE REGRESSION MODEL OF IRAN LIBRARIES ORGANIZATIONAL CLIMATE

    OpenAIRE

    Jahani, Mohammad Ali; Yaminfirooz, Mousa; Siamian, Hasan

    2015-01-01

    Background: The purpose of this study was to drawing a regression model of organizational climate of central libraries of Iran?s universities. Methods: This study is an applied research. The statistical population of this study consisted of 96 employees of the central libraries of Iran?s public universities selected among the 117 universities affiliated to the Ministry of Health by Stratified Sampling method (510 people). Climate Qual localized questionnaire was used as research tools. For pr...

  5. Radiation sources EB and UV curing machines

    Energy Technology Data Exchange (ETDEWEB)

    Sasaki, Takashi [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment

    1994-12-31

    This paper describes electron beam processors and related technologies for curing applications to facilitate those industrial personals who are trying to understand and evaluate the applicability and benefits of radiation curing to their products. 4 tabs., 10 figs.

  6. Radiation sources EB and UV curing machines

    International Nuclear Information System (INIS)

    Takashi Sasaki

    1993-01-01

    This paper describes electron beam processors and related technologies for curing applications to facilitate those industrial personals who are trying to understand and evaluate the applicability and benefits of radiation curing to their products. 4 tabs., 10 figs

  7. What is radiation curing

    International Nuclear Information System (INIS)

    Kinstle, J.F.

    1975-01-01

    Radiation curing is a highly interdisciplinary and sophisticated field. Successful interplay between chemists and engineers of various disciplines is required. Throughout the research-development-applications cycle, two disciplines for which hybridization is extremely important are radiation chemistry and polymer chemistry. The molecular level effects caused by absorbed radiation depend strongly on the type and intensity of the radiation. Efficient utilization of the radiation to effect desired transformations in a monomer and/or polymer system, and maximization of final properties, depend on well-planned polymer synthesis and system formulation. The elementary basis of these two disciplines and the manner in which they necessarily coalesce in the field of radiation curing are reviewed

  8. UV-cured methacrylic-silica hybrids: Effect of oxygen inhibition on photo-curing kinetics

    International Nuclear Information System (INIS)

    Corcione, C. Esposito; Striani, R.; Frigione, M.

    2014-01-01

    Highlights: • The kinetic behavior of novel photopolymerizable organic–inorganic hybrid system was studied as a function of the composition and of the atmosphere for reactions. • The UV-curing reaction of the hybrid mixture was found fast and complete. • The combined presence of thiol monomer and nanostructured silica allows to reduce the effect of inhibition of oxygen towards the radical photopolymerization. - Abstract: The kinetic behavior of innovative photopolymerizable UV-cured methacrylic–silica hybrid formulations, previously developed, was studied and compared to that of a reference control system. The organic–inorganic (O–I) hybrids proposed in this study are obtained from organic precursors with a high siloxane content mixed with tetraethoxysilane (TEOS) in such a way to produce co-continuous silica nano-domains dispersed within a cross-linked organic phase, as a result of the hydrolysis and condensation reactions. The kinetics of the radical photopolymerization mechanism induced by UV-radiations, in presence of a suitable photoinitiator, was studied by calorimetric, FTIR and Raman spectroscopic analyses, by varying the composition of the mixtures and the atmosphere for reactions. The well known effect of oxygen on the kinetic mechanism of the free radical photopolymerization of the methacrylic–siloxane based monomers was found to be strongly reduced in the hybrid system, especially when a proper thiol was used. The experimental calorimetric data were fitted using a simple kinetic model for radical photopolymerization reactions, obtaining a good agreement between the experimental data and the theoretical model. From the comparison of the kinetic constants calculated for control and hybrid systems, it was possible to assess the effect of the composition, as well as of the atmosphere used during the photo-polymerization process, on the kinetic of photopolymerization reaction

  9. Harmonic regression of Landsat time series for modeling attributes from national forest inventory data

    Science.gov (United States)

    Wilson, Barry T.; Knight, Joseph F.; McRoberts, Ronald E.

    2018-03-01

    Imagery from the Landsat Program has been used frequently as a source of auxiliary data for modeling land cover, as well as a variety of attributes associated with tree cover. With ready access to all scenes in the archive since 2008 due to the USGS Landsat Data Policy, new approaches to deriving such auxiliary data from dense Landsat time series are required. Several methods have previously been developed for use with finer temporal resolution imagery (e.g. AVHRR and MODIS), including image compositing and harmonic regression using Fourier series. The manuscript presents a study, using Minnesota, USA during the years 2009-2013 as the study area and timeframe. The study examined the relative predictive power of land cover models, in particular those related to tree cover, using predictor variables based solely on composite imagery versus those using estimated harmonic regression coefficients. The study used two common non-parametric modeling approaches (i.e. k-nearest neighbors and random forests) for fitting classification and regression models of multiple attributes measured on USFS Forest Inventory and Analysis plots using all available Landsat imagery for the study area and timeframe. The estimated Fourier coefficients developed by harmonic regression of tasseled cap transformation time series data were shown to be correlated with land cover, including tree cover. Regression models using estimated Fourier coefficients as predictor variables showed a two- to threefold increase in explained variance for a small set of continuous response variables, relative to comparable models using monthly image composites. Similarly, the overall accuracies of classification models using the estimated Fourier coefficients were approximately 10-20 percentage points higher than the models using the image composites, with corresponding individual class accuracies between six and 45 percentage points higher.

  10. An appraisal of convergence failures in the application of logistic regression model in published manuscripts.

    Science.gov (United States)

    Yusuf, O B; Bamgboye, E A; Afolabi, R F; Shodimu, M A

    2014-09-01

    Logistic regression model is widely used in health research for description and predictive purposes. Unfortunately, most researchers are sometimes not aware that the underlying principles of the techniques have failed when the algorithm for maximum likelihood does not converge. Young researchers particularly postgraduate students may not know why separation problem whether quasi or complete occurs, how to identify it and how to fix it. This study was designed to critically evaluate convergence issues in articles that employed logistic regression analysis published in an African Journal of Medicine and medical sciences between 2004 and 2013. Problems of quasi or complete separation were described and were illustrated with the National Demographic and Health Survey dataset. A critical evaluation of articles that employed logistic regression was conducted. A total of 581 articles was reviewed, of which 40 (6.9%) used binary logistic regression. Twenty-four (60.0%) stated the use of logistic regression model in the methodology while none of the articles assessed model fit. Only 3 (12.5%) properly described the procedures. Of the 40 that used the logistic regression model, the problem of convergence occurred in 6 (15.0%) of the articles. Logistic regression tends to be poorly reported in studies published between 2004 and 2013. Our findings showed that the procedure may not be well understood by researchers since very few described the process in their reports and may be totally unaware of the problem of convergence or how to deal with it.

  11. Modeling of the Monthly Rainfall-Runoff Process Through Regressions

    Directory of Open Access Journals (Sweden)

    Campos-Aranda Daniel Francisco

    2014-10-01

    Full Text Available To solve the problems associated with the assessment of water resources of a river, the modeling of the rainfall-runoff process (RRP allows the deduction of runoff missing data and to extend its record, since generally the information available on precipitation is larger. It also enables the estimation of inputs to reservoirs, when their building led to the suppression of the gauging station. The simplest mathematical model that can be set for the RRP is the linear regression or curve on a monthly basis. Such a model is described in detail and is calibrated with the simultaneous record of monthly rainfall and runoff in Ballesmi hydrometric station, which covers 35 years. Since the runoff of this station has an important contribution from the spring discharge, the record is corrected first by removing that contribution. In order to do this a procedure was developed based either on the monthly average regional runoff coefficients or on nearby and similar watershed; in this case the Tancuilín gauging station was used. Both stations belong to the Partial Hydrologic Region No. 26 (Lower Rio Panuco and are located within the state of San Luis Potosi, México. The study performed indicates that the monthly regression model, due to its conceptual approach, faithfully reproduces monthly average runoff volumes and achieves an excellent approximation in relation to the dispersion, proved by calculation of the means and standard deviations.

  12. Genetic evaluation of European quails by random regression models

    Directory of Open Access Journals (Sweden)

    Flaviana Miranda Gonçalves

    2012-09-01

    Full Text Available The objective of this study was to compare different random regression models, defined from different classes of heterogeneity of variance combined with different Legendre polynomial orders for the estimate of (covariance of quails. The data came from 28,076 observations of 4,507 female meat quails of the LF1 lineage. Quail body weights were determined at birth and 1, 14, 21, 28, 35 and 42 days of age. Six different classes of residual variance were fitted to Legendre polynomial functions (orders ranging from 2 to 6 to determine which model had the best fit to describe the (covariance structures as a function of time. According to the evaluated criteria (AIC, BIC and LRT, the model with six classes of residual variances and of sixth-order Legendre polynomial was the best fit. The estimated additive genetic variance increased from birth to 28 days of age, and dropped slightly from 35 to 42 days. The heritability estimates decreased along the growth curve and changed from 0.51 (1 day to 0.16 (42 days. Animal genetic and permanent environmental correlation estimates between weights and age classes were always high and positive, except for birth weight. The sixth order Legendre polynomial, along with the residual variance divided into six classes was the best fit for the growth rate curve of meat quails; therefore, they should be considered for breeding evaluation processes by random regression models.

  13. Excimer Laser Curing Of Polymer Coatings

    Science.gov (United States)

    Klick, David; Akerman, M. Alfred; Paul, George L.; Supurovic, Darko; Tsuda, Haruki

    1988-12-01

    The use of the excimer laser as a source of energy for photo-assisted curing of industrial polymeric coatings was investigated. Presently, UV lamps are sometimes used to excite a photoinitiating molecule mixed with the starting monomers and oligomers of a coating. The resulting polymeric chain reaction multiplies the effect of the initial photons, making economical use of the light source. The high cost of laser photons may thus be justifiable if lasers provide advantages over lamps. A series of visibly transparent 7 μm coatings (a typical thickness for 'slick' magazine coatings) with various photoinitiators, monomers, and oligomers was illuminated with excimer laser light of various wavelengths, fluences, and pulse repetition rates. For the optimum parameters, it was found that the laser had large advantages in curing speed over existing UV lamp processes, due to its monochromaticity. Pigmented coatings (20 μm TiO2 mixtures typical of appliance or automotive finishes) are not easily cured with UV lamps due to the inability of light to penetrate the absorbing and scattering pigmented layer. However, economically-viable cure rates were achieved with certain photoinitiators using a tunable excimer-pumped dye laser. A prototype of such a laser suitable for factory use was built and used to cure these coatings. Results are scaled to a factory situation, and costs are calculated to show the advantages of the laser method over currently used processes.

  14. Significance tests to determine the direction of effects in linear regression models.

    Science.gov (United States)

    Wiedermann, Wolfgang; Hagmann, Michael; von Eye, Alexander

    2015-02-01

    Previous studies have discussed asymmetric interpretations of the Pearson correlation coefficient and have shown that higher moments can be used to decide on the direction of dependence in the bivariate linear regression setting. The current study extends this approach by illustrating that the third moment of regression residuals may also be used to derive conclusions concerning the direction of effects. Assuming non-normally distributed variables, it is shown that the distribution of residuals of the correctly specified regression model (e.g., Y is regressed on X) is more symmetric than the distribution of residuals of the competing model (i.e., X is regressed on Y). Based on this result, 4 one-sample tests are discussed which can be used to decide which variable is more likely to be the response and which one is more likely to be the explanatory variable. A fifth significance test is proposed based on the differences of skewness estimates, which leads to a more direct test of a hypothesis that is compatible with direction of dependence. A Monte Carlo simulation study was performed to examine the behaviour of the procedures under various degrees of associations, sample sizes, and distributional properties of the underlying population. An empirical example is given which illustrates the application of the tests in practice. © 2014 The British Psychological Society.

  15. Weighted functional linear regression models for gene-based association analysis.

    Science.gov (United States)

    Belonogova, Nadezhda M; Svishcheva, Gulnara R; Wilson, James F; Campbell, Harry; Axenovich, Tatiana I

    2018-01-01

    Functional linear regression models are effectively used in gene-based association analysis of complex traits. These models combine information about individual genetic variants, taking into account their positions and reducing the influence of noise and/or observation errors. To increase the power of methods, where several differently informative components are combined, weights are introduced to give the advantage to more informative components. Allele-specific weights have been introduced to collapsing and kernel-based approaches to gene-based association analysis. Here we have for the first time introduced weights to functional linear regression models adapted for both independent and family samples. Using data simulated on the basis of GAW17 genotypes and weights defined by allele frequencies via the beta distribution, we demonstrated that type I errors correspond to declared values and that increasing the weights of causal variants allows the power of functional linear models to be increased. We applied the new method to real data on blood pressure from the ORCADES sample. Five of the six known genes with P models. Moreover, we found an association between diastolic blood pressure and the VMP1 gene (P = 8.18×10-6), when we used a weighted functional model. For this gene, the unweighted functional and weighted kernel-based models had P = 0.004 and 0.006, respectively. The new method has been implemented in the program package FREGAT, which is freely available at https://cran.r-project.org/web/packages/FREGAT/index.html.

  16. A model study on color and related structural properties of cured porcine batters

    NARCIS (Netherlands)

    Palombo, R.

    1990-01-01

    Color, determined by tristimulus colorimeters, and related structural properties, i.e., microstructure, surface rheology, and bulk rheology, of cured porcine meat batters were studied.

    Effects of various processing factors (such as, temperature, air pressure during chopping, and

  17. Improving the Prediction of Total Surgical Procedure Time Using Linear Regression Modeling

    Directory of Open Access Journals (Sweden)

    Eric R. Edelman

    2017-06-01

    Full Text Available For efficient utilization of operating rooms (ORs, accurate schedules of assigned block time and sequences of patient cases need to be made. The quality of these planning tools is dependent on the accurate prediction of total procedure time (TPT per case. In this paper, we attempt to improve the accuracy of TPT predictions by using linear regression models based on estimated surgeon-controlled time (eSCT and other variables relevant to TPT. We extracted data from a Dutch benchmarking database of all surgeries performed in six academic hospitals in The Netherlands from 2012 till 2016. The final dataset consisted of 79,983 records, describing 199,772 h of total OR time. Potential predictors of TPT that were included in the subsequent analysis were eSCT, patient age, type of operation, American Society of Anesthesiologists (ASA physical status classification, and type of anesthesia used. First, we computed the predicted TPT based on a previously described fixed ratio model for each record, multiplying eSCT by 1.33. This number is based on the research performed by van Veen-Berkx et al., which showed that 33% of SCT is generally a good approximation of anesthesia-controlled time (ACT. We then systematically tested all possible linear regression models to predict TPT using eSCT in combination with the other available independent variables. In addition, all regression models were again tested without eSCT as a predictor to predict ACT separately (which leads to TPT by adding SCT. TPT was most accurately predicted using a linear regression model based on the independent variables eSCT, type of operation, ASA classification, and type of anesthesia. This model performed significantly better than the fixed ratio model and the method of predicting ACT separately. Making use of these more accurate predictions in planning and sequencing algorithms may enable an increase in utilization of ORs, leading to significant financial and productivity related

  18. Improving the Prediction of Total Surgical Procedure Time Using Linear Regression Modeling.

    Science.gov (United States)

    Edelman, Eric R; van Kuijk, Sander M J; Hamaekers, Ankie E W; de Korte, Marcel J M; van Merode, Godefridus G; Buhre, Wolfgang F F A

    2017-01-01

    For efficient utilization of operating rooms (ORs), accurate schedules of assigned block time and sequences of patient cases need to be made. The quality of these planning tools is dependent on the accurate prediction of total procedure time (TPT) per case. In this paper, we attempt to improve the accuracy of TPT predictions by using linear regression models based on estimated surgeon-controlled time (eSCT) and other variables relevant to TPT. We extracted data from a Dutch benchmarking database of all surgeries performed in six academic hospitals in The Netherlands from 2012 till 2016. The final dataset consisted of 79,983 records, describing 199,772 h of total OR time. Potential predictors of TPT that were included in the subsequent analysis were eSCT, patient age, type of operation, American Society of Anesthesiologists (ASA) physical status classification, and type of anesthesia used. First, we computed the predicted TPT based on a previously described fixed ratio model for each record, multiplying eSCT by 1.33. This number is based on the research performed by van Veen-Berkx et al., which showed that 33% of SCT is generally a good approximation of anesthesia-controlled time (ACT). We then systematically tested all possible linear regression models to predict TPT using eSCT in combination with the other available independent variables. In addition, all regression models were again tested without eSCT as a predictor to predict ACT separately (which leads to TPT by adding SCT). TPT was most accurately predicted using a linear regression model based on the independent variables eSCT, type of operation, ASA classification, and type of anesthesia. This model performed significantly better than the fixed ratio model and the method of predicting ACT separately. Making use of these more accurate predictions in planning and sequencing algorithms may enable an increase in utilization of ORs, leading to significant financial and productivity related benefits.

  19. Bulk-Fill Composites: Effectiveness of Cure With Poly- and Monowave Curing Lights and Modes.

    Science.gov (United States)

    Gan, J K; Yap, A U; Cheong, J W; Arista, N; Tan, Cbk

    This study compared the effectiveness of cure of bulk-fill composites using polywave light-emitting diode (LED; with various curing modes), monowave LED, and conventional halogen curing lights. The bulk-fill composites evaluated were Tetric N-Ceram bulk-fill (TNC), which contained a novel germanium photoinitiator (Ivocerin), and Smart Dentin Replacement (SDR). The composites were placed into black polyvinyl molds with cylindrical recesses of 4-mm height and 3-mm diameter and photopolymerized as follows: Bluephase N Polywave High (NH), 1200 mW/cm 2 (10 seconds); Bluephase N Polywave Low (NL), 650 mW/cm 2 (18.5 seconds); Bluephase N Polywave soft-start (NS), 0-650 mW/cm 2 (5 seconds) → 1200 mW/cm 2 (10 seconds); Bluephase N Monowave (NM), 800 mW/cm 2 (15 seconds); QHL75 (QH), 550 mW/cm 2 (21.8 seconds). Total energy output was fixed at 12,000 mJ/cm 2 for all lights/modes, with the exception of NS. The cured specimens were stored in a light-proof container at 37°C for 24 hours, and hardness (Knoop Hardness Number) of the top and bottom surfaces of the specimens was determined using a Knoop microhardness tester (n=6). Hardness data and bottom-to-top hardness ratios were subjected to statistical analysis using one-way analysis of variance/Scheffe's post hoc test at a significance level of 0.05. Hardness ratios ranged from 38.43% ± 5.19% to 49.25% ± 6.38% for TNC and 50.67% ± 1.54% to 67.62% ± 6.96% for SDR. For both bulk-fill composites, the highest hardness ratios were obtained with NM and lowest hardness ratios with NL. While no significant difference in hardness ratios was observed between curing lights/modes for TNC, the hardness ratio obtained with NM was significantly higher than the hardness ratio obtained for NL for SDR.

  20. A Fourier transform Raman spectroscopy analysis of the degree of conversion of a universal hybrid resin composite cured with light-emitting diode curing units.

    Science.gov (United States)

    Lindberg, Anders; Emami, Nazanin; van Dijken, Jan W V

    2005-01-01

    The degree of conversion (DC), of a universal hybrid resin composite cured with LED curing units with low and high power densities and a 510 mW/cm2 quartz tungsten halogen unit, was investigated with Fourier Transform Raman spectroscopy. Three curing depths (0, 2, 4mm) and 0 and 7 mm light guide tip - resin composite (LT - RC) distances were tested. The DC of the LED units varied between 52.3% - 59.8% at the top surface and 46.4% - 57.0% at 4 mm depth. The DC of specimen cured with a 0 mm LT- RC distance at 4 mm depth varied between 50.8% - 57.0% and with 7 mm distance between 46.4% - 55.4%. The low power density LED unit showed a significantly lower DC for both distances at all depth levels compared to the other curing units (p units were only found at the 4 mm depth level cured from 7 mm distance (p units. It can be concluded that the improved LED curing units could cure the studied resin composite to the same DC as the control unit.

  1. Linearity and Misspecification Tests for Vector Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    Teräsvirta, Timo; Yang, Yukai

    The purpose of the paper is to derive Lagrange multiplier and Lagrange multiplier type specification and misspecification tests for vector smooth transition regression models. We report results from simulation studies in which the size and power properties of the proposed asymptotic tests in small...

  2. Cure kinetics and chemorheology of EPDM/graphene oxide nanocomposites

    Energy Technology Data Exchange (ETDEWEB)

    Allahbakhsh, Ahmad [Department of Polymer Engineering, Islamic Azad University, South Tehran Branch, 17776-13651 Tehran (Iran, Islamic Republic of); Mazinani, Saeedeh, E-mail: s.mazinani@aut.ac.ir [Amirkabir Nanotechnology Research Institute (ANTRI), Amirkabir University of Technology, Tehran (Iran, Islamic Republic of); Kalaee, Mohammad Reza [Department of Polymer Engineering, Islamic Azad University, South Tehran Branch, 17776-13651 Tehran (Iran, Islamic Republic of); Sharif, Farhad [Department of Polymer Engineering and Color Technology, Amirkabir University of Technology, Tehran (Iran, Islamic Republic of)

    2013-07-10

    Graphical abstract: - Highlights: • Graphene oxide content and dispersion as effective parameters on cure kinetics. • Graphene oxide as an effective controlling factor of crosslink density. • Interaction of graphene oxide with curing system (ZnO) during curing process. - Abstract: In this study, the effect of graphene oxide on cure behavior of ethylene–propylene–diene rubber (EPDM) nanocomposite is studied. In this regard, the cure kinetics of nanocomposite is studied employing different empirical methods. The required activation energy of nth-order cure process shows about 160 kJ/mol increments upon 5 phr graphene oxide loading compared to 1 phr graphene oxide loading. However, the required activation energy is significantly reduced followed by incorporation of graphene oxide in nanocomposites compared to neat EPDM sample. Furthermore, the effect of graphene oxide on structural properties of nanocomposites during the cure process is studied using X-ray diffraction, scanning electron microscopy and Fourier transform infrared spectrometry techniques. As the results show, graphene oxide interestingly affects the structure of zinc oxide during the vulcanization process. This behavior could be probably related to high tendency of zinc oxide to react with oxidized surface of graphene oxide.

  3. UV curing of a liquid based bismaleimide-containing polymer system

    Directory of Open Access Journals (Sweden)

    2007-06-01

    Full Text Available A new liquid formulation of commercial bismaleimide and n-acryloylmorpholine was prepared that could be UV cured as an alternative to traditional thermal cure methods presently used for BMI in the industry. UV curing was shown to be an efficient method which promoted the reaction rate significantly and was able to achieve this at low temperatures (30–50°C. A free radical polymerization approach has been used to explain the cure mechanism and cure kinetics, using data elucidated from the DPC and FTIR. The cured thin film was shown to achieve very high thermal stability (~400°C, with the BMI shown to retard the thermal degradation temperature and rate.

  4. Regularized multivariate regression models with skew-t error distributions

    KAUST Repository

    Chen, Lianfu

    2014-06-01

    We consider regularization of the parameters in multivariate linear regression models with the errors having a multivariate skew-t distribution. An iterative penalized likelihood procedure is proposed for constructing sparse estimators of both the regression coefficient and inverse scale matrices simultaneously. The sparsity is introduced through penalizing the negative log-likelihood by adding L1-penalties on the entries of the two matrices. Taking advantage of the hierarchical representation of skew-t distributions, and using the expectation conditional maximization (ECM) algorithm, we reduce the problem to penalized normal likelihood and develop a procedure to minimize the ensuing objective function. Using a simulation study the performance of the method is assessed, and the methodology is illustrated using a real data set with a 24-dimensional response vector. © 2014 Elsevier B.V.

  5. Radiation curing: Science and technology

    International Nuclear Information System (INIS)

    Pappas, S.P.

    1992-01-01

    The science and technology of radiation curing have progressed substantially within the last 20 years. Nevertheless, radiation-curable compositions typically command relatively small shares in many of their competitive markets. This situation signifies that potential advantages of radiation curing are not generally perceived to overcome their limitations. An important objective of this book is to address this issue, within the scope of the subjects offered, by providing the present state of knowledge and by identifying the directions and challenges for future studies. The first chapter introduces radiation curing. Chapter 2 offers the first systematic presentation of inorganic and organometallic photoinitiators. Chapters 3 and 4 present the analytical techniques of photocalorimetry and real-time infrared spectroscopy, respectively. Recent advances in resin technology are offered in Chapters 5 and 6, which constitute the first comprehensive accounts of (meth)acrylated silicones and vinyl ethers, respectively. Radiation-curable coatings, printing inks, and adhesives are discussed in Chapters 7-9, respectively. Chapter 10 offers a discussion on photopolymer imaging systems

  6. Techniques and materials for internal water curing of concrete

    DEFF Research Database (Denmark)

    Jensen, Ole Mejlhede; Lura, Pietro

    2006-01-01

    This paper gives an overview of different techniques for incorporation of internal curing water in concrete. Internal water curing can be used to mitigate self-desiccation and selfdesiccation shrinkage. Some concretes may need 50 kg/m3 of internal curing water for this purpose. The price...

  7. Quantile regression theory and applications

    CERN Document Server

    Davino, Cristina; Vistocco, Domenico

    2013-01-01

    A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and

  8. Regression models in the determination of the absorbed dose with extrapolation chamber for ophthalmological applicators

    International Nuclear Information System (INIS)

    Alvarez R, J.T.; Morales P, R.

    1992-06-01

    The absorbed dose for equivalent soft tissue is determined,it is imparted by ophthalmologic applicators, ( 90 Sr/ 90 Y, 1850 MBq) using an extrapolation chamber of variable electrodes; when estimating the slope of the extrapolation curve using a simple lineal regression model is observed that the dose values are underestimated from 17.7 percent up to a 20.4 percent in relation to the estimate of this dose by means of a regression model polynomial two grade, at the same time are observed an improvement in the standard error for the quadratic model until in 50%. Finally the global uncertainty of the dose is presented, taking into account the reproducibility of the experimental arrangement. As conclusion it can infers that in experimental arrangements where the source is to contact with the extrapolation chamber, it was recommended to substitute the lineal regression model by the quadratic regression model, in the determination of the slope of the extrapolation curve, for more exact and accurate measurements of the absorbed dose. (Author)

  9. ON THE EFFECTS OF THE PRESENCE AND METHODS OF THE ELIMINATION HETEROSCEDASTICITY AND AUTOCORRELATION IN THE REGRESSION MODEL

    Directory of Open Access Journals (Sweden)

    Nina L. Timofeeva

    2014-01-01

    Full Text Available The article presents the methodological and technical bases for the creation of regression models that adequately reflect reality. The focus is on methods of removing residual autocorrelation in models. Algorithms eliminating heteroscedasticity and autocorrelation of the regression model residuals: reweighted least squares method, the method of Cochran-Orkutta are given. A model of "pure" regression is build, as well as to compare the effect on the dependent variable of the different explanatory variables when the latter are expressed in different units, a standardized form of the regression equation. The scheme of abatement techniques of heteroskedasticity and autocorrelation for the creation of regression models specific to the social and cultural sphere is developed.

  10. A generalized multivariate regression model for modelling ocean wave heights

    Science.gov (United States)

    Wang, X. L.; Feng, Y.; Swail, V. R.

    2012-04-01

    In this study, a generalized multivariate linear regression model is developed to represent the relationship between 6-hourly ocean significant wave heights (Hs) and the corresponding 6-hourly mean sea level pressure (MSLP) fields. The model is calibrated using the ERA-Interim reanalysis of Hs and MSLP fields for 1981-2000, and is validated using the ERA-Interim reanalysis for 2001-2010 and ERA40 reanalysis of Hs and MSLP for 1958-2001. The performance of the fitted model is evaluated in terms of Pierce skill score, frequency bias index, and correlation skill score. Being not normally distributed, wave heights are subjected to a data adaptive Box-Cox transformation before being used in the model fitting. Also, since 6-hourly data are being modelled, lag-1 autocorrelation must be and is accounted for. The models with and without Box-Cox transformation, and with and without accounting for autocorrelation, are inter-compared in terms of their prediction skills. The fitted MSLP-Hs relationship is then used to reconstruct historical wave height climate from the 6-hourly MSLP fields taken from the Twentieth Century Reanalysis (20CR, Compo et al. 2011), and to project possible future wave height climates using CMIP5 model simulations of MSLP fields. The reconstructed and projected wave heights, both seasonal means and maxima, are subject to a trend analysis that allows for non-linear (polynomial) trends.

  11. Factors affecting dry-cured ham consumer acceptability.

    Science.gov (United States)

    Morales, R; Guerrero, L; Aguiar, A P S; Guàrdia, M D; Gou, P

    2013-11-01

    The objectives of the present study were (1) to compare the relative importance of price, processing time, texture and intramuscular fat in purchase intention of dry-cured ham through conjoint analysis, (2) to evaluate the effect of dry-cured ham appearance on consumer expectations, and (3) to describe the consumer sensory preferences of dry-cured ham using external preference mapping. Texture and processing time influenced the consumer preferences in conjoint analysis. Red colour intensity, colour uniformity, external fat and white film presence/absence influenced consumer expectations. The consumer disliked hams with bitter and metallic flavour and with excessive saltiness and piquantness. Differences between expected and experienced acceptability were found, which indicates that the visual preference of consumers does not allow them to select a dry-cured ham that satisfies their sensory preferences of flavour and texture. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Steam-cured stabilised soil blocks for masonry construction

    Energy Technology Data Exchange (ETDEWEB)

    Venkatarama Reddy, B.V. [Indian Inst. of Science, Bangalore (India). Dept. of Civil Engineering; Lokras, S.S. [Indian Inst. of Science, Bangalore (India). ASTRA

    1998-12-01

    Energy-efficient, economical and durable building materials are essential for sustainable construction practices. The paper deals with production and properties of energy-efficient steam-cured stabilised soil blocks used for masonry construction. Problems of mixing expansive soil and lime, and production of blocks using soil-lime mixtures have been discussed briefly. Details of steam curing of stabilised soil blocks and properties of such blocks are given. A comparison of energy content of steam-cured soil blocks and burnt bricks is presented. It has been shown that energy-efficient steam cured soil blocks (consuming 35% less thermal energy compared to burnt clay bricks) having high compressive strength can be easily produced in a decentralised manner. (orig.)

  13. The use of logistic regression in modelling the distributions of bird ...

    African Journals Online (AJOL)

    The method of logistic regression was used to model the observed geographical distribution patterns of bird species in Swaziland in relation to a set of environmental variables. Reporting rates derived from bird atlas data are used as an index of population densities. This is justified in part by the success of the modelling ...

  14. Modeling Tetanus Neonatorum case using the regression of negative binomial and zero-inflated negative binomial

    Science.gov (United States)

    Amaliana, Luthfatul; Sa'adah, Umu; Wayan Surya Wardhani, Ni

    2017-12-01

    Tetanus Neonatorum is an infectious disease that can be prevented by immunization. The number of Tetanus Neonatorum cases in East Java Province is the highest in Indonesia until 2015. Tetanus Neonatorum data contain over dispersion and big enough proportion of zero-inflation. Negative Binomial (NB) regression is an alternative method when over dispersion happens in Poisson regression. However, the data containing over dispersion and zero-inflation are more appropriately analyzed by using Zero-Inflated Negative Binomial (ZINB) regression. The purpose of this study are: (1) to model Tetanus Neonatorum cases in East Java Province with 71.05 percent proportion of zero-inflation by using NB and ZINB regression, (2) to obtain the best model. The result of this study indicates that ZINB is better than NB regression with smaller AIC.

  15. UV/EB curing in Australia

    International Nuclear Information System (INIS)

    Woods, R.; Garnett, J.; Loo Teck Ng

    1999-01-01

    Progress in LTV/EB curing is reviewed in Australia. Generally the technology is used by those industries where curing is well developed in Europe and North America, however the scale is an order of magnitude lower due to the smaller market size. The Asian economic crisis does not appear to have affected expansion of the technology in Australia. EB continues to be successfully used in the packaging and foam fields whilst in UV, security devices, particularly banknotes are steadily expanding especially in export markets have been studied

  16. Modeling Information Content Via Dirichlet-Multinomial Regression Analysis.

    Science.gov (United States)

    Ferrari, Alberto

    2017-01-01

    Shannon entropy is being increasingly used in biomedical research as an index of complexity and information content in sequences of symbols, e.g. languages, amino acid sequences, DNA methylation patterns and animal vocalizations. Yet, distributional properties of information entropy as a random variable have seldom been the object of study, leading to researchers mainly using linear models or simulation-based analytical approach to assess differences in information content, when entropy is measured repeatedly in different experimental conditions. Here a method to perform inference on entropy in such conditions is proposed. Building on results coming from studies in the field of Bayesian entropy estimation, a symmetric Dirichlet-multinomial regression model, able to deal efficiently with the issue of mean entropy estimation, is formulated. Through a simulation study the model is shown to outperform linear modeling in a vast range of scenarios and to have promising statistical properties. As a practical example, the method is applied to a data set coming from a real experiment on animal communication.

  17. Predicting 30-day Hospital Readmission with Publicly Available Administrative Database. A Conditional Logistic Regression Modeling Approach.

    Science.gov (United States)

    Zhu, K; Lou, Z; Zhou, J; Ballester, N; Kong, N; Parikh, P

    2015-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". Hospital readmissions raise healthcare costs and cause significant distress to providers and patients. It is, therefore, of great interest to healthcare organizations to predict what patients are at risk to be readmitted to their hospitals. However, current logistic regression based risk prediction models have limited prediction power when applied to hospital administrative data. Meanwhile, although decision trees and random forests have been applied, they tend to be too complex to understand among the hospital practitioners. Explore the use of conditional logistic regression to increase the prediction accuracy. We analyzed an HCUP statewide inpatient discharge record dataset, which includes patient demographics, clinical and care utilization data from California. We extracted records of heart failure Medicare beneficiaries who had inpatient experience during an 11-month period. We corrected the data imbalance issue with under-sampling. In our study, we first applied standard logistic regression and decision tree to obtain influential variables and derive practically meaning decision rules. We then stratified the original data set accordingly and applied logistic regression on each data stratum. We further explored the effect of interacting variables in the logistic regression modeling. We conducted cross validation to assess the overall prediction performance of conditional logistic regression (CLR) and compared it with standard classification models. The developed CLR models outperformed several standard classification models (e.g., straightforward logistic regression, stepwise logistic regression, random forest, support vector machine). For example, the best CLR model improved the classification accuracy by nearly 20% over the straightforward logistic regression model. Furthermore, the developed CLR models tend to achieve better sensitivity of

  18. Muscle individual phospholipid classes throughout the processing of dry-cured ham: influence of pre-cure freezing.

    Science.gov (United States)

    Pérez-Palacios, Trinidad; Ruiz, Jorge; Dewettinck, Koen; Le, Thien Trung; Antequera, Teresa

    2010-03-01

    This paper aims to study the profile of phospholipid (PL) classes of Iberian ham throughout its processing and the changes it underwent due to the influence of the pre-cure freezing treatment. The general profile of each PL class did not vary during the ripening stage. Phosphatidylcholine (PC) showed the highest proportion, followed by phosphatidyletanolamine (PE) and phosphatidylserine (PS) and phosphatidylinositol (PI) being the minor PL. The four PL classes were highly hydrolysed during the salting stage and their degradation continued during the rest of the processing. Pre-cure freezing of Iberian ham influenced the levels of the four PL classes at the initial stage, all of them being higher in refrigerated (R) than in pre-cure frozen (F) hams. Moreover, the pattern of hydrolysis was not the same in these two groups. Copyright 2009 Elsevier Ltd. All rights reserved.

  19. Curing the queue

    NARCIS (Netherlands)

    Zonderland, Maartje Elisabeth

    2012-01-01

    In this dissertation we study several problems related to the management of healthcare and the cure of disease. In each chapter a hospital capacity distribution problem is analyzed using techniques from operations research, also known as mathematical decision theory. The problems considered are

  20. Notes on power of normality tests of error terms in regression models

    International Nuclear Information System (INIS)

    Střelec, Luboš

    2015-01-01

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models

  1. Notes on power of normality tests of error terms in regression models

    Energy Technology Data Exchange (ETDEWEB)

    Střelec, Luboš [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, Brno, 61300 (Czech Republic)

    2015-03-10

    Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importance of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.

  2. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  3. The Transmuted Geometric-Weibull distribution: Properties, Characterizations and Regression Models

    Directory of Open Access Journals (Sweden)

    Zohdy M Nofal

    2017-06-01

    Full Text Available We propose a new lifetime model called the transmuted geometric-Weibull distribution. Some of its structural properties including ordinary and incomplete moments, quantile and generating functions, probability weighted moments, Rényi and q-entropies and order statistics are derived. The maximum likelihood method is discussed to estimate the model parameters by means of Monte Carlo simulation study. A new location-scale regression model is introduced based on the proposed distribution. The new distribution is applied to two real data sets to illustrate its flexibility. Empirical results indicate that proposed distribution can be alternative model to other lifetime models available in the literature for modeling real data in many areas.

  4. Credit Scoring Problem Based on Regression Analysis

    OpenAIRE

    Khassawneh, Bashar Suhil Jad Allah

    2014-01-01

    ABSTRACT: This thesis provides an explanatory introduction to the regression models of data mining and contains basic definitions of key terms in the linear, multiple and logistic regression models. Meanwhile, the aim of this study is to illustrate fitting models for the credit scoring problem using simple linear, multiple linear and logistic regression models and also to analyze the found model functions by statistical tools. Keywords: Data mining, linear regression, logistic regression....

  5. Conditional Monte Carlo randomization tests for regression models.

    Science.gov (United States)

    Parhat, Parwen; Rosenberger, William F; Diao, Guoqing

    2014-08-15

    We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Use of multiple linear regression and logistic regression models to investigate changes in birthweight for term singleton infants in Scotland.

    Science.gov (United States)

    Bonellie, Sandra R

    2012-10-01

    To illustrate the use of regression and logistic regression models to investigate changes over time in size of babies particularly in relation to social deprivation, age of the mother and smoking. Mean birthweight has been found to be increasing in many countries in recent years, but there are still a group of babies who are born with low birthweights. Population-based retrospective cohort study. Multiple linear regression and logistic regression models are used to analyse data on term 'singleton births' from Scottish hospitals between 1994-2003. Mothers who smoke are shown to give birth to lighter babies on average, a difference of approximately 0.57 Standard deviations lower (95% confidence interval. 0.55-0.58) when adjusted for sex and parity. These mothers are also more likely to have babies that are low birthweight (odds ratio 3.46, 95% confidence interval 3.30-3.63) compared with non-smokers. Low birthweight is 30% more likely where the mother lives in the most deprived areas compared with the least deprived, (odds ratio 1.30, 95% confidence interval 1.21-1.40). Smoking during pregnancy is shown to have a detrimental effect on the size of infants at birth. This effect explains some, though not all, of the observed socioeconomic birthweight. It also explains much of the observed birthweight differences by the age of the mother.   Identifying mothers at greater risk of having a low birthweight baby as important implications for the care and advice this group receives. © 2012 Blackwell Publishing Ltd.

  7. Scalable Bayesian nonparametric regression via a Plackett-Luce model for conditional ranks

    Science.gov (United States)

    Gray-Davies, Tristan; Holmes, Chris C.; Caron, François

    2018-01-01

    We present a novel Bayesian nonparametric regression model for covariates X and continuous response variable Y ∈ ℝ. The model is parametrized in terms of marginal distributions for Y and X and a regression function which tunes the stochastic ordering of the conditional distributions F (y|x). By adopting an approximate composite likelihood approach, we show that the resulting posterior inference can be decoupled for the separate components of the model. This procedure can scale to very large datasets and allows for the use of standard, existing, software from Bayesian nonparametric density estimation and Plackett-Luce ranking estimation to be applied. As an illustration, we show an application of our approach to a US Census dataset, with over 1,300,000 data points and more than 100 covariates. PMID:29623150

  8. Recursive Algorithm For Linear Regression

    Science.gov (United States)

    Varanasi, S. V.

    1988-01-01

    Order of model determined easily. Linear-regression algorithhm includes recursive equations for coefficients of model of increased order. Algorithm eliminates duplicative calculations, facilitates search for minimum order of linear-regression model fitting set of data satisfactory.

  9. Using a Chlorophyll Meter to Evaluate the Nitrogen Leaf Content in Flue-Cured Tobacco (Nicotiana tabacum L.

    Directory of Open Access Journals (Sweden)

    Fabio Castelli

    2009-06-01

    Full Text Available In flue-cured tobacco N fertilizer is commonly applied during pre-planting, and very often applied again later as a growth-starter. It is generally held that the efficiency of N-fertilizer use can be improved by evaluating the leaf Nstatus after transplanting and until flowering stage. N use efficiency in this context does not refer merely to the yield but also to the quality, in the meanwhile minimizing the negative effects on the environment. To investigate these aspects, we evaluated the capacity of a Minolta model SPAD-502 chlorophyll meter to estimate the N-status in flue-cured tobacco. The aims was to verify if a relationship exists between SPAD readings and leaf N content, and if a single leaf, in a well defined stalk position, could represent the nitrogen content of the whole plant. During the years 1995 and 1996, a pot experiment was conducted using two flue-cured tobacco varieties. SPAD values, total chlorophyll, total N contents and leaf area were measured throughout the growing season, on each odd leaf stalk position. SPAD values were well-correlated with both total chlorophyll and total N leaf concentration, and the regression coefficients were higher when relationships were calculated on a leaf-area basis. For both relationships, SPAD-total chlorophyll and SPAD-total N, the best fittings were obtained with quadratic equations. One leaf stalk position alone is able to monitor the N-status of the whole plant during the first six weeks after transplanting, without distinction of year and variety effects. The SPAD measurement of one leaf per plant, throughout the vegetative growing season, is therefore a valid tool to test the N-status of the crop in a period when a required N supply is still effective.

  10. An adaptive two-stage analog/regression model for probabilistic prediction of small-scale precipitation in France

    Science.gov (United States)

    Chardon, Jérémy; Hingray, Benoit; Favre, Anne-Catherine

    2018-01-01

    Statistical downscaling models (SDMs) are often used to produce local weather scenarios from large-scale atmospheric information. SDMs include transfer functions which are based on a statistical link identified from observations between local weather and a set of large-scale predictors. As physical processes driving surface weather vary in time, the most relevant predictors and the regression link are likely to vary in time too. This is well known for precipitation for instance and the link is thus often estimated after some seasonal stratification of the data. In this study, we present a two-stage analog/regression model where the regression link is estimated from atmospheric analogs of the current prediction day. Atmospheric analogs are identified from fields of geopotential heights at 1000 and 500 hPa. For the regression stage, two generalized linear models are further used to model the probability of precipitation occurrence and the distribution of non-zero precipitation amounts, respectively. The two-stage model is evaluated for the probabilistic prediction of small-scale precipitation over France. It noticeably improves the skill of the prediction for both precipitation occurrence and amount. As the analog days vary from one prediction day to another, the atmospheric predictors selected in the regression stage and the value of the corresponding regression coefficients can vary from one prediction day to another. The model allows thus for a day-to-day adaptive and tailored downscaling. It can also reveal specific predictors for peculiar and non-frequent weather configurations.

  11. Curing reactions of bismaleimide resins catalyzed by triphenylphosphine. High resolution solid-state 13C NMR study

    International Nuclear Information System (INIS)

    Shibahara, Sumio; Enoki, Takashi; Yamamoto, Takahisa; Motoyoshiya, Jiro; Hayashi, Sadao.

    1996-01-01

    The curing reactions of bismaleimide resins consisted of N,N'-4,4'-diphenylmethanebismaleimide (BMI) and o,o'-diallylbisphenol-A (DABA) in the presence of triphenylphosphine (TPP) as a catalyst were investigated. DSC measurements showed that the catalytic effect of TPP on the curing reaction of BMI was more in the presence of DABA than in its absence. In order to explore this curing reaction, N-phenylmaleimide (PMI) and o-allylphenol (AP) were selected as model compounds. The products of the PMI/TPP system were oligomers and polymers of PMI, whereas the main product of the PMI/AP/TPP system was the PMI trimer which had the five-membered ring formed via the phosphonium ylide intermediate. In these model reactions, 13 C NMR was found to be useful to distinguish between trimerization and polymerization of PMI. On the basis of the results of the model reactions, the curing reactions of bismaleimide resins were investigated by high resolution solid state 13 C NMR techniques. In the BMI/TPP system, maleimides polymerize above 175degC, but the polymerization does not proceed at 120degC. On the other hand, maleimides trimerize above 120degC in the presence of DABA and TPP. The mechanism of the trimerization is briefly discussed. (author)

  12. Logistic regression for dichotomized counts.

    Science.gov (United States)

    Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W

    2016-12-01

    Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.

  13. Logistic regression models for polymorphic and antagonistic pleiotropic gene action on human aging and longevity

    DEFF Research Database (Denmark)

    Tan, Qihua; Bathum, L; Christiansen, L

    2003-01-01

    In this paper, we apply logistic regression models to measure genetic association with human survival for highly polymorphic and pleiotropic genes. By modelling genotype frequency as a function of age, we introduce a logistic regression model with polytomous responses to handle the polymorphic...... situation. Genotype and allele-based parameterization can be used to investigate the modes of gene action and to reduce the number of parameters, so that the power is increased while the amount of multiple testing minimized. A binomial logistic regression model with fractional polynomials is used to capture...... the age-dependent or antagonistic pleiotropic effects. The models are applied to HFE genotype data to assess the effects on human longevity by different alleles and to detect if an age-dependent effect exists. Application has shown that these methods can serve as useful tools in searching for important...

  14. Effect of curing time on microstructure and mechanical strength ...

    Indian Academy of Sciences (India)

    The aim of this paper is to study the influence of curing time on the microstructure and mechanical strength development of alkali activated binders based on vitreous calcium aluminosilicate (VCAS). Mechanical strength of alkali activated mortars cured at 65 °C was assessed for different curing times (4–168 h) using 10 ...

  15. Curing potential of experimental resin composites with systematically varying amount of bioactive glass: Degree of conversion, light transmittance and depth of cure.

    Science.gov (United States)

    Par, Matej; Spanovic, Nika; Bjelovucic, Ruza; Skenderovic, Hrvoje; Gamulin, Ozren; Tarle, Zrinka

    2018-06-17

    The aim of this work was to investigate the curing potential of an experimental resin composite series with the systematically varying amount of bioactive glass 45S5 by evaluating the degree of conversion, light transmittance and depth of cure. Resin composites based on a Bis-GMA/TEGDMA resin with a total filler load of 70 wt% and a variable amount of bioactive glass (0-40 wt%) were prepared. The photoinitiator system was camphorquinone and ethyl-4-(dimethylamino) benzoate. The degree of conversion and light transmittance were measured by Raman spectroscopy and UV-vis spectroscopy, respectively. The depth of cure was evaluated according to the classical ISO 4049 test. The initial introduction of bioactive glass into the experimental series diminished the light transmittance while the further increase in the bioactive glass amount up to 40 wt% caused minor variations with no clear trend. The curing potential of the experimental composites was similar to or better than that of commercial resin composites. However, unsilanized bioactive glass fillers demonstrated the tendency to diminish both the maximum attainable conversion and the curing efficiency at depth. Experimental composite materials containing bioactive glass showed a clinically acceptable degree of conversion and depth of cure. The degree of conversion and depth of cure were diminished by bioactive glass fillers in a dose-dependent manner, although light transmittance was similar among all of the experimental composites containing 5-40 wt% of bioactive glass. Reduced curing potential caused by the bioactive glass has possible consequences on mechanical properties and biocompatibility. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Modeling daily soil temperature over diverse climate conditions in Iran—a comparison of multiple linear regression and support vector regression techniques

    Science.gov (United States)

    Delbari, Masoomeh; Sharifazari, Salman; Mohammadi, Ehsan

    2018-02-01

    The knowledge of soil temperature at different depths is important for agricultural industry and for understanding climate change. The aim of this study is to evaluate the performance of a support vector regression (SVR)-based model in estimating daily soil temperature at 10, 30 and 100 cm depth at different climate conditions over Iran. The obtained results were compared to those obtained from a more classical multiple linear regression (MLR) model. The correlation sensitivity for the input combinations and periodicity effect were also investigated. Climatic data used as inputs to the models were minimum and maximum air temperature, solar radiation, relative humidity, dew point, and the atmospheric pressure (reduced to see level), collected from five synoptic stations Kerman, Ahvaz, Tabriz, Saghez, and Rasht located respectively in the hyper-arid, arid, semi-arid, Mediterranean, and hyper-humid climate conditions. According to the results, the performance of both MLR and SVR models was quite well at surface layer, i.e., 10-cm depth. However, SVR performed better than MLR in estimating soil temperature at deeper layers especially 100 cm depth. Moreover, both models performed better in humid climate condition than arid and hyper-arid areas. Further, adding a periodicity component into the modeling process considerably improved the models' performance especially in the case of SVR.

  17. Ajuste de modelos de platô de resposta via regressão isotônica Response plateau models fitting via isotonic regression

    Directory of Open Access Journals (Sweden)

    Renata Pires Gonçalves

    2012-02-01

    . The experiments of type dosage x response are very common in the determination of levels of nutrients in optimal food balance and include the use of regression models to achieve this objective. Nevertheless, the regression analysis routine, generally, uses a priori information about a possible relationship between the response variable. The isotonic regression is a method of estimation by least squares that generates estimates which preserves data ordering. In the theory of isotonic regression this information is essential and it is expected to increase fitting efficiency. The objective of this work was to use an isotonic regression methodology, as an alternative way of analyzing data of Zn deposition in tibia of male birds of Hubbard lineage. We considered the models of plateau response of polynomial quadratic and linear exponential forms. In addition to these models, we also proposed the fitting of a logarithmic model to the data and the efficiency of the methodology was evaluated by Monte Carlo simulations, considering different scenarios for the parametric values. The isotonization of the data yielded an improvement in all the fitting quality parameters evaluated. Among the models used, the logarithmic presented estimates of the parameters more consistent with the values reported in literature.

  18. Hard facts for radiation curing of elastomers

    International Nuclear Information System (INIS)

    Lyall, D.J.

    1984-01-01

    The subject is covered under the headings: introduction; outline of chemistry (differences between conventional and radiation curing); compounding; green strength; response of rubbers to electron beam treatment; electron beam cured applications:(a) wire and cable applications;(b) rubber tyre components;(c) heat shrinkable materials;(d) roofing materials. (U.K.)

  19. Adjusting for overdispersion in piecewise exponential regression models to estimate excess mortality rate in population-based research.

    Science.gov (United States)

    Luque-Fernandez, Miguel Angel; Belot, Aurélien; Quaresma, Manuela; Maringe, Camille; Coleman, Michel P; Rachet, Bernard

    2016-10-01

    In population-based cancer research, piecewise exponential regression models are used to derive adjusted estimates of excess mortality due to cancer using the Poisson generalized linear modelling framework. However, the assumption that the conditional mean and variance of the rate parameter given the set of covariates x i are equal is strong and may fail to account for overdispersion given the variability of the rate parameter (the variance exceeds the mean). Using an empirical example, we aimed to describe simple methods to test and correct for overdispersion. We used a regression-based score test for overdispersion under the relative survival framework and proposed different approaches to correct for overdispersion including a quasi-likelihood, robust standard errors estimation, negative binomial regression and flexible piecewise modelling. All piecewise exponential regression models showed the presence of significant inherent overdispersion (p-value regression modelling, with either a quasi-likelihood or robust standard errors, was the best approach as it deals with both, overdispersion due to model misspecification and true or inherent overdispersion.

  20. Endogenous glucose production from infancy to adulthood: a non-linear regression model

    NARCIS (Netherlands)

    Huidekoper, Hidde H.; Ackermans, Mariëtte T.; Ruiter, An F. C.; Sauerwein, Hans P.; Wijburg, Frits A.

    2014-01-01

    To construct a regression model for endogenous glucose production (EGP) as a function of age, and compare this with glucose supplementation using commonly used dextrose-based saline solutions at fluid maintenance rate in children. A model was constructed based on EGP data, as quantified by

  1. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    Science.gov (United States)

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.

  2. Electron Beam Curing of Polymer Matrix Composites - CRADA Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Janke, C. J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Howell, Dave [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Norris, Robert E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    1997-05-01

    The major cost driver in manufacturing polymer matrix composite (PMC) parts and structures, and one of the elements having the greatest effect on their quality and performance, is the standard thermal cure process. Thermal curing of PMCs requires long cure times and high energy consumption, creates residual thermal stresses in the part, produces volatile toxic by-products, and requires expensive tooling that is tolerant of the high cure temperatures.

  3. A perspective on "cure" for Rett syndrome.

    Science.gov (United States)

    Clarke, Angus John; Abdala Sheikh, Ana Paula

    2018-04-02

    The reversal of the Rett syndrome disease process in the Mecp2 mouse model of Guy et al. (2007) has motivated families and researchers to work on this condition. The reversibility in adult mice suggests that there is potentially much to be gained from rational treatments applied to patients of any age. However, it may be difficult to strike the right balance between enthusiasm on the one hand and realism on the other. One effect of this has been a fragmentation of the "Rett syndrome community" with some groups giving priority to work aimed at a cure while fewer resources are devoted to medical or therapy-based interventions to enhance the quality of life of affected patients or provide support for their families.Several possible therapeutic approaches are under development that, it is claimed and hoped, may lead to a "cure" for patients with Rett syndrome. While all have a rationale, there are potential obstacles to each being both safe and effective. Furthermore, any strategy that succeeded in restoring normal levels of MECP2 gene expression throughout the brain carries potential pitfalls, so that it will be of crucial importance to introduce any clinical trials of such therapies with great care.Expectations of families for a radical, rational treatment should not be inflated beyond a cautious optimism. This is particularly because affected patients with us now may not be able to reap the full benefits of a "cure". Thus, interventions aimed at enhancing the quality of life of affected patients should not be forgone and their importance should not be minimised.

  4. Geographically weighted negative binomial regression applied to zonal level safety performance models.

    Science.gov (United States)

    Gomes, Marcos José Timbó Lima; Cunto, Flávio; da Silva, Alan Ricardo

    2017-09-01

    Generalized Linear Models (GLM) with negative binomial distribution for errors, have been widely used to estimate safety at the level of transportation planning. The limited ability of this technique to take spatial effects into account can be overcome through the use of local models from spatial regression techniques, such as Geographically Weighted Poisson Regression (GWPR). Although GWPR is a system that deals with spatial dependency and heterogeneity and has already been used in some road safety studies at the planning level, it fails to account for the possible overdispersion that can be found in the observations on road-traffic crashes. Two approaches were adopted for the Geographically Weighted Negative Binomial Regression (GWNBR) model to allow discrete data to be modeled in a non-stationary form and to take note of the overdispersion of the data: the first examines the constant overdispersion for all the traffic zones and the second includes the variable for each spatial unit. This research conducts a comparative analysis between non-spatial global crash prediction models and spatial local GWPR and GWNBR at the level of traffic zones in Fortaleza/Brazil. A geographic database of 126 traffic zones was compiled from the available data on exposure, network characteristics, socioeconomic factors and land use. The models were calibrated by using the frequency of injury crashes as a dependent variable and the results showed that GWPR and GWNBR achieved a better performance than GLM for the average residuals and likelihood as well as reducing the spatial autocorrelation of the residuals, and the GWNBR model was more able to capture the spatial heterogeneity of the crash frequency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. UV curing by radical, cationic and concurrent radicalcationic polymerization

    International Nuclear Information System (INIS)

    Pappas, S.P.

    1984-01-01

    UV and EB curing represent complementary technologies with respective advantages and disadvantages. This paper deals with the design and evaluation of UV curable coatings to optimize cure rate and film properties. Topics included are state-of-the-art photoinitiator systems, light intensity effects, retardation of air-inhibition, adhesion, and amplification of photons for enhanced speed of cure

  6. Linear regression

    CERN Document Server

    Olive, David J

    2017-01-01

    This text covers both multiple linear regression and some experimental design models. The text uses the response plot to visualize the model and to detect outliers, does not assume that the error distribution has a known parametric distribution, develops prediction intervals that work when the error distribution is unknown, suggests bootstrap hypothesis tests that may be useful for inference after variable selection, and develops prediction regions and large sample theory for the multivariate linear regression model that has m response variables. A relationship between multivariate prediction regions and confidence regions provides a simple way to bootstrap confidence regions. These confidence regions often provide a practical method for testing hypotheses. There is also a chapter on generalized linear models and generalized additive models. There are many R functions to produce response and residual plots, to simulate prediction intervals and hypothesis tests, to detect outliers, and to choose response trans...

  7. Regression-based model of skin diffuse reflectance for skin color analysis

    Science.gov (United States)

    Tsumura, Norimichi; Kawazoe, Daisuke; Nakaguchi, Toshiya; Ojima, Nobutoshi; Miyake, Yoichi

    2008-11-01

    A simple regression-based model of skin diffuse reflectance is developed based on reflectance samples calculated by Monte Carlo simulation of light transport in a two-layered skin model. This reflectance model includes the values of spectral reflectance in the visible spectra for Japanese women. The modified Lambert Beer law holds in the proposed model with a modified mean free path length in non-linear density space. The averaged RMS and maximum errors of the proposed model were 1.1 and 3.1%, respectively, in the above range.

  8. Development of an empirical model of turbine efficiency using the Taylor expansion and regression analysis

    International Nuclear Information System (INIS)

    Fang, Xiande; Xu, Yu

    2011-01-01

    The empirical model of turbine efficiency is necessary for the control- and/or diagnosis-oriented simulation and useful for the simulation and analysis of dynamic performances of the turbine equipment and systems, such as air cycle refrigeration systems, power plants, turbine engines, and turbochargers. Existing empirical models of turbine efficiency are insufficient because there is no suitable form available for air cycle refrigeration turbines. This work performs a critical review of empirical models (called mean value models in some literature) of turbine efficiency and develops an empirical model in the desired form for air cycle refrigeration, the dominant cooling approach in aircraft environmental control systems. The Taylor series and regression analysis are used to build the model, with the Taylor series being used to expand functions with the polytropic exponent and the regression analysis to finalize the model. The measured data of a turbocharger turbine and two air cycle refrigeration turbines are used for the regression analysis. The proposed model is compact and able to present the turbine efficiency map. Its predictions agree with the measured data very well, with the corrected coefficient of determination R c 2 ≥ 0.96 and the mean absolute percentage deviation = 1.19% for the three turbines. -- Highlights: → Performed a critical review of empirical models of turbine efficiency. → Developed an empirical model in the desired form for air cycle refrigeration, using the Taylor expansion and regression analysis. → Verified the method for developing the empirical model. → Verified the model.

  9. Study of oxygen inhibition effect on radiation curing

    International Nuclear Information System (INIS)

    Xiao Bin; Yang Xuemei; Zhao Pengji; Zeng Shuqing; Jiang Bo; Zhou Yong; Huang Wei; Zhou Youyi

    1995-01-01

    Michacl addition reaction product was used in the research of oxygen inhibition effect of radiation curing. The experimental results was measured by the content of gel and percentage of double bonds. It was proved that 9% of Michacl addition product could speed up 1.2 times of the radiation curing rate at 30 kGy of EB irradiation. This kind of formulation can withstand oxygen inhibition effect obviously, so it was the foundation of application for radiation curing in atmospheric condition

  10. Assessing the irradiance delivered from light-curing units in private dental offices in Jordan.

    Science.gov (United States)

    Maghaireh, Ghada A; Alzraikat, Hanan; Taha, Nessrin A

    2013-08-01

    The authors conducted a study to examine the irradiance from light-curing units (LCUs) used in dental offices in Jordan. Two of the authors visited 295 private dental offices (15 percent) in Jordan and collected the following information about the LCUs: age, type (quartz-tungsten-halogen or light-emitting diode), date of last maintenance, type of maintenance, last date of use, number of times used during the day, availability of a radiometer, exposure time for each resin-based composite increment, size of light-curing tips and presence of resin-based composite on the tips. The authors used a radiometer to measure the irradiance from the LCUs. They used linear regression with stepwise correlation for the statistical analysis. The authors set the minimum acceptable irradiance at 300 milliwatts/square centimeter. The mean irradiance of the 295 LCUs examined was 361 mW/cm(2), and 136 LCUs (46.1 percent) delivered an irradiance of less than 300 mW/cm(2). The unit's age, type and presence of resin-based composite on the light-curing tips had a significant effect on the irradiance (P ≤ .001). Only 37 of the 141 quartz-tungsten-halogen units (26.2 percent) and 122 of the 154 light-emitting diode units (79.2 percent) delivered at least 300 mW/cm(2). Resin contamination on the light-curing tips had a significant effect on the irradiance delivered. The irradiance from the LCUs decreased with use. Practical Implications. The irradiance from many of the units in this study was less than 300 mW/cm(2), which may affect the quality of resin-based composite restorations. Dentists should monitor the performance of the LCUs in their offices weekly.

  11. Curing and caring competences in the skills training of physiotherapy students.

    Science.gov (United States)

    Dahl-Michelsen, Tone

    2015-01-01

    This article explores the significance of curing and caring competences in physiotherapy education, as well as how curing and caring competences intersect within the professional training of physiotherapy students. The empirical data include participant observations and interviews with students attending skills training in the first year of a bachelor's degree program in Norway. Curing and caring are conceptualized as gender-coded competences. That is, curing and caring are viewed as historical and cultural constructions of masculinities and femininities within the physiotherapy profession, as well as performative actions. The findings illuminate the complexity of curing and caring competences in the skills training of physiotherapy students. Curing and caring are both binary and intertwined competences; however, whereas binary competences are mostly concerned with contextual frames, intertwined competences are mostly concerned with performative aspects. The findings also point to how female and male students attend to curing and caring competences in similar ways; thus, the possibilities of transcending traditional gender norms turn out to be significant in this context. The findings suggest that, although curing somehow remains hegemonic to caring, the future generation of physiotherapists seemingly will be able to use their skills for both caring and curing.

  12. Linear regression in astronomy. II

    Science.gov (United States)

    Feigelson, Eric D.; Babu, Gutti J.

    1992-01-01

    A wide variety of least-squares linear regression procedures used in observational astronomy, particularly investigations of the cosmic distance scale, are presented and discussed. The classes of linear models considered are (1) unweighted regression lines, with bootstrap and jackknife resampling; (2) regression solutions when measurement error, in one or both variables, dominates the scatter; (3) methods to apply a calibration line to new data; (4) truncated regression models, which apply to flux-limited data sets; and (5) censored regression models, which apply when nondetections are present. For the calibration problem we develop two new procedures: a formula for the intercept offset between two parallel data sets, which propagates slope errors from one regression to the other; and a generalization of the Working-Hotelling confidence bands to nonstandard least-squares lines. They can provide improved error analysis for Faber-Jackson, Tully-Fisher, and similar cosmic distance scale relations.

  13. Bayesian Bandwidth Selection for a Nonparametric Regression Model with Mixed Types of Regressors

    Directory of Open Access Journals (Sweden)

    Xibin Zhang

    2016-04-01

    Full Text Available This paper develops a sampling algorithm for bandwidth estimation in a nonparametric regression model with continuous and discrete regressors under an unknown error density. The error density is approximated by the kernel density estimator of the unobserved errors, while the regression function is estimated using the Nadaraya-Watson estimator admitting continuous and discrete regressors. We derive an approximate likelihood and posterior for bandwidth parameters, followed by a sampling algorithm. Simulation results show that the proposed approach typically leads to better accuracy of the resulting estimates than cross-validation, particularly for smaller sample sizes. This bandwidth estimation approach is applied to nonparametric regression model of the Australian All Ordinaries returns and the kernel density estimation of gross domestic product (GDP growth rates among the organisation for economic co-operation and development (OECD and non-OECD countries.

  14. Multiresponse semiparametric regression for modelling the effect of regional socio-economic variables on the use of information technology

    Science.gov (United States)

    Wibowo, Wahyu; Wene, Chatrien; Budiantara, I. Nyoman; Permatasari, Erma Oktania

    2017-03-01

    Multiresponse semiparametric regression is simultaneous equation regression model and fusion of parametric and nonparametric model. The regression model comprise several models and each model has two components, parametric and nonparametric. The used model has linear function as parametric and polynomial truncated spline as nonparametric component. The model can handle both linearity and nonlinearity relationship between response and the sets of predictor variables. The aim of this paper is to demonstrate the application of the regression model for modeling of effect of regional socio-economic on use of information technology. More specific, the response variables are percentage of households has access to internet and percentage of households has personal computer. Then, predictor variables are percentage of literacy people, percentage of electrification and percentage of economic growth. Based on identification of the relationship between response and predictor variable, economic growth is treated as nonparametric predictor and the others are parametric predictors. The result shows that the multiresponse semiparametric regression can be applied well as indicate by the high coefficient determination, 90 percent.

  15. A hydrologic regression sediment-yield model for two ungaged watershed outlet stations in Africa

    International Nuclear Information System (INIS)

    Moussa, O.M.; Smith, S.E.; Shrestha, R.L.

    1991-01-01

    A hydrologic regression sediment-yield model was established to determine the relationship between water discharge and suspended sediment discharge at the Blue Nile and the Atbara River outlet stations during the flood season. The model consisted of two main submodels: (1) a suspended sediment discharge model, which was used to determine suspended sediment discharge for each basin outlet; and (2) a sediment rating model, which related water discharge and suspended sediment discharge for each outlet station. Due to the absence of suspended sediment concentration measurements at or near the outlet stations, a minimum norm solution, which is based on the minimization of the unknowns rather than the residuals, was used to determine the suspended sediment discharges at the stations. In addition, the sediment rating submodel was regressed by using an observation equations procedure. Verification analyses on the model were carried out and the mean percentage errors were found to be +12.59 and -12.39, respectively, for the Blue Nile and Atbara. The hydrologic regression model was found to be most sensitive to the relative weight matrix, moderately sensitive to the mean water discharge ratio, and slightly sensitive to the concentration variation along the River Nile's course

  16. Arnica Tincture Cures Cutaneous Leishmaniasis in Golden Hamsters

    Directory of Open Access Journals (Sweden)

    Sara M. Robledo

    2018-01-01

    Full Text Available In search for potential therapeutic alternatives to existing treatments for cutaneous Leishmaniasis, we have investigated the effect of Arnica tincture Ph. Eur. (a 70% hydroethanolic tincture prepared from flowerheads of Arnica montana L. on the lesions caused by infection with Leishmania braziliensis in a model with golden hamsters. The animals were treated topically with a daily single dose of the preparation for 28 days. Subsequently, the healing process was monitored by recording the lesion size in intervals of 15 days up to day 90. As a result, Arnica tincture fully cured three out of five hamsters while one animal showed an improvement and another one suffered from a relapse. This result was slightly better than that obtained with the positive control, meglumine antimonate, which cured two of five hamsters while the other three showed a relapse after 90 days. This result encourages us to further investigate the potential of Arnica tincture in the treatment of cutaneous Leishmaniasis.

  17. Understanding poisson regression.

    Science.gov (United States)

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  18. effect of curing methods on the compressive strength of concrete

    African Journals Online (AJOL)

    High curing temperature (up to 212◦F or. 100◦C) ... are affected by curing and application of the ... for concrete production, it is important to ... Concrete properties and durability are signif- ... Curing compounds are merely temporary coatings on.

  19. 7 CFR 30.36 - Class 1; flue-cured types and groups.

    Science.gov (United States)

    2010-01-01

    ...-cured, produced principally in the Piedmont sections of Virginia and North Carolina. (b) Type 11b. That... lying between the Piedmont and coastal plains regions of Virginia and North Carolina. (c) Type 12. That type of flue-cured tobacco commonly known as Eastern Flue-cured or Eastern Carolina Flue-cured...

  20. An adaptive two-stage analog/regression model for probabilistic prediction of small-scale precipitation in France

    Directory of Open Access Journals (Sweden)

    J. Chardon

    2018-01-01

    Full Text Available Statistical downscaling models (SDMs are often used to produce local weather scenarios from large-scale atmospheric information. SDMs include transfer functions which are based on a statistical link identified from observations between local weather and a set of large-scale predictors. As physical processes driving surface weather vary in time, the most relevant predictors and the regression link are likely to vary in time too. This is well known for precipitation for instance and the link is thus often estimated after some seasonal stratification of the data. In this study, we present a two-stage analog/regression model where the regression link is estimated from atmospheric analogs of the current prediction day. Atmospheric analogs are identified from fields of geopotential heights at 1000 and 500 hPa. For the regression stage, two generalized linear models are further used to model the probability of precipitation occurrence and the distribution of non-zero precipitation amounts, respectively. The two-stage model is evaluated for the probabilistic prediction of small-scale precipitation over France. It noticeably improves the skill of the prediction for both precipitation occurrence and amount. As the analog days vary from one prediction day to another, the atmospheric predictors selected in the regression stage and the value of the corresponding regression coefficients can vary from one prediction day to another. The model allows thus for a day-to-day adaptive and tailored downscaling. It can also reveal specific predictors for peculiar and non-frequent weather configurations.